Gemma-2-27B-It?

#1
by christopherthompson81 - opened

And plans to do the same with Gemma-2-27B-It?

UCLA Artificial General Intelligence Lab org
β€’
edited Jun 30

@christopherthompson81 We are experiencing some generation issues with Gemma-2-27B-It and vllm. We saw similar issues reported in this discussion: https://huggingface.co/google/gemma-2-27b-it/discussions/10

We are planning to do 27B as soon as a stable release of transformers and vllm generation on Gemma-2-27B-It is available.

Hey there,

Any word on the 27B? The 9B is absolutely phenomenal, I can't wait to see what you do with 27 :)

@christopherthompson81 We are experiencing some generation issues with Gemma-2-27B-It and vllm. We saw similar issues reported in this discussion: https://huggingface.co/google/gemma-2-27b-it/discussions/10

We are planning to do 27B as soon as a stable release of transformers and vllm generation on Gemma-2-27B-It is available.

Any updates? I love the sppo model for Gemma 2 9b, been waiting for any decent 27b finetunes to take the spot as my favorite model

@christopherthompson81 We are experiencing some generation issues with Gemma-2-27B-It and vllm. We saw similar issues reported in this discussion: https://huggingface.co/google/gemma-2-27b-it/discussions/10

We are planning to do 27B as soon as a stable release of transformers and vllm generation on Gemma-2-27B-It is available.

Hi! gemma27B is already working well on vllm with a 4k context length, and sglang supports 8k. Should we wait for the SPPO 27B version?

Sign up or log in to comment