unable to use custom inference endpoint

#9
by HonestAnnie - opened

"The checkpoint you are trying to load has model type gemma2 but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date."

Perhaps a requirements.txt with a specified transformers version could fix it?

transformers>=4.44.0

Reference: https://huggingface.co/docs/inference-endpoints/guides/custom_dependencies

I've created a PR: https://huggingface.co/BAAI/bge-multilingual-gemma2/discussions/11

Sign up or log in to comment