Transformers
falcon

Model file not found

#6
by TPelc - opened

I'm running the basic example:

from ctransformers import AutoModelForCausalLM
llm = AutoModelForCausalLM.from_pretrained("TheBloke/Falcon-180B-Chat-GGUF", model_file="falcon-180b-chat.q4_K_M.gguf", model_type="falcon", gpu_layers=2)
print(llm("AI is going to"))

and getting: Model file 'falcon-180b-chat.q4_K_M.gguf' not found in '/root/.cache/huggingface/hub/models--TheBloke--Falcon-180B-Chat-GGUF/snapshots/669d5768e34820b453b1a966ef16d0c80bb7a914'

I was able to run TheBloke/Llama-2-70B-chat-GGUF with the same configuration
What can it be?

TPelc changed discussion title from not found to Model file not found

Ah yeah sorry, those ctransformers examples don't work for this model immediately because the model files are split and require manualy rejoining. Download the files you want manually, and then check the README for instructions on how to join them. Then you can load them from a local directory using ctranformers

I see
Thank you for the explanation and all your efforts to serve the community.

Sign up or log in to comment