the model path in the example is dying, should be Q4_0-00001-of-00001 instead of 00009

#1
by henrywang0314 - opened

huggingface-cli download LiteLLMs/Meta-Llama-3-8B-GGUF Q4_0/Q4_0-00001-of-00009.gguf --local-dir . --local-dir-use-symlinks False
but when I download and use the Q4_0-00001-of-00001 model
AssertionError Occurs
File "/home/apteam/.local/lib/python3.10/site-packages/llama_cpp/llama.py", line 340, in init
assert self.model is not None
AssertionError

Sign up or log in to comment