llamacpp 2048 context tokens

#7
by Hasaranga85 - opened

I get following warning for nomic-embed-text-v1.5.Q8_0.gguf

main: warning: model was trained on only 2048 context tokens (8192 specified)

command: llama-embedding.exe -m nomic-embed-text-v1.5.Q8_0.gguf -c 8192 -b 8192 --rope-scaling yarn --rope-freq-scale .75 -f vector_question.txt --embd-output-format json

Sign up or log in to comment