llama.cpp CPU backend crashes

#1
by mtasic85 - opened

I just reported bug running this model with CPU backend.
Please check reported issue here:
https://github.com/ggerganov/llama.cpp/issues/9315

That's quite interesting, usually if this happens I fail to make the imatrix, but there must be some other part missing

Sign up or log in to comment