VM Operation issue

#21
by Amit2balag - opened

Hi Folks,

I had application deployed with model intfloat/multilingual-e5-large as embedding llm upon deployment in the VM it was working as good as in the local. Now when I have intfloat/multilingual-e5-large-instruct configured as an embedding model the VM is not even able to load. Going through the documentation & model card the memory consumption & model size as in storage required is not evident anywhere. Can anyone help with these two information??

My VM has 4 GB RAM & 50 GB storage. Please suggest the required VM configuration. I'm using llama3-8b for text generation.

Regards

Sign up or log in to comment