LM Studio / ollama support

#3
by burdie - opened

Can this model be used with LM Studio or ollama?

Yes there is a gguf version, you can use this one. https://huggingface.co/mradermacher/Flux-Prompt-Enhance-GGUF

@gokaygokay thanks for the quick reply, I actually downloaded this version, but I could not find the correct preset to use for LM Studio

Yeah, i did not try it myself, i just saw the GGUF version and i thought it might work with lmstudio since its llama.cpp based.

Thanks for your effort, I gave ollama a try without any luck too, it returns an error.

I guess the preset must be set somehow for it to understand how it should communicate with the model/system/user combo..

I tried with msty I think it needs a json to understand how to talk to it maybe?

Sign up or log in to comment