Can you provide Quantized (GGUF) versions of this model?
#4
by
Impulse2000
- opened
Hello, i really love this model! Could you provide Quantized versions 2,4,6,8,16fp quants?
So that i can run it on my machines?
@Impulse2000 I have some here:
https://huggingface.co/bartowski/Starling-LM-7B-beta-GGUF
Thanks!
Impulse2000
changed discussion status to
closed