Update README.md
Browse files
README.md
CHANGED
@@ -117,11 +117,11 @@ _____
|
|
117 |
|
118 |
| Name | Quant method | Bits Per Weight | Size | Max RAM/VRAM required | Use case |
|
119 |
| ---- | ---- | ---- | ---- | ---- | ----- |
|
120 |
-
| [normistral-7b-warm-instruct.Q3_K_M.gguf](https://huggingface.co/norallm/normistral-7b-warm/blob/main/normistral-7b-warm-instruct.Q3_K_M.gguf) | Q3_K_M | 3.89 | 3.28 GB| 5.37 GB | very small, high quality
|
121 |
-
| [normistral-7b-warm-instruct.Q4_K_M.gguf](https://huggingface.co/norallm/normistral-7b-warm/blob/main/normistral-7b-warm-instruct.Q4_K_M.gguf) | Q4_K_M | 4.83 | 4.07 GB| 6.16 GB | medium, balanced quality
|
122 |
-
| [normistral-7b-warm-instruct.Q5_K_M.gguf](https://huggingface.co/norallm/normistral-7b-warm/blob/main/normistral-7b-warm-instruct.Q5_K_M.gguf) | Q5_K_M | 5.67 | 4.78 GB| 6.87 GB | large, very low quality loss
|
123 |
| [normistral-7b-warm-instruct.Q6_K.gguf](https://huggingface.co/norallm/normistral-7b-warm/blob/main/normistral-7b-warm-instruct.Q6_K.gguf) | Q6_K | 6.56 | 5.54 GB| 7.63 GB | very large, extremely low quality loss |
|
124 |
-
| [normistral-7b-warm-instruct.Q8_0.gguf](https://huggingface.co/norallm/normistral-7b-warm/blob/main/normistral-7b-warm-instruct.Q8_0.gguf) | Q8_0 | 8.50 | 7.17 GB| 9.26 GB | very large, extremely low quality loss
|
125 |
|
126 |
### How to run from Python code
|
127 |
|
|
|
117 |
|
118 |
| Name | Quant method | Bits Per Weight | Size | Max RAM/VRAM required | Use case |
|
119 |
| ---- | ---- | ---- | ---- | ---- | ----- |
|
120 |
+
| [normistral-7b-warm-instruct.Q3_K_M.gguf](https://huggingface.co/norallm/normistral-7b-warm/blob/main/normistral-7b-warm-instruct.Q3_K_M.gguf) | Q3_K_M | 3.89 | 3.28 GB| 5.37 GB | very small, high loss of quality |
|
121 |
+
| [normistral-7b-warm-instruct.Q4_K_M.gguf](https://huggingface.co/norallm/normistral-7b-warm/blob/main/normistral-7b-warm-instruct.Q4_K_M.gguf) | Q4_K_M | 4.83 | 4.07 GB| 6.16 GB | medium, balanced quality |
|
122 |
+
| [normistral-7b-warm-instruct.Q5_K_M.gguf](https://huggingface.co/norallm/normistral-7b-warm/blob/main/normistral-7b-warm-instruct.Q5_K_M.gguf) | Q5_K_M | 5.67 | 4.78 GB| 6.87 GB | large, very low quality loss |
|
123 |
| [normistral-7b-warm-instruct.Q6_K.gguf](https://huggingface.co/norallm/normistral-7b-warm/blob/main/normistral-7b-warm-instruct.Q6_K.gguf) | Q6_K | 6.56 | 5.54 GB| 7.63 GB | very large, extremely low quality loss |
|
124 |
+
| [normistral-7b-warm-instruct.Q8_0.gguf](https://huggingface.co/norallm/normistral-7b-warm/blob/main/normistral-7b-warm-instruct.Q8_0.gguf) | Q8_0 | 8.50 | 7.17 GB| 9.26 GB | very large, extremely low quality loss |
|
125 |
|
126 |
### How to run from Python code
|
127 |
|