Convert from mistralai/Mistral-7B-v0.1, and 4 bits quantized.
- Downloads last month
- 6
Inference API (serverless) is not available, repository is disabled.
Model tree for BricksDisplay/Mistral-7B-v0.1-q4
Base model
mistralai/Mistral-7B-v0.1
Quantized
this model