metadata
license: other
inference: false
base_model: mistralai/Mistral-Small-Instruct-2409
base_model_relation: quantized
tags:
- p22
- ov
- llmware-chat
- green
mistral-small-instruct-2409-ov
mistral-small-instruct-2409-ov is an OpenVino int4 quantized version of mistral-small-instruct-2409, which is a research 22B general purpose chat/instruct model from Mistral.
This model is licensed under MRL (Mistral Research License), which can be found here. Per this license, this model is available only for research.
We are including in this collection for the purpose of testing the speed and quality of a 4-bit OpenVino quantized 22B parameter model running locally on an AI PC.
Model Description
- Developed by: mistralai
- Quantized by: llmware
- Model type: mistral-small-instruct-2409-ov
- Parameters: 22 billion
- Model Parent: mistralai/mistral-small-instruct-2409
- Language(s) (NLP): English
- License: MRL - research only - not commercial
- Uses: General use
- Quantization: int4