This is a Llama 2 architecture model series trained on the FineWeb dataset. This is ~500 Million model uses lamma tokenizer. trained using code from Karpathy lamma2
- Downloads last month
- 15
Inference API (serverless) does not yet support Transformers models for this pipeline type.