This is exl2 format model.
Yi-34b-200K-alpaca-rpv3-4bpw-hb6-exl2
- base model: Yi-34B-200K
- LoRA: Yi-34b-alpaca-cot-lora
- LoRA: limarpv3-yi-llama-34b-lora
- Instruction template: Alpaca
description
- This is test for exllamav2 model
- 4.15bpw
python convert.py -i Yi-34b-200K-alpaca-rpv3 -c exl2/0000.parquet -o Yi-34b-200K-alpaca-rpv3-4bpw-hb6-exl2 -hb 6 -l 4096 -b 4.15
- convert doc
- calibration dataset: WikiText-2-v1
- oobabooga/text-generation-webui must add
--trust-remote-code
into CMD_FLAGS.txt and use ExLlamav2_HF to load model
- Downloads last month
- 16
Inference API (serverless) does not yet support model repos that contain custom code.