Edit model card

Exl2 version of maywell/PiVoT-0.1-early

branch

main : 8bpw h8
6bh8 : 6bpw h8
4bh8 : 4bpw h8

Using VMware/open-instruct as dataset

Quantization settings : python convert.py -i models/maywell_PiVoT-0.1-early -o PiVoT-0.1-early-temp -cf PiVoT-0.1-early-8bpw-h8-exl2 -c 0000.parquet -l 4096 -b 8 -hb 8
python convert.py -i models/maywell_PiVoT-0.1-early -o PiVoT-0.1-early-temp2 -cf PiVoT-0.1-early-6bpw-h8-exl22 -c 0000.parquet -l 4096 -b 6 -hb 8 -m PiVoT-0.1-early-temp/measurement.json
python convert.py -i models/maywell_PiVoT-0.1-early -o PiVoT-0.1-early-temp3 -cf PiVoT-0.1-early-4bpw-h8-exl2 -c 0000.parquet -l 4096 -b 4 -hb 8 -m PiVoT-0.1-early-temp/measurement.json

below this line is original readme

PiVoT-0.1-early

image/png

Model Details

Description

PivoT is Finetuned model based on Mistral 7B. It is variation from Synatra v0.3 RP which has shown decent performance.

OpenOrca Dataset used when finetune PiVoT variation. Arcalive Ai Chat Chan log 7k, ko_wikidata_QA, kyujinpy/OpenOrca-KO and other datasets used on base model.

Follow me on twitter: https://twitter.com/stablefluffy

Consider Support me making these model alone: https://www.buymeacoffee.com/mwell or with Runpod Credit Gift 💕

Contact me on Telegram: https://t.me/AlzarTakkarsen

Downloads last month
7
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Datasets used to train IHaBiS/PiVoT-0.1-early-exl2