Update README.md
Browse files
README.md
CHANGED
@@ -8,6 +8,16 @@ tags:
|
|
8 |
- nsfw
|
9 |
license: cc-by-nc-4.0
|
10 |
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
11 |
## What is PetrolLM-Claude-Chat?
|
12 |
PetrolLM-Claude-Chat is the [CollectiveCognition-v1.1-Mistral-7B](https://huggingface.co/teknium/CollectiveCognition-v1.1-Mistral-7B) model with the [PetrolLoRA](https://huggingface.co/Norquinal/PetrolLoRA) applied.
|
13 |
|
|
|
8 |
- nsfw
|
9 |
license: cc-by-nc-4.0
|
10 |
---
|
11 |
+
|
12 |
+
exl2 version of [Norquinal/PetrolLM-CollectiveCognition](https://huggingface.co/Norquinal/PetrolLM-CollectiveCognition)
|
13 |
+
used dataset : [wikitext](https://huggingface.co/datasets/wikitext)
|
14 |
+
quantized by IHaBiS
|
15 |
+
|
16 |
+
command : python convert.py -i models/Norquinal_PetrolLM-CollectiveCognition -o Norquinal_PetrolLM-CollectiveCognition-temp -cf Norquinal_PetrolLM-CollectiveCognition-6bpw-h8-exl2 -c 0000.parquet -l 4096 -b 6 -hb 8 -ss 4096 -m Norquinal_PetrolLM-CollectiveCognition_measurement.json
|
17 |
+
|
18 |
+
|
19 |
+
Below this sentence is original model card
|
20 |
+
|
21 |
## What is PetrolLM-Claude-Chat?
|
22 |
PetrolLM-Claude-Chat is the [CollectiveCognition-v1.1-Mistral-7B](https://huggingface.co/teknium/CollectiveCognition-v1.1-Mistral-7B) model with the [PetrolLoRA](https://huggingface.co/Norquinal/PetrolLoRA) applied.
|
23 |
|