Update README.md
Browse files# Model Summary
Llama3-8B-COIG-CQIA is an instruction-tuned language model for Chinese & English users with various abilities such as roleplaying & tool-using built upon the Meta-Llama-3-8B-Instruct model.
Developed by: [Wenfeng Qiu](https://github.com/summit4you)
- License: [Llama-3 License](https://llama.meta.com/llama3/license/)
- Base Model: Meta-Llama-3-8B-Instruct
- Model Size: 8.03B
- Context length: 8K
# 1. Introduction
Training framework: [unsloth](https://github.com/unslothai/unsloth)
Training details:
- epochs: 1
- learning rate: 2e-4
- learning rate scheduler type: linear
- warmup_steps: 5
- cutoff len (i.e. context length): 2048
- global batch size: 2
- fine-tuning type: full parameters
- optimizer: adamw_8bit
# 2. Usage
Now, use the `model-unsloth.gguf` file or `model-unsloth-Q4_K_M.gguf` file in `llama.cpp` or a UI based system like `GPT4All`. You can install GPT4All by going [here](https://gpt4all.io/index.html).
Here is the example in `llama.cpp`.
```python
from llama_cpp import Llama
model = Llama(
"/Your/Path/To/Llama3-8B-COIG-CQIA.Q8_0.gguf",
verbose=False,
n_gpu_layers=-1,
)
system_prompt = "You are a helpful assistant."
def generate_reponse(_model, _messages, _max_tokens=8192):
_output = _model.create_chat_completion(
_messages,
stop=["<|eot_id|>", "<|end_of_text|>"],
max_tokens=_max_tokens,
)["choices"][0]["message"]["content"]
return _output
# The following are some examples
messages = [
{
"role": "system",
"content": system_prompt,
},
{"role": "user", "content": "你是谁?"},
]
print(generate_reponse(_model=model, _messages=messages), end="\n\n\n")
```
@@ -1,3 +1,17 @@
|
|
1 |
-
---
|
2 |
-
license: mit
|
3 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
license: mit
|
3 |
+
datasets:
|
4 |
+
- m-a-p/COIG-CQIA
|
5 |
+
language:
|
6 |
+
- zh
|
7 |
+
- en
|
8 |
+
metrics:
|
9 |
+
- accuracy
|
10 |
+
pipeline_tag: text2text-generation
|
11 |
+
tags:
|
12 |
+
- finance
|
13 |
+
- legal
|
14 |
+
- medical
|
15 |
+
- code
|
16 |
+
- biology
|
17 |
+
---
|