Locutusque
commited on
Commit
•
1e7d056
1
Parent(s):
ab72a2d
Update README.md
Browse files
README.md
CHANGED
@@ -1,13 +1,4 @@
|
|
1 |
---
|
2 |
-
tags:
|
3 |
-
- merge
|
4 |
-
- mergekit
|
5 |
-
- lazymergekit
|
6 |
-
- M4-ai/tau-0.5B
|
7 |
-
- Qwen/Qwen1.5-0.5B
|
8 |
-
base_model:
|
9 |
-
- M4-ai/tau-0.5B
|
10 |
-
- Qwen/Qwen1.5-0.5B
|
11 |
license: cc-by-sa-4.0
|
12 |
datasets:
|
13 |
- Locutusque/UltraTextbooks-2.0
|
@@ -29,7 +20,7 @@ inference:
|
|
29 |
- **Dataset:** UltraTextbooks-2.0
|
30 |
- **Model Size:** 0.5B parameters
|
31 |
- **Model Type:** Language Model
|
32 |
-
- **Training Procedure:** Further pre-training of Qwen1.5-0.5B on UltraTextbooks-2.0
|
33 |
|
34 |
## Model Use
|
35 |
tau-0.5B is designed to be a general-purpose language model with enhanced capabilities in the domains of machine learning, mathematics, and coding. It can be used for a wide range of natural language processing tasks, such as:
|
|
|
1 |
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
2 |
license: cc-by-sa-4.0
|
3 |
datasets:
|
4 |
- Locutusque/UltraTextbooks-2.0
|
|
|
20 |
- **Dataset:** UltraTextbooks-2.0
|
21 |
- **Model Size:** 0.5B parameters
|
22 |
- **Model Type:** Language Model
|
23 |
+
- **Training Procedure:** Further pre-training of Qwen1.5-0.5B on UltraTextbooks-2.0.
|
24 |
|
25 |
## Model Use
|
26 |
tau-0.5B is designed to be a general-purpose language model with enhanced capabilities in the domains of machine learning, mathematics, and coding. It can be used for a wide range of natural language processing tasks, such as:
|