Update README.md
Browse files
README.md
CHANGED
@@ -1,9 +1,20 @@
|
|
1 |
---
|
2 |
-
license:
|
3 |
language:
|
4 |
- en
|
5 |
---
|
6 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
7 |
# Model Description
|
8 |
This is an experiment to test merging 14 models using DARE TIES 🦙
|
9 |
|
|
|
1 |
---
|
2 |
+
license: cc
|
3 |
language:
|
4 |
- en
|
5 |
---
|
6 |
|
7 |
+
# Update 2023-12-19
|
8 |
+
|
9 |
+
In light of [dataset contamination issue among the merged models](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard/discussions/474)
|
10 |
+
raised by the community in recent days, in particular
|
11 |
+
[berkeley-nest/Starling-LM-7B-alpha](https://huggingface.co/berkeley-nest/Starling-LM-7B-alpha),
|
12 |
+
[Q-bert/MetaMath-Cybertron-Starling](https://huggingface.co/Q-bert/MetaMath-Cybertron-Starling), and
|
13 |
+
[janai-hq/trinity-v1](https://huggingface.co/janai-hq/trinity-v1)
|
14 |
+
we decided to remake another model without the models mentioned.
|
15 |
+
Additionally, their CC-by-NC-4.0 license is restrictive and thus are not suitable for an open model.
|
16 |
+
|
17 |
+
|
18 |
# Model Description
|
19 |
This is an experiment to test merging 14 models using DARE TIES 🦙
|
20 |
|