Files changed (1) hide show
  1. README.md +58 -0
README.md CHANGED
@@ -1,3 +1,61 @@
1
  ---
 
 
 
 
 
 
 
 
2
  license: apache-2.0
 
 
3
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
+ base_model:
3
+ - flemmingmiguel/MBX-7B-v3
4
+ - paulml/NeuralOmniWestBeaglake-7B
5
+ - FelixChao/Faraday-7B
6
+ - paulml/NeuralOmniBeagleMBX-v3-7B
7
+ tags:
8
+ - mergekit
9
+ - merge
10
  license: apache-2.0
11
+ language:
12
+ - en
13
  ---
14
+
15
+ ![image/png](https://cdn-uploads.huggingface.co/production/uploads/6589d7e6586088fd2784a12c/eDLmpTkM4vuk8HiQcUzWv.png)
16
+
17
+ # To see what will happen.
18
+
19
+ [Join our Discord!](https://discord.gg/aEGuFph9)
20
+
21
+ [GGUF FILES HERE](https://huggingface.co/Kquant03/Samlagast-7B-GGUF)
22
+
23
+ This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
24
+
25
+ ### Merge Method
26
+
27
+ This model was merged using the [task arithmetic](https://arxiv.org/abs/2212.04089) merge method using [paulml/NeuralOmniBeagleMBX-v3-7B](https://huggingface.co/paulml/NeuralOmniBeagleMBX-v3-7B) as a base.
28
+
29
+ ### Models Merged
30
+
31
+ The following models were included in the merge:
32
+ * [flemmingmiguel/MBX-7B-v3](https://huggingface.co/flemmingmiguel/MBX-7B-v3)
33
+ * [paulml/NeuralOmniWestBeaglake-7B](https://huggingface.co/paulml/NeuralOmniWestBeaglake-7B)
34
+ * [FelixChao/Faraday-7B](https://huggingface.co/FelixChao/Faraday-7B)
35
+
36
+ ### Configuration
37
+
38
+ The following YAML configuration was used to produce this model:
39
+
40
+ ```yaml
41
+ models:
42
+ - model: paulml/NeuralOmniWestBeaglake-7B
43
+ parameters:
44
+ weight: 1
45
+ - model: FelixChao/Faraday-7B
46
+ parameters:
47
+ weight: 1
48
+ - model: flemmingmiguel/MBX-7B-v3
49
+ parameters:
50
+ weight: 1
51
+ - model: paulml/NeuralOmniBeagleMBX-v3-7B
52
+ parameters:
53
+ weight: 1
54
+ merge_method: task_arithmetic
55
+ base_model: paulml/NeuralOmniBeagleMBX-v3-7B
56
+ parameters:
57
+ normalize: true
58
+ int8_mask: true
59
+ dtype: float16
60
+
61
+ ```