Update README.md
Browse files
README.md
CHANGED
@@ -11,7 +11,7 @@ language:
|
|
11 |
---
|
12 |
# Gemma 2 Aeria 9B
|
13 |
|
14 |
-
This is a merge of pre-trained language models created using [
|
15 |
|
16 |
### Merge Method
|
17 |
|
@@ -29,16 +29,15 @@ The following YAML configuration was used to produce this model:
|
|
29 |
|
30 |
```yaml
|
31 |
base_model: princeton-nlp/gemma-2-9b-it-SimPO
|
32 |
-
chat_template: auto
|
33 |
-
dtype: bfloat16
|
34 |
-
merge_method: dare_ties
|
35 |
models:
|
36 |
-
- model: princeton-nlp/gemma-2-9b-it-SimPO
|
37 |
-
- model: lemon07r/Gemma-2-Ataraxy-v2-9B
|
38 |
-
|
39 |
-
|
40 |
-
|
|
|
41 |
parameters:
|
42 |
int8_mask: true
|
43 |
-
|
|
|
44 |
```
|
|
|
11 |
---
|
12 |
# Gemma 2 Aeria 9B
|
13 |
|
14 |
+
This is a merge of pre-trained language models created using [Mergekit](https://github.com/cg123/mergekit).
|
15 |
|
16 |
### Merge Method
|
17 |
|
|
|
29 |
|
30 |
```yaml
|
31 |
base_model: princeton-nlp/gemma-2-9b-it-SimPO
|
|
|
|
|
|
|
32 |
models:
|
33 |
+
- model: princeton-nlp/gemma-2-9b-it-SimPO
|
34 |
+
- model: lemon07r/Gemma-2-Ataraxy-v2-9B
|
35 |
+
parameters:
|
36 |
+
density: 0.5
|
37 |
+
weight: 1
|
38 |
+
merge_method: dare_ties
|
39 |
parameters:
|
40 |
int8_mask: true
|
41 |
+
chat_template: auto
|
42 |
+
dtype: bfloat16
|
43 |
```
|