eval_name
stringlengths
9
97
Precision
stringclasses
5 values
Type
stringclasses
6 values
T
stringclasses
6 values
Weight type
stringclasses
3 values
Architecture
stringclasses
53 values
Model
stringlengths
355
611
fullname
stringlengths
4
89
Model sha
stringlengths
0
40
Average ⬆️
float64
27
81.3
Hub License
stringclasses
35 values
Hub ❤️
int64
0
4.88k
#Params (B)
int64
0
238
Available on the hub
bool
2 classes
Merged
bool
2 classes
MoE
bool
2 classes
Flagged
bool
1 class
date
stringlengths
0
26
Chat Template
bool
2 classes
ARC
float64
19.7
87.5
HellaSwag
float64
20.7
92.8
MMLU
float64
17.8
89.4
TruthfulQA
float64
27.9
82.3
Winogrande
float64
47.2
91.5
GSM8K
float64
0
88.2
Maintainers Choice
bool
2 classes
sail_Sailor-1.8B-Chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/sail/Sailor-1.8B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sail/Sailor-1.8B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_sail__Sailor-1.8B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sail/Sailor-1.8B-Chat
2a3bbb343ffba05985f26f66e2d3ee8e695a2e94
38.762
apache-2.0
3
1
true
true
true
true
2024-03-11T05:50:26Z
false
35.750853
57.120096
38.310721
38.711002
59.116022
3.563306
false
sail_Sailor-4B_float16
float16
🟢 pretrained
🟢
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/sail/Sailor-4B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sail/Sailor-4B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_sail__Sailor-4B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sail/Sailor-4B
bc4d4e338bf7e64e52dd05c69bc7e893a21d9dad
44.192669
apache-2.0
6
3
true
true
true
true
2024-03-02T21:20:15Z
false
44.453925
69.527982
38.994512
37.02023
66.061563
9.097801
false
sail_Sailor-4B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/sail/Sailor-4B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sail/Sailor-4B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_sail__Sailor-4B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sail/Sailor-4B
bc4d4e338bf7e64e52dd05c69bc7e893a21d9dad
43.715538
apache-2.0
6
3
true
true
true
true
2024-03-03T05:19:59Z
false
43.856655
69.508066
37.449372
37.017661
65.66693
8.794541
false
sail_Sailor-4B-Chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/sail/Sailor-4B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sail/Sailor-4B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_sail__Sailor-4B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sail/Sailor-4B-Chat
462e04484d1b1dd9c4dffe4f3d2d313e01a7abda
45.795954
apache-2.0
1
3
true
true
true
true
2024-03-11T05:49:17Z
false
45.051195
68.362876
43.957957
42.08648
66.219416
9.097801
false
sail_Sailor-7B_float16
float16
🟢 pretrained
🟢
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/sail/Sailor-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sail/Sailor-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_sail__Sailor-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sail/Sailor-7B
f8a0533c4818d021a7dbf985b9779d0a640bae6b
53.816504
apache-2.0
27
7
true
true
true
true
2024-03-02T21:19:58Z
false
49.829352
76.209918
54.840281
40.118356
69.37648
32.52464
false
sail_Sailor-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/sail/Sailor-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sail/Sailor-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_sail__Sailor-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sail/Sailor-7B
f8a0533c4818d021a7dbf985b9779d0a640bae6b
53.879128
apache-2.0
27
7
true
true
true
true
2024-03-03T05:18:28Z
false
49.829352
76.209918
54.653296
40.083896
69.1397
33.358605
false
sail_Sailor-7B-Chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/sail/Sailor-7B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sail/Sailor-7B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_sail__Sailor-7B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sail/Sailor-7B-Chat
c7bd0a5e9ec309952f4b8187399314d618da8496
54.807878
apache-2.0
5
7
true
true
true
true
2024-03-11T05:47:09Z
false
52.303754
75.014937
56.238743
44.090855
70.797159
30.40182
false
saishf_Fett-Eris-Mix-7B_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/saishf/Fett-Eris-Mix-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">saishf/Fett-Eris-Mix-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_saishf__Fett-Eris-Mix-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
saishf/Fett-Eris-Mix-7B
287e1bc2ca35ba1978cfe1040d9183d530b23c0c
71.65924
cc-by-nc-4.0
2
7
true
false
true
true
2024-03-06T13:49:48Z
false
68.771331
87.333201
63.650898
71.911396
80.820837
57.467779
false
saishf_Fett-uccine-11B-Experiment_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/saishf/Fett-uccine-11B-Experiment" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">saishf/Fett-uccine-11B-Experiment</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_saishf__Fett-uccine-11B-Experiment" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
saishf/Fett-uccine-11B-Experiment
b0673c461432527942cf2e82ffdca34360098712
63.087826
agpl-3.0
0
10
true
false
true
true
2024-02-28T15:11:42Z
false
63.139932
85.391356
59.715967
69.916961
74.585635
25.777104
false
saishf_Fimbulvetr-Kuro-Lotus-10.7B_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/saishf/Fimbulvetr-Kuro-Lotus-10.7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">saishf/Fimbulvetr-Kuro-Lotus-10.7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_saishf__Fimbulvetr-Kuro-Lotus-10.7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
saishf/Fimbulvetr-Kuro-Lotus-10.7B
b41d174c2041e8661086e4eb939480641a5c66dc
72.726661
cc-by-nc-4.0
16
10
true
false
false
true
2024-02-13T05:33:28Z
false
69.539249
87.870942
66.994477
60.950702
84.135754
66.86884
false
saishf_Kuno-Lake-7B_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/saishf/Kuno-Lake-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">saishf/Kuno-Lake-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_saishf__Kuno-Lake-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
saishf/Kuno-Lake-7B
ee6af302f1aa7b49a89f79ae2ae15e3a357099f0
73.564243
cc-by-nc-4.0
2
7
true
false
true
true
2024-02-13T09:17:03Z
false
71.843003
88.149771
64.758266
66.830417
84.45146
65.35254
false
saishf_Kuro-Lotus-10.7B_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/saishf/Kuro-Lotus-10.7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">saishf/Kuro-Lotus-10.7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_saishf__Kuro-Lotus-10.7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
saishf/Kuro-Lotus-10.7B
ec748dade16858ef2fb3c712c78de748d165a21c
71.90494
cc-by-nc-4.0
4
10
true
false
true
true
2024-02-13T09:14:12Z
false
68.686007
87.512448
66.640258
58.265558
84.21468
66.11069
false
saishf_Llama4Some-SOVL-4x8B-L3-V1_float16
float16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/saishf/Llama4Some-SOVL-4x8B-L3-V1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">saishf/Llama4Some-SOVL-4x8B-L3-V1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_saishf__Llama4Some-SOVL-4x8B-L3-V1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
saishf/Llama4Some-SOVL-4x8B-L3-V1
965db04bc06f157c121316ee883d0d09c9c9eab9
66.757536
cc-by-nc-4.0
2
24
true
false
false
true
2024-05-12T14:43:12Z
false
61.945392
79.376618
65.485549
51.481469
75.690608
66.56558
false
saishf_Merge-Mayhem-L3-V2_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/saishf/Merge-Mayhem-L3-V2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">saishf/Merge-Mayhem-L3-V2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_saishf__Merge-Mayhem-L3-V2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
saishf/Merge-Mayhem-L3-V2
d40f39f4201f5c11d9a91311029fff84d6909265
55.838124
cc-by-nc-4.0
3
8
true
false
true
true
2024-05-07T15:54:06Z
false
61.68942
80.033858
66.825295
51.503014
74.901342
0.075815
false
saishf_Merge-Mayhem-L3-V2.1_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/saishf/Merge-Mayhem-L3-V2.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">saishf/Merge-Mayhem-L3-V2.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_saishf__Merge-Mayhem-L3-V2.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
saishf/Merge-Mayhem-L3-V2.1
ac3d3a0b3b4911530ccae3941cb14252a17083c6
67.274306
cc-by-nc-4.0
0
8
true
false
true
true
2024-05-09T09:08:48Z
false
62.286689
79.824736
67.577194
52.910959
75.769534
65.276725
false
saishf_Multi-Verse-RP-7B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/saishf/Multi-Verse-RP-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">saishf/Multi-Verse-RP-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_saishf__Multi-Verse-RP-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
saishf/Multi-Verse-RP-7B
ca05b22adfc6ef9a9af7d2a07d617ac8684b1b9a
74.73297
cc-by-nc-4.0
3
7
true
false
true
true
2024-03-13T10:17:59Z
false
72.354949
88.368851
63.935929
73.18839
84.135754
66.41395
false
saishf_Neural-SOVLish-Devil-8B-L3_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/saishf/Neural-SOVLish-Devil-8B-L3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">saishf/Neural-SOVLish-Devil-8B-L3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_saishf__Neural-SOVLish-Devil-8B-L3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
saishf/Neural-SOVLish-Devil-8B-L3
6c53c200a066c7c36516d3c16765db2ed76439bb
72.223811
cc-by-nc-4.0
9
8
true
false
true
true
2024-05-28T13:02:22Z
false
69.112628
84.773949
69.023869
59.051561
78.295185
73.085671
false
saishf_Ortho-SOVL-8B-L3_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/saishf/Ortho-SOVL-8B-L3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">saishf/Ortho-SOVL-8B-L3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_saishf__Ortho-SOVL-8B-L3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
saishf/Ortho-SOVL-8B-L3
a11b95f57c1f33e6e26c573c08017d43aa64c425
65.675895
cc-by-nc-4.0
2
8
true
false
true
true
2024-05-12T14:43:28Z
false
60.153584
77.853017
64.708157
50.034656
74.664562
66.641395
false
saishf_SOVL-Mega-Mash-L3-8B_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/saishf/SOVL-Mega-Mash-L3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">saishf/SOVL-Mega-Mash-L3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_saishf__SOVL-Mega-Mash-L3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
saishf/SOVL-Mega-Mash-L3-8B
bf2d42047f6fc31bd9406aebf0f4267f5b02c2bf
67.432834
cc-by-nc-4.0
1
8
true
false
true
true
2024-05-12T14:43:55Z
false
62.030717
79.675363
67.643051
51.835792
76.164167
67.247915
false
saishf_SOVL-Mega-Mash-V2-L3-8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/saishf/SOVL-Mega-Mash-V2-L3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">saishf/SOVL-Mega-Mash-V2-L3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_saishf__SOVL-Mega-Mash-V2-L3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
saishf/SOVL-Mega-Mash-V2-L3-8B
1c97dfa1c4a7cdce65fa38dc98031fdc621d6784
67.990679
cc-by-nc-4.0
1
8
true
false
true
true
2024-06-03T09:44:31Z
false
63.225256
80.352519
68.274605
53.21343
76.085241
66.793025
false
saishf_SOVLish-Devil-8B-L3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/saishf/SOVLish-Devil-8B-L3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">saishf/SOVLish-Devil-8B-L3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_saishf__SOVLish-Devil-8B-L3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
saishf/SOVLish-Devil-8B-L3
39d7a970963f2acf8e28a7758674e9c54b4940b0
71.862345
cc-by-nc-4.0
7
8
true
false
true
true
2024-05-28T13:01:41Z
false
69.197952
84.435371
68.972049
57.952217
78.137332
72.479151
false
saishf_SOVLish-Maid-L3-8B_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/saishf/SOVLish-Maid-L3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">saishf/SOVLish-Maid-L3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_saishf__SOVLish-Maid-L3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
saishf/SOVLish-Maid-L3-8B
16f2c0201677f974fbf6b4b097c44a59433cdc96
66.243467
cc-by-nc-4.0
8
8
true
false
true
true
2024-05-09T13:40:39Z
false
61.348123
79.097789
67.23946
49.878844
75.453828
64.44276
false
saishf_Top-Western-Maid-7B_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/saishf/Top-Western-Maid-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">saishf/Top-Western-Maid-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_saishf__Top-Western-Maid-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
saishf/Top-Western-Maid-7B
2973b0902468b765a9d6452ae3ba116a3e1ceba0
71.570101
cc-by-nc-4.0
0
7
true
false
true
true
2024-02-13T09:17:25Z
false
69.368601
87.402908
64.629632
58.792846
83.267561
65.95906
false
saishf_West-Hermes-7B_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/saishf/West-Hermes-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">saishf/West-Hermes-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_saishf__West-Hermes-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
saishf/West-Hermes-7B
9cd172b853949228761dfa65dfec57746475d703
73.597802
apache-2.0
4
7
true
false
true
true
2024-02-09T01:00:18Z
false
71.672355
87.602071
64.830611
64.256763
84.68824
68.53677
false
saishf_West-Maid-7B_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/saishf/West-Maid-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">saishf/West-Maid-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_saishf__West-Maid-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
saishf/West-Maid-7B
a271497bda998eed0acd3e68165133e7f3d196a1
69.092874
cc-by-nc-4.0
0
7
true
false
true
true
2024-02-13T09:16:32Z
false
67.235495
86.436965
64.845746
51.004022
82.715075
62.319939
false
saltlux_luxia-21.4b-alignment-v0.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/saltlux/luxia-21.4b-alignment-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">saltlux/luxia-21.4b-alignment-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_saltlux__luxia-21.4b-alignment-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
saltlux/luxia-21.4b-alignment-v0.1
88a47c498102132f5262581803fe1ed9252a16bc
77.320748
0
21
false
true
true
true
2024-03-11T17:09:10Z
false
76.791809
91.794463
68.180118
76.702039
87.529597
62.926459
false
saltlux_luxia-21.4b-alignment-v0.2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/saltlux/luxia-21.4b-alignment-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">saltlux/luxia-21.4b-alignment-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_saltlux__luxia-21.4b-alignment-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
saltlux/luxia-21.4b-alignment-v0.2
59243de958296a4516f72ebfb1b597188dd59229
77.511965
0
21
false
true
true
true
2024-03-11T17:08:55Z
false
76.706485
91.605258
68.270921
79.795482
87.056038
61.637604
false
saltlux_luxia-21.4b-alignment-v0.3_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/saltlux/luxia-21.4b-alignment-v0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">saltlux/luxia-21.4b-alignment-v0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_saltlux__luxia-21.4b-alignment-v0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
saltlux/luxia-21.4b-alignment-v0.3
89d77a1219490fc423615f3ca28c1888bb4845a5
75.914239
0
21
false
true
true
true
2024-03-11T17:09:41Z
false
76.279863
91.525593
68.098218
69.435189
87.371744
62.774829
false
saltlux_luxia-21.4b-alignment-v0.4_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/saltlux/luxia-21.4b-alignment-v0.4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">saltlux/luxia-21.4b-alignment-v0.4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_saltlux__luxia-21.4b-alignment-v0.4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
saltlux/luxia-21.4b-alignment-v0.4
4c4342a9c3e8e793a0969b74222d887d53cb294e
77.233404
0
21
false
true
true
true
2024-03-11T17:10:06Z
false
76.877133
91.834296
68.05694
76.719153
87.213891
62.699014
false
saltlux_luxia-21.4b-alignment-v1.0_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/saltlux/luxia-21.4b-alignment-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">saltlux/luxia-21.4b-alignment-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_saltlux__luxia-21.4b-alignment-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
saltlux/luxia-21.4b-alignment-v1.0
910c73192c30fb51dc94f69777b2ec7cc3a4465b
77.736585
apache-2.0
32
21
true
true
true
true
2024-03-09T08:58:53Z
false
77.730375
91.824338
68.047082
79.201843
87.371744
62.244124
false
saltlux_luxia-21.4b-alignment-v1.0_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/saltlux/luxia-21.4b-alignment-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">saltlux/luxia-21.4b-alignment-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_saltlux__luxia-21.4b-alignment-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
saltlux/luxia-21.4b-alignment-v1.0
ba3403eaafc6d1f6e3a73245314ee96025c08d96
77.744307
apache-2.0
32
21
true
true
true
true
2024-03-11T03:09:26Z
false
77.474403
91.884087
68.0953
79.165625
87.450671
62.395754
false
saltlux_luxia-21.4b-alignment-v1.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/saltlux/luxia-21.4b-alignment-v1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">saltlux/luxia-21.4b-alignment-v1.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_saltlux__luxia-21.4b-alignment-v1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
saltlux/luxia-21.4b-alignment-v1.1
d1dda8b111024dc06eb3a7072100e74d5039a782
75.028244
0
21
false
true
true
true
2024-03-17T12:37:53Z
false
78.242321
89.68333
68.078268
80.88413
86.503552
46.777862
false
saltlux_luxia-21.4b-alignment-v1.1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/saltlux/luxia-21.4b-alignment-v1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">saltlux/luxia-21.4b-alignment-v1.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_saltlux__luxia-21.4b-alignment-v1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
saltlux/luxia-21.4b-alignment-v1.1
d1dda8b111024dc06eb3a7072100e74d5039a782
74.95703
0
21
false
true
true
true
2024-03-17T12:37:14Z
false
78.242321
89.693288
68.216023
80.909431
86.661405
46.019712
false
saltlux_luxia-21.4b-alignment-v1.2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/saltlux/luxia-21.4b-alignment-v1.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">saltlux/luxia-21.4b-alignment-v1.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_saltlux__luxia-21.4b-alignment-v1.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
saltlux/luxia-21.4b-alignment-v1.2
e318e0a864db847b4020cbc8d23035dae08522ab
78.135781
apache-2.0
4
21
true
true
false
true
2024-05-27T14:08:06Z
false
77.730375
90.858395
67.85938
79.155112
86.266772
66.944655
false
sambanovasystems_SambaLingo-Thai-Chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/sambanovasystems/SambaLingo-Thai-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sambanovasystems/SambaLingo-Thai-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_sambanovasystems__SambaLingo-Thai-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sambanovasystems/SambaLingo-Thai-Chat
fbe817bea4967720268af0e5793000b109147bde
49.454219
llama2
35
6
true
true
true
true
2024-03-03T19:39:53Z
false
52.730375
78.420633
43.953758
40.835614
72.217837
8.567096
false
samir-fama_FernandoGPT-v1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/samir-fama/FernandoGPT-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">samir-fama/FernandoGPT-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_samir-fama__FernandoGPT-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
samir-fama/FernandoGPT-v1
a26fbae35874a6aafb02e39fd8a623022b9e2a95
72.869283
apache-2.0
2
7
true
false
true
true
2023-12-31T09:50:37Z
false
69.453925
86.944832
65.186254
61.181027
81.136543
73.313116
false
samir-fama_SamirGPT-v1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/samir-fama/SamirGPT-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">samir-fama/SamirGPT-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_samir-fama__SamirGPT-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
samir-fama/SamirGPT-v1
8e8abca2d9703dff2d60de78b013360a9a3f4d5e
73.109078
apache-2.0
2
7
true
false
true
true
2023-12-31T07:47:41Z
false
69.539249
87.044413
65.295108
63.365668
81.689029
71.721001
false
sarahlintang_mistral-indo-7b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/sarahlintang/mistral-indo-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sarahlintang/mistral-indo-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_sarahlintang__mistral-indo-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sarahlintang/mistral-indo-7b
eb5051623b2057c2af3d69247a649d4e8ec5b111
59.675403
apache-2.0
0
7
true
true
true
true
2024-02-03T20:48:42Z
false
61.09215
81.189006
62.991421
42.335977
78.374112
32.06975
false
sartifyllc_dociproLLM-7B_float16
float16
🟢 pretrained
🟢
Original
FalconForCausalLM
<a target="_blank" href="https://huggingface.co/sartifyllc/dociproLLM-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sartifyllc/dociproLLM-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_sartifyllc__dociproLLM-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sartifyllc/dociproLLM-7B
5ab56efa2c334e9d6ed5a986ebffce8c4bf83bd1
44.195168
0
6
false
true
true
true
2024-05-07T05:45:07Z
false
47.866894
78.11193
27.77573
34.258192
72.533544
4.624716
false
sartmis1_starcoder-finetune-selfinstruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Unknown
<a target="_blank" href="https://huggingface.co/sartmis1/starcoder-finetune-selfinstruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sartmis1/starcoder-finetune-selfinstruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_sartmis1__starcoder-finetune-selfinstruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sartmis1/starcoder-finetune-selfinstruct
b21bd307ea7417185e7dc59557c399a3e4e0092b
35.64574
0
0
false
true
true
true
2023-10-16T12:48:18Z
false
31.228669
47.659829
29.518811
41.627662
57.77427
6.065201
false
sarvamai_OpenHathi-7B-Hi-v0.1-Base_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/sarvamai/OpenHathi-7B-Hi-v0.1-Base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sarvamai/OpenHathi-7B-Hi-v0.1-Base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_sarvamai__OpenHathi-7B-Hi-v0.1-Base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sarvamai/OpenHathi-7B-Hi-v0.1-Base
2cbb156ab4426113115bc3387b06d1940015119a
46.642358
llama2
93
6
true
true
true
true
2023-12-15T14:44:49Z
false
49.488055
74.337781
41.381804
37.462221
71.270718
5.913571
false
saucam_Arithmo-Wizard-2-7B_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/saucam/Arithmo-Wizard-2-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">saucam/Arithmo-Wizard-2-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_saucam__Arithmo-Wizard-2-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
saucam/Arithmo-Wizard-2-7B
f3dfc103652959db096440a32be1b3a6d7d5a13f
65.441075
apache-2.0
0
7
true
false
true
true
2024-04-16T18:43:38Z
false
62.201365
83.170683
61.640613
46.904704
78.531965
60.197119
false
saucam_Athena-8B_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/saucam/Athena-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">saucam/Athena-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_saucam__Athena-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
saucam/Athena-8B
612bce9dc5da1bc32c813dcd4d6fe104e1921ca6
57.641794
apache-2.0
0
8
true
false
true
true
2024-05-05T04:26:11Z
false
62.116041
83.290181
65.123332
57.515139
77.426993
0.379075
false
saucam_Nereus-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/saucam/Nereus-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">saucam/Nereus-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_saucam__Nereus-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
saucam/Nereus-7B
a4442b4bb9029466ae52ea6ba79dc6492678ff21
63.753128
apache-2.0
0
7
true
false
true
true
2024-04-04T13:36:41Z
false
62.713311
83.310098
61.185818
55.29007
77.03236
42.987111
false
saucam_Proteus-8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/saucam/Proteus-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">saucam/Proteus-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_saucam__Proteus-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
saucam/Proteus-8B
3f17b02d69b86c9d612e373b1346cedff3d9a699
69.548331
apache-2.0
0
8
true
false
true
true
2024-05-18T08:01:50Z
false
61.860068
82.852022
65.917441
56.529976
77.348066
72.782411
false
saucam_Skyro-4X8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/saucam/Skyro-4X8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">saucam/Skyro-4X8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_saucam__Skyro-4X8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
saucam/Skyro-4X8B
406bfaffea19098fae15a489731009f4f4a5c384
66.388048
apache-2.0
1
24
true
false
false
true
2024-04-26T19:26:08Z
false
61.262799
82.383987
66.666682
50.153931
77.663773
60.197119
false
saucam_aqua-smaug-0.3-8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/saucam/aqua-smaug-0.3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">saucam/aqua-smaug-0.3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_saucam__aqua-smaug-0.3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
saucam/aqua-smaug-0.3-8B
03d31e37a387212be72867d7c90ddb387a257c91
69.066072
apache-2.0
0
8
true
false
true
true
2024-04-22T11:51:59Z
false
63.139932
82.891854
67.277025
53.84434
77.190213
70.053071
false
saucam_aqua-smaug-hermes-8B_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/saucam/aqua-smaug-hermes-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">saucam/aqua-smaug-hermes-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_saucam__aqua-smaug-hermes-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
saucam/aqua-smaug-hermes-8B
63b8e25d55ae4ffe960122e6f06576814fd91086
57.10685
apache-2.0
1
8
true
false
true
true
2024-05-09T16:42:34Z
false
62.030717
82.31428
66.184482
55.559042
76.400947
0.15163
false
saucam_mistral-orpo-beta-NeuralBeagle14-7B-dare-ties_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/saucam/mistral-orpo-beta-NeuralBeagle14-7B-dare-ties" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">saucam/mistral-orpo-beta-NeuralBeagle14-7B-dare-ties</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_saucam__mistral-orpo-beta-NeuralBeagle14-7B-dare-ties" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
saucam/mistral-orpo-beta-NeuralBeagle14-7B-dare-ties
3fb5752c0b99378f10e5a9ad1ccdd236a4214479
69.299355
apache-2.0
0
7
true
false
true
true
2024-03-17T04:28:32Z
false
66.723549
85.978889
64.630821
53.866049
81.21547
63.38135
false
sauce1337_AppleSauce-L2-13b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/sauce1337/AppleSauce-L2-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sauce1337/AppleSauce-L2-13b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_sauce1337__AppleSauce-L2-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sauce1337/AppleSauce-L2-13b
ba253c52eb85e24987c81e5d36b5a9a00e276ce7
55.905101
cc-by-nc-4.0
1
13
true
true
true
true
2023-10-16T12:46:18Z
false
61.006826
83.608843
57.065734
47.814235
75.927388
10.007582
false
sauce1337_BerrySauce-L2-13b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/sauce1337/BerrySauce-L2-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sauce1337/BerrySauce-L2-13b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_sauce1337__BerrySauce-L2-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sauce1337/BerrySauce-L2-13b
c8788874b78c84bc5593586d16fbd8ae7b5b2991
56.550862
cc-by-nc-4.0
0
13
true
true
true
true
2023-11-06T10:31:15Z
false
62.286689
83.778132
57.103634
48.300147
76.085241
11.751327
false
saurav1199_adisesha-phi1.5-7-3-10000_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiMultipleHeadsForCasualLM
<a target="_blank" href="https://huggingface.co/saurav1199/adisesha-phi1.5-7-3-10000" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">saurav1199/adisesha-phi1.5-7-3-10000</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_saurav1199__adisesha-phi1.5-7-3-10000" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
saurav1199/adisesha-phi1.5-7-3-10000
9df4ebf72ced772c6e163123192bf6f4a2d302d2
38.019791
bigscience-openrail-m
0
1
false
true
true
true
2024-04-20T01:11:44Z
false
38.90785
50.497909
32.741532
41.172721
64.798737
0
false
saurav1199_adisesha-phi1.5-7-3-15000_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiMultipleHeadsForCasualLM
<a target="_blank" href="https://huggingface.co/saurav1199/adisesha-phi1.5-7-3-15000" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">saurav1199/adisesha-phi1.5-7-3-15000</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_saurav1199__adisesha-phi1.5-7-3-15000" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
saurav1199/adisesha-phi1.5-7-3-15000
670bd2a0cbd81b1d03b330a5035642fa18de9847
38.375273
bigscience-openrail-m
0
1
false
true
true
true
2024-04-20T01:12:24Z
false
40.017065
51.971719
35.265753
39.145482
63.851618
0
false
saurav1199_adisesha-phi1.5-7-3-20000_float16
float16
🟩 continuously pretrained
🟩
Original
PhiMultipleHeadsForCasualLM
<a target="_blank" href="https://huggingface.co/saurav1199/adisesha-phi1.5-7-3-20000" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">saurav1199/adisesha-phi1.5-7-3-20000</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_saurav1199__adisesha-phi1.5-7-3-20000" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
saurav1199/adisesha-phi1.5-7-3-20000
193b6b8f38245d8c1acedbcd9bb342b220e86f0a
37.36726
bigscience-openrail-m
0
1
false
true
true
true
2024-04-19T16:58:30Z
false
37.713311
50.069707
35.15555
37.808005
63.456985
0
false
saurav1199_adisesha-phi1.5-7-3-25000_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiMultipleHeadsForCasualLM
<a target="_blank" href="https://huggingface.co/saurav1199/adisesha-phi1.5-7-3-25000" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">saurav1199/adisesha-phi1.5-7-3-25000</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_saurav1199__adisesha-phi1.5-7-3-25000" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
saurav1199/adisesha-phi1.5-7-3-25000
e53969ca4f9e6ee87908972eec6067a5f54f207f
37.059977
bigscience-openrail-m
0
1
false
true
true
true
2024-04-22T01:06:55Z
false
36.860068
50.019916
34.074193
38.42226
62.983425
0
false
saurav1199_adisesha-phi1.5-7-3-5000_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiMultipleHeadsForCasualLM
<a target="_blank" href="https://huggingface.co/saurav1199/adisesha-phi1.5-7-3-5000" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">saurav1199/adisesha-phi1.5-7-3-5000</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_saurav1199__adisesha-phi1.5-7-3-5000" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
saurav1199/adisesha-phi1.5-7-3-5000
26416b8e20a452805bf388bce4626f1aede3c776
38.316444
bigscience-openrail-m
0
1
false
true
true
true
2024-04-20T01:17:59Z
false
39.675768
50.906194
34.425858
39.855325
65.035517
0
false
sbawa_elysa_model_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/sbawa/elysa_model" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sbawa/elysa_model</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_sbawa__elysa_model" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sbawa/elysa_model
f57eba56111fcea5f1438d31d05bc84ccb4fc51c
36.997191
0
1
false
true
true
true
2024-03-29T18:31:50Z
false
37.542662
60.366461
25.57781
37.36544
60.220994
0.90978
false
scaledown_ScaleDown-7B-slerp-v0.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/scaledown/ScaleDown-7B-slerp-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">scaledown/ScaleDown-7B-slerp-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_scaledown__ScaleDown-7B-slerp-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
scaledown/ScaleDown-7B-slerp-v0.1
9bddd33f58ddbbaa9ecf8c5a4b79dfd8e49155e5
71.568954
apache-2.0
0
7
true
false
true
true
2024-01-01T08:40:59Z
false
68.003413
85.70006
65.261695
61.903134
81.373323
67.1721
false
scb10x_llama-3-typhoon-v1.5-8b_float16
float16
🟩 continuously pretrained
🟩
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/scb10x/llama-3-typhoon-v1.5-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">scb10x/llama-3-typhoon-v1.5-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_scb10x__llama-3-typhoon-v1.5-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
scb10x/llama-3-typhoon-v1.5-8b
021a9ec74183e45cdc0f56aac1c1e398a98a8d01
63.122586
llama3
2
8
true
true
true
true
2024-05-12T09:02:05Z
false
56.399317
80.661223
65.650504
44.749016
76.006314
55.269143
false
scb10x_llama-3-typhoon-v1.5-8b_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/scb10x/llama-3-typhoon-v1.5-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">scb10x/llama-3-typhoon-v1.5-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_scb10x__llama-3-typhoon-v1.5-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
scb10x/llama-3-typhoon-v1.5-8b
021a9ec74183e45cdc0f56aac1c1e398a98a8d01
63.09991
llama3
2
8
true
true
true
true
2024-05-29T08:53:59Z
false
56.399317
80.581557
65.447613
44.677404
75.769534
55.724033
false
scb10x_llama-3-typhoon-v1.5-8b-instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/scb10x/llama-3-typhoon-v1.5-8b-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">scb10x/llama-3-typhoon-v1.5-8b-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_scb10x__llama-3-typhoon-v1.5-8b-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
scb10x/llama-3-typhoon-v1.5-8b-instruct
8d6210bf56b8f43d6ff282f1f580323b15fcc56e
65.62238
llama3
16
8
true
true
true
true
2024-05-26T04:25:26Z
false
60.409556
80.790679
64.457399
53.248352
77.663773
57.164519
false
scb10x_typhoon-7b_float16
float16
🟢 pretrained
🟢
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/scb10x/typhoon-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">scb10x/typhoon-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_scb10x__typhoon-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
scb10x/typhoon-7b
35fb2f9cee5dbac35109effc816ca206962dad43
58.053128
apache-2.0
93
7
true
true
true
true
2023-12-23T08:15:07Z
false
58.532423
81.5475
59.543199
40.521983
76.5588
31.61486
false
scb10x_typhoon-7b-instruct-01-30-2024_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/scb10x/typhoon-7b-instruct-01-30-2024" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">scb10x/typhoon-7b-instruct-01-30-2024</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_scb10x__typhoon-7b-instruct-01-30-2024" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
scb10x/typhoon-7b-instruct-01-30-2024
9903e1fa761c31f6fb27f048483b3b1cc04e090c
66.214435
apache-2.0
0
7
true
true
true
true
2024-05-30T01:32:48Z
false
61.860068
81.298546
60.715952
52.600589
77.505919
63.305534
false
scb10x_typhoon-7b-instruct-02-19-2024_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/scb10x/typhoon-7b-instruct-02-19-2024" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">scb10x/typhoon-7b-instruct-02-19-2024</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_scb10x__typhoon-7b-instruct-02-19-2024" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
scb10x/typhoon-7b-instruct-02-19-2024
a478b0a508ca9c33fa5f97dab18775473041720a
65.385723
apache-2.0
0
7
true
true
true
true
2024-05-30T01:32:21Z
false
61.945392
81.507668
61.856507
49.940504
78.610892
58.453374
false
sci-m-wang_deepseek-llm-7b-chat-sa-v0.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/sci-m-wang/deepseek-llm-7b-chat-sa-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sci-m-wang/deepseek-llm-7b-chat-sa-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_sci-m-wang__deepseek-llm-7b-chat-sa-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sci-m-wang/deepseek-llm-7b-chat-sa-v0.1
740432a2ace4d15c2a2b834970714b2928b56dea
60.04979
other
0
7
true
true
true
true
2024-05-31T02:20:41Z
false
55.546075
79.36666
52.443814
46.860627
76.953433
49.128127
false
sci-m-wang_gemma-7b-it-sa-v0.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/sci-m-wang/gemma-7b-it-sa-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sci-m-wang/gemma-7b-it-sa-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_sci-m-wang__gemma-7b-it-sa-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sci-m-wang/gemma-7b-it-sa-v0.1
50633283ea7ff25425b02f57bd4d0dc9988d1326
53.078884
0
7
false
true
true
true
2024-05-31T01:59:54Z
false
50.767918
72.296355
54.98395
47.750475
68.034728
24.639879
false
seb-c_Psydestroyer-20B_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/seb-c/Psydestroyer-20B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">seb-c/Psydestroyer-20B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_seb-c__Psydestroyer-20B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
seb-c/Psydestroyer-20B
6a8e7636f7546c0aae531e2c3b76a0653ea6858d
55.039248
llama2
0
19
true
false
true
true
2024-03-05T03:20:01Z
false
60.324232
85.172276
55.559308
54.833929
74.269929
0.075815
false
selfrag_selfrag_llama2_7b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/selfrag/selfrag_llama2_7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">selfrag/selfrag_llama2_7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_selfrag__selfrag_llama2_7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
selfrag/selfrag_llama2_7b
190261383b0779ff66d2f95a73c7ad267d94b820
51.304545
mit
69
7
true
true
true
true
2024-03-04T01:51:17Z
false
51.450512
78.480382
52.004773
41.73347
73.164957
10.993177
false
senseable_33x-coder_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/senseable/33x-coder" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">senseable/33x-coder</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_senseable__33x-coder" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
senseable/33x-coder
352e5249cd84f34ea9265b4218ddfdd1e9b73cc6
49.664508
apache-2.0
2
33
true
true
true
true
2024-01-07T20:04:41Z
false
45.904437
62.636925
42.022026
45.604281
63.456985
38.362396
false
senseable_Trillama-8B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/senseable/Trillama-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">senseable/Trillama-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_senseable__Trillama-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
senseable/Trillama-8B
0d7741ed9fa05ea0471c2c3d8845bf3ccd5a7f86
66.547842
llama2
3
8
true
true
true
true
2024-04-18T21:47:43Z
false
60.580205
78.689504
66.351908
51.285623
74.980268
67.399545
false
senseable_WestLake-7B-v2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/senseable/WestLake-7B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">senseable/WestLake-7B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_senseable__WestLake-7B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
senseable/WestLake-7B-v2
6df7bb2069432bcab0971ab105284a66b3ec1ce0
74.677116
apache-2.0
102
7
true
true
true
true
2024-01-22T07:36:11Z
false
73.037543
88.64768
64.711347
67.062024
86.977111
67.62699
false
senseable_Westlake-7B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/senseable/Westlake-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">senseable/Westlake-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_senseable__Westlake-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
senseable/Westlake-7B
645fa936256811f53f0c33f1e5298f6ad1095dce
74.484373
apache-2.0
6
7
true
true
true
true
2024-01-21T09:54:52Z
false
73.208191
88.488349
64.644851
67.362752
86.029992
67.1721
false
senseable_Wilbur-30B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/senseable/Wilbur-30B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">senseable/Wilbur-30B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_senseable__Wilbur-30B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
senseable/Wilbur-30B
eab679f95e078efb71fbaa7b1aa0be05bb4e46ca
77.178613
0
34
false
true
true
true
2024-01-27T04:03:58Z
false
74.061433
86.675961
76.695571
69.961591
83.425414
72.251706
false
senseable_garten2-7b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/senseable/garten2-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">senseable/garten2-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_senseable__garten2-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
senseable/garten2-7b
96e7c78544d7eca96e3ae60ff80c728f3109e8ba
72.650594
apache-2.0
2
7
true
true
true
true
2024-01-11T05:24:10Z
false
69.368601
87.542322
65.435574
59.498093
84.68824
69.370735
false
senseable_moe-x33_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/senseable/moe-x33" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">senseable/moe-x33</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_senseable__moe-x33" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
senseable/moe-x33
2ce4ba7ce76392721be10c3c05b63853be98b686
29.949642
apache-2.0
0
58
true
true
false
true
2024-01-15T17:30:24Z
false
26.194539
26.438956
24.934582
51.14319
50.986582
0
false
sequelbox_DiamondForce_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/sequelbox/DiamondForce" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sequelbox/DiamondForce</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_sequelbox__DiamondForce" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sequelbox/DiamondForce
e92bbb8e6373408235e30cebcf4a71cc319b0ae3
59.627161
0
13
false
true
true
true
2024-01-12T06:31:12Z
false
62.116041
83.429596
58.095893
46.457836
79.005525
28.658074
false
sequelbox_Llama2-13B-DaringFortitude_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/sequelbox/Llama2-13B-DaringFortitude" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sequelbox/Llama2-13B-DaringFortitude</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_sequelbox__Llama2-13B-DaringFortitude" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sequelbox/Llama2-13B-DaringFortitude
49878338360a7884407ee8a81bb6ddced6f3120a
60.040464
llama2
13
13
true
true
true
true
2024-05-15T16:07:44Z
false
63.481229
83.559052
59.839186
55.964725
76.322021
21.076573
false
sequelbox_SpellBlade_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/sequelbox/SpellBlade" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sequelbox/SpellBlade</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_sequelbox__SpellBlade" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sequelbox/SpellBlade
258211a0cceaa08f7c8df3660ff8cd7cb6bee5e8
68.53519
0
68
false
true
true
true
2023-12-30T17:44:52Z
false
69.283276
87.313284
70.497435
47.09985
83.188635
53.828658
false
sethuiyer_Aika-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/sethuiyer/Aika-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sethuiyer/Aika-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_sethuiyer__Aika-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sethuiyer/Aika-7B
00589aa6b5081b35c38103071c3901d191d5ecf2
59.249964
cc
0
7
true
false
true
true
2024-02-16T16:15:24Z
false
65.358362
81.487751
53.913996
51.21987
77.742699
25.777104
false
sethuiyer_Chikuma_10.7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/sethuiyer/Chikuma_10.7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sethuiyer/Chikuma_10.7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_sethuiyer__Chikuma_10.7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sethuiyer/Chikuma_10.7B
3c99ba83d1b6cdee68696fc8443dbd4c71cf9cfe
68.1671
apache-2.0
4
10
true
false
true
true
2024-01-11T05:45:33Z
false
65.699659
84.305915
64.808588
57.011017
79.558011
57.619409
false
sethuiyer_CodeCalc-Mistral-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/sethuiyer/CodeCalc-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sethuiyer/CodeCalc-Mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_sethuiyer__CodeCalc-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sethuiyer/CodeCalc-Mistral-7B
e03e7b8e6ea737f565848caaf3467b75b646c878
66.329793
apache-2.0
0
7
true
false
true
true
2024-02-19T13:25:26Z
false
61.945392
83.638717
62.781357
47.785124
78.295185
63.53298
false
sethuiyer_Diana-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/sethuiyer/Diana-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sethuiyer/Diana-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_sethuiyer__Diana-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sethuiyer/Diana-7B
09f1c9e78c1e73a00278ce864470c4ffb35f626d
70.604445
cc
0
7
true
false
true
true
2024-02-18T07:27:37Z
false
68.34471
86.725752
64.583714
60.553351
80.189424
63.229719
false
sethuiyer_Dr_Samantha-7b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/sethuiyer/Dr_Samantha-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sethuiyer/Dr_Samantha-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_sethuiyer__Dr_Samantha-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sethuiyer/Dr_Samantha-7b
b1a643e32e467d8dd722186d6c36d16ea4281003
52.945885
llama2
22
6
true
false
true
true
2024-01-04T11:37:18Z
false
53.83959
77.952599
47.937069
45.584336
73.55959
18.802123
false
sethuiyer_Dr_Samantha_7b_mistral_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/sethuiyer/Dr_Samantha_7b_mistral" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sethuiyer/Dr_Samantha_7b_mistral</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_sethuiyer__Dr_Samantha_7b_mistral" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sethuiyer/Dr_Samantha_7b_mistral
e0201aa9423f082a4182cbf910d75ba438528ddb
59.247538
apache-2.0
4
7
true
false
true
true
2024-01-06T10:51:45Z
false
60.409556
83.648676
63.138174
41.371761
75.453828
31.46323
false
sethuiyer_Eida_10.7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/sethuiyer/Eida_10.7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sethuiyer/Eida_10.7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_sethuiyer__Eida_10.7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sethuiyer/Eida_10.7B
9cc692ef0d0821ef113ad175141632d2efad4b33
70.54353
0
10
false
true
true
true
2024-02-04T12:59:58Z
false
70.904437
87.363075
64.304607
71.33106
81.21547
48.142532
false
sethuiyer_Herculoid-2.0_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/sethuiyer/Herculoid-2.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sethuiyer/Herculoid-2.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_sethuiyer__Herculoid-2.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sethuiyer/Herculoid-2.0
fd39739fa6569e7020bba9cb49c2920bbdcb7aba
64.076721
0
7
false
true
true
true
2024-02-09T14:37:05Z
false
62.883959
83.927504
64.034987
49.609599
80.031571
43.972707
false
sethuiyer_Medichat-Llama3-8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/sethuiyer/Medichat-Llama3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sethuiyer/Medichat-Llama3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_sethuiyer__Medichat-Llama3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sethuiyer/Medichat-Llama3-8B
5215ae6cef808022af0047f700ca3d902dfe7e78
66.025157
other
10
8
true
false
true
true
2024-04-22T11:10:37Z
false
59.129693
82.901812
65.192063
49.652029
78.926598
60.348749
false
sethuiyer_MedleyMD_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/sethuiyer/MedleyMD" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sethuiyer/MedleyMD</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_sethuiyer__MedleyMD" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sethuiyer/MedleyMD
ce34d7174f0522f91723bc47419d60fbaec659cd
69.891776
cc-by-nc-nd-4.0
0
12
true
false
false
true
2024-01-15T09:04:41Z
false
66.467577
86.058554
65.101503
52.463012
80.26835
68.99166
false
sethuiyer_Nandine-7b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/sethuiyer/Nandine-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sethuiyer/Nandine-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_sethuiyer__Nandine-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sethuiyer/Nandine-7b
6fe9ea49efd6024e45e352c63815efdb7d0fe35d
71.469029
apache-2.0
3
7
true
false
true
true
2024-01-25T12:34:45Z
false
69.283276
87.014539
64.827469
62.104503
83.188635
62.395754
false
sethuiyer_OpenDolphinHermes_Llama2_7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/sethuiyer/OpenDolphinHermes_Llama2_7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sethuiyer/OpenDolphinHermes_Llama2_7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_sethuiyer__OpenDolphinHermes_Llama2_7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sethuiyer/OpenDolphinHermes_Llama2_7B
3b6713b4ab2e2ea79535802f126287dd9d7036ba
54.24234
llama2
1
6
true
false
true
true
2024-01-28T13:16:26Z
false
55.03413
78.739295
52.249708
46.09916
73.164957
20.166793
false
sethuiyer_SynthIQ-7b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/sethuiyer/SynthIQ-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sethuiyer/SynthIQ-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_sethuiyer__SynthIQ-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sethuiyer/SynthIQ-7b
32612e89aa87a23f6b1c5c5a9165896e599ca9ca
69.365407
llama2
1
7
true
false
true
true
2023-12-29T11:13:43Z
false
65.870307
85.819558
64.748715
57.00036
78.689818
64.063685
false
sethuiyer_distilabled_Chikuma_10.7B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/sethuiyer/distilabled_Chikuma_10.7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sethuiyer/distilabled_Chikuma_10.7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_sethuiyer__distilabled_Chikuma_10.7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sethuiyer/distilabled_Chikuma_10.7B
a5a6ba84916b025cdce898d17387e4b4bc31104f
68.867814
0
10
false
true
true
true
2024-01-13T03:16:24Z
false
66.382253
85.142402
64.704669
59.199847
79.400158
58.377559
false
seungduk_KoSOLAR-10.7B-v0.1_float16
float16
🤝 base merges and moerges
🤝
Original
Unknown
<a target="_blank" href="https://huggingface.co/seungduk/KoSOLAR-10.7B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">seungduk/KoSOLAR-10.7B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_seungduk__KoSOLAR-10.7B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
seungduk/KoSOLAR-10.7B-v0.1
a4ddde9b0d06f340ff9c29777b4bfd883700c6cd
66.039672
0
10
false
true
true
true
2023-12-29T18:16:27Z
false
62.030717
84.544911
65.556617
45.02593
83.583268
55.496588
false
seyf1elislam_KuTrix-7b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/seyf1elislam/KuTrix-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">seyf1elislam/KuTrix-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_seyf1elislam__KuTrix-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
seyf1elislam/KuTrix-7b
37995fab81810aacdf8fa7db73c41c4673dd4794
74.4212
cc-by-nc-4.0
2
7
true
false
true
true
2024-03-16T05:27:27Z
false
70.477816
87.940649
65.282805
70.847053
81.925809
70.053071
false
seyf1elislam_WestKunai-Hermes-10.7b-test_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/seyf1elislam/WestKunai-Hermes-10.7b-test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">seyf1elislam/WestKunai-Hermes-10.7b-test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_seyf1elislam__WestKunai-Hermes-10.7b-test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
seyf1elislam/WestKunai-Hermes-10.7b-test
76887e42e7d48d55de29561b1306e1fe0d308466
69.745357
cc-by-nc-4.0
0
10
true
false
true
true
2024-03-20T06:18:31Z
false
68.088737
87.104163
64.42621
64.280487
82.715075
51.857468
false
seyf1elislam_WestKunai-Hermes-7b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/seyf1elislam/WestKunai-Hermes-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">seyf1elislam/WestKunai-Hermes-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_seyf1elislam__WestKunai-Hermes-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
seyf1elislam/WestKunai-Hermes-7b
5f348a5ad4c996e22f0fcbdbb2a5326ffc069cc5
73.507203
cc-by-nc-4.0
3
7
true
false
true
true
2024-03-16T06:10:39Z
false
71.16041
87.761402
64.771324
65.251829
83.030781
69.067475
false
seyf1elislam_WestKunai-X-7b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/seyf1elislam/WestKunai-X-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">seyf1elislam/WestKunai-X-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_seyf1elislam__WestKunai-X-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
seyf1elislam/WestKunai-X-7b
ca07b7bea2f28538d4112c989b1e4402c96c17ef
74.176175
0
7
false
true
true
true
2024-03-07T15:12:32Z
false
71.075085
87.860984
65.41636
68.006065
82.872928
69.825625
false
seyf1elislam_WestKunai-XD-7b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/seyf1elislam/WestKunai-XD-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">seyf1elislam/WestKunai-XD-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_seyf1elislam__WestKunai-XD-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
seyf1elislam/WestKunai-XD-7b
824e0c22a5f06a17d38251fa36be1d9ee7888d66
73.271372
cc-by-nc-4.0
0
7
true
false
true
true
2024-03-16T02:27:10Z
false
71.245734
87.592113
64.689696
67.293591
82.241515
66.56558
false
sfairXC_FsfairX-Zephyr-Chat-v0.1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/sfairXC/FsfairX-Zephyr-Chat-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sfairXC/FsfairX-Zephyr-Chat-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_sfairXC__FsfairX-Zephyr-Chat-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sfairXC/FsfairX-Zephyr-Chat-v0.1
e585ddb6076d907e84ea832e94b210703248ff8d
61.203125
cc-by-sa-4.0
7
7
true
true
true
true
2024-04-27T21:09:23Z
false
63.31058
84.415455
61.205222
53.563985
77.505919
27.217589
false
shadowml_BeagSake-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/shadowml/BeagSake-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">shadowml/BeagSake-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_shadowml__BeagSake-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
shadowml/BeagSake-7B
e1ae2c1e9bea8b54f6b8bff41a4f50895625a6ed
75.382122
cc-by-nc-4.0
1
7
true
false
true
true
2024-02-01T13:48:16Z
false
72.440273
88.388767
65.233058
72.271232
82.162589
71.796816
false