Dataset Preview
Full Screen
The full dataset viewer is not available (click to read why). Only showing a preview of the rows.
The dataset generation failed
Error code:   DatasetGenerationError
Exception:    TypeError
Message:      Couldn't cast array of type
struct<MMLU: struct<metric_name: double>, Truthful_qa: struct<metric_name: double>, ARC: struct<metric_name: double>, HellaSwag: struct<metric_name: double>, GSM8K: struct<metric_name: double>, GSM1K: struct<metric_name: double>, Winogrande: struct<metric_name: double>>
to
{'MMLU': {'metric_name': Value(dtype='float64', id=None)}, 'Truthful_qa': {'metric_name': Value(dtype='float64', id=None)}, 'ARC': {'metric_name': Value(dtype='float64', id=None)}, 'HellaSwag': {'metric_name': Value(dtype='float64', id=None)}, 'GSM8K': {'metric_name': Value(dtype='float64', id=None)}, 'Winogrande': {'metric_name': Value(dtype='float64', id=None)}}
Traceback:    Traceback (most recent call last):
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 2013, in _prepare_split_single
                  writer.write_table(table)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py", line 585, in write_table
                  pa_table = table_cast(pa_table, self._schema)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2302, in table_cast
                  return cast_table_to_schema(table, schema)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2261, in cast_table_to_schema
                  arrays = [cast_array_to_feature(table[name], feature) for name, feature in features.items()]
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2261, in <listcomp>
                  arrays = [cast_array_to_feature(table[name], feature) for name, feature in features.items()]
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 1802, in wrapper
                  return pa.chunked_array([func(chunk, *args, **kwargs) for chunk in array.chunks])
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 1802, in <listcomp>
                  return pa.chunked_array([func(chunk, *args, **kwargs) for chunk in array.chunks])
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2122, in cast_array_to_feature
                  raise TypeError(f"Couldn't cast array of type\n{_short_str(array.type)}\nto\n{_short_str(feature)}")
              TypeError: Couldn't cast array of type
              struct<MMLU: struct<metric_name: double>, Truthful_qa: struct<metric_name: double>, ARC: struct<metric_name: double>, HellaSwag: struct<metric_name: double>, GSM8K: struct<metric_name: double>, GSM1K: struct<metric_name: double>, Winogrande: struct<metric_name: double>>
              to
              {'MMLU': {'metric_name': Value(dtype='float64', id=None)}, 'Truthful_qa': {'metric_name': Value(dtype='float64', id=None)}, 'ARC': {'metric_name': Value(dtype='float64', id=None)}, 'HellaSwag': {'metric_name': Value(dtype='float64', id=None)}, 'GSM8K': {'metric_name': Value(dtype='float64', id=None)}, 'Winogrande': {'metric_name': Value(dtype='float64', id=None)}}
              
              The above exception was the direct cause of the following exception:
              
              Traceback (most recent call last):
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1396, in compute_config_parquet_and_info_response
                  parquet_operations = convert_to_parquet(builder)
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1045, in convert_to_parquet
                  builder.download_and_prepare(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1029, in download_and_prepare
                  self._download_and_prepare(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1124, in _download_and_prepare
                  self._prepare_split(split_generator, **prepare_split_kwargs)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1884, in _prepare_split
                  for job_id, done, content in self._prepare_split_single(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 2040, in _prepare_split_single
                  raise DatasetGenerationError("An error occurred while generating the dataset") from e
              datasets.exceptions.DatasetGenerationError: An error occurred while generating the dataset

Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.

config
dict
results
dict
{ "model_dtype": "torch.float16", "model_name": "CerebrumTech/cere-llama-3-8b-tr" }
{ "MMLU": { "metric_name": 0.46180581231975154 }, "Truthful_qa": { "metric_name": 0.48134093096145175 }, "ARC": { "metric_name": 0.4308873720136519 }, "HellaSwag": { "metric_name": 0.47871739866772045 }, "GSM8K": { "metric_name": 0.0554290053151101 }, "Winogrande": { "metric_name": 0.5624012638230648 } }
{ "model_dtype": "torch.float16", "model_name": "CohereForAI/aya-23-35B" }
{ "MMLU": { "metric_name": 0.5536493381646085 }, "Truthful_qa": { "metric_name": 0.44159595450971734 }, "ARC": { "metric_name": 0.5162116040955631 }, "HellaSwag": { "metric_name": 0.6028000451620188 }, "GSM8K": { "metric_name": 0.49810174639331817 }, "GSM1K": { "metric_name": 0.49810174639331817 }, "Winogrande": { "metric_name": 0.6129541864139021 } }
{ "model_dtype": "torch.float16", "model_name": "CohereForAI/aya-23-8B" }
{ "MMLU": { "metric_name": 0.4660208533609406 }, "Truthful_qa": { "metric_name": 0.44053150342459574 }, "ARC": { "metric_name": 0.4138225255972696 }, "HellaSwag": { "metric_name": 0.5224116518008355 }, "GSM8K": { "metric_name": 0.3082763857251329 }, "GSM1K": { "metric_name": 0.3082763857251329 }, "Winogrande": { "metric_name": 0.5695102685624013 } }
{ "model_dtype": "torch.float16", "model_name": "Commencis/Commencis-LLM" }
{ "MMLU": { "metric_name": 0.3582 }, "Truthful_qa": { "metric_name": 0.4888 }, "ARC": { "metric_name": 0.3361 }, "HellaSwag": { "metric_name": 0.3319 }, "GSM8K": { "metric_name": 0.0022 }, "Winogrande": { "metric_name": 0.5165 } }
{ "model_dtype": "torch.float16", "model_name": "Eurdem/Defne-llama3.1-8B" }
{ "MMLU": { "metric_name": 0.5295031055900621 }, "Truthful_qa": { "metric_name": 0.5115308205648597 }, "ARC": { "metric_name": 0.46928327645051193 }, "HellaSwag": { "metric_name": 0.5162018742237778 }, "GSM8K": { "metric_name": 0.6013667425968109 }, "GSM1K": { "metric_name": 0.6013667425968109 }, "Winogrande": { "metric_name": 0.5710900473933649 } }
{ "model_dtype": "torch.float16", "model_name": "Eurdem/Defne_llama3_2x8B" }
{ "MMLU": { "metric_name": 0.5048 }, "Truthful_qa": { "metric_name": 0.5349 }, "ARC": { "metric_name": 0.4718 }, "HellaSwag": { "metric_name": 0.4806 }, "GSM8K": { "metric_name": 0.5747 }, "Winogrande": { "metric_name": 0.5703 } }
{ "model_dtype": "torch.float16", "model_name": "Eurdem/Megatron_llama3_2x8B" }
{ "MMLU": { "metric_name": 0.5011461953708497 }, "Truthful_qa": { "metric_name": 0.526653718203988 }, "ARC": { "metric_name": 0.4718430034129693 }, "HellaSwag": { "metric_name": 0.4773625381054533 }, "GSM8K": { "metric_name": 0.5679574791192104 }, "Winogrande": { "metric_name": 0.5766192733017378 } }
{ "model_dtype": "torch.float16", "model_name": "Eurdem/Pinokio_v1.0" }
{ "MMLU": { "metric_name": 0.41292612585964655 }, "Truthful_qa": { "metric_name": 0.4887693221380527 }, "ARC": { "metric_name": 0.38054607508532423 }, "HellaSwag": { "metric_name": 0.4242971660833239 }, "GSM8K": { "metric_name": 0.040242976461655276 }, "Winogrande": { "metric_name": 0.5268562401263823 } }
{ "model_dtype": "torch.float16", "model_name": "KOCDIGITAL/Kocdigital-LLM-8b-v0.1" }
{ "MMLU": { "metric_name": 0.47348 }, "Truthful_qa": { "metric_name": 0.477 }, "ARC": { "metric_name": 0.4488 }, "HellaSwag": { "metric_name": 0.4861 }, "GSM8K": { "metric_name": 0.4365 }, "Winogrande": { "metric_name": 0.556 } }
{ "model_dtype": "torch.float16", "model_name": "Metin/LLaMA-3-8B-Instruct-TR-DPO" }
{ "MMLU": { "metric_name": 0.4971 }, "Truthful_qa": { "metric_name": 0.5235 }, "ARC": { "metric_name": 0.4453 }, "HellaSwag": { "metric_name": 0.4592 }, "GSM8K": { "metric_name": 0.5322 }, "Winogrande": { "metric_name": 0.5545 } }
{ "model_dtype": "torch.float16", "model_name": "Morfoz-Aigap/Morfoz-LLM-8b-v1.0" }
{ "MMLU": { "metric_name": 0.4372 }, "Truthful_qa": { "metric_name": 0.495 }, "ARC": { "metric_name": 0.3865 }, "HellaSwag": { "metric_name": 0.4813 }, "GSM8K": { "metric_name": 0.31283 }, "Winogrande": { "metric_name": 0.5797 } }
{ "model_dtype": "torch.float16", "model_name": "NousResearch/Hermes-2-Pro-Mistral-7B" }
{ "MMLU": { "metric_name": 0.39162907638837535 }, "Truthful_qa": { "metric_name": 0.4723526410552315 }, "ARC": { "metric_name": 0.32081911262798635 }, "HellaSwag": { "metric_name": 0.3638929660155809 }, "GSM8K": { "metric_name": 0.2999240698557327 }, "Winogrande": { "metric_name": 0.5323854660347551 } }
{ "model_dtype": "torch.float16", "model_name": "NousResearch/Meta-Llama-3-8B" }
{ "MMLU": { "metric_name": 0.4928640094653553 }, "Truthful_qa": { "metric_name": 0.47354505859506446 }, "ARC": { "metric_name": 0.4402730375426621 }, "HellaSwag": { "metric_name": 0.4879756125098792 }, "GSM8K": { "metric_name": 0.31738800303720577 }, "Winogrande": { "metric_name": 0.5560821484992101 } }
{ "model_dtype": "torch.float16", "model_name": "NovusResearch/Novus-7b-tr_v1" }
{ "MMLU": { "metric_name": 0.4308 }, "Truthful_qa": { "metric_name": 0.4885 }, "ARC": { "metric_name": 0.3669 }, "HellaSwag": { "metric_name": 0.3974 }, "GSM8K": { "metric_name": 0.2969 }, "Winogrande": { "metric_name": 0.5355 } }
{ "model_dtype": "torch.float16", "model_name": "NovusResearch/Thestral-0.1-tr-chat-7B" }
{ "MMLU": { "metric_name": 0.22790800857797827 }, "Truthful_qa": { "metric_name": null }, "ARC": { "metric_name": 0.22696245733788395 }, "HellaSwag": { "metric_name": 0.25166534944112 }, "GSM8K": { "metric_name": 0 }, "Winogrande": { "metric_name": 0.4960505529225908 } }
{ "model_dtype": "torch.float16", "model_name": "Orbina/Orbita-v0.1" }
{ "MMLU": { "metric_name": 0.4951 }, "Truthful_qa": { "metric_name": 0.5078 }, "ARC": { "metric_name": 0.4197 }, "HellaSwag": { "metric_name": 0.48 }, "GSM8K": { "metric_name": 0.5041 }, "Winogrande": { "metric_name": 0.5616 } }
{ "model_dtype": "torch.float16", "model_name": "zgrgr/Meta-Llama-3.1-8B-Instruct" }
{ "MMLU": { "metric_name": 0.4966353619758929 }, "Truthful_qa": { "metric_name": 0.5030797117568518 }, "ARC": { "metric_name": 0.43686006825938567 }, "HellaSwag": { "metric_name": 0.4490233713446991 }, "GSM8K": { "metric_name": 0.55125284738041 }, "GSM1K": { "metric_name": 0.55125284738041 }, "Winogrande": { "metric_name": 0.5647709320695102 } }
{ "model_dtype": "torch.float16", "model_name": "Qwen/Qwen2-7B-Instruct" }
{ "MMLU": { "metric_name": 0.5202248021888635 }, "Truthful_qa": { "metric_name": 0.5513880529020762 }, "ARC": { "metric_name": 0.3779863481228669 }, "HellaSwag": { "metric_name": 0.42802303262955854 }, "GSM8K": { "metric_name": 0.5558086560364465 }, "Winogrande": { "metric_name": 0.5434439178515008 } }
{ "model_dtype": "torch.float16", "model_name": "TURKCELL/Turkcell-LLM-7b-v1" }
{ "MMLU": { "metric_name": 0.3903 }, "Truthful_qa": { "metric_name": 0.4162 }, "ARC": { "metric_name": 0.4343 }, "HellaSwag": { "metric_name": 0.4918 }, "GSM8K": { "metric_name": 0.2353 }, "Winogrande": { "metric_name": 0.5687 } }
{ "model_dtype": "torch.float16", "model_name": "Trendyol/Trendyol-LLM-7b-base-v1.0" }
{ "MMLU": { "metric_name": 0.37158914442061675 }, "Truthful_qa": { "metric_name": 0.41060309496645 }, "ARC": { "metric_name": 0.37372013651877134 }, "HellaSwag": { "metric_name": 0.4388619171276956 }, "GSM8K": { "metric_name": 0.03416856492027335 }, "Winogrande": { "metric_name": 0.5837282780410743 } }
{ "model_dtype": "torch.float16", "model_name": "Trendyol/Trendyol-LLM-7b-chat-dpo-v1.0" }
{ "MMLU": { "metric_name": 0.3961 }, "Truthful_qa": { "metric_name": 0.4618 }, "ARC": { "metric_name": 0.413 }, "HellaSwag": { "metric_name": 0.4629 }, "GSM8K": { "metric_name": 0.0714 }, "Winogrande": { "metric_name": 0.5766 } }
{ "model_dtype": "torch.float16", "model_name": "Trendyol/Trendyol-LLM-7b-chat-v0.1" }
{ "MMLU": { "metric_name": 0.3449 }, "Truthful_qa": { "metric_name": 0.4219 }, "ARC": { "metric_name": 0.3404 }, "HellaSwag": { "metric_name": 0.4165 }, "GSM8K": { "metric_name": 0.0167 }, "Winogrande": { "metric_name": 0.5442 } }
{ "model_dtype": "torch.float16", "model_name": "Trendyol/Trendyol-LLM-7b-chat-v1.0" }
{ "MMLU": { "metric_name": 0.3948 }, "Truthful_qa": { "metric_name": 0.4312 }, "ARC": { "metric_name": 0.3813 }, "HellaSwag": { "metric_name": 0.4294 }, "GSM8K": { "metric_name": 0.0523 }, "Winogrande": { "metric_name": 0.5639 } }
{ "model_dtype": "torch.float16", "model_name": "Trendyol/Trendyol-LLM-7b-chat-v1.8" }
{ "MMLU": { "metric_name": 0.4191377652887673 }, "Truthful_qa": { "metric_name": 0.44459550818588606 }, "ARC": { "metric_name": 0.39505119453924914 }, "HellaSwag": { "metric_name": 0.4349102404877498 }, "GSM8K": { "metric_name": 0.30372057706909644 }, "Winogrande": { "metric_name": 0.580568720379147 } }
{ "model_dtype": "torch.float16", "model_name": "Trendyol/Trendyol-LLM-8b-chat-v2.0" }
{ "MMLU": { "metric_name": 0.48543330375628513 }, "Truthful_qa": { "metric_name": 0.4803474262879919 }, "ARC": { "metric_name": 0.45307167235494883 }, "HellaSwag": { "metric_name": 0.5525572993112792 }, "GSM8K": { "metric_name": 0.5026575550493546 }, "GSM1K": { "metric_name": 0.5026575550493546 }, "Winogrande": { "metric_name": 0.5947867298578199 } }
{ "model_dtype": "torch.float16", "model_name": "VeriUS/VeriUS-LLM-8b-v0.2" }
{ "MMLU": { "metric_name": 0.4881 }, "Truthful_qa": { "metric_name": 0.4621 }, "ARC": { "metric_name": 0.4642 }, "HellaSwag": { "metric_name": 0.4783 }, "GSM8K": { "metric_name": 0.4343 }, "Winogrande": { "metric_name": 0.556 } }
{ "model_dtype": "torch.float16", "model_name": "WiroAI/llama3.1-tr-v0.1" }
{ "MMLU": { "metric_name": 0.5240701027878429 }, "Truthful_qa": { "metric_name": 0.49451032738707124 }, "ARC": { "metric_name": 0.5008532423208191 }, "HellaSwag": { "metric_name": 0.5400248391103082 }, "GSM8K": { "metric_name": 0.554290053151101 }, "GSM1K": { "metric_name": 0.554290053151101 }, "Winogrande": { "metric_name": 0.5750394944707741 } }
{ "model_dtype": "torch.float16", "model_name": "aerdincdal/CBDDO-LLM-8B-Instruct-v1" }
{ "MMLU": { "metric_name": 0.4380684759298972 }, "Truthful_qa": { "metric_name": 0.45898189506718756 }, "ARC": { "metric_name": 0.41638225255972694 }, "HellaSwag": { "metric_name": 0.44507169470475333 }, "GSM8K": { "metric_name": 0.008352315869400152 }, "Winogrande": { "metric_name": 0.5315955766192733 } }
{ "model_dtype": "torch.float16", "model_name": "aerdincdal/CBDDO-LLM-8B-Instruct-v1" }
{ "MMLU": { "metric_name": 0.437 }, "Truthful_qa": { "metric_name": 0.4656 }, "ARC": { "metric_name": 0.4215 }, "HellaSwag": { "metric_name": 0.4526 }, "GSM8K": { "metric_name": 0.0091 }, "Winogrande": { "metric_name": 0.5379 } }
{ "model_dtype": "torch.float16", "model_name": "asafaya/kanarya-2b" }
{ "MMLU": { "metric_name": 0.24136656067440657 }, "Truthful_qa": { "metric_name": 0.43056237173575296 }, "ARC": { "metric_name": 0.2935153583617747 }, "HellaSwag": { "metric_name": 0.42610364683301344 }, "GSM8K": { "metric_name": 0.015945330296127564 }, "Winogrande": { "metric_name": 0.5 } }
{ "model_dtype": "torch.float16", "model_name": "asafaya/kanarya-2b" }
{ "MMLU": { "metric_name": 0.24136656067440657 }, "Truthful_qa": { "metric_name": 0.43056237173575296 }, "ARC": { "metric_name": 0.2935153583617747 }, "HellaSwag": { "metric_name": 0.42610364683301344 }, "GSM8K": { "metric_name": 0.015945330296127564 }, "Winogrande": { "metric_name": 0.5 } }
{ "model_dtype": "torch.float16", "model_name": "asafaya/kanarya-750m" }
{ "MMLU": { "metric_name": 0.2426236781779191 }, "Truthful_qa": { "metric_name": 0.43467431643538545 }, "ARC": { "metric_name": 0.29692832764505117 }, "HellaSwag": { "metric_name": 0.3844416845432991 }, "GSM8K": { "metric_name": 0.01442672741078208 }, "Winogrande": { "metric_name": 0.5063191153238547 }, "GSM1K": { "metric_name": 0.0058997050147492625 } }
{ "model_dtype": "torch.float16", "model_name": "berkecr/tr-dare-merge-7B" }
{ "MMLU": { "metric_name": 0.3112475042520151 }, "Truthful_qa": { "metric_name": 0.382583721812538 }, "ARC": { "metric_name": 0.28498293515358364 }, "HellaSwag": { "metric_name": 0.3378119001919386 }, "GSM8K": { "metric_name": 0.08731966590736523 }, "Winogrande": { "metric_name": 0.5165876777251185 } }
{ "model_dtype": "torch.float16", "model_name": "burak/Trendyol-Turkcell-7b-mixture" }
{ "MMLU": { "metric_name": 0.31797678030022924 }, "Truthful_qa": { "metric_name": 0.4090851410328414 }, "ARC": { "metric_name": 0.27559726962457337 }, "HellaSwag": { "metric_name": 0.30032742463588125 }, "GSM8K": { "metric_name": 0 }, "Winogrande": { "metric_name": 0.504739336492891 } }
{ "model_dtype": "torch.float16", "model_name": "curiositytech/MARS-v0.2" }
{ "MMLU": { "metric_name": 0.4613621237891001 }, "Truthful_qa": { "metric_name": 0.48665216594195637 }, "ARC": { "metric_name": 0.43856655290102387 }, "HellaSwag": { "metric_name": 0.46573331827932707 }, "GSM8K": { "metric_name": 0.5937737281700836 }, "Winogrande": { "metric_name": 0.5284360189573459 } }
{ "model_dtype": "torch.float16", "model_name": "curiositytech/MARS" }
{ "MMLU": { "metric_name": 0.46727797086445316 }, "Truthful_qa": { "metric_name": 0.48530084653664834 }, "ARC": { "metric_name": 0.4496587030716723 }, "HellaSwag": { "metric_name": 0.4577170599525799 }, "GSM8K": { "metric_name": 0.5125284738041003 }, "Winogrande": { "metric_name": 0.5394944707740916 } }
{ "model_dtype": "torch.float16", "model_name": "cypienai/cymist-2-v02-SFT" }
{ "MMLU": { "metric_name": 0.4331879020927309 }, "Truthful_qa": { "metric_name": 0.4415913411600135 }, "ARC": { "metric_name": 0.3771331058020478 }, "HellaSwag": { "metric_name": 0.44845884611042114 }, "GSM8K": { "metric_name": 0.2201974183750949 }, "Winogrande": { "metric_name": 0.5458135860979463 } }
{ "model_dtype": "torch.float16", "model_name": "cypienai/cymist-2-v03-SFT" }
{ "MMLU": { "metric_name": 0.4349626562153368 }, "Truthful_qa": { "metric_name": 0.44898602402466015 }, "ARC": { "metric_name": 0.3660409556313993 }, "HellaSwag": { "metric_name": 0.4362651010500169 }, "GSM8K": { "metric_name": 0.20273348519362186 }, "Winogrande": { "metric_name": 0.5497630331753555 } }
{ "model_dtype": "torch.float16", "model_name": "cypienai/cymist" }
{ "MMLU": { "metric_name": 0.3208 }, "Truthful_qa": { "metric_name": 0.4692 }, "ARC": { "metric_name": 0.3498 }, "HellaSwag": { "metric_name": 0.4022 }, "GSM8K": { "metric_name": 0.006 }, "Winogrande": { "metric_name": 0.5371 } }
{ "model_dtype": "torch.float16", "model_name": "cypienai/cymist2-v01-SFT" }
{ "MMLU": { "metric_name": 0.3761 }, "Truthful_qa": { "metric_name": 0.4392 }, "ARC": { "metric_name": 0.3651 }, "HellaSwag": { "metric_name": 0.45 }, "GSM8K": { "metric_name": 0.066 }, "Winogrande": { "metric_name": 0.5205 } }
{ "model_dtype": "torch.float16", "model_name": "google/gemma-2-2b-it" }
{ "MMLU": { "metric_name": 0.41721511498927755 }, "Truthful_qa": { "metric_name": 0.484041166553975 }, "ARC": { "metric_name": 0.371160409556314 }, "HellaSwag": { "metric_name": 0.398103195212826 }, "GSM8K": { "metric_name": 0.08200455580865604 }, "Winogrande": { "metric_name": 0.5244865718799369 } }
{ "model_dtype": "torch.float16", "model_name": "huggyllama/llama-7b" }
{ "MMLU": { "metric_name": 0.2588922576351401 }, "Truthful_qa": { "metric_name": 0.43169172402480593 }, "ARC": { "metric_name": 0.2508532423208191 }, "HellaSwag": { "metric_name": 0.2931015016371232 }, "GSM8K": { "metric_name": 0.016704631738800303 }, "Winogrande": { "metric_name": 0.48973143759873616 } }
{ "model_dtype": "torch.float16", "model_name": "malhajar/Mistral-7B-Instruct-v0.2-turkish" }
{ "MMLU": { "metric_name": 0.3851216446054869 }, "Truthful_qa": { "metric_name": 0.4604028855292625 }, "ARC": { "metric_name": 0.3506825938566553 }, "HellaSwag": { "metric_name": 0.38737721576154455 }, "GSM8K": { "metric_name": 0.14123006833712984 }, "Winogrande": { "metric_name": 0.5229067930489731 } }
{ "model_dtype": "torch.float16", "model_name": "malhajar/Mistral-7B-v0.2-meditron-turkish" }
{ "MMLU": { "metric_name": 0.37336389854322266 }, "Truthful_qa": { "metric_name": 0.4698152667870169 }, "ARC": { "metric_name": 0.3822525597269625 }, "HellaSwag": { "metric_name": 0.3844416845432991 }, "GSM8K": { "metric_name": 0.08959757023538345 }, "Winogrande": { "metric_name": 0.5197472353870458 } }
{ "model_dtype": "torch.float16", "model_name": "meta-llama/Meta-Llama-3-8B-Instruct" }
{ "MMLU": { "metric_name": 0.49404717888042593 }, "Truthful_qa": { "metric_name": 0.4970343050531305 }, "ARC": { "metric_name": 0.439419795221843 }, "HellaSwag": { "metric_name": 0.4456362199390313 }, "GSM8K": { "metric_name": 0.5398633257403189 }, "Winogrande": { "metric_name": 0.5560821484992101 } }
{ "model_dtype": "torch.float16", "model_name": "meta-llama/Meta-Llama-3-8B" }
{ "MMLU": { "metric_name": 0.4928640094653553 }, "Truthful_qa": { "metric_name": 0.47354505859506446 }, "ARC": { "metric_name": 0.4402730375426621 }, "HellaSwag": { "metric_name": 0.4879756125098792 }, "GSM8K": { "metric_name": 0.31738800303720577 }, "Winogrande": { "metric_name": 0.5560821484992101 } }
{ "model_dtype": "torch.float16", "model_name": "meta-llama/Meta-Llama-3.1-8B-Instruct" }
{ "MMLU": { "metric_name": 0.5232179828453121 }, "Truthful_qa": { "metric_name": 0.49226540480192577 }, "ARC": { "metric_name": 0.4445392491467577 }, "HellaSwag": { "metric_name": 0.4915885740092582 }, "GSM8K": { "metric_name": 0.5611237661351557 }, "GSM1K": { "metric_name": 0.5611237661351557 }, "Winogrande": { "metric_name": 0.5671406003159558 } }
{ "model_dtype": "torch.float16", "model_name": "microsoft/Phi-3-mini-4k-instruct" }
{ "MMLU": { "metric_name": 0.342157805220735 }, "Truthful_qa": { "metric_name": 0.45704161970077833 }, "ARC": { "metric_name": 0.2593856655290102 }, "HellaSwag": { "metric_name": 0.29987580444845885 }, "GSM8K": { "metric_name": 0.12148823082763857 }, "GSM1K": { "metric_name": 0.12148823082763857 }, "Winogrande": { "metric_name": 0.5165876777251185 } }
{ "model_dtype": "torch.float16", "model_name": "notbdq/mistral-turkish-v2" }
{ "MMLU": { "metric_name": 0.3269984470901427 }, "Truthful_qa": { "metric_name": 0.4520176493869486 }, "ARC": { "metric_name": 0.3302047781569966 }, "HellaSwag": { "metric_name": 0.37281246471717283 }, "GSM8K": { "metric_name": 0.0007593014426727411 }, "Winogrande": { "metric_name": 0.5221169036334913 } }
{ "model_dtype": "torch.float16", "model_name": "notlober/llama3-8b-tr" }
{ "MMLU": { "metric_name": 0.4597352658433779 }, "Truthful_qa": { "metric_name": 0.4523010631732335 }, "ARC": { "metric_name": 0.4112627986348123 }, "HellaSwag": { "metric_name": 0.47081404538782884 }, "GSM8K": { "metric_name": 0.38724373576309795 }, "Winogrande": { "metric_name": 0.5339652448657188 } }
{ "model_dtype": "torch.float16", "model_name": "nvidia/Llama3-ChatQA-1.5-8B" }
{ "MMLU": { "metric_name": 0.4737854026473416 }, "Truthful_qa": { "metric_name": 0.5004162765382687 }, "ARC": { "metric_name": 0.42406143344709896 }, "HellaSwag": { "metric_name": 0.48266907530766623 }, "GSM8K": { "metric_name": 0.02885345482156416 }, "Winogrande": { "metric_name": 0.5434439178515008 } }
{ "model_dtype": "torch.float16", "model_name": "sambanovasystems/SambaLingo-Turkish-Chat" }
{ "MMLU": { "metric_name": 0.3835 }, "Truthful_qa": { "metric_name": 0.4411 }, "ARC": { "metric_name": 0.4471 }, "HellaSwag": { "metric_name": 0.553 }, "GSM8K": { "metric_name": 0.0615 }, "Winogrande": { "metric_name": 0.571 } }
{ "model_dtype": "torch.float16", "model_name": "teknium/OpenHermes-2.5-Mistral-7B" }
{ "MMLU": { "metric_name": 0.40560526510389705 }, "Truthful_qa": { "metric_name": 0.46019623944667454 }, "ARC": { "metric_name": 0.3412969283276451 }, "HellaSwag": { "metric_name": 0.3915547024952015 }, "GSM8K": { "metric_name": 0.28170083523158695 }, "Winogrande": { "metric_name": 0.5213270142180095 } }
{ "model_dtype": "torch.float16", "model_name": "tolgadev/llama-2-7b-ruyallm" }
{ "MMLU": { "metric_name": 0.3403830510981291 }, "Truthful_qa": { "metric_name": 0.42003020789928697 }, "ARC": { "metric_name": 0.33361774744027306 }, "HellaSwag": { "metric_name": 0.41910353392796657 }, "GSM8K": { "metric_name": 0.01442672741078208 }, "Winogrande": { "metric_name": 0.5410742496050553 } }
{ "model_dtype": "torch.float16", "model_name": "umarigan/Hermes-7B-TR" }
{ "MMLU": { "metric_name": 0.40730607113806105 }, "Truthful_qa": { "metric_name": 0.44394252827078573 }, "ARC": { "metric_name": 0.40273037542662116 }, "HellaSwag": { "metric_name": 0.48086259455797675 }, "GSM8K": { "metric_name": 0.05466970387243736 }, "Winogrande": { "metric_name": 0.5703001579778831 } }
{ "model_dtype": "torch.float16", "model_name": "umarigan/LLama-3-8B-Instruction-tr" }
{ "MMLU": { "metric_name": 0.4736 }, "Truthful_qa": { "metric_name": 0.5027 }, "ARC": { "metric_name": 0.4274 }, "HellaSwag": { "metric_name": 0.4888 }, "GSM8K": { "metric_name": 0.2938 }, "Winogrande": { "metric_name": 0.5489 } }
{ "model_dtype": "torch.float16", "model_name": "umarigan/TURKCELL-LLM-7B-openhermes" }
{ "MMLU": { "metric_name": 0.39000221844265326 }, "Truthful_qa": { "metric_name": 0.4219778245893802 }, "ARC": { "metric_name": 0.42406143344709896 }, "HellaSwag": { "metric_name": 0.49170147905611383 }, "GSM8K": { "metric_name": 0.30447987851176916 }, "Winogrande": { "metric_name": 0.5679304897314376 } }
{ "model_dtype": "torch.float16", "model_name": "umarigan/Trendyol-LLM-7b-chat-v0.1-DPO" }
{ "MMLU": { "metric_name": 0.3378 }, "Truthful_qa": { "metric_name": 0.4128 }, "ARC": { "metric_name": 0.3447 }, "HellaSwag": { "metric_name": 0.4214 }, "GSM8K": { "metric_name": 0.0227 }, "Winogrande": { "metric_name": 0.5418 } }
{ "model_dtype": "torch.float16", "model_name": "umarigan/Trendyol-LLM-7b-chat-v1.0-RLHF" }
{ "MMLU": { "metric_name": 0.3783184204688309 }, "Truthful_qa": { "metric_name": 0.4287000745636075 }, "ARC": { "metric_name": 0.3626279863481229 }, "HellaSwag": { "metric_name": 0.4213616348650785 }, "GSM8K": { "metric_name": 0.04859529233105543 }, "Winogrande": { "metric_name": 0.566350710900474 } }
{ "model_dtype": "torch.float16", "model_name": "umarigan/llama-3-openhermes-tr" }
{ "MMLU": { "metric_name": 0.4778525475116468 }, "Truthful_qa": { "metric_name": 0.4960030507948868 }, "ARC": { "metric_name": 0.439419795221843 }, "HellaSwag": { "metric_name": 0.48458846110421133 }, "GSM8K": { "metric_name": 0.32270311313591493 }, "Winogrande": { "metric_name": 0.5560821484992101 } }
{ "model_dtype": "torch.float16", "model_name": "umarigan/llama-3.1-openhermes-tr" }
{ "MMLU": { "metric_name": 0.490571618723656 }, "Truthful_qa": { "metric_name": 0.4878379865771679 }, "ARC": { "metric_name": 0.4539249146757679 }, "HellaSwag": { "metric_name": 0.4956531556960596 }, "GSM8K": { "metric_name": 0.3470007593014427 }, "GSM1K": { "metric_name": 0.3470007593014427 }, "Winogrande": { "metric_name": 0.5521327014218009 } }
{ "model_dtype": "torch.float16", "model_name": "ytu-ce-cosmos/Turkish-Llama-8b-DPO-v0.1" }
{ "MMLU": { "metric_name": 0.5198550617466539 }, "Truthful_qa": { "metric_name": 0.5749863469255445 }, "ARC": { "metric_name": 0.5102389078498294 }, "HellaSwag": { "metric_name": 0.52896014451846 }, "GSM8K": { "metric_name": 0.5770690964312832 }, "GSM1K": { "metric_name": 0.5770690964312832 }, "Winogrande": { "metric_name": 0.5774091627172195 } }
{ "model_dtype": "torch.float16", "model_name": "ytu-ce-cosmos/Turkish-Llama-8b-Instruct-v0.1" }
{ "MMLU": { "metric_name": 0.5174887229165126 }, "Truthful_qa": { "metric_name": 0.496780989197894 }, "ARC": { "metric_name": 0.48976109215017066 }, "HellaSwag": { "metric_name": 0.5123631026306876 }, "GSM8K": { "metric_name": 0.5732725892179195 }, "Winogrande": { "metric_name": 0.5695102685624013 } }
{ "model_dtype": "torch.float16", "model_name": "ytu-ce-cosmos/Turkish-Llama-8b-v0.1" }
{ "MMLU": { "metric_name": 0.5086 }, "Truthful_qa": { "metric_name": 0.4988 }, "ARC": { "metric_name": 0.4872 }, "HellaSwag": { "metric_name": 0.5045 }, "GSM8K": { "metric_name": 0.4844 }, "Winogrande": { "metric_name": 0.5837 } }
{ "model_dtype": "torch.float16", "model_name": "ytu-ce-cosmos/turkish-gpt2-large-750m-instruct-v0.1" }
{ "MMLU": { "metric_name": 0.2639946757376322 }, "Truthful_qa": { "metric_name": 0.4628814668834825 }, "ARC": { "metric_name": 0.2738907849829352 }, "HellaSwag": { "metric_name": 0.3612961499379022 }, "GSM8K": { "metric_name": 0.0015186028853454822 }, "Winogrande": { "metric_name": 0.4928909952606635 } }
{ "model_dtype": "torch.float16", "model_name": "ytu-ce-cosmos/turkish-gpt2-large" }
{ "MMLU": { "metric_name": 0.2676920801597279 }, "Truthful_qa": { "metric_name": 0.44090866135348844 }, "ARC": { "metric_name": 0.2738907849829352 }, "HellaSwag": { "metric_name": 0.3593767641413571 }, "GSM8K": { "metric_name": 0.007593014426727411 }, "Winogrande": { "metric_name": 0.48578199052132703 } }
{ "model_dtype": "torch.float16", "model_name": "zgrgr/Meta-Llama-3.1-8B-Instruct" }
{ "MMLU": { "metric_name": 0.4966353619758929 }, "Truthful_qa": { "metric_name": 0.5030797117568518 }, "ARC": { "metric_name": 0.43686006825938567 }, "HellaSwag": { "metric_name": 0.4490233713446991 }, "GSM8K": { "metric_name": 0.55125284738041 }, "GSM1K": { "metric_name": 0.55125284738041 }, "Winogrande": { "metric_name": 0.5647709320695102 } }

No dataset card yet

New: Create and edit this dataset card directly on the website!

Contribute a Dataset Card
Downloads last month
2