File size: 1,280 Bytes
92585a6 4001e96 92585a6 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 |
---
license: mit
language:
- ro
---
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
[RoWiki](https://github.com/dumitrescustefan/wiki-ro) is cleaned dump of the June 2020 Romanian Wikipedia. It is meant as a reference text upon which to calculate Language Model capacity and/or perplexity.
This dataset is used as a benchmark and is part of the evaluation protocol for Romanian LLMs proposed in *"Vorbeşti Româneşte?" A Recipe to Train Powerful Romanian LLMs with English Instructions* ([Masala et al., 2024](https://arxiv.org/abs/2406.18266))
## Citation
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
```bibtext
@article{masala2024vorbecstiromanecsterecipetrain,
title={"Vorbe\c{s}ti Rom\^ane\c{s}te?" A Recipe to Train Powerful Romanian LLMs with English Instructions},
author={Mihai Masala and Denis C. Ilie-Ablachim and Alexandru Dima and Dragos Corlatescu and Miruna Zavelca and Ovio Olaru and Simina Terian and Andrei Terian and Marius Leordeanu and Horia Velicu and Marius Popescu and Mihai Dascalu and Traian Rebedea},
year={2024},
eprint={2406.18266},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |