Update README.md
Browse files
README.md
CHANGED
@@ -49,17 +49,15 @@ AraT5 Pytorch and TensorFlow checkpoints are available on the Huggingface websit
|
|
49 |
|
50 |
If you use our models (Arat5-base, Arat5-msa-base, Arat5-tweet-base, Arat5-msa-small, or Arat5-tweet-small ) for your scientific publication, or if you find the resources in this repository useful, please cite our paper as follows (to be updated):
|
51 |
```bibtex
|
52 |
-
@inproceedings{
|
53 |
-
|
54 |
-
|
55 |
-
|
56 |
-
|
57 |
-
|
58 |
-
|
59 |
-
|
60 |
-
|
61 |
-
publisher = "Association for Computational Linguistics",
|
62 |
-
}
|
63 |
```
|
64 |
|
65 |
## Acknowledgments
|
|
|
49 |
|
50 |
If you use our models (Arat5-base, Arat5-msa-base, Arat5-tweet-base, Arat5-msa-small, or Arat5-tweet-small ) for your scientific publication, or if you find the resources in this repository useful, please cite our paper as follows (to be updated):
|
51 |
```bibtex
|
52 |
+
@inproceedings{nagoudi2022_arat5,
|
53 |
+
title={AraT5: Text-to-Text Transformers for Arabic Language Generation},
|
54 |
+
author={Nagoudi, El Moatez Billah and Elmadany, AbdelRahim and Abdul-Mageed, Muhammad},
|
55 |
+
journal={Proceedings of the 60th Annual Meeting of the Association for Computational Linguistic},
|
56 |
+
month = {May},
|
57 |
+
address = {Online},
|
58 |
+
year={2022},
|
59 |
+
publisher = {Association for Computational Linguistics}
|
60 |
+
}
|
|
|
|
|
61 |
```
|
62 |
|
63 |
## Acknowledgments
|