Update README.md
Browse files
README.md
CHANGED
@@ -204,11 +204,16 @@ Falcon-Mamba-7B was trained an internal distributed training codebase, Gigatron.
|
|
204 |
|
205 |
# Citation
|
206 |
|
207 |
-
|
208 |
```
|
209 |
-
@
|
210 |
-
|
211 |
-
|
212 |
-
|
|
|
|
|
|
|
|
|
213 |
}
|
214 |
```
|
|
|
|
204 |
|
205 |
# Citation
|
206 |
|
207 |
+
You can use the following bibtex citation:
|
208 |
```
|
209 |
+
@misc{zuo2024falconmambacompetitiveattentionfree,
|
210 |
+
title={Falcon Mamba: The First Competitive Attention-free 7B Language Model},
|
211 |
+
author={Jingwei Zuo and Maksim Velikanov and Dhia Eddine Rhaiem and Ilyas Chahed and Younes Belkada and Guillaume Kunsch and Hakim Hacid},
|
212 |
+
year={2024},
|
213 |
+
eprint={2410.05355},
|
214 |
+
archivePrefix={arXiv},
|
215 |
+
primaryClass={cs.CL},
|
216 |
+
url={https://arxiv.org/abs/2410.05355},
|
217 |
}
|
218 |
```
|
219 |
+
```
|