antoinelouis
commited on
Commit
•
9719d3c
1
Parent(s):
0bb8295
Update README.md
Browse files
README.md
CHANGED
@@ -4,14 +4,18 @@ language:
|
|
4 |
license:
|
5 |
- mit
|
6 |
widget:
|
7 |
-
- text:
|
8 |
-
- text:
|
9 |
-
- text:
|
|
|
|
|
|
|
|
|
10 |
---
|
11 |
|
12 |
-
#
|
13 |
|
14 |
-
**
|
15 |
|
16 |
## Usage
|
17 |
|
@@ -60,7 +64,7 @@ Below is the list of all French copora used to pre-trained the model:
|
|
60 |
|
61 |
## Documentation
|
62 |
|
63 |
-
Detailed documentation on the pre-trained model, its implementation, and the data can be found [here](https://github.com/
|
64 |
|
65 |
## Citation
|
66 |
|
@@ -69,8 +73,8 @@ For attribution in academic contexts, please cite this work as:
|
|
69 |
```
|
70 |
@misc{louis2020belgpt2,
|
71 |
author = {Louis, Antoine},
|
72 |
-
title = {{BelGPT-2:
|
73 |
year = {2020},
|
74 |
-
howpublished = {\url{https://github.com/
|
75 |
}
|
76 |
```
|
|
|
4 |
license:
|
5 |
- mit
|
6 |
widget:
|
7 |
+
- text: Hier, Elon Musk a
|
8 |
+
- text: Pourquoi a-t-il
|
9 |
+
- text: Tout à coup, elle
|
10 |
+
metrics:
|
11 |
+
- perplexity
|
12 |
+
library_name: transformers
|
13 |
+
pipeline_tag: text-generation
|
14 |
---
|
15 |
|
16 |
+
# BelGPT-2
|
17 |
|
18 |
+
**The 1st GPT-2 model pre-trained on a very large and heterogeneous French corpus (~60Gb).**
|
19 |
|
20 |
## Usage
|
21 |
|
|
|
64 |
|
65 |
## Documentation
|
66 |
|
67 |
+
Detailed documentation on the pre-trained model, its implementation, and the data can be found [here](https://github.com/ant-louis/belgpt2/blob/master/docs/index.md).
|
68 |
|
69 |
## Citation
|
70 |
|
|
|
73 |
```
|
74 |
@misc{louis2020belgpt2,
|
75 |
author = {Louis, Antoine},
|
76 |
+
title = {{BelGPT-2: A GPT-2 Model Pre-trained on French Corpora}},
|
77 |
year = {2020},
|
78 |
+
howpublished = {\url{https://github.com/ant-louis/belgpt2}},
|
79 |
}
|
80 |
```
|