Update README.md
Browse files
README.md
CHANGED
@@ -1,3 +1,48 @@
|
|
1 |
---
|
2 |
license: apache-2.0
|
|
|
|
|
3 |
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
---
|
2 |
license: apache-2.0
|
3 |
+
language:
|
4 |
+
- it
|
5 |
---
|
6 |
+
--------------------------------------------------------------------------------------------------
|
7 |
+
|
8 |
+
<body>
|
9 |
+
<span class="vertical-text" style="background-color:lightgreen;border-radius: 3px;padding: 3px;"> </span>
|
10 |
+
<br>
|
11 |
+
<span class="vertical-text" style="background-color:orange;border-radius: 3px;padding: 3px;"> </span>
|
12 |
+
<br>
|
13 |
+
<span class="vertical-text" style="background-color:lightblue;border-radius: 3px;padding: 3px;"> Model: DistilBERT</span>
|
14 |
+
<br>
|
15 |
+
<span class="vertical-text" style="background-color:tomato;border-radius: 3px;padding: 3px;"> Lang: IT</span>
|
16 |
+
<br>
|
17 |
+
<span class="vertical-text" style="background-color:lightgrey;border-radius: 3px;padding: 3px;"> </span>
|
18 |
+
<br>
|
19 |
+
<span class="vertical-text" style="background-color:#CF9FFF;border-radius: 3px;padding: 3px;"> </span>
|
20 |
+
</body>
|
21 |
+
|
22 |
+
--------------------------------------------------------------------------------------------------
|
23 |
+
|
24 |
+
<h3>Model description</h3>
|
25 |
+
|
26 |
+
This is a <b>DistilBERT</b> <b>[1]</b> model for the <b>Italian</b> language, obtained using the multilingual <b>DistilBERT</b> ([distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased)) as a starting point and focusing it on the Italian language by modifying the embedding layer
|
27 |
+
(as in <b>[2]</b>, computing document-level frequencies over the <b>Wikipedia</b> dataset)
|
28 |
+
|
29 |
+
The resulting model has 67M parameters, a vocabulary of 30.785 tokens, and a size of ~270 MB.
|
30 |
+
|
31 |
+
<h3>Quick usage</h3>
|
32 |
+
|
33 |
+
```python
|
34 |
+
from transformers import BertTokenizerFast, DistilBertModel
|
35 |
+
|
36 |
+
tokenizer = DistilBertTokenizerFast.from_pretrained("osiria/distilbert-base-italian-cased")
|
37 |
+
model = DistilBertModel.from_pretrained("osiria/distilbert-base-italian-cased")
|
38 |
+
```
|
39 |
+
|
40 |
+
<h3>References</h3>
|
41 |
+
|
42 |
+
[1] https://arxiv.org/abs/1910.01108
|
43 |
+
|
44 |
+
[2] https://arxiv.org/abs/2010.05609
|
45 |
+
|
46 |
+
<h3>License</h3>
|
47 |
+
|
48 |
+
The model is released under <b>Apache-2.0</b> license
|