osiria's picture
Update README.md
6d57970
|
raw
history blame
2.21 kB
metadata
license: apache-2.0
language:
  - it
widget:
  - text: milano è una [MASK] dell'italia
    example_title: Example 1
  - text: >-
      giacomo leopardi è stato uno dei più grandi [MASK] del classicismo
      italiano
    example_title: Example 2
  - text: la pizza è un piatto tipico della [MASK] gastronomica italiana
    example_title: Example 3


  
    Model: BERT
    Lang: IT
  Type: Uncased

Model description

This is an uncased BERT [1] model for the Italian language, obtained using the uncased mBERT (bert-base-multilingual-uncased) as a starting point and focusing it on the Italian language by modifying the embedding layer (as in [2], computing document-level frequencies over the Wikipedia dataset)

The resulting model has 110M parameters, a vocabulary of 30.154 tokens, and a size of ~430 MB.

Quick usage

from transformers import BertTokenizerFast, BertModel

tokenizer = BertTokenizerFast.from_pretrained("osiria/bert-base-italian-uncased")
model = BertModel.from_pretrained("osiria/bert-base-italian-uncased")

References

[1] https://arxiv.org/abs/1810.04805

[2] https://arxiv.org/abs/2010.05609

License

The model is released under Apache-2.0 license