YAML Metadata
Error:
"pipeline_tag" must be a string
BERT base uncased model pre-trained on 5 NER datasets
Model was trained by SberIDP. The pretraining process and technical details are described in this article.
- Task: Named Entity Recognition
- Base model: bert-base-uncased
- Training Data is 5 datasets: CoNLL-2003, WNUT17, JNLPBA, CoNLL-2012 (OntoNotes), BTC
- Testing was made in Few-Shot scenario on Few-NERD dataset using the model as a backbone for StructShot
The model is pretrained for NER task using Reptile and can be finetuned for new entities with only a small amount of samples.
- Downloads last month
- 187
Inference API (serverless) has been turned off for this model.
Datasets used to train ai-forever/bert-base-NER-reptile-5-datasets
Spaces using ai-forever/bert-base-NER-reptile-5-datasets 3
Evaluation results
- 5 way 1~2 shot on few-nerd-interself-reported56.120
- 5-way 5~10-shot on few-nerd-interself-reported62.700
- 10-way 1~2-shot on few-nerd-interself-reported50.300
- 10-way 5~10-shot on few-nerd-interself-reported58.820