ONNX version of dslim/bert-large-NER
This model is a conversion of dslim/bert-large-NER to ONNX format using the 🤗 Optimum library.
bert-large-NER is a fine-tuned BERT model that is ready to use for Named Entity Recognition and achieves state-of-the-art performance for the NER task. It has been trained to recognize four types of entities: location (LOC), organizations (ORG), person (PER) and Miscellaneous (MISC).
Specifically, this model is a bert-large-cased model that was fine-tuned on the English version of the standard CoNLL-2003 Named Entity Recognition dataset.
Usage
Loading the model requires the 🤗 Optimum library installed.
from optimum.onnxruntime import ORTModelForTokenClassification
from transformers import AutoTokenizer, pipeline
tokenizer = AutoTokenizer.from_pretrained("laiyer/bert-large-NER-onnx")
model = ORTModelForTokenClassification.from_pretrained("laiyer/bert-large-NER-onnx")
ner = pipeline(
task="ner",
model=model,
tokenizer=tokenizer,
)
ner_output = ner("My name is John Doe.")
print(ner_output)
LLM Guard
Community
Join our Slack to give us feedback, connect with the maintainers and fellow users, ask questions, or engage in discussions about LLM security!
- Downloads last month
- 16
Model tree for protectai/bert-large-NER-onnx
Base model
dslim/bert-large-NERDataset used to train protectai/bert-large-NER-onnx
Collection including protectai/bert-large-NER-onnx
Evaluation results
- Accuracy on conll2003test set self-reported0.903
- Precision on conll2003test set self-reported0.920
- Recall on conll2003test set self-reported0.919
- F1 on conll2003test set self-reported0.920
- loss on conll2003test set self-reported0.509