andorei's picture
Create README.md
6dad038
|
raw
history blame
978 Bytes
metadata
language:
  - en
tags:
  - biomedical
  - bionlp
  - entity linking
  - embedding
  - bert

The GEBERT model pre-trained with GAT graph encoder.

The model was published at CLEF 2023 conference. The source code is available at github.

Pretraining data: biomedical concept graph and concept names from the UMLS (2020AB release).

Base model: microsoft/BiomedNLP-PubMedBERT-base-uncased-abstract-fulltext.

@inproceedings{sakhovskiy2023gebert,
author="Sakhovskiy, Andrey
and Semenova, Natalia
and Kadurin, Artur
and Tutubalina, Elena",
title="Graph-Enriched Biomedical Entity Representation Transformer",
booktitle="Experimental IR Meets Multilinguality, Multimodality, and Interaction",
year="2023",
publisher="Springer Nature Switzerland",
address="Cham",
pages="109--120",
isbn="978-3-031-42448-9"
}