ngwlh commited on
Commit
ce7f068
1 Parent(s): ceb5e6a

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -12,7 +12,7 @@ tags:
12
  # KBioXLM
13
 
14
  The aligned corpus constructed using the knowledge-anchored method is combined with a multi task training strategy to continue training XLM-R, thus obtaining KBioXLM. It is the first multilingual biomedical pre-trained language model we know that has cross-lingual understanding capabilities in medical domain. It was introduced in the paper [KBioXLM: A Knowledge-anchored Biomedical
15
- Multilingual Pretrained Language Model]() by geng et al. and first released in [this repository](https://github.com/ngwlh-gl/KBioXLM/tree/main)
16
 
17
  ## Model description
18
  KBioXLM model can be fintuned on downstream tasks. The downstream tasks here refer to biomedical cross-lingual understanding tasks, such as biomedical entity recognition, biomedical relationship extraction and biomedical text classification.
 
12
  # KBioXLM
13
 
14
  The aligned corpus constructed using the knowledge-anchored method is combined with a multi task training strategy to continue training XLM-R, thus obtaining KBioXLM. It is the first multilingual biomedical pre-trained language model we know that has cross-lingual understanding capabilities in medical domain. It was introduced in the paper [KBioXLM: A Knowledge-anchored Biomedical
15
+ Multilingual Pretrained Language Model](http://arxiv.org/abs/2311.11564) by geng et al. and first released in [this repository](https://github.com/ngwlh-gl/KBioXLM/tree/main)
16
 
17
  ## Model description
18
  KBioXLM model can be fintuned on downstream tasks. The downstream tasks here refer to biomedical cross-lingual understanding tasks, such as biomedical entity recognition, biomedical relationship extraction and biomedical text classification.