File size: 962 Bytes
f59d332 fd620eb f59d332 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 |
---
language:
- lzh
tags:
- classical chinese
- literary chinese
- ancient chinese
- token-classification
- pos
- dependency-parsing
base_model: KoichiYasuoka/Xunzi-Qwen2-1.5B-upos
datasets:
- universal_dependencies
license: apache-2.0
pipeline_tag: token-classification
widget:
- text: 子曰學而時習之不亦説乎有朋自遠方來不亦樂乎人不知而不慍不亦君子乎
---
# Xunzi-Qwen2-1.5B-ud-causal
## Model Description
This is a LLaMA model pretrained on Classical Chinese texts for POS-tagging and dependency-parsing, derived from [Xunzi-Qwen2-1.5B-upos](https://huggingface.co/KoichiYasuoka/Xunzi-Qwen2-1.5B-upos) and [UD_Classical_Chinese-Kyoto](https://github.com/UniversalDependencies/UD_Classical_Chinese-Kyoto).
## How to Use
```
from transformers import pipeline
nlp=pipeline("universal-dependencies","KoichiYasuoka/Xunzi-Qwen2-1.5B-ud-causal",trust_remote_code=True)
print(nlp("不入虎穴不得虎子"))
```
|