KoichiYasuoka commited on
Commit
f59d332
1 Parent(s): c049273

initial release

Browse files
README.md ADDED
@@ -0,0 +1,33 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - lzh
4
+ tags:
5
+ - classical chinese
6
+ - literary chinese
7
+ - ancient chinese
8
+ - token-classification
9
+ - pos
10
+ - dependency-parsing
11
+ base_model: KoichiYasuoka/Xunzi-Qwen2-1.5B-upos
12
+ datasets:
13
+ - universal_dependencies
14
+ license: apache-2.0
15
+ pipeline_tag: token-classification
16
+ widget:
17
+ - text: 子曰學而時習之不亦説乎有朋自遠方來不亦樂乎人不知而不慍不亦君子乎
18
+ ---
19
+
20
+ # Xunzi-Qwen2-1.5B-ud-causal
21
+
22
+ ## Model Description
23
+
24
+ This is a LLaMA model pretrained for POS-tagging and dependency-parsing, derived from [Xunzi-Qwen2-1.5B-upos](https://huggingface.co/KoichiYasuoka/Xunzi-Qwen2-1.5B-upos) and [UD_Classical_Chinese-Kyoto](https://github.com/UniversalDependencies/UD_Classical_Chinese-Kyoto).
25
+
26
+ ## How to Use
27
+
28
+ ```
29
+ from transformers import pipeline
30
+ nlp=pipeline("universal-dependencies","KoichiYasuoka/Xunzi-Qwen2-1.5B-ud-causal",trust_remote_code=True)
31
+ print(nlp("不入虎穴不得虎子"))
32
+ ```
33
+
config.json ADDED
@@ -0,0 +1,1676 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "architectures": [
3
+ "Qwen2ForTokenClassification"
4
+ ],
5
+ "attention_dropout": 0.0,
6
+ "bos_token_id": 151643,
7
+ "custom_pipelines": {
8
+ "upos": {
9
+ "impl": "ud.BellmanFordTokenClassificationPipeline",
10
+ "pt": "AutoModelForTokenClassification"
11
+ },
12
+ "universal-dependencies": {
13
+ "impl": "ud.UniversalDependenciesCausalPipeline",
14
+ "pt": "AutoModelForTokenClassification"
15
+ }
16
+ },
17
+ "eos_token_id": 151643,
18
+ "hidden_act": "silu",
19
+ "hidden_size": 1536,
20
+ "id2label": {
21
+ "0": "ADP",
22
+ "1": "ADP|Degree=Equ",
23
+ "2": "ADP|Degree=Equ|l-cc",
24
+ "3": "ADP|l-acl",
25
+ "4": "ADP|l-advcl",
26
+ "5": "ADP|l-amod",
27
+ "6": "ADP|l-case",
28
+ "7": "ADP|l-cc",
29
+ "8": "ADP|l-mark",
30
+ "9": "ADP|l-nsubj",
31
+ "10": "ADP|l-obl",
32
+ "11": "ADP|r-case",
33
+ "12": "ADP|r-conj",
34
+ "13": "ADP|r-fixed",
35
+ "14": "ADP|r-mark",
36
+ "15": "ADP|r-obj",
37
+ "16": "ADP|root",
38
+ "17": "ADV",
39
+ "18": "ADV|AdvType=Cau",
40
+ "19": "ADV|AdvType=Cau|l-advmod",
41
+ "20": "ADV|AdvType=Cau|l-amod",
42
+ "21": "ADV|AdvType=Cau|l-nsubj",
43
+ "22": "ADV|AdvType=Cau|l-obj",
44
+ "23": "ADV|AdvType=Deg|Degree=Cmp",
45
+ "24": "ADV|AdvType=Deg|Degree=Cmp|l-advmod",
46
+ "25": "ADV|AdvType=Deg|Degree=Cmp|l-amod",
47
+ "26": "ADV|AdvType=Deg|Degree=Cmp|r-conj",
48
+ "27": "ADV|AdvType=Deg|Degree=Cmp|r-obj",
49
+ "28": "ADV|AdvType=Deg|Degree=Pos",
50
+ "29": "ADV|AdvType=Deg|Degree=Pos|l-advmod",
51
+ "30": "ADV|AdvType=Deg|Degree=Pos|l-amod",
52
+ "31": "ADV|AdvType=Deg|Degree=Pos|r-ccomp",
53
+ "32": "ADV|AdvType=Deg|Degree=Pos|r-conj",
54
+ "33": "ADV|AdvType=Deg|Degree=Pos|r-flat:vv",
55
+ "34": "ADV|AdvType=Deg|Degree=Pos|r-parataxis",
56
+ "35": "ADV|AdvType=Deg|Degree=Pos|root",
57
+ "36": "ADV|AdvType=Deg|Degree=Sup",
58
+ "37": "ADV|AdvType=Deg|Degree=Sup|l-advmod",
59
+ "38": "ADV|AdvType=Deg|Degree=Sup|l-amod",
60
+ "39": "ADV|AdvType=Deg|Degree=Sup|l-nsubj",
61
+ "40": "ADV|AdvType=Deg|Degree=Sup|r-conj",
62
+ "41": "ADV|AdvType=Deg|Degree=Sup|r-parataxis",
63
+ "42": "ADV|AdvType=Deg|Degree=Sup|root",
64
+ "43": "ADV|AdvType=Tim",
65
+ "44": "ADV|AdvType=Tim|Aspect=Perf",
66
+ "45": "ADV|AdvType=Tim|Aspect=Perf|l-advmod",
67
+ "46": "ADV|AdvType=Tim|Aspect=Perf|l-amod",
68
+ "47": "ADV|AdvType=Tim|Aspect=Perf|l-obl:lmod",
69
+ "48": "ADV|AdvType=Tim|Aspect=Perf|r-parataxis",
70
+ "49": "ADV|AdvType=Tim|Aspect=Perf|root",
71
+ "50": "ADV|AdvType=Tim|Tense=Fut",
72
+ "51": "ADV|AdvType=Tim|Tense=Fut|l-advmod",
73
+ "52": "ADV|AdvType=Tim|Tense=Fut|l-amod",
74
+ "53": "ADV|AdvType=Tim|Tense=Fut|l-nsubj",
75
+ "54": "ADV|AdvType=Tim|Tense=Fut|l-nsubj:outer",
76
+ "55": "ADV|AdvType=Tim|Tense=Fut|root",
77
+ "56": "ADV|AdvType=Tim|Tense=Past",
78
+ "57": "ADV|AdvType=Tim|Tense=Past|l-advmod",
79
+ "58": "ADV|AdvType=Tim|Tense=Past|l-amod",
80
+ "59": "ADV|AdvType=Tim|Tense=Pres",
81
+ "60": "ADV|AdvType=Tim|Tense=Pres|l-advmod",
82
+ "61": "ADV|AdvType=Tim|Tense=Pres|l-amod",
83
+ "62": "ADV|AdvType=Tim|Tense=Pres|root",
84
+ "63": "ADV|AdvType=Tim|l-advcl",
85
+ "64": "ADV|AdvType=Tim|l-advmod",
86
+ "65": "ADV|AdvType=Tim|l-amod",
87
+ "66": "ADV|AdvType=Tim|l-nsubj",
88
+ "67": "ADV|AdvType=Tim|r-advmod",
89
+ "68": "ADV|AdvType=Tim|r-ccomp",
90
+ "69": "ADV|AdvType=Tim|r-compound:redup",
91
+ "70": "ADV|AdvType=Tim|r-conj",
92
+ "71": "ADV|AdvType=Tim|r-flat:vv",
93
+ "72": "ADV|AdvType=Tim|r-parataxis",
94
+ "73": "ADV|AdvType=Tim|root",
95
+ "74": "ADV|Degree=Equ|VerbForm=Conv",
96
+ "75": "ADV|Degree=Equ|VerbForm=Conv|l-advmod",
97
+ "76": "ADV|Degree=Pos|VerbForm=Conv",
98
+ "77": "ADV|Degree=Pos|VerbForm=Conv|l-advmod",
99
+ "78": "ADV|Degree=Pos|VerbForm=Conv|r-advmod",
100
+ "79": "ADV|Polarity=Neg",
101
+ "80": "ADV|Polarity=Neg|VerbForm=Conv",
102
+ "81": "ADV|Polarity=Neg|VerbForm=Conv|l-advmod",
103
+ "82": "ADV|Polarity=Neg|l-advmod",
104
+ "83": "ADV|Polarity=Neg|l-amod",
105
+ "84": "ADV|Polarity=Neg|l-nsubj",
106
+ "85": "ADV|Polarity=Neg|l-parataxis",
107
+ "86": "ADV|Polarity=Neg|r-advmod",
108
+ "87": "ADV|Polarity=Neg|r-conj",
109
+ "88": "ADV|Polarity=Neg|r-obj",
110
+ "89": "ADV|Polarity=Neg|r-parataxis",
111
+ "90": "ADV|Polarity=Neg|root",
112
+ "91": "ADV|VerbForm=Conv",
113
+ "92": "ADV|VerbForm=Conv|l-advmod",
114
+ "93": "ADV|VerbForm=Conv|r-advmod",
115
+ "94": "ADV|l-acl",
116
+ "95": "ADV|l-advcl",
117
+ "96": "ADV|l-advmod",
118
+ "97": "ADV|l-amod",
119
+ "98": "ADV|l-cc",
120
+ "99": "ADV|l-nsubj",
121
+ "100": "ADV|r-advmod",
122
+ "101": "ADV|r-ccomp",
123
+ "102": "ADV|r-conj",
124
+ "103": "ADV|r-flat:vv",
125
+ "104": "ADV|r-obj",
126
+ "105": "ADV|root",
127
+ "106": "AUX|Mood=Des",
128
+ "107": "AUX|Mood=Des|l-aux",
129
+ "108": "AUX|Mood=Des|l-csubj",
130
+ "109": "AUX|Mood=Des|l-parataxis",
131
+ "110": "AUX|Mood=Des|r-ccomp",
132
+ "111": "AUX|Mood=Des|r-conj",
133
+ "112": "AUX|Mood=Des|r-flat:vv",
134
+ "113": "AUX|Mood=Des|root",
135
+ "114": "AUX|Mood=Nec",
136
+ "115": "AUX|Mood=Nec|l-acl",
137
+ "116": "AUX|Mood=Nec|l-amod",
138
+ "117": "AUX|Mood=Nec|l-aux",
139
+ "118": "AUX|Mood=Nec|r-aux",
140
+ "119": "AUX|Mood=Nec|root",
141
+ "120": "AUX|Mood=Pot",
142
+ "121": "AUX|Mood=Pot|l-acl",
143
+ "122": "AUX|Mood=Pot|l-advcl",
144
+ "123": "AUX|Mood=Pot|l-amod",
145
+ "124": "AUX|Mood=Pot|l-aux",
146
+ "125": "AUX|Mood=Pot|l-csubj",
147
+ "126": "AUX|Mood=Pot|l-nsubj",
148
+ "127": "AUX|Mood=Pot|r-ccomp",
149
+ "128": "AUX|Mood=Pot|r-conj",
150
+ "129": "AUX|Mood=Pot|r-obj",
151
+ "130": "AUX|Mood=Pot|r-parataxis",
152
+ "131": "AUX|Mood=Pot|r-xcomp",
153
+ "132": "AUX|Mood=Pot|root",
154
+ "133": "AUX|VerbType=Cop",
155
+ "134": "AUX|VerbType=Cop|l-cop",
156
+ "135": "AUX|Voice=Pass",
157
+ "136": "AUX|Voice=Pass|l-aux",
158
+ "137": "AUX|Voice=Pass|r-conj",
159
+ "138": "AUX|Voice=Pass|root",
160
+ "139": "B-ADP",
161
+ "140": "B-ADP|Degree=Equ",
162
+ "141": "B-ADV",
163
+ "142": "B-ADV|AdvType=Cau",
164
+ "143": "B-ADV|AdvType=Deg|Degree=Cmp",
165
+ "144": "B-ADV|AdvType=Deg|Degree=Pos",
166
+ "145": "B-ADV|AdvType=Deg|Degree=Sup",
167
+ "146": "B-ADV|AdvType=Tim",
168
+ "147": "B-ADV|AdvType=Tim|Aspect=Perf",
169
+ "148": "B-ADV|AdvType=Tim|Tense=Fut",
170
+ "149": "B-ADV|AdvType=Tim|Tense=Past",
171
+ "150": "B-ADV|AdvType=Tim|Tense=Pres",
172
+ "151": "B-ADV|Degree=Equ|VerbForm=Conv",
173
+ "152": "B-ADV|Degree=Pos|VerbForm=Conv",
174
+ "153": "B-ADV|Polarity=Neg",
175
+ "154": "B-ADV|Polarity=Neg|VerbForm=Conv",
176
+ "155": "B-ADV|VerbForm=Conv",
177
+ "156": "B-AUX|Mood=Des",
178
+ "157": "B-AUX|Mood=Nec",
179
+ "158": "B-AUX|Mood=Pot",
180
+ "159": "B-AUX|VerbType=Cop",
181
+ "160": "B-AUX|Voice=Pass",
182
+ "161": "B-CCONJ",
183
+ "162": "B-INTJ",
184
+ "163": "B-NOUN",
185
+ "164": "B-NOUN|Case=Loc",
186
+ "165": "B-NOUN|Case=Tem",
187
+ "166": "B-NOUN|Degree=Pos",
188
+ "167": "B-NOUN|NounType=Clf",
189
+ "168": "B-NUM",
190
+ "169": "B-NUM|NumType=Ord",
191
+ "170": "B-PART",
192
+ "171": "B-PRON|Person=1|PronType=Prs",
193
+ "172": "B-PRON|Person=2|PronType=Prs",
194
+ "173": "B-PRON|Person=3|PronType=Prs",
195
+ "174": "B-PRON|PronType=Dem",
196
+ "175": "B-PRON|PronType=Int",
197
+ "176": "B-PRON|PronType=Prs",
198
+ "177": "B-PRON|PronType=Prs|Reflex=Yes",
199
+ "178": "B-PROPN",
200
+ "179": "B-PROPN|Case=Loc|NameType=Geo",
201
+ "180": "B-PROPN|Case=Loc|NameType=Nat",
202
+ "181": "B-PROPN|NameType=Giv",
203
+ "182": "B-PROPN|NameType=Prs",
204
+ "183": "B-PROPN|NameType=Sur",
205
+ "184": "B-PUNCT",
206
+ "185": "B-SCONJ",
207
+ "186": "B-SYM",
208
+ "187": "B-VERB",
209
+ "188": "B-VERB|Degree=Equ",
210
+ "189": "B-VERB|Degree=Equ|VerbForm=Part",
211
+ "190": "B-VERB|Degree=Pos",
212
+ "191": "B-VERB|Degree=Pos|VerbForm=Part",
213
+ "192": "B-VERB|Polarity=Neg",
214
+ "193": "B-VERB|Polarity=Neg|VerbForm=Part",
215
+ "194": "B-VERB|VerbForm=Part",
216
+ "195": "CCONJ",
217
+ "196": "CCONJ|l-advmod",
218
+ "197": "CCONJ|l-amod",
219
+ "198": "CCONJ|l-cc",
220
+ "199": "CCONJ|l-obj",
221
+ "200": "CCONJ|r-fixed",
222
+ "201": "CCONJ|r-orphan",
223
+ "202": "I-ADP",
224
+ "203": "I-ADP|Degree=Equ",
225
+ "204": "I-ADV",
226
+ "205": "I-ADV|AdvType=Cau",
227
+ "206": "I-ADV|AdvType=Deg|Degree=Cmp",
228
+ "207": "I-ADV|AdvType=Deg|Degree=Pos",
229
+ "208": "I-ADV|AdvType=Deg|Degree=Sup",
230
+ "209": "I-ADV|AdvType=Tim",
231
+ "210": "I-ADV|AdvType=Tim|Aspect=Perf",
232
+ "211": "I-ADV|AdvType=Tim|Tense=Fut",
233
+ "212": "I-ADV|AdvType=Tim|Tense=Past",
234
+ "213": "I-ADV|AdvType=Tim|Tense=Pres",
235
+ "214": "I-ADV|Degree=Equ|VerbForm=Conv",
236
+ "215": "I-ADV|Degree=Pos|VerbForm=Conv",
237
+ "216": "I-ADV|Polarity=Neg",
238
+ "217": "I-ADV|Polarity=Neg|VerbForm=Conv",
239
+ "218": "I-ADV|VerbForm=Conv",
240
+ "219": "I-AUX|Mood=Des",
241
+ "220": "I-AUX|Mood=Nec",
242
+ "221": "I-AUX|Mood=Pot",
243
+ "222": "I-AUX|VerbType=Cop",
244
+ "223": "I-AUX|Voice=Pass",
245
+ "224": "I-CCONJ",
246
+ "225": "I-INTJ",
247
+ "226": "I-NOUN",
248
+ "227": "I-NOUN|Case=Loc",
249
+ "228": "I-NOUN|Case=Tem",
250
+ "229": "I-NOUN|Degree=Pos",
251
+ "230": "I-NOUN|NounType=Clf",
252
+ "231": "I-NUM",
253
+ "232": "I-NUM|NumType=Ord",
254
+ "233": "I-PART",
255
+ "234": "I-PRON|Person=1|PronType=Prs",
256
+ "235": "I-PRON|Person=2|PronType=Prs",
257
+ "236": "I-PRON|Person=3|PronType=Prs",
258
+ "237": "I-PRON|PronType=Dem",
259
+ "238": "I-PRON|PronType=Int",
260
+ "239": "I-PRON|PronType=Prs",
261
+ "240": "I-PRON|PronType=Prs|Reflex=Yes",
262
+ "241": "I-PROPN",
263
+ "242": "I-PROPN|Case=Loc|NameType=Geo",
264
+ "243": "I-PROPN|Case=Loc|NameType=Nat",
265
+ "244": "I-PROPN|NameType=Giv",
266
+ "245": "I-PROPN|NameType=Prs",
267
+ "246": "I-PROPN|NameType=Sur",
268
+ "247": "I-PUNCT",
269
+ "248": "I-SCONJ",
270
+ "249": "I-SYM",
271
+ "250": "I-VERB",
272
+ "251": "I-VERB|Degree=Equ",
273
+ "252": "I-VERB|Degree=Equ|VerbForm=Part",
274
+ "253": "I-VERB|Degree=Pos",
275
+ "254": "I-VERB|Degree=Pos|VerbForm=Part",
276
+ "255": "I-VERB|Polarity=Neg",
277
+ "256": "I-VERB|Polarity=Neg|VerbForm=Part",
278
+ "257": "I-VERB|VerbForm=Part",
279
+ "258": "INTJ",
280
+ "259": "INTJ|l-advcl",
281
+ "260": "INTJ|l-csubj",
282
+ "261": "INTJ|l-discourse",
283
+ "262": "INTJ|l-discourse:sp",
284
+ "263": "INTJ|l-dislocated",
285
+ "264": "INTJ|l-nsubj",
286
+ "265": "INTJ|l-vocative",
287
+ "266": "INTJ|r-compound:redup",
288
+ "267": "INTJ|r-conj",
289
+ "268": "INTJ|r-discourse:sp",
290
+ "269": "INTJ|r-dislocated",
291
+ "270": "INTJ|r-fixed",
292
+ "271": "INTJ|r-obj",
293
+ "272": "INTJ|r-parataxis",
294
+ "273": "INTJ|root",
295
+ "274": "NOUN",
296
+ "275": "NOUN|Case=Loc",
297
+ "276": "NOUN|Case=Loc|l-acl",
298
+ "277": "NOUN|Case=Loc|l-advcl",
299
+ "278": "NOUN|Case=Loc|l-amod",
300
+ "279": "NOUN|Case=Loc|l-clf",
301
+ "280": "NOUN|Case=Loc|l-compound",
302
+ "281": "NOUN|Case=Loc|l-csubj",
303
+ "282": "NOUN|Case=Loc|l-dislocated",
304
+ "283": "NOUN|Case=Loc|l-nmod",
305
+ "284": "NOUN|Case=Loc|l-nsubj",
306
+ "285": "NOUN|Case=Loc|l-nsubj:outer",
307
+ "286": "NOUN|Case=Loc|l-obj",
308
+ "287": "NOUN|Case=Loc|l-obl",
309
+ "288": "NOUN|Case=Loc|l-obl:lmod",
310
+ "289": "NOUN|Case=Loc|l-obl:tmod",
311
+ "290": "NOUN|Case=Loc|l-parataxis",
312
+ "291": "NOUN|Case=Loc|r-ccomp",
313
+ "292": "NOUN|Case=Loc|r-clf",
314
+ "293": "NOUN|Case=Loc|r-compound:redup",
315
+ "294": "NOUN|Case=Loc|r-conj",
316
+ "295": "NOUN|Case=Loc|r-dislocated",
317
+ "296": "NOUN|Case=Loc|r-flat",
318
+ "297": "NOUN|Case=Loc|r-iobj",
319
+ "298": "NOUN|Case=Loc|r-list",
320
+ "299": "NOUN|Case=Loc|r-nmod",
321
+ "300": "NOUN|Case=Loc|r-nsubj",
322
+ "301": "NOUN|Case=Loc|r-obj",
323
+ "302": "NOUN|Case=Loc|r-obl",
324
+ "303": "NOUN|Case=Loc|r-obl:lmod",
325
+ "304": "NOUN|Case=Loc|r-parataxis",
326
+ "305": "NOUN|Case=Loc|r-xcomp",
327
+ "306": "NOUN|Case=Loc|root",
328
+ "307": "NOUN|Case=Tem",
329
+ "308": "NOUN|Case=Tem|l-acl",
330
+ "309": "NOUN|Case=Tem|l-advcl",
331
+ "310": "NOUN|Case=Tem|l-amod",
332
+ "311": "NOUN|Case=Tem|l-compound",
333
+ "312": "NOUN|Case=Tem|l-csubj",
334
+ "313": "NOUN|Case=Tem|l-nmod",
335
+ "314": "NOUN|Case=Tem|l-nsubj",
336
+ "315": "NOUN|Case=Tem|l-nsubj:outer",
337
+ "316": "NOUN|Case=Tem|l-obj",
338
+ "317": "NOUN|Case=Tem|l-obl:tmod",
339
+ "318": "NOUN|Case=Tem|r-amod",
340
+ "319": "NOUN|Case=Tem|r-ccomp",
341
+ "320": "NOUN|Case=Tem|r-clf",
342
+ "321": "NOUN|Case=Tem|r-compound:redup",
343
+ "322": "NOUN|Case=Tem|r-conj",
344
+ "323": "NOUN|Case=Tem|r-flat",
345
+ "324": "NOUN|Case=Tem|r-iobj",
346
+ "325": "NOUN|Case=Tem|r-list",
347
+ "326": "NOUN|Case=Tem|r-nsubj",
348
+ "327": "NOUN|Case=Tem|r-obj",
349
+ "328": "NOUN|Case=Tem|r-obl:tmod",
350
+ "329": "NOUN|Case=Tem|r-parataxis",
351
+ "330": "NOUN|Case=Tem|r-xcomp",
352
+ "331": "NOUN|Case=Tem|root",
353
+ "332": "NOUN|Degree=Pos",
354
+ "333": "NOUN|Degree=Pos|root",
355
+ "334": "NOUN|NounType=Clf",
356
+ "335": "NOUN|NounType=Clf|l-clf",
357
+ "336": "NOUN|NounType=Clf|l-nmod",
358
+ "337": "NOUN|NounType=Clf|l-nsubj",
359
+ "338": "NOUN|NounType=Clf|l-obl",
360
+ "339": "NOUN|NounType=Clf|r-ccomp",
361
+ "340": "NOUN|NounType=Clf|r-clf",
362
+ "341": "NOUN|NounType=Clf|r-compound:redup",
363
+ "342": "NOUN|NounType=Clf|r-conj",
364
+ "343": "NOUN|NounType=Clf|r-flat",
365
+ "344": "NOUN|NounType=Clf|r-obj",
366
+ "345": "NOUN|NounType=Clf|r-parataxis",
367
+ "346": "NOUN|NounType=Clf|root",
368
+ "347": "NOUN|l-acl",
369
+ "348": "NOUN|l-advcl",
370
+ "349": "NOUN|l-amod",
371
+ "350": "NOUN|l-ccomp",
372
+ "351": "NOUN|l-clf",
373
+ "352": "NOUN|l-compound",
374
+ "353": "NOUN|l-csubj",
375
+ "354": "NOUN|l-csubj:outer",
376
+ "355": "NOUN|l-dislocated",
377
+ "356": "NOUN|l-iobj",
378
+ "357": "NOUN|l-list",
379
+ "358": "NOUN|l-nmod",
380
+ "359": "NOUN|l-nsubj",
381
+ "360": "NOUN|l-nsubj:outer",
382
+ "361": "NOUN|l-nsubj:pass",
383
+ "362": "NOUN|l-obj",
384
+ "363": "NOUN|l-obl",
385
+ "364": "NOUN|l-obl:lmod",
386
+ "365": "NOUN|l-obl:tmod",
387
+ "366": "NOUN|l-vocative",
388
+ "367": "NOUN|r-acl",
389
+ "368": "NOUN|r-advcl",
390
+ "369": "NOUN|r-amod",
391
+ "370": "NOUN|r-ccomp",
392
+ "371": "NOUN|r-clf",
393
+ "372": "NOUN|r-compound:redup",
394
+ "373": "NOUN|r-conj",
395
+ "374": "NOUN|r-csubj",
396
+ "375": "NOUN|r-dislocated",
397
+ "376": "NOUN|r-flat",
398
+ "377": "NOUN|r-flat:foreign",
399
+ "378": "NOUN|r-iobj",
400
+ "379": "NOUN|r-list",
401
+ "380": "NOUN|r-nmod",
402
+ "381": "NOUN|r-nsubj",
403
+ "382": "NOUN|r-obj",
404
+ "383": "NOUN|r-obl",
405
+ "384": "NOUN|r-obl:lmod",
406
+ "385": "NOUN|r-parataxis",
407
+ "386": "NOUN|r-vocative",
408
+ "387": "NOUN|r-xcomp",
409
+ "388": "NOUN|root",
410
+ "389": "NUM",
411
+ "390": "NUM|NumType=Ord",
412
+ "391": "NUM|NumType=Ord|l-nsubj",
413
+ "392": "NUM|NumType=Ord|l-nummod",
414
+ "393": "NUM|NumType=Ord|l-obl",
415
+ "394": "NUM|NumType=Ord|l-obl:lmod",
416
+ "395": "NUM|NumType=Ord|l-obl:tmod",
417
+ "396": "NUM|NumType=Ord|r-conj",
418
+ "397": "NUM|NumType=Ord|r-flat",
419
+ "398": "NUM|NumType=Ord|r-obj",
420
+ "399": "NUM|NumType=Ord|root",
421
+ "400": "NUM|l-acl",
422
+ "401": "NUM|l-advcl",
423
+ "402": "NUM|l-compound",
424
+ "403": "NUM|l-csubj",
425
+ "404": "NUM|l-dislocated",
426
+ "405": "NUM|l-nsubj",
427
+ "406": "NUM|l-nsubj:outer",
428
+ "407": "NUM|l-nummod",
429
+ "408": "NUM|l-obj",
430
+ "409": "NUM|l-obl",
431
+ "410": "NUM|l-obl:lmod",
432
+ "411": "NUM|l-obl:tmod",
433
+ "412": "NUM|r-ccomp",
434
+ "413": "NUM|r-clf",
435
+ "414": "NUM|r-compound",
436
+ "415": "NUM|r-compound:redup",
437
+ "416": "NUM|r-conj",
438
+ "417": "NUM|r-flat",
439
+ "418": "NUM|r-iobj",
440
+ "419": "NUM|r-list",
441
+ "420": "NUM|r-nummod",
442
+ "421": "NUM|r-obj",
443
+ "422": "NUM|r-obl",
444
+ "423": "NUM|r-obl:tmod",
445
+ "424": "NUM|r-parataxis",
446
+ "425": "NUM|r-xcomp",
447
+ "426": "NUM|root",
448
+ "427": "PART",
449
+ "428": "PART|l-acl",
450
+ "429": "PART|l-advcl",
451
+ "430": "PART|l-advmod",
452
+ "431": "PART|l-amod",
453
+ "432": "PART|l-case",
454
+ "433": "PART|l-cc",
455
+ "434": "PART|l-csubj",
456
+ "435": "PART|l-csubj:outer",
457
+ "436": "PART|l-discourse",
458
+ "437": "PART|l-discourse:sp",
459
+ "438": "PART|l-dislocated",
460
+ "439": "PART|l-mark",
461
+ "440": "PART|l-nmod",
462
+ "441": "PART|l-nsubj",
463
+ "442": "PART|l-nsubj:outer",
464
+ "443": "PART|l-nsubj:pass",
465
+ "444": "PART|l-obj",
466
+ "445": "PART|l-obl",
467
+ "446": "PART|l-obl:lmod",
468
+ "447": "PART|r-advmod",
469
+ "448": "PART|r-case",
470
+ "449": "PART|r-ccomp",
471
+ "450": "PART|r-clf",
472
+ "451": "PART|r-conj",
473
+ "452": "PART|r-discourse",
474
+ "453": "PART|r-discourse:sp",
475
+ "454": "PART|r-dislocated",
476
+ "455": "PART|r-fixed",
477
+ "456": "PART|r-flat",
478
+ "457": "PART|r-iobj",
479
+ "458": "PART|r-list",
480
+ "459": "PART|r-mark",
481
+ "460": "PART|r-nsubj",
482
+ "461": "PART|r-obj",
483
+ "462": "PART|r-obl",
484
+ "463": "PART|r-parataxis",
485
+ "464": "PART|r-xcomp",
486
+ "465": "PART|root",
487
+ "466": "PRON|Person=1|PronType=Prs",
488
+ "467": "PRON|Person=1|PronType=Prs|l-acl",
489
+ "468": "PRON|Person=1|PronType=Prs|l-advcl",
490
+ "469": "PRON|Person=1|PronType=Prs|l-det",
491
+ "470": "PRON|Person=1|PronType=Prs|l-iobj",
492
+ "471": "PRON|Person=1|PronType=Prs|l-nsubj",
493
+ "472": "PRON|Person=1|PronType=Prs|l-nsubj:outer",
494
+ "473": "PRON|Person=1|PronType=Prs|l-obj",
495
+ "474": "PRON|Person=1|PronType=Prs|l-obl",
496
+ "475": "PRON|Person=1|PronType=Prs|l-vocative",
497
+ "476": "PRON|Person=1|PronType=Prs|r-ccomp",
498
+ "477": "PRON|Person=1|PronType=Prs|r-conj",
499
+ "478": "PRON|Person=1|PronType=Prs|r-iobj",
500
+ "479": "PRON|Person=1|PronType=Prs|r-nsubj",
501
+ "480": "PRON|Person=1|PronType=Prs|r-obj",
502
+ "481": "PRON|Person=1|PronType=Prs|r-obl",
503
+ "482": "PRON|Person=1|PronType=Prs|r-obl:lmod",
504
+ "483": "PRON|Person=1|PronType=Prs|root",
505
+ "484": "PRON|Person=2|PronType=Prs",
506
+ "485": "PRON|Person=2|PronType=Prs|l-advcl",
507
+ "486": "PRON|Person=2|PronType=Prs|l-amod",
508
+ "487": "PRON|Person=2|PronType=Prs|l-det",
509
+ "488": "PRON|Person=2|PronType=Prs|l-nsubj",
510
+ "489": "PRON|Person=2|PronType=Prs|l-nsubj:outer",
511
+ "490": "PRON|Person=2|PronType=Prs|l-obj",
512
+ "491": "PRON|Person=2|PronType=Prs|l-obl",
513
+ "492": "PRON|Person=2|PronType=Prs|l-vocative",
514
+ "493": "PRON|Person=2|PronType=Prs|r-conj",
515
+ "494": "PRON|Person=2|PronType=Prs|r-flat",
516
+ "495": "PRON|Person=2|PronType=Prs|r-iobj",
517
+ "496": "PRON|Person=2|PronType=Prs|r-obj",
518
+ "497": "PRON|Person=2|PronType=Prs|r-obl",
519
+ "498": "PRON|Person=2|PronType=Prs|root",
520
+ "499": "PRON|Person=3|PronType=Prs",
521
+ "500": "PRON|Person=3|PronType=Prs|l-advcl",
522
+ "501": "PRON|Person=3|PronType=Prs|l-amod",
523
+ "502": "PRON|Person=3|PronType=Prs|l-det",
524
+ "503": "PRON|Person=3|PronType=Prs|l-dislocated",
525
+ "504": "PRON|Person=3|PronType=Prs|l-expl",
526
+ "505": "PRON|Person=3|PronType=Prs|l-iobj",
527
+ "506": "PRON|Person=3|PronType=Prs|l-nsubj",
528
+ "507": "PRON|Person=3|PronType=Prs|l-nsubj:outer",
529
+ "508": "PRON|Person=3|PronType=Prs|l-nsubj:pass",
530
+ "509": "PRON|Person=3|PronType=Prs|l-obj",
531
+ "510": "PRON|Person=3|PronType=Prs|l-obl",
532
+ "511": "PRON|Person=3|PronType=Prs|r-ccomp",
533
+ "512": "PRON|Person=3|PronType=Prs|r-conj",
534
+ "513": "PRON|Person=3|PronType=Prs|r-expl",
535
+ "514": "PRON|Person=3|PronType=Prs|r-iobj",
536
+ "515": "PRON|Person=3|PronType=Prs|r-nsubj",
537
+ "516": "PRON|Person=3|PronType=Prs|r-obj",
538
+ "517": "PRON|Person=3|PronType=Prs|r-obl",
539
+ "518": "PRON|Person=3|PronType=Prs|root",
540
+ "519": "PRON|PronType=Dem",
541
+ "520": "PRON|PronType=Dem|l-acl",
542
+ "521": "PRON|PronType=Dem|l-advcl",
543
+ "522": "PRON|PronType=Dem|l-amod",
544
+ "523": "PRON|PronType=Dem|l-compound",
545
+ "524": "PRON|PronType=Dem|l-det",
546
+ "525": "PRON|PronType=Dem|l-dislocated",
547
+ "526": "PRON|PronType=Dem|l-expl",
548
+ "527": "PRON|PronType=Dem|l-nsubj",
549
+ "528": "PRON|PronType=Dem|l-nsubj:outer",
550
+ "529": "PRON|PronType=Dem|l-obj",
551
+ "530": "PRON|PronType=Dem|l-obl",
552
+ "531": "PRON|PronType=Dem|l-obl:lmod",
553
+ "532": "PRON|PronType=Dem|r-conj",
554
+ "533": "PRON|PronType=Dem|r-det",
555
+ "534": "PRON|PronType=Dem|r-expl",
556
+ "535": "PRON|PronType=Dem|r-flat",
557
+ "536": "PRON|PronType=Dem|r-iobj",
558
+ "537": "PRON|PronType=Dem|r-obj",
559
+ "538": "PRON|PronType=Dem|r-obl",
560
+ "539": "PRON|PronType=Dem|r-obl:lmod",
561
+ "540": "PRON|PronType=Dem|root",
562
+ "541": "PRON|PronType=Int",
563
+ "542": "PRON|PronType=Int|l-advcl",
564
+ "543": "PRON|PronType=Int|l-amod",
565
+ "544": "PRON|PronType=Int|l-det",
566
+ "545": "PRON|PronType=Int|l-dislocated",
567
+ "546": "PRON|PronType=Int|l-nsubj",
568
+ "547": "PRON|PronType=Int|l-nsubj:outer",
569
+ "548": "PRON|PronType=Int|l-obj",
570
+ "549": "PRON|PronType=Int|l-obl",
571
+ "550": "PRON|PronType=Int|l-vocative",
572
+ "551": "PRON|PronType=Int|r-ccomp",
573
+ "552": "PRON|PronType=Int|r-conj",
574
+ "553": "PRON|PronType=Int|r-flat",
575
+ "554": "PRON|PronType=Int|r-obj",
576
+ "555": "PRON|PronType=Int|r-parataxis",
577
+ "556": "PRON|PronType=Int|r-xcomp",
578
+ "557": "PRON|PronType=Int|root",
579
+ "558": "PRON|PronType=Prs",
580
+ "559": "PRON|PronType=Prs|Reflex=Yes",
581
+ "560": "PRON|PronType=Prs|Reflex=Yes|l-acl",
582
+ "561": "PRON|PronType=Prs|Reflex=Yes|l-det",
583
+ "562": "PRON|PronType=Prs|Reflex=Yes|l-nsubj",
584
+ "563": "PRON|PronType=Prs|Reflex=Yes|l-obj",
585
+ "564": "PRON|PronType=Prs|Reflex=Yes|l-obl",
586
+ "565": "PRON|PronType=Prs|Reflex=Yes|r-dislocated",
587
+ "566": "PRON|PronType=Prs|Reflex=Yes|r-obj",
588
+ "567": "PRON|PronType=Prs|Reflex=Yes|r-obl",
589
+ "568": "PRON|PronType=Prs|Reflex=Yes|root",
590
+ "569": "PRON|PronType=Prs|l-det",
591
+ "570": "PRON|PronType=Prs|l-nsubj",
592
+ "571": "PRON|PronType=Prs|l-nsubj:outer",
593
+ "572": "PRON|PronType=Prs|l-obj",
594
+ "573": "PRON|PronType=Prs|r-conj",
595
+ "574": "PRON|PronType=Prs|r-iobj",
596
+ "575": "PRON|PronType=Prs|r-obj",
597
+ "576": "PROPN",
598
+ "577": "PROPN|Case=Loc|NameType=Geo",
599
+ "578": "PROPN|Case=Loc|NameType=Geo|l-acl",
600
+ "579": "PROPN|Case=Loc|NameType=Geo|l-advcl",
601
+ "580": "PROPN|Case=Loc|NameType=Geo|l-amod",
602
+ "581": "PROPN|Case=Loc|NameType=Geo|l-compound",
603
+ "582": "PROPN|Case=Loc|NameType=Geo|l-csubj",
604
+ "583": "PROPN|Case=Loc|NameType=Geo|l-dislocated",
605
+ "584": "PROPN|Case=Loc|NameType=Geo|l-nmod",
606
+ "585": "PROPN|Case=Loc|NameType=Geo|l-nsubj",
607
+ "586": "PROPN|Case=Loc|NameType=Geo|l-nsubj:outer",
608
+ "587": "PROPN|Case=Loc|NameType=Geo|l-obl",
609
+ "588": "PROPN|Case=Loc|NameType=Geo|l-obl:lmod",
610
+ "589": "PROPN|Case=Loc|NameType=Geo|r-conj",
611
+ "590": "PROPN|Case=Loc|NameType=Geo|r-flat",
612
+ "591": "PROPN|Case=Loc|NameType=Geo|r-iobj",
613
+ "592": "PROPN|Case=Loc|NameType=Geo|r-obj",
614
+ "593": "PROPN|Case=Loc|NameType=Geo|r-obl",
615
+ "594": "PROPN|Case=Loc|NameType=Geo|r-obl:lmod",
616
+ "595": "PROPN|Case=Loc|NameType=Geo|r-parataxis",
617
+ "596": "PROPN|Case=Loc|NameType=Geo|r-xcomp",
618
+ "597": "PROPN|Case=Loc|NameType=Geo|root",
619
+ "598": "PROPN|Case=Loc|NameType=Nat",
620
+ "599": "PROPN|Case=Loc|NameType=Nat|l-acl",
621
+ "600": "PROPN|Case=Loc|NameType=Nat|l-advcl",
622
+ "601": "PROPN|Case=Loc|NameType=Nat|l-amod",
623
+ "602": "PROPN|Case=Loc|NameType=Nat|l-clf",
624
+ "603": "PROPN|Case=Loc|NameType=Nat|l-compound",
625
+ "604": "PROPN|Case=Loc|NameType=Nat|l-nmod",
626
+ "605": "PROPN|Case=Loc|NameType=Nat|l-nsubj",
627
+ "606": "PROPN|Case=Loc|NameType=Nat|l-nsubj:outer",
628
+ "607": "PROPN|Case=Loc|NameType=Nat|l-nsubj:pass",
629
+ "608": "PROPN|Case=Loc|NameType=Nat|l-obj",
630
+ "609": "PROPN|Case=Loc|NameType=Nat|l-obl",
631
+ "610": "PROPN|Case=Loc|NameType=Nat|l-obl:lmod",
632
+ "611": "PROPN|Case=Loc|NameType=Nat|r-ccomp",
633
+ "612": "PROPN|Case=Loc|NameType=Nat|r-conj",
634
+ "613": "PROPN|Case=Loc|NameType=Nat|r-flat",
635
+ "614": "PROPN|Case=Loc|NameType=Nat|r-iobj",
636
+ "615": "PROPN|Case=Loc|NameType=Nat|r-nmod",
637
+ "616": "PROPN|Case=Loc|NameType=Nat|r-obj",
638
+ "617": "PROPN|Case=Loc|NameType=Nat|r-obl",
639
+ "618": "PROPN|Case=Loc|NameType=Nat|r-obl:lmod",
640
+ "619": "PROPN|Case=Loc|NameType=Nat|r-parataxis",
641
+ "620": "PROPN|Case=Loc|NameType=Nat|r-xcomp",
642
+ "621": "PROPN|Case=Loc|NameType=Nat|root",
643
+ "622": "PROPN|NameType=Giv",
644
+ "623": "PROPN|NameType=Giv|l-acl",
645
+ "624": "PROPN|NameType=Giv|l-advcl",
646
+ "625": "PROPN|NameType=Giv|l-amod",
647
+ "626": "PROPN|NameType=Giv|l-compound",
648
+ "627": "PROPN|NameType=Giv|l-dislocated",
649
+ "628": "PROPN|NameType=Giv|l-nmod",
650
+ "629": "PROPN|NameType=Giv|l-nsubj",
651
+ "630": "PROPN|NameType=Giv|l-nsubj:outer",
652
+ "631": "PROPN|NameType=Giv|l-nsubj:pass",
653
+ "632": "PROPN|NameType=Giv|l-obj",
654
+ "633": "PROPN|NameType=Giv|l-obl",
655
+ "634": "PROPN|NameType=Giv|l-obl:lmod",
656
+ "635": "PROPN|NameType=Giv|l-parataxis",
657
+ "636": "PROPN|NameType=Giv|l-vocative",
658
+ "637": "PROPN|NameType=Giv|r-ccomp",
659
+ "638": "PROPN|NameType=Giv|r-conj",
660
+ "639": "PROPN|NameType=Giv|r-dislocated",
661
+ "640": "PROPN|NameType=Giv|r-flat",
662
+ "641": "PROPN|NameType=Giv|r-iobj",
663
+ "642": "PROPN|NameType=Giv|r-list",
664
+ "643": "PROPN|NameType=Giv|r-nmod",
665
+ "644": "PROPN|NameType=Giv|r-obj",
666
+ "645": "PROPN|NameType=Giv|r-obl",
667
+ "646": "PROPN|NameType=Giv|r-obl:lmod",
668
+ "647": "PROPN|NameType=Giv|r-parataxis",
669
+ "648": "PROPN|NameType=Giv|r-xcomp",
670
+ "649": "PROPN|NameType=Giv|root",
671
+ "650": "PROPN|NameType=Prs",
672
+ "651": "PROPN|NameType=Prs|l-acl",
673
+ "652": "PROPN|NameType=Prs|l-advcl",
674
+ "653": "PROPN|NameType=Prs|l-amod",
675
+ "654": "PROPN|NameType=Prs|l-compound",
676
+ "655": "PROPN|NameType=Prs|l-dislocated",
677
+ "656": "PROPN|NameType=Prs|l-nmod",
678
+ "657": "PROPN|NameType=Prs|l-nsubj",
679
+ "658": "PROPN|NameType=Prs|l-nsubj:outer",
680
+ "659": "PROPN|NameType=Prs|l-obj",
681
+ "660": "PROPN|NameType=Prs|l-obl",
682
+ "661": "PROPN|NameType=Prs|r-conj",
683
+ "662": "PROPN|NameType=Prs|r-dislocated",
684
+ "663": "PROPN|NameType=Prs|r-flat",
685
+ "664": "PROPN|NameType=Prs|r-iobj",
686
+ "665": "PROPN|NameType=Prs|r-obj",
687
+ "666": "PROPN|NameType=Prs|r-obl",
688
+ "667": "PROPN|NameType=Prs|r-parataxis",
689
+ "668": "PROPN|NameType=Prs|root",
690
+ "669": "PROPN|NameType=Sur",
691
+ "670": "PROPN|NameType=Sur|l-acl",
692
+ "671": "PROPN|NameType=Sur|l-advcl",
693
+ "672": "PROPN|NameType=Sur|l-amod",
694
+ "673": "PROPN|NameType=Sur|l-compound",
695
+ "674": "PROPN|NameType=Sur|l-csubj",
696
+ "675": "PROPN|NameType=Sur|l-dislocated",
697
+ "676": "PROPN|NameType=Sur|l-nmod",
698
+ "677": "PROPN|NameType=Sur|l-nsubj",
699
+ "678": "PROPN|NameType=Sur|l-nsubj:outer",
700
+ "679": "PROPN|NameType=Sur|l-nsubj:pass",
701
+ "680": "PROPN|NameType=Sur|l-obl",
702
+ "681": "PROPN|NameType=Sur|l-obl:lmod",
703
+ "682": "PROPN|NameType=Sur|l-vocative",
704
+ "683": "PROPN|NameType=Sur|r-ccomp",
705
+ "684": "PROPN|NameType=Sur|r-conj",
706
+ "685": "PROPN|NameType=Sur|r-dislocated",
707
+ "686": "PROPN|NameType=Sur|r-flat",
708
+ "687": "PROPN|NameType=Sur|r-iobj",
709
+ "688": "PROPN|NameType=Sur|r-list",
710
+ "689": "PROPN|NameType=Sur|r-nmod",
711
+ "690": "PROPN|NameType=Sur|r-nsubj",
712
+ "691": "PROPN|NameType=Sur|r-obj",
713
+ "692": "PROPN|NameType=Sur|r-obl",
714
+ "693": "PROPN|NameType=Sur|r-obl:lmod",
715
+ "694": "PROPN|NameType=Sur|r-parataxis",
716
+ "695": "PROPN|NameType=Sur|r-xcomp",
717
+ "696": "PROPN|NameType=Sur|root",
718
+ "697": "PROPN|l-nmod",
719
+ "698": "PUNCT",
720
+ "699": "PUNCT|root",
721
+ "700": "SCONJ",
722
+ "701": "SCONJ|l-case",
723
+ "702": "SCONJ|l-cc",
724
+ "703": "SCONJ|l-mark",
725
+ "704": "SCONJ|l-nsubj",
726
+ "705": "SCONJ|l-obl",
727
+ "706": "SCONJ|r-case",
728
+ "707": "SCONJ|r-iobj",
729
+ "708": "SCONJ|r-mark",
730
+ "709": "SCONJ|r-nsubj",
731
+ "710": "SCONJ|r-nsubj:pass",
732
+ "711": "SCONJ|r-obj",
733
+ "712": "SCONJ|root",
734
+ "713": "SYM",
735
+ "714": "SYM|l-nmod",
736
+ "715": "SYM|l-nsubj",
737
+ "716": "SYM|r-conj",
738
+ "717": "SYM|r-nmod",
739
+ "718": "SYM|r-xcomp",
740
+ "719": "SYM|root",
741
+ "720": "VERB",
742
+ "721": "VERB|Degree=Equ",
743
+ "722": "VERB|Degree=Equ|VerbForm=Part",
744
+ "723": "VERB|Degree=Equ|VerbForm=Part|l-amod",
745
+ "724": "VERB|Degree=Equ|l-acl",
746
+ "725": "VERB|Degree=Equ|l-advcl",
747
+ "726": "VERB|Degree=Equ|l-ccomp",
748
+ "727": "VERB|Degree=Equ|l-csubj",
749
+ "728": "VERB|Degree=Equ|l-nsubj",
750
+ "729": "VERB|Degree=Equ|l-obj",
751
+ "730": "VERB|Degree=Equ|r-ccomp",
752
+ "731": "VERB|Degree=Equ|r-compound:redup",
753
+ "732": "VERB|Degree=Equ|r-conj",
754
+ "733": "VERB|Degree=Equ|r-obj",
755
+ "734": "VERB|Degree=Equ|r-parataxis",
756
+ "735": "VERB|Degree=Equ|r-xcomp",
757
+ "736": "VERB|Degree=Equ|root",
758
+ "737": "VERB|Degree=Pos",
759
+ "738": "VERB|Degree=Pos|VerbForm=Part",
760
+ "739": "VERB|Degree=Pos|VerbForm=Part|l-amod",
761
+ "740": "VERB|Degree=Pos|VerbForm=Part|r-amod",
762
+ "741": "VERB|Degree=Pos|l-acl",
763
+ "742": "VERB|Degree=Pos|l-advcl",
764
+ "743": "VERB|Degree=Pos|l-ccomp",
765
+ "744": "VERB|Degree=Pos|l-csubj",
766
+ "745": "VERB|Degree=Pos|l-csubj:outer",
767
+ "746": "VERB|Degree=Pos|l-dislocated",
768
+ "747": "VERB|Degree=Pos|l-nsubj",
769
+ "748": "VERB|Degree=Pos|l-nsubj:outer",
770
+ "749": "VERB|Degree=Pos|l-obj",
771
+ "750": "VERB|Degree=Pos|l-obl",
772
+ "751": "VERB|Degree=Pos|l-vocative",
773
+ "752": "VERB|Degree=Pos|r-advcl",
774
+ "753": "VERB|Degree=Pos|r-ccomp",
775
+ "754": "VERB|Degree=Pos|r-compound:redup",
776
+ "755": "VERB|Degree=Pos|r-conj",
777
+ "756": "VERB|Degree=Pos|r-dislocated",
778
+ "757": "VERB|Degree=Pos|r-fixed",
779
+ "758": "VERB|Degree=Pos|r-flat:vv",
780
+ "759": "VERB|Degree=Pos|r-iobj",
781
+ "760": "VERB|Degree=Pos|r-obj",
782
+ "761": "VERB|Degree=Pos|r-obl",
783
+ "762": "VERB|Degree=Pos|r-parataxis",
784
+ "763": "VERB|Degree=Pos|r-xcomp",
785
+ "764": "VERB|Degree=Pos|root",
786
+ "765": "VERB|Polarity=Neg",
787
+ "766": "VERB|Polarity=Neg|VerbForm=Part",
788
+ "767": "VERB|Polarity=Neg|VerbForm=Part|l-amod",
789
+ "768": "VERB|Polarity=Neg|l-acl",
790
+ "769": "VERB|Polarity=Neg|l-advcl",
791
+ "770": "VERB|Polarity=Neg|l-ccomp",
792
+ "771": "VERB|Polarity=Neg|l-csubj",
793
+ "772": "VERB|Polarity=Neg|l-csubj:outer",
794
+ "773": "VERB|Polarity=Neg|l-nsubj",
795
+ "774": "VERB|Polarity=Neg|l-obl",
796
+ "775": "VERB|Polarity=Neg|r-advcl",
797
+ "776": "VERB|Polarity=Neg|r-ccomp",
798
+ "777": "VERB|Polarity=Neg|r-conj",
799
+ "778": "VERB|Polarity=Neg|r-flat:vv",
800
+ "779": "VERB|Polarity=Neg|r-obj",
801
+ "780": "VERB|Polarity=Neg|r-obl",
802
+ "781": "VERB|Polarity=Neg|r-parataxis",
803
+ "782": "VERB|Polarity=Neg|r-xcomp",
804
+ "783": "VERB|Polarity=Neg|root",
805
+ "784": "VERB|VerbForm=Part",
806
+ "785": "VERB|VerbForm=Part|l-amod",
807
+ "786": "VERB|VerbForm=Part|r-amod",
808
+ "787": "VERB|l-acl",
809
+ "788": "VERB|l-advcl",
810
+ "789": "VERB|l-ccomp",
811
+ "790": "VERB|l-csubj",
812
+ "791": "VERB|l-csubj:outer",
813
+ "792": "VERB|l-csubj:pass",
814
+ "793": "VERB|l-dislocated",
815
+ "794": "VERB|l-nsubj",
816
+ "795": "VERB|l-nsubj:outer",
817
+ "796": "VERB|l-obj",
818
+ "797": "VERB|l-obl",
819
+ "798": "VERB|l-obl:lmod",
820
+ "799": "VERB|l-parataxis",
821
+ "800": "VERB|r-acl",
822
+ "801": "VERB|r-advcl",
823
+ "802": "VERB|r-ccomp",
824
+ "803": "VERB|r-compound:redup",
825
+ "804": "VERB|r-conj",
826
+ "805": "VERB|r-dislocated",
827
+ "806": "VERB|r-fixed",
828
+ "807": "VERB|r-flat:vv",
829
+ "808": "VERB|r-iobj",
830
+ "809": "VERB|r-list",
831
+ "810": "VERB|r-obj",
832
+ "811": "VERB|r-obl",
833
+ "812": "VERB|r-obl:lmod",
834
+ "813": "VERB|r-parataxis",
835
+ "814": "VERB|r-vocative",
836
+ "815": "VERB|r-xcomp",
837
+ "816": "VERB|root"
838
+ },
839
+ "initializer_range": 0.02,
840
+ "intermediate_size": 8960,
841
+ "label2id": {
842
+ "ADP": 0,
843
+ "ADP|Degree=Equ": 1,
844
+ "ADP|Degree=Equ|l-cc": 2,
845
+ "ADP|l-acl": 3,
846
+ "ADP|l-advcl": 4,
847
+ "ADP|l-amod": 5,
848
+ "ADP|l-case": 6,
849
+ "ADP|l-cc": 7,
850
+ "ADP|l-mark": 8,
851
+ "ADP|l-nsubj": 9,
852
+ "ADP|l-obl": 10,
853
+ "ADP|r-case": 11,
854
+ "ADP|r-conj": 12,
855
+ "ADP|r-fixed": 13,
856
+ "ADP|r-mark": 14,
857
+ "ADP|r-obj": 15,
858
+ "ADP|root": 16,
859
+ "ADV": 17,
860
+ "ADV|AdvType=Cau": 18,
861
+ "ADV|AdvType=Cau|l-advmod": 19,
862
+ "ADV|AdvType=Cau|l-amod": 20,
863
+ "ADV|AdvType=Cau|l-nsubj": 21,
864
+ "ADV|AdvType=Cau|l-obj": 22,
865
+ "ADV|AdvType=Deg|Degree=Cmp": 23,
866
+ "ADV|AdvType=Deg|Degree=Cmp|l-advmod": 24,
867
+ "ADV|AdvType=Deg|Degree=Cmp|l-amod": 25,
868
+ "ADV|AdvType=Deg|Degree=Cmp|r-conj": 26,
869
+ "ADV|AdvType=Deg|Degree=Cmp|r-obj": 27,
870
+ "ADV|AdvType=Deg|Degree=Pos": 28,
871
+ "ADV|AdvType=Deg|Degree=Pos|l-advmod": 29,
872
+ "ADV|AdvType=Deg|Degree=Pos|l-amod": 30,
873
+ "ADV|AdvType=Deg|Degree=Pos|r-ccomp": 31,
874
+ "ADV|AdvType=Deg|Degree=Pos|r-conj": 32,
875
+ "ADV|AdvType=Deg|Degree=Pos|r-flat:vv": 33,
876
+ "ADV|AdvType=Deg|Degree=Pos|r-parataxis": 34,
877
+ "ADV|AdvType=Deg|Degree=Pos|root": 35,
878
+ "ADV|AdvType=Deg|Degree=Sup": 36,
879
+ "ADV|AdvType=Deg|Degree=Sup|l-advmod": 37,
880
+ "ADV|AdvType=Deg|Degree=Sup|l-amod": 38,
881
+ "ADV|AdvType=Deg|Degree=Sup|l-nsubj": 39,
882
+ "ADV|AdvType=Deg|Degree=Sup|r-conj": 40,
883
+ "ADV|AdvType=Deg|Degree=Sup|r-parataxis": 41,
884
+ "ADV|AdvType=Deg|Degree=Sup|root": 42,
885
+ "ADV|AdvType=Tim": 43,
886
+ "ADV|AdvType=Tim|Aspect=Perf": 44,
887
+ "ADV|AdvType=Tim|Aspect=Perf|l-advmod": 45,
888
+ "ADV|AdvType=Tim|Aspect=Perf|l-amod": 46,
889
+ "ADV|AdvType=Tim|Aspect=Perf|l-obl:lmod": 47,
890
+ "ADV|AdvType=Tim|Aspect=Perf|r-parataxis": 48,
891
+ "ADV|AdvType=Tim|Aspect=Perf|root": 49,
892
+ "ADV|AdvType=Tim|Tense=Fut": 50,
893
+ "ADV|AdvType=Tim|Tense=Fut|l-advmod": 51,
894
+ "ADV|AdvType=Tim|Tense=Fut|l-amod": 52,
895
+ "ADV|AdvType=Tim|Tense=Fut|l-nsubj": 53,
896
+ "ADV|AdvType=Tim|Tense=Fut|l-nsubj:outer": 54,
897
+ "ADV|AdvType=Tim|Tense=Fut|root": 55,
898
+ "ADV|AdvType=Tim|Tense=Past": 56,
899
+ "ADV|AdvType=Tim|Tense=Past|l-advmod": 57,
900
+ "ADV|AdvType=Tim|Tense=Past|l-amod": 58,
901
+ "ADV|AdvType=Tim|Tense=Pres": 59,
902
+ "ADV|AdvType=Tim|Tense=Pres|l-advmod": 60,
903
+ "ADV|AdvType=Tim|Tense=Pres|l-amod": 61,
904
+ "ADV|AdvType=Tim|Tense=Pres|root": 62,
905
+ "ADV|AdvType=Tim|l-advcl": 63,
906
+ "ADV|AdvType=Tim|l-advmod": 64,
907
+ "ADV|AdvType=Tim|l-amod": 65,
908
+ "ADV|AdvType=Tim|l-nsubj": 66,
909
+ "ADV|AdvType=Tim|r-advmod": 67,
910
+ "ADV|AdvType=Tim|r-ccomp": 68,
911
+ "ADV|AdvType=Tim|r-compound:redup": 69,
912
+ "ADV|AdvType=Tim|r-conj": 70,
913
+ "ADV|AdvType=Tim|r-flat:vv": 71,
914
+ "ADV|AdvType=Tim|r-parataxis": 72,
915
+ "ADV|AdvType=Tim|root": 73,
916
+ "ADV|Degree=Equ|VerbForm=Conv": 74,
917
+ "ADV|Degree=Equ|VerbForm=Conv|l-advmod": 75,
918
+ "ADV|Degree=Pos|VerbForm=Conv": 76,
919
+ "ADV|Degree=Pos|VerbForm=Conv|l-advmod": 77,
920
+ "ADV|Degree=Pos|VerbForm=Conv|r-advmod": 78,
921
+ "ADV|Polarity=Neg": 79,
922
+ "ADV|Polarity=Neg|VerbForm=Conv": 80,
923
+ "ADV|Polarity=Neg|VerbForm=Conv|l-advmod": 81,
924
+ "ADV|Polarity=Neg|l-advmod": 82,
925
+ "ADV|Polarity=Neg|l-amod": 83,
926
+ "ADV|Polarity=Neg|l-nsubj": 84,
927
+ "ADV|Polarity=Neg|l-parataxis": 85,
928
+ "ADV|Polarity=Neg|r-advmod": 86,
929
+ "ADV|Polarity=Neg|r-conj": 87,
930
+ "ADV|Polarity=Neg|r-obj": 88,
931
+ "ADV|Polarity=Neg|r-parataxis": 89,
932
+ "ADV|Polarity=Neg|root": 90,
933
+ "ADV|VerbForm=Conv": 91,
934
+ "ADV|VerbForm=Conv|l-advmod": 92,
935
+ "ADV|VerbForm=Conv|r-advmod": 93,
936
+ "ADV|l-acl": 94,
937
+ "ADV|l-advcl": 95,
938
+ "ADV|l-advmod": 96,
939
+ "ADV|l-amod": 97,
940
+ "ADV|l-cc": 98,
941
+ "ADV|l-nsubj": 99,
942
+ "ADV|r-advmod": 100,
943
+ "ADV|r-ccomp": 101,
944
+ "ADV|r-conj": 102,
945
+ "ADV|r-flat:vv": 103,
946
+ "ADV|r-obj": 104,
947
+ "ADV|root": 105,
948
+ "AUX|Mood=Des": 106,
949
+ "AUX|Mood=Des|l-aux": 107,
950
+ "AUX|Mood=Des|l-csubj": 108,
951
+ "AUX|Mood=Des|l-parataxis": 109,
952
+ "AUX|Mood=Des|r-ccomp": 110,
953
+ "AUX|Mood=Des|r-conj": 111,
954
+ "AUX|Mood=Des|r-flat:vv": 112,
955
+ "AUX|Mood=Des|root": 113,
956
+ "AUX|Mood=Nec": 114,
957
+ "AUX|Mood=Nec|l-acl": 115,
958
+ "AUX|Mood=Nec|l-amod": 116,
959
+ "AUX|Mood=Nec|l-aux": 117,
960
+ "AUX|Mood=Nec|r-aux": 118,
961
+ "AUX|Mood=Nec|root": 119,
962
+ "AUX|Mood=Pot": 120,
963
+ "AUX|Mood=Pot|l-acl": 121,
964
+ "AUX|Mood=Pot|l-advcl": 122,
965
+ "AUX|Mood=Pot|l-amod": 123,
966
+ "AUX|Mood=Pot|l-aux": 124,
967
+ "AUX|Mood=Pot|l-csubj": 125,
968
+ "AUX|Mood=Pot|l-nsubj": 126,
969
+ "AUX|Mood=Pot|r-ccomp": 127,
970
+ "AUX|Mood=Pot|r-conj": 128,
971
+ "AUX|Mood=Pot|r-obj": 129,
972
+ "AUX|Mood=Pot|r-parataxis": 130,
973
+ "AUX|Mood=Pot|r-xcomp": 131,
974
+ "AUX|Mood=Pot|root": 132,
975
+ "AUX|VerbType=Cop": 133,
976
+ "AUX|VerbType=Cop|l-cop": 134,
977
+ "AUX|Voice=Pass": 135,
978
+ "AUX|Voice=Pass|l-aux": 136,
979
+ "AUX|Voice=Pass|r-conj": 137,
980
+ "AUX|Voice=Pass|root": 138,
981
+ "B-ADP": 139,
982
+ "B-ADP|Degree=Equ": 140,
983
+ "B-ADV": 141,
984
+ "B-ADV|AdvType=Cau": 142,
985
+ "B-ADV|AdvType=Deg|Degree=Cmp": 143,
986
+ "B-ADV|AdvType=Deg|Degree=Pos": 144,
987
+ "B-ADV|AdvType=Deg|Degree=Sup": 145,
988
+ "B-ADV|AdvType=Tim": 146,
989
+ "B-ADV|AdvType=Tim|Aspect=Perf": 147,
990
+ "B-ADV|AdvType=Tim|Tense=Fut": 148,
991
+ "B-ADV|AdvType=Tim|Tense=Past": 149,
992
+ "B-ADV|AdvType=Tim|Tense=Pres": 150,
993
+ "B-ADV|Degree=Equ|VerbForm=Conv": 151,
994
+ "B-ADV|Degree=Pos|VerbForm=Conv": 152,
995
+ "B-ADV|Polarity=Neg": 153,
996
+ "B-ADV|Polarity=Neg|VerbForm=Conv": 154,
997
+ "B-ADV|VerbForm=Conv": 155,
998
+ "B-AUX|Mood=Des": 156,
999
+ "B-AUX|Mood=Nec": 157,
1000
+ "B-AUX|Mood=Pot": 158,
1001
+ "B-AUX|VerbType=Cop": 159,
1002
+ "B-AUX|Voice=Pass": 160,
1003
+ "B-CCONJ": 161,
1004
+ "B-INTJ": 162,
1005
+ "B-NOUN": 163,
1006
+ "B-NOUN|Case=Loc": 164,
1007
+ "B-NOUN|Case=Tem": 165,
1008
+ "B-NOUN|Degree=Pos": 166,
1009
+ "B-NOUN|NounType=Clf": 167,
1010
+ "B-NUM": 168,
1011
+ "B-NUM|NumType=Ord": 169,
1012
+ "B-PART": 170,
1013
+ "B-PRON|Person=1|PronType=Prs": 171,
1014
+ "B-PRON|Person=2|PronType=Prs": 172,
1015
+ "B-PRON|Person=3|PronType=Prs": 173,
1016
+ "B-PRON|PronType=Dem": 174,
1017
+ "B-PRON|PronType=Int": 175,
1018
+ "B-PRON|PronType=Prs": 176,
1019
+ "B-PRON|PronType=Prs|Reflex=Yes": 177,
1020
+ "B-PROPN": 178,
1021
+ "B-PROPN|Case=Loc|NameType=Geo": 179,
1022
+ "B-PROPN|Case=Loc|NameType=Nat": 180,
1023
+ "B-PROPN|NameType=Giv": 181,
1024
+ "B-PROPN|NameType=Prs": 182,
1025
+ "B-PROPN|NameType=Sur": 183,
1026
+ "B-PUNCT": 184,
1027
+ "B-SCONJ": 185,
1028
+ "B-SYM": 186,
1029
+ "B-VERB": 187,
1030
+ "B-VERB|Degree=Equ": 188,
1031
+ "B-VERB|Degree=Equ|VerbForm=Part": 189,
1032
+ "B-VERB|Degree=Pos": 190,
1033
+ "B-VERB|Degree=Pos|VerbForm=Part": 191,
1034
+ "B-VERB|Polarity=Neg": 192,
1035
+ "B-VERB|Polarity=Neg|VerbForm=Part": 193,
1036
+ "B-VERB|VerbForm=Part": 194,
1037
+ "CCONJ": 195,
1038
+ "CCONJ|l-advmod": 196,
1039
+ "CCONJ|l-amod": 197,
1040
+ "CCONJ|l-cc": 198,
1041
+ "CCONJ|l-obj": 199,
1042
+ "CCONJ|r-fixed": 200,
1043
+ "CCONJ|r-orphan": 201,
1044
+ "I-ADP": 202,
1045
+ "I-ADP|Degree=Equ": 203,
1046
+ "I-ADV": 204,
1047
+ "I-ADV|AdvType=Cau": 205,
1048
+ "I-ADV|AdvType=Deg|Degree=Cmp": 206,
1049
+ "I-ADV|AdvType=Deg|Degree=Pos": 207,
1050
+ "I-ADV|AdvType=Deg|Degree=Sup": 208,
1051
+ "I-ADV|AdvType=Tim": 209,
1052
+ "I-ADV|AdvType=Tim|Aspect=Perf": 210,
1053
+ "I-ADV|AdvType=Tim|Tense=Fut": 211,
1054
+ "I-ADV|AdvType=Tim|Tense=Past": 212,
1055
+ "I-ADV|AdvType=Tim|Tense=Pres": 213,
1056
+ "I-ADV|Degree=Equ|VerbForm=Conv": 214,
1057
+ "I-ADV|Degree=Pos|VerbForm=Conv": 215,
1058
+ "I-ADV|Polarity=Neg": 216,
1059
+ "I-ADV|Polarity=Neg|VerbForm=Conv": 217,
1060
+ "I-ADV|VerbForm=Conv": 218,
1061
+ "I-AUX|Mood=Des": 219,
1062
+ "I-AUX|Mood=Nec": 220,
1063
+ "I-AUX|Mood=Pot": 221,
1064
+ "I-AUX|VerbType=Cop": 222,
1065
+ "I-AUX|Voice=Pass": 223,
1066
+ "I-CCONJ": 224,
1067
+ "I-INTJ": 225,
1068
+ "I-NOUN": 226,
1069
+ "I-NOUN|Case=Loc": 227,
1070
+ "I-NOUN|Case=Tem": 228,
1071
+ "I-NOUN|Degree=Pos": 229,
1072
+ "I-NOUN|NounType=Clf": 230,
1073
+ "I-NUM": 231,
1074
+ "I-NUM|NumType=Ord": 232,
1075
+ "I-PART": 233,
1076
+ "I-PRON|Person=1|PronType=Prs": 234,
1077
+ "I-PRON|Person=2|PronType=Prs": 235,
1078
+ "I-PRON|Person=3|PronType=Prs": 236,
1079
+ "I-PRON|PronType=Dem": 237,
1080
+ "I-PRON|PronType=Int": 238,
1081
+ "I-PRON|PronType=Prs": 239,
1082
+ "I-PRON|PronType=Prs|Reflex=Yes": 240,
1083
+ "I-PROPN": 241,
1084
+ "I-PROPN|Case=Loc|NameType=Geo": 242,
1085
+ "I-PROPN|Case=Loc|NameType=Nat": 243,
1086
+ "I-PROPN|NameType=Giv": 244,
1087
+ "I-PROPN|NameType=Prs": 245,
1088
+ "I-PROPN|NameType=Sur": 246,
1089
+ "I-PUNCT": 247,
1090
+ "I-SCONJ": 248,
1091
+ "I-SYM": 249,
1092
+ "I-VERB": 250,
1093
+ "I-VERB|Degree=Equ": 251,
1094
+ "I-VERB|Degree=Equ|VerbForm=Part": 252,
1095
+ "I-VERB|Degree=Pos": 253,
1096
+ "I-VERB|Degree=Pos|VerbForm=Part": 254,
1097
+ "I-VERB|Polarity=Neg": 255,
1098
+ "I-VERB|Polarity=Neg|VerbForm=Part": 256,
1099
+ "I-VERB|VerbForm=Part": 257,
1100
+ "INTJ": 258,
1101
+ "INTJ|l-advcl": 259,
1102
+ "INTJ|l-csubj": 260,
1103
+ "INTJ|l-discourse": 261,
1104
+ "INTJ|l-discourse:sp": 262,
1105
+ "INTJ|l-dislocated": 263,
1106
+ "INTJ|l-nsubj": 264,
1107
+ "INTJ|l-vocative": 265,
1108
+ "INTJ|r-compound:redup": 266,
1109
+ "INTJ|r-conj": 267,
1110
+ "INTJ|r-discourse:sp": 268,
1111
+ "INTJ|r-dislocated": 269,
1112
+ "INTJ|r-fixed": 270,
1113
+ "INTJ|r-obj": 271,
1114
+ "INTJ|r-parataxis": 272,
1115
+ "INTJ|root": 273,
1116
+ "NOUN": 274,
1117
+ "NOUN|Case=Loc": 275,
1118
+ "NOUN|Case=Loc|l-acl": 276,
1119
+ "NOUN|Case=Loc|l-advcl": 277,
1120
+ "NOUN|Case=Loc|l-amod": 278,
1121
+ "NOUN|Case=Loc|l-clf": 279,
1122
+ "NOUN|Case=Loc|l-compound": 280,
1123
+ "NOUN|Case=Loc|l-csubj": 281,
1124
+ "NOUN|Case=Loc|l-dislocated": 282,
1125
+ "NOUN|Case=Loc|l-nmod": 283,
1126
+ "NOUN|Case=Loc|l-nsubj": 284,
1127
+ "NOUN|Case=Loc|l-nsubj:outer": 285,
1128
+ "NOUN|Case=Loc|l-obj": 286,
1129
+ "NOUN|Case=Loc|l-obl": 287,
1130
+ "NOUN|Case=Loc|l-obl:lmod": 288,
1131
+ "NOUN|Case=Loc|l-obl:tmod": 289,
1132
+ "NOUN|Case=Loc|l-parataxis": 290,
1133
+ "NOUN|Case=Loc|r-ccomp": 291,
1134
+ "NOUN|Case=Loc|r-clf": 292,
1135
+ "NOUN|Case=Loc|r-compound:redup": 293,
1136
+ "NOUN|Case=Loc|r-conj": 294,
1137
+ "NOUN|Case=Loc|r-dislocated": 295,
1138
+ "NOUN|Case=Loc|r-flat": 296,
1139
+ "NOUN|Case=Loc|r-iobj": 297,
1140
+ "NOUN|Case=Loc|r-list": 298,
1141
+ "NOUN|Case=Loc|r-nmod": 299,
1142
+ "NOUN|Case=Loc|r-nsubj": 300,
1143
+ "NOUN|Case=Loc|r-obj": 301,
1144
+ "NOUN|Case=Loc|r-obl": 302,
1145
+ "NOUN|Case=Loc|r-obl:lmod": 303,
1146
+ "NOUN|Case=Loc|r-parataxis": 304,
1147
+ "NOUN|Case=Loc|r-xcomp": 305,
1148
+ "NOUN|Case=Loc|root": 306,
1149
+ "NOUN|Case=Tem": 307,
1150
+ "NOUN|Case=Tem|l-acl": 308,
1151
+ "NOUN|Case=Tem|l-advcl": 309,
1152
+ "NOUN|Case=Tem|l-amod": 310,
1153
+ "NOUN|Case=Tem|l-compound": 311,
1154
+ "NOUN|Case=Tem|l-csubj": 312,
1155
+ "NOUN|Case=Tem|l-nmod": 313,
1156
+ "NOUN|Case=Tem|l-nsubj": 314,
1157
+ "NOUN|Case=Tem|l-nsubj:outer": 315,
1158
+ "NOUN|Case=Tem|l-obj": 316,
1159
+ "NOUN|Case=Tem|l-obl:tmod": 317,
1160
+ "NOUN|Case=Tem|r-amod": 318,
1161
+ "NOUN|Case=Tem|r-ccomp": 319,
1162
+ "NOUN|Case=Tem|r-clf": 320,
1163
+ "NOUN|Case=Tem|r-compound:redup": 321,
1164
+ "NOUN|Case=Tem|r-conj": 322,
1165
+ "NOUN|Case=Tem|r-flat": 323,
1166
+ "NOUN|Case=Tem|r-iobj": 324,
1167
+ "NOUN|Case=Tem|r-list": 325,
1168
+ "NOUN|Case=Tem|r-nsubj": 326,
1169
+ "NOUN|Case=Tem|r-obj": 327,
1170
+ "NOUN|Case=Tem|r-obl:tmod": 328,
1171
+ "NOUN|Case=Tem|r-parataxis": 329,
1172
+ "NOUN|Case=Tem|r-xcomp": 330,
1173
+ "NOUN|Case=Tem|root": 331,
1174
+ "NOUN|Degree=Pos": 332,
1175
+ "NOUN|Degree=Pos|root": 333,
1176
+ "NOUN|NounType=Clf": 334,
1177
+ "NOUN|NounType=Clf|l-clf": 335,
1178
+ "NOUN|NounType=Clf|l-nmod": 336,
1179
+ "NOUN|NounType=Clf|l-nsubj": 337,
1180
+ "NOUN|NounType=Clf|l-obl": 338,
1181
+ "NOUN|NounType=Clf|r-ccomp": 339,
1182
+ "NOUN|NounType=Clf|r-clf": 340,
1183
+ "NOUN|NounType=Clf|r-compound:redup": 341,
1184
+ "NOUN|NounType=Clf|r-conj": 342,
1185
+ "NOUN|NounType=Clf|r-flat": 343,
1186
+ "NOUN|NounType=Clf|r-obj": 344,
1187
+ "NOUN|NounType=Clf|r-parataxis": 345,
1188
+ "NOUN|NounType=Clf|root": 346,
1189
+ "NOUN|l-acl": 347,
1190
+ "NOUN|l-advcl": 348,
1191
+ "NOUN|l-amod": 349,
1192
+ "NOUN|l-ccomp": 350,
1193
+ "NOUN|l-clf": 351,
1194
+ "NOUN|l-compound": 352,
1195
+ "NOUN|l-csubj": 353,
1196
+ "NOUN|l-csubj:outer": 354,
1197
+ "NOUN|l-dislocated": 355,
1198
+ "NOUN|l-iobj": 356,
1199
+ "NOUN|l-list": 357,
1200
+ "NOUN|l-nmod": 358,
1201
+ "NOUN|l-nsubj": 359,
1202
+ "NOUN|l-nsubj:outer": 360,
1203
+ "NOUN|l-nsubj:pass": 361,
1204
+ "NOUN|l-obj": 362,
1205
+ "NOUN|l-obl": 363,
1206
+ "NOUN|l-obl:lmod": 364,
1207
+ "NOUN|l-obl:tmod": 365,
1208
+ "NOUN|l-vocative": 366,
1209
+ "NOUN|r-acl": 367,
1210
+ "NOUN|r-advcl": 368,
1211
+ "NOUN|r-amod": 369,
1212
+ "NOUN|r-ccomp": 370,
1213
+ "NOUN|r-clf": 371,
1214
+ "NOUN|r-compound:redup": 372,
1215
+ "NOUN|r-conj": 373,
1216
+ "NOUN|r-csubj": 374,
1217
+ "NOUN|r-dislocated": 375,
1218
+ "NOUN|r-flat": 376,
1219
+ "NOUN|r-flat:foreign": 377,
1220
+ "NOUN|r-iobj": 378,
1221
+ "NOUN|r-list": 379,
1222
+ "NOUN|r-nmod": 380,
1223
+ "NOUN|r-nsubj": 381,
1224
+ "NOUN|r-obj": 382,
1225
+ "NOUN|r-obl": 383,
1226
+ "NOUN|r-obl:lmod": 384,
1227
+ "NOUN|r-parataxis": 385,
1228
+ "NOUN|r-vocative": 386,
1229
+ "NOUN|r-xcomp": 387,
1230
+ "NOUN|root": 388,
1231
+ "NUM": 389,
1232
+ "NUM|NumType=Ord": 390,
1233
+ "NUM|NumType=Ord|l-nsubj": 391,
1234
+ "NUM|NumType=Ord|l-nummod": 392,
1235
+ "NUM|NumType=Ord|l-obl": 393,
1236
+ "NUM|NumType=Ord|l-obl:lmod": 394,
1237
+ "NUM|NumType=Ord|l-obl:tmod": 395,
1238
+ "NUM|NumType=Ord|r-conj": 396,
1239
+ "NUM|NumType=Ord|r-flat": 397,
1240
+ "NUM|NumType=Ord|r-obj": 398,
1241
+ "NUM|NumType=Ord|root": 399,
1242
+ "NUM|l-acl": 400,
1243
+ "NUM|l-advcl": 401,
1244
+ "NUM|l-compound": 402,
1245
+ "NUM|l-csubj": 403,
1246
+ "NUM|l-dislocated": 404,
1247
+ "NUM|l-nsubj": 405,
1248
+ "NUM|l-nsubj:outer": 406,
1249
+ "NUM|l-nummod": 407,
1250
+ "NUM|l-obj": 408,
1251
+ "NUM|l-obl": 409,
1252
+ "NUM|l-obl:lmod": 410,
1253
+ "NUM|l-obl:tmod": 411,
1254
+ "NUM|r-ccomp": 412,
1255
+ "NUM|r-clf": 413,
1256
+ "NUM|r-compound": 414,
1257
+ "NUM|r-compound:redup": 415,
1258
+ "NUM|r-conj": 416,
1259
+ "NUM|r-flat": 417,
1260
+ "NUM|r-iobj": 418,
1261
+ "NUM|r-list": 419,
1262
+ "NUM|r-nummod": 420,
1263
+ "NUM|r-obj": 421,
1264
+ "NUM|r-obl": 422,
1265
+ "NUM|r-obl:tmod": 423,
1266
+ "NUM|r-parataxis": 424,
1267
+ "NUM|r-xcomp": 425,
1268
+ "NUM|root": 426,
1269
+ "PART": 427,
1270
+ "PART|l-acl": 428,
1271
+ "PART|l-advcl": 429,
1272
+ "PART|l-advmod": 430,
1273
+ "PART|l-amod": 431,
1274
+ "PART|l-case": 432,
1275
+ "PART|l-cc": 433,
1276
+ "PART|l-csubj": 434,
1277
+ "PART|l-csubj:outer": 435,
1278
+ "PART|l-discourse": 436,
1279
+ "PART|l-discourse:sp": 437,
1280
+ "PART|l-dislocated": 438,
1281
+ "PART|l-mark": 439,
1282
+ "PART|l-nmod": 440,
1283
+ "PART|l-nsubj": 441,
1284
+ "PART|l-nsubj:outer": 442,
1285
+ "PART|l-nsubj:pass": 443,
1286
+ "PART|l-obj": 444,
1287
+ "PART|l-obl": 445,
1288
+ "PART|l-obl:lmod": 446,
1289
+ "PART|r-advmod": 447,
1290
+ "PART|r-case": 448,
1291
+ "PART|r-ccomp": 449,
1292
+ "PART|r-clf": 450,
1293
+ "PART|r-conj": 451,
1294
+ "PART|r-discourse": 452,
1295
+ "PART|r-discourse:sp": 453,
1296
+ "PART|r-dislocated": 454,
1297
+ "PART|r-fixed": 455,
1298
+ "PART|r-flat": 456,
1299
+ "PART|r-iobj": 457,
1300
+ "PART|r-list": 458,
1301
+ "PART|r-mark": 459,
1302
+ "PART|r-nsubj": 460,
1303
+ "PART|r-obj": 461,
1304
+ "PART|r-obl": 462,
1305
+ "PART|r-parataxis": 463,
1306
+ "PART|r-xcomp": 464,
1307
+ "PART|root": 465,
1308
+ "PRON|Person=1|PronType=Prs": 466,
1309
+ "PRON|Person=1|PronType=Prs|l-acl": 467,
1310
+ "PRON|Person=1|PronType=Prs|l-advcl": 468,
1311
+ "PRON|Person=1|PronType=Prs|l-det": 469,
1312
+ "PRON|Person=1|PronType=Prs|l-iobj": 470,
1313
+ "PRON|Person=1|PronType=Prs|l-nsubj": 471,
1314
+ "PRON|Person=1|PronType=Prs|l-nsubj:outer": 472,
1315
+ "PRON|Person=1|PronType=Prs|l-obj": 473,
1316
+ "PRON|Person=1|PronType=Prs|l-obl": 474,
1317
+ "PRON|Person=1|PronType=Prs|l-vocative": 475,
1318
+ "PRON|Person=1|PronType=Prs|r-ccomp": 476,
1319
+ "PRON|Person=1|PronType=Prs|r-conj": 477,
1320
+ "PRON|Person=1|PronType=Prs|r-iobj": 478,
1321
+ "PRON|Person=1|PronType=Prs|r-nsubj": 479,
1322
+ "PRON|Person=1|PronType=Prs|r-obj": 480,
1323
+ "PRON|Person=1|PronType=Prs|r-obl": 481,
1324
+ "PRON|Person=1|PronType=Prs|r-obl:lmod": 482,
1325
+ "PRON|Person=1|PronType=Prs|root": 483,
1326
+ "PRON|Person=2|PronType=Prs": 484,
1327
+ "PRON|Person=2|PronType=Prs|l-advcl": 485,
1328
+ "PRON|Person=2|PronType=Prs|l-amod": 486,
1329
+ "PRON|Person=2|PronType=Prs|l-det": 487,
1330
+ "PRON|Person=2|PronType=Prs|l-nsubj": 488,
1331
+ "PRON|Person=2|PronType=Prs|l-nsubj:outer": 489,
1332
+ "PRON|Person=2|PronType=Prs|l-obj": 490,
1333
+ "PRON|Person=2|PronType=Prs|l-obl": 491,
1334
+ "PRON|Person=2|PronType=Prs|l-vocative": 492,
1335
+ "PRON|Person=2|PronType=Prs|r-conj": 493,
1336
+ "PRON|Person=2|PronType=Prs|r-flat": 494,
1337
+ "PRON|Person=2|PronType=Prs|r-iobj": 495,
1338
+ "PRON|Person=2|PronType=Prs|r-obj": 496,
1339
+ "PRON|Person=2|PronType=Prs|r-obl": 497,
1340
+ "PRON|Person=2|PronType=Prs|root": 498,
1341
+ "PRON|Person=3|PronType=Prs": 499,
1342
+ "PRON|Person=3|PronType=Prs|l-advcl": 500,
1343
+ "PRON|Person=3|PronType=Prs|l-amod": 501,
1344
+ "PRON|Person=3|PronType=Prs|l-det": 502,
1345
+ "PRON|Person=3|PronType=Prs|l-dislocated": 503,
1346
+ "PRON|Person=3|PronType=Prs|l-expl": 504,
1347
+ "PRON|Person=3|PronType=Prs|l-iobj": 505,
1348
+ "PRON|Person=3|PronType=Prs|l-nsubj": 506,
1349
+ "PRON|Person=3|PronType=Prs|l-nsubj:outer": 507,
1350
+ "PRON|Person=3|PronType=Prs|l-nsubj:pass": 508,
1351
+ "PRON|Person=3|PronType=Prs|l-obj": 509,
1352
+ "PRON|Person=3|PronType=Prs|l-obl": 510,
1353
+ "PRON|Person=3|PronType=Prs|r-ccomp": 511,
1354
+ "PRON|Person=3|PronType=Prs|r-conj": 512,
1355
+ "PRON|Person=3|PronType=Prs|r-expl": 513,
1356
+ "PRON|Person=3|PronType=Prs|r-iobj": 514,
1357
+ "PRON|Person=3|PronType=Prs|r-nsubj": 515,
1358
+ "PRON|Person=3|PronType=Prs|r-obj": 516,
1359
+ "PRON|Person=3|PronType=Prs|r-obl": 517,
1360
+ "PRON|Person=3|PronType=Prs|root": 518,
1361
+ "PRON|PronType=Dem": 519,
1362
+ "PRON|PronType=Dem|l-acl": 520,
1363
+ "PRON|PronType=Dem|l-advcl": 521,
1364
+ "PRON|PronType=Dem|l-amod": 522,
1365
+ "PRON|PronType=Dem|l-compound": 523,
1366
+ "PRON|PronType=Dem|l-det": 524,
1367
+ "PRON|PronType=Dem|l-dislocated": 525,
1368
+ "PRON|PronType=Dem|l-expl": 526,
1369
+ "PRON|PronType=Dem|l-nsubj": 527,
1370
+ "PRON|PronType=Dem|l-nsubj:outer": 528,
1371
+ "PRON|PronType=Dem|l-obj": 529,
1372
+ "PRON|PronType=Dem|l-obl": 530,
1373
+ "PRON|PronType=Dem|l-obl:lmod": 531,
1374
+ "PRON|PronType=Dem|r-conj": 532,
1375
+ "PRON|PronType=Dem|r-det": 533,
1376
+ "PRON|PronType=Dem|r-expl": 534,
1377
+ "PRON|PronType=Dem|r-flat": 535,
1378
+ "PRON|PronType=Dem|r-iobj": 536,
1379
+ "PRON|PronType=Dem|r-obj": 537,
1380
+ "PRON|PronType=Dem|r-obl": 538,
1381
+ "PRON|PronType=Dem|r-obl:lmod": 539,
1382
+ "PRON|PronType=Dem|root": 540,
1383
+ "PRON|PronType=Int": 541,
1384
+ "PRON|PronType=Int|l-advcl": 542,
1385
+ "PRON|PronType=Int|l-amod": 543,
1386
+ "PRON|PronType=Int|l-det": 544,
1387
+ "PRON|PronType=Int|l-dislocated": 545,
1388
+ "PRON|PronType=Int|l-nsubj": 546,
1389
+ "PRON|PronType=Int|l-nsubj:outer": 547,
1390
+ "PRON|PronType=Int|l-obj": 548,
1391
+ "PRON|PronType=Int|l-obl": 549,
1392
+ "PRON|PronType=Int|l-vocative": 550,
1393
+ "PRON|PronType=Int|r-ccomp": 551,
1394
+ "PRON|PronType=Int|r-conj": 552,
1395
+ "PRON|PronType=Int|r-flat": 553,
1396
+ "PRON|PronType=Int|r-obj": 554,
1397
+ "PRON|PronType=Int|r-parataxis": 555,
1398
+ "PRON|PronType=Int|r-xcomp": 556,
1399
+ "PRON|PronType=Int|root": 557,
1400
+ "PRON|PronType=Prs": 558,
1401
+ "PRON|PronType=Prs|Reflex=Yes": 559,
1402
+ "PRON|PronType=Prs|Reflex=Yes|l-acl": 560,
1403
+ "PRON|PronType=Prs|Reflex=Yes|l-det": 561,
1404
+ "PRON|PronType=Prs|Reflex=Yes|l-nsubj": 562,
1405
+ "PRON|PronType=Prs|Reflex=Yes|l-obj": 563,
1406
+ "PRON|PronType=Prs|Reflex=Yes|l-obl": 564,
1407
+ "PRON|PronType=Prs|Reflex=Yes|r-dislocated": 565,
1408
+ "PRON|PronType=Prs|Reflex=Yes|r-obj": 566,
1409
+ "PRON|PronType=Prs|Reflex=Yes|r-obl": 567,
1410
+ "PRON|PronType=Prs|Reflex=Yes|root": 568,
1411
+ "PRON|PronType=Prs|l-det": 569,
1412
+ "PRON|PronType=Prs|l-nsubj": 570,
1413
+ "PRON|PronType=Prs|l-nsubj:outer": 571,
1414
+ "PRON|PronType=Prs|l-obj": 572,
1415
+ "PRON|PronType=Prs|r-conj": 573,
1416
+ "PRON|PronType=Prs|r-iobj": 574,
1417
+ "PRON|PronType=Prs|r-obj": 575,
1418
+ "PROPN": 576,
1419
+ "PROPN|Case=Loc|NameType=Geo": 577,
1420
+ "PROPN|Case=Loc|NameType=Geo|l-acl": 578,
1421
+ "PROPN|Case=Loc|NameType=Geo|l-advcl": 579,
1422
+ "PROPN|Case=Loc|NameType=Geo|l-amod": 580,
1423
+ "PROPN|Case=Loc|NameType=Geo|l-compound": 581,
1424
+ "PROPN|Case=Loc|NameType=Geo|l-csubj": 582,
1425
+ "PROPN|Case=Loc|NameType=Geo|l-dislocated": 583,
1426
+ "PROPN|Case=Loc|NameType=Geo|l-nmod": 584,
1427
+ "PROPN|Case=Loc|NameType=Geo|l-nsubj": 585,
1428
+ "PROPN|Case=Loc|NameType=Geo|l-nsubj:outer": 586,
1429
+ "PROPN|Case=Loc|NameType=Geo|l-obl": 587,
1430
+ "PROPN|Case=Loc|NameType=Geo|l-obl:lmod": 588,
1431
+ "PROPN|Case=Loc|NameType=Geo|r-conj": 589,
1432
+ "PROPN|Case=Loc|NameType=Geo|r-flat": 590,
1433
+ "PROPN|Case=Loc|NameType=Geo|r-iobj": 591,
1434
+ "PROPN|Case=Loc|NameType=Geo|r-obj": 592,
1435
+ "PROPN|Case=Loc|NameType=Geo|r-obl": 593,
1436
+ "PROPN|Case=Loc|NameType=Geo|r-obl:lmod": 594,
1437
+ "PROPN|Case=Loc|NameType=Geo|r-parataxis": 595,
1438
+ "PROPN|Case=Loc|NameType=Geo|r-xcomp": 596,
1439
+ "PROPN|Case=Loc|NameType=Geo|root": 597,
1440
+ "PROPN|Case=Loc|NameType=Nat": 598,
1441
+ "PROPN|Case=Loc|NameType=Nat|l-acl": 599,
1442
+ "PROPN|Case=Loc|NameType=Nat|l-advcl": 600,
1443
+ "PROPN|Case=Loc|NameType=Nat|l-amod": 601,
1444
+ "PROPN|Case=Loc|NameType=Nat|l-clf": 602,
1445
+ "PROPN|Case=Loc|NameType=Nat|l-compound": 603,
1446
+ "PROPN|Case=Loc|NameType=Nat|l-nmod": 604,
1447
+ "PROPN|Case=Loc|NameType=Nat|l-nsubj": 605,
1448
+ "PROPN|Case=Loc|NameType=Nat|l-nsubj:outer": 606,
1449
+ "PROPN|Case=Loc|NameType=Nat|l-nsubj:pass": 607,
1450
+ "PROPN|Case=Loc|NameType=Nat|l-obj": 608,
1451
+ "PROPN|Case=Loc|NameType=Nat|l-obl": 609,
1452
+ "PROPN|Case=Loc|NameType=Nat|l-obl:lmod": 610,
1453
+ "PROPN|Case=Loc|NameType=Nat|r-ccomp": 611,
1454
+ "PROPN|Case=Loc|NameType=Nat|r-conj": 612,
1455
+ "PROPN|Case=Loc|NameType=Nat|r-flat": 613,
1456
+ "PROPN|Case=Loc|NameType=Nat|r-iobj": 614,
1457
+ "PROPN|Case=Loc|NameType=Nat|r-nmod": 615,
1458
+ "PROPN|Case=Loc|NameType=Nat|r-obj": 616,
1459
+ "PROPN|Case=Loc|NameType=Nat|r-obl": 617,
1460
+ "PROPN|Case=Loc|NameType=Nat|r-obl:lmod": 618,
1461
+ "PROPN|Case=Loc|NameType=Nat|r-parataxis": 619,
1462
+ "PROPN|Case=Loc|NameType=Nat|r-xcomp": 620,
1463
+ "PROPN|Case=Loc|NameType=Nat|root": 621,
1464
+ "PROPN|NameType=Giv": 622,
1465
+ "PROPN|NameType=Giv|l-acl": 623,
1466
+ "PROPN|NameType=Giv|l-advcl": 624,
1467
+ "PROPN|NameType=Giv|l-amod": 625,
1468
+ "PROPN|NameType=Giv|l-compound": 626,
1469
+ "PROPN|NameType=Giv|l-dislocated": 627,
1470
+ "PROPN|NameType=Giv|l-nmod": 628,
1471
+ "PROPN|NameType=Giv|l-nsubj": 629,
1472
+ "PROPN|NameType=Giv|l-nsubj:outer": 630,
1473
+ "PROPN|NameType=Giv|l-nsubj:pass": 631,
1474
+ "PROPN|NameType=Giv|l-obj": 632,
1475
+ "PROPN|NameType=Giv|l-obl": 633,
1476
+ "PROPN|NameType=Giv|l-obl:lmod": 634,
1477
+ "PROPN|NameType=Giv|l-parataxis": 635,
1478
+ "PROPN|NameType=Giv|l-vocative": 636,
1479
+ "PROPN|NameType=Giv|r-ccomp": 637,
1480
+ "PROPN|NameType=Giv|r-conj": 638,
1481
+ "PROPN|NameType=Giv|r-dislocated": 639,
1482
+ "PROPN|NameType=Giv|r-flat": 640,
1483
+ "PROPN|NameType=Giv|r-iobj": 641,
1484
+ "PROPN|NameType=Giv|r-list": 642,
1485
+ "PROPN|NameType=Giv|r-nmod": 643,
1486
+ "PROPN|NameType=Giv|r-obj": 644,
1487
+ "PROPN|NameType=Giv|r-obl": 645,
1488
+ "PROPN|NameType=Giv|r-obl:lmod": 646,
1489
+ "PROPN|NameType=Giv|r-parataxis": 647,
1490
+ "PROPN|NameType=Giv|r-xcomp": 648,
1491
+ "PROPN|NameType=Giv|root": 649,
1492
+ "PROPN|NameType=Prs": 650,
1493
+ "PROPN|NameType=Prs|l-acl": 651,
1494
+ "PROPN|NameType=Prs|l-advcl": 652,
1495
+ "PROPN|NameType=Prs|l-amod": 653,
1496
+ "PROPN|NameType=Prs|l-compound": 654,
1497
+ "PROPN|NameType=Prs|l-dislocated": 655,
1498
+ "PROPN|NameType=Prs|l-nmod": 656,
1499
+ "PROPN|NameType=Prs|l-nsubj": 657,
1500
+ "PROPN|NameType=Prs|l-nsubj:outer": 658,
1501
+ "PROPN|NameType=Prs|l-obj": 659,
1502
+ "PROPN|NameType=Prs|l-obl": 660,
1503
+ "PROPN|NameType=Prs|r-conj": 661,
1504
+ "PROPN|NameType=Prs|r-dislocated": 662,
1505
+ "PROPN|NameType=Prs|r-flat": 663,
1506
+ "PROPN|NameType=Prs|r-iobj": 664,
1507
+ "PROPN|NameType=Prs|r-obj": 665,
1508
+ "PROPN|NameType=Prs|r-obl": 666,
1509
+ "PROPN|NameType=Prs|r-parataxis": 667,
1510
+ "PROPN|NameType=Prs|root": 668,
1511
+ "PROPN|NameType=Sur": 669,
1512
+ "PROPN|NameType=Sur|l-acl": 670,
1513
+ "PROPN|NameType=Sur|l-advcl": 671,
1514
+ "PROPN|NameType=Sur|l-amod": 672,
1515
+ "PROPN|NameType=Sur|l-compound": 673,
1516
+ "PROPN|NameType=Sur|l-csubj": 674,
1517
+ "PROPN|NameType=Sur|l-dislocated": 675,
1518
+ "PROPN|NameType=Sur|l-nmod": 676,
1519
+ "PROPN|NameType=Sur|l-nsubj": 677,
1520
+ "PROPN|NameType=Sur|l-nsubj:outer": 678,
1521
+ "PROPN|NameType=Sur|l-nsubj:pass": 679,
1522
+ "PROPN|NameType=Sur|l-obl": 680,
1523
+ "PROPN|NameType=Sur|l-obl:lmod": 681,
1524
+ "PROPN|NameType=Sur|l-vocative": 682,
1525
+ "PROPN|NameType=Sur|r-ccomp": 683,
1526
+ "PROPN|NameType=Sur|r-conj": 684,
1527
+ "PROPN|NameType=Sur|r-dislocated": 685,
1528
+ "PROPN|NameType=Sur|r-flat": 686,
1529
+ "PROPN|NameType=Sur|r-iobj": 687,
1530
+ "PROPN|NameType=Sur|r-list": 688,
1531
+ "PROPN|NameType=Sur|r-nmod": 689,
1532
+ "PROPN|NameType=Sur|r-nsubj": 690,
1533
+ "PROPN|NameType=Sur|r-obj": 691,
1534
+ "PROPN|NameType=Sur|r-obl": 692,
1535
+ "PROPN|NameType=Sur|r-obl:lmod": 693,
1536
+ "PROPN|NameType=Sur|r-parataxis": 694,
1537
+ "PROPN|NameType=Sur|r-xcomp": 695,
1538
+ "PROPN|NameType=Sur|root": 696,
1539
+ "PROPN|l-nmod": 697,
1540
+ "PUNCT": 698,
1541
+ "PUNCT|root": 699,
1542
+ "SCONJ": 700,
1543
+ "SCONJ|l-case": 701,
1544
+ "SCONJ|l-cc": 702,
1545
+ "SCONJ|l-mark": 703,
1546
+ "SCONJ|l-nsubj": 704,
1547
+ "SCONJ|l-obl": 705,
1548
+ "SCONJ|r-case": 706,
1549
+ "SCONJ|r-iobj": 707,
1550
+ "SCONJ|r-mark": 708,
1551
+ "SCONJ|r-nsubj": 709,
1552
+ "SCONJ|r-nsubj:pass": 710,
1553
+ "SCONJ|r-obj": 711,
1554
+ "SCONJ|root": 712,
1555
+ "SYM": 713,
1556
+ "SYM|l-nmod": 714,
1557
+ "SYM|l-nsubj": 715,
1558
+ "SYM|r-conj": 716,
1559
+ "SYM|r-nmod": 717,
1560
+ "SYM|r-xcomp": 718,
1561
+ "SYM|root": 719,
1562
+ "VERB": 720,
1563
+ "VERB|Degree=Equ": 721,
1564
+ "VERB|Degree=Equ|VerbForm=Part": 722,
1565
+ "VERB|Degree=Equ|VerbForm=Part|l-amod": 723,
1566
+ "VERB|Degree=Equ|l-acl": 724,
1567
+ "VERB|Degree=Equ|l-advcl": 725,
1568
+ "VERB|Degree=Equ|l-ccomp": 726,
1569
+ "VERB|Degree=Equ|l-csubj": 727,
1570
+ "VERB|Degree=Equ|l-nsubj": 728,
1571
+ "VERB|Degree=Equ|l-obj": 729,
1572
+ "VERB|Degree=Equ|r-ccomp": 730,
1573
+ "VERB|Degree=Equ|r-compound:redup": 731,
1574
+ "VERB|Degree=Equ|r-conj": 732,
1575
+ "VERB|Degree=Equ|r-obj": 733,
1576
+ "VERB|Degree=Equ|r-parataxis": 734,
1577
+ "VERB|Degree=Equ|r-xcomp": 735,
1578
+ "VERB|Degree=Equ|root": 736,
1579
+ "VERB|Degree=Pos": 737,
1580
+ "VERB|Degree=Pos|VerbForm=Part": 738,
1581
+ "VERB|Degree=Pos|VerbForm=Part|l-amod": 739,
1582
+ "VERB|Degree=Pos|VerbForm=Part|r-amod": 740,
1583
+ "VERB|Degree=Pos|l-acl": 741,
1584
+ "VERB|Degree=Pos|l-advcl": 742,
1585
+ "VERB|Degree=Pos|l-ccomp": 743,
1586
+ "VERB|Degree=Pos|l-csubj": 744,
1587
+ "VERB|Degree=Pos|l-csubj:outer": 745,
1588
+ "VERB|Degree=Pos|l-dislocated": 746,
1589
+ "VERB|Degree=Pos|l-nsubj": 747,
1590
+ "VERB|Degree=Pos|l-nsubj:outer": 748,
1591
+ "VERB|Degree=Pos|l-obj": 749,
1592
+ "VERB|Degree=Pos|l-obl": 750,
1593
+ "VERB|Degree=Pos|l-vocative": 751,
1594
+ "VERB|Degree=Pos|r-advcl": 752,
1595
+ "VERB|Degree=Pos|r-ccomp": 753,
1596
+ "VERB|Degree=Pos|r-compound:redup": 754,
1597
+ "VERB|Degree=Pos|r-conj": 755,
1598
+ "VERB|Degree=Pos|r-dislocated": 756,
1599
+ "VERB|Degree=Pos|r-fixed": 757,
1600
+ "VERB|Degree=Pos|r-flat:vv": 758,
1601
+ "VERB|Degree=Pos|r-iobj": 759,
1602
+ "VERB|Degree=Pos|r-obj": 760,
1603
+ "VERB|Degree=Pos|r-obl": 761,
1604
+ "VERB|Degree=Pos|r-parataxis": 762,
1605
+ "VERB|Degree=Pos|r-xcomp": 763,
1606
+ "VERB|Degree=Pos|root": 764,
1607
+ "VERB|Polarity=Neg": 765,
1608
+ "VERB|Polarity=Neg|VerbForm=Part": 766,
1609
+ "VERB|Polarity=Neg|VerbForm=Part|l-amod": 767,
1610
+ "VERB|Polarity=Neg|l-acl": 768,
1611
+ "VERB|Polarity=Neg|l-advcl": 769,
1612
+ "VERB|Polarity=Neg|l-ccomp": 770,
1613
+ "VERB|Polarity=Neg|l-csubj": 771,
1614
+ "VERB|Polarity=Neg|l-csubj:outer": 772,
1615
+ "VERB|Polarity=Neg|l-nsubj": 773,
1616
+ "VERB|Polarity=Neg|l-obl": 774,
1617
+ "VERB|Polarity=Neg|r-advcl": 775,
1618
+ "VERB|Polarity=Neg|r-ccomp": 776,
1619
+ "VERB|Polarity=Neg|r-conj": 777,
1620
+ "VERB|Polarity=Neg|r-flat:vv": 778,
1621
+ "VERB|Polarity=Neg|r-obj": 779,
1622
+ "VERB|Polarity=Neg|r-obl": 780,
1623
+ "VERB|Polarity=Neg|r-parataxis": 781,
1624
+ "VERB|Polarity=Neg|r-xcomp": 782,
1625
+ "VERB|Polarity=Neg|root": 783,
1626
+ "VERB|VerbForm=Part": 784,
1627
+ "VERB|VerbForm=Part|l-amod": 785,
1628
+ "VERB|VerbForm=Part|r-amod": 786,
1629
+ "VERB|l-acl": 787,
1630
+ "VERB|l-advcl": 788,
1631
+ "VERB|l-ccomp": 789,
1632
+ "VERB|l-csubj": 790,
1633
+ "VERB|l-csubj:outer": 791,
1634
+ "VERB|l-csubj:pass": 792,
1635
+ "VERB|l-dislocated": 793,
1636
+ "VERB|l-nsubj": 794,
1637
+ "VERB|l-nsubj:outer": 795,
1638
+ "VERB|l-obj": 796,
1639
+ "VERB|l-obl": 797,
1640
+ "VERB|l-obl:lmod": 798,
1641
+ "VERB|l-parataxis": 799,
1642
+ "VERB|r-acl": 800,
1643
+ "VERB|r-advcl": 801,
1644
+ "VERB|r-ccomp": 802,
1645
+ "VERB|r-compound:redup": 803,
1646
+ "VERB|r-conj": 804,
1647
+ "VERB|r-dislocated": 805,
1648
+ "VERB|r-fixed": 806,
1649
+ "VERB|r-flat:vv": 807,
1650
+ "VERB|r-iobj": 808,
1651
+ "VERB|r-list": 809,
1652
+ "VERB|r-obj": 810,
1653
+ "VERB|r-obl": 811,
1654
+ "VERB|r-obl:lmod": 812,
1655
+ "VERB|r-parataxis": 813,
1656
+ "VERB|r-vocative": 814,
1657
+ "VERB|r-xcomp": 815,
1658
+ "VERB|root": 816
1659
+ },
1660
+ "max_position_embeddings": 131072,
1661
+ "max_window_layers": 28,
1662
+ "model_type": "qwen2",
1663
+ "num_attention_heads": 12,
1664
+ "num_hidden_layers": 28,
1665
+ "num_key_value_heads": 2,
1666
+ "rms_norm_eps": 1e-06,
1667
+ "rope_theta": 1000000.0,
1668
+ "sliding_window": 131072,
1669
+ "tie_word_embeddings": true,
1670
+ "tokenizer_class": "Qwen2Tokenizer",
1671
+ "torch_dtype": "float32",
1672
+ "transformers_version": "4.42.4",
1673
+ "use_cache": false,
1674
+ "use_sliding_window": false,
1675
+ "vocab_size": 151936
1676
+ }
maker.py ADDED
@@ -0,0 +1,108 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ #! /usr/bin/python3
2
+ src="KoichiYasuoka/Xunzi-Qwen2-1.5B-upos"
3
+ tgt="KoichiYasuoka/Xunzi-Qwen2-1.5B-ud-causal"
4
+ url="https://github.com/UniversalDependencies/UD_Classical_Chinese-Kyoto"
5
+ import os
6
+ d=os.path.basename(url)
7
+ os.system("test -d "+d+" || git clone --depth=1 "+url)
8
+ os.system("for F in train dev test ; do cp "+d+"/*-$F.conllu $F.conllu ; done")
9
+ class UDCausalDataset(object):
10
+ def __init__(self,conllu,tokenizer,embeddings=None):
11
+ self.conllu=open(conllu,"r",encoding="utf-8")
12
+ self.tokenizer=tokenizer
13
+ self.embeddings=embeddings
14
+ self.max_tokens=3
15
+ self.seeks=[(0,0)]
16
+ label=set(["SYM"])
17
+ dep=set()
18
+ s=self.conllu.readline()
19
+ while s!="":
20
+ if s=="\n":
21
+ self.seeks.append((self.conllu.tell(),0))
22
+ else:
23
+ w=s.split("\t")
24
+ if len(w)==10:
25
+ if w[0].isdecimal():
26
+ p=w[3] if w[5]=="_" else w[3]+"|"+w[5]
27
+ label.add(p)
28
+ dep.add(p+("|" if w[6]=="0" else "|l-" if int(w[0])<int(w[6]) else "|r-")+w[7])
29
+ self.seeks.append((self.seeks[-1][0],int(w[0])))
30
+ self.max_tokens=max(self.max_tokens,int(w[0])*2+1)
31
+ s=self.conllu.readline()
32
+ lid={}
33
+ for i,l in enumerate(sorted(label)):
34
+ lid[l],lid["B-"+l],lid["I-"+l]=i*3,i*3+1,i*3+2
35
+ for i,d in enumerate(sorted(dep),len(lid)):
36
+ lid[d]=i
37
+ self.label2id=lid
38
+ def __call__(*args):
39
+ lid={l:i for i,l in enumerate(sorted(set(sum([list(t.label2id) for t in args],[]))))}
40
+ for t in args:
41
+ t.label2id=lid
42
+ return lid
43
+ def __del__(self):
44
+ self.conllu.close()
45
+ __len__=lambda self:len(self.seeks)-1
46
+ def __getitem__(self,i):
47
+ s,t=self.seeks[i]
48
+ self.conllu.seek(s)
49
+ form,upos,deps,w=[],[],[],[""]
50
+ while w[0]!="\n":
51
+ w=self.conllu.readline().split("\t")
52
+ if len(w)==10:
53
+ form.append(w[1])
54
+ if w[0].isdecimal():
55
+ upos.append(w[3] if w[5]=="_" else w[3]+"|"+w[5])
56
+ deps.append((int(w[6]),w[7]))
57
+ v=self.tokenizer(form,add_special_tokens=False)
58
+ if t==0:
59
+ i,u=[],[]
60
+ for j,(x,y) in enumerate(zip(v["input_ids"],upos)):
61
+ if x!=[]:
62
+ i+=x
63
+ u+=[y] if len(x)==1 else ["B-"+y]+["I-"+y]*(len(x)-1)
64
+ emb=self.embeddings
65
+ pad=self.tokenizer.pad_token_id
66
+ else:
67
+ import torch
68
+ m=[]
69
+ for x in v["input_ids"]:
70
+ if x==[]:
71
+ m.append(self.embeddings[self.tokenizer.unk_token_id,:])
72
+ else:
73
+ m.append(self.embeddings[x,:].sum(axis=0))
74
+ m.append(self.embeddings[self.tokenizer.sep_token_id,:])
75
+ m.append(self.embeddings[self.tokenizer.pad_token_id,:])
76
+ emb=torch.stack(m)
77
+ i,u=list(range(len(upos)+1)),upos+["SYM"]
78
+ i.append(t-1)
79
+ k,d=deps[t-1]
80
+ u.append(upos[t-1]+"|"+d if k==0 else upos[t-1])
81
+ for j in range(t,len(upos)):
82
+ i.append(j)
83
+ a,b=deps[j]
84
+ u.append(upos[j]+"|r-"+b if a==t else upos[t-1]+"|l-"+d if j+1==k else upos[j])
85
+ pad=-1
86
+ j=self.max_tokens-len(i)
87
+ if j>0:
88
+ ids=i+[pad]*j
89
+ upos=u+["SYM"]*j
90
+ else:
91
+ ids=i[0:self.max_tokens]
92
+ upos=u[0:self.max_tokens]
93
+ return {"inputs_embeds":emb[ids,:],"labels":[self.label2id[p] for p in upos]}
94
+ from transformers import AutoTokenizer,AutoConfig,Qwen2ForTokenClassification,DefaultDataCollator,TrainingArguments,Trainer
95
+ tkz=AutoTokenizer.from_pretrained(src,cls_token="<|im_start|>",sep_token="<|im_end|>",mask_token="<unk>")
96
+ trainDS=UDCausalDataset("train.conllu",tkz)
97
+ devDS=UDCausalDataset("dev.conllu",tkz)
98
+ testDS=UDCausalDataset("test.conllu",tkz)
99
+ lid=trainDS(devDS,testDS)
100
+ cfg=AutoConfig.from_pretrained(src,num_labels=len(lid),label2id=lid,id2label={i:l for l,i in lid.items()},ignore_mismatched_sizes=True)
101
+ mdl=Qwen2ForTokenClassification.from_pretrained(src,config=cfg,ignore_mismatched_sizes=True)
102
+ trainDS.embeddings=mdl.get_input_embeddings().weight
103
+ trainDS.max_tokens=min(trainDS.max_tokens,cfg.max_position_embeddings)
104
+ arg=TrainingArguments(num_train_epochs=3,per_device_train_batch_size=12,dataloader_pin_memory=False,output_dir=tgt,overwrite_output_dir=True,save_total_limit=2,learning_rate=5e-05,warmup_ratio=0.1,save_safetensors=False)
105
+ trn=Trainer(args=arg,data_collator=DefaultDataCollator(),model=mdl,train_dataset=trainDS)
106
+ trn.train()
107
+ trn.save_model(tgt)
108
+ tkz.save_pretrained(tgt)
merges.txt ADDED
The diff for this file is too large to render. See raw diff
 
pytorch_model-00001-of-00002.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1a9a8fe436930b4571843166b905238634592c070ab319d332dcdc35eba281bc
3
+ size 4996733492
pytorch_model-00002-of-00002.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:8cf1a3220364c5045d865cf6ed5521751cbb1fd0f0a1db1424585be7df3f5220
3
+ size 1183266918
pytorch_model.bin.index.json ADDED
@@ -0,0 +1,347 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "metadata": {
3
+ "total_size": 6179880132
4
+ },
5
+ "weight_map": {
6
+ "model.embed_tokens.weight": "pytorch_model-00001-of-00002.bin",
7
+ "model.layers.0.input_layernorm.weight": "pytorch_model-00001-of-00002.bin",
8
+ "model.layers.0.mlp.down_proj.weight": "pytorch_model-00001-of-00002.bin",
9
+ "model.layers.0.mlp.gate_proj.weight": "pytorch_model-00001-of-00002.bin",
10
+ "model.layers.0.mlp.up_proj.weight": "pytorch_model-00001-of-00002.bin",
11
+ "model.layers.0.post_attention_layernorm.weight": "pytorch_model-00001-of-00002.bin",
12
+ "model.layers.0.self_attn.k_proj.bias": "pytorch_model-00001-of-00002.bin",
13
+ "model.layers.0.self_attn.k_proj.weight": "pytorch_model-00001-of-00002.bin",
14
+ "model.layers.0.self_attn.o_proj.weight": "pytorch_model-00001-of-00002.bin",
15
+ "model.layers.0.self_attn.q_proj.bias": "pytorch_model-00001-of-00002.bin",
16
+ "model.layers.0.self_attn.q_proj.weight": "pytorch_model-00001-of-00002.bin",
17
+ "model.layers.0.self_attn.v_proj.bias": "pytorch_model-00001-of-00002.bin",
18
+ "model.layers.0.self_attn.v_proj.weight": "pytorch_model-00001-of-00002.bin",
19
+ "model.layers.1.input_layernorm.weight": "pytorch_model-00001-of-00002.bin",
20
+ "model.layers.1.mlp.down_proj.weight": "pytorch_model-00001-of-00002.bin",
21
+ "model.layers.1.mlp.gate_proj.weight": "pytorch_model-00001-of-00002.bin",
22
+ "model.layers.1.mlp.up_proj.weight": "pytorch_model-00001-of-00002.bin",
23
+ "model.layers.1.post_attention_layernorm.weight": "pytorch_model-00001-of-00002.bin",
24
+ "model.layers.1.self_attn.k_proj.bias": "pytorch_model-00001-of-00002.bin",
25
+ "model.layers.1.self_attn.k_proj.weight": "pytorch_model-00001-of-00002.bin",
26
+ "model.layers.1.self_attn.o_proj.weight": "pytorch_model-00001-of-00002.bin",
27
+ "model.layers.1.self_attn.q_proj.bias": "pytorch_model-00001-of-00002.bin",
28
+ "model.layers.1.self_attn.q_proj.weight": "pytorch_model-00001-of-00002.bin",
29
+ "model.layers.1.self_attn.v_proj.bias": "pytorch_model-00001-of-00002.bin",
30
+ "model.layers.1.self_attn.v_proj.weight": "pytorch_model-00001-of-00002.bin",
31
+ "model.layers.10.input_layernorm.weight": "pytorch_model-00001-of-00002.bin",
32
+ "model.layers.10.mlp.down_proj.weight": "pytorch_model-00001-of-00002.bin",
33
+ "model.layers.10.mlp.gate_proj.weight": "pytorch_model-00001-of-00002.bin",
34
+ "model.layers.10.mlp.up_proj.weight": "pytorch_model-00001-of-00002.bin",
35
+ "model.layers.10.post_attention_layernorm.weight": "pytorch_model-00001-of-00002.bin",
36
+ "model.layers.10.self_attn.k_proj.bias": "pytorch_model-00001-of-00002.bin",
37
+ "model.layers.10.self_attn.k_proj.weight": "pytorch_model-00001-of-00002.bin",
38
+ "model.layers.10.self_attn.o_proj.weight": "pytorch_model-00001-of-00002.bin",
39
+ "model.layers.10.self_attn.q_proj.bias": "pytorch_model-00001-of-00002.bin",
40
+ "model.layers.10.self_attn.q_proj.weight": "pytorch_model-00001-of-00002.bin",
41
+ "model.layers.10.self_attn.v_proj.bias": "pytorch_model-00001-of-00002.bin",
42
+ "model.layers.10.self_attn.v_proj.weight": "pytorch_model-00001-of-00002.bin",
43
+ "model.layers.11.input_layernorm.weight": "pytorch_model-00001-of-00002.bin",
44
+ "model.layers.11.mlp.down_proj.weight": "pytorch_model-00001-of-00002.bin",
45
+ "model.layers.11.mlp.gate_proj.weight": "pytorch_model-00001-of-00002.bin",
46
+ "model.layers.11.mlp.up_proj.weight": "pytorch_model-00001-of-00002.bin",
47
+ "model.layers.11.post_attention_layernorm.weight": "pytorch_model-00001-of-00002.bin",
48
+ "model.layers.11.self_attn.k_proj.bias": "pytorch_model-00001-of-00002.bin",
49
+ "model.layers.11.self_attn.k_proj.weight": "pytorch_model-00001-of-00002.bin",
50
+ "model.layers.11.self_attn.o_proj.weight": "pytorch_model-00001-of-00002.bin",
51
+ "model.layers.11.self_attn.q_proj.bias": "pytorch_model-00001-of-00002.bin",
52
+ "model.layers.11.self_attn.q_proj.weight": "pytorch_model-00001-of-00002.bin",
53
+ "model.layers.11.self_attn.v_proj.bias": "pytorch_model-00001-of-00002.bin",
54
+ "model.layers.11.self_attn.v_proj.weight": "pytorch_model-00001-of-00002.bin",
55
+ "model.layers.12.input_layernorm.weight": "pytorch_model-00001-of-00002.bin",
56
+ "model.layers.12.mlp.down_proj.weight": "pytorch_model-00001-of-00002.bin",
57
+ "model.layers.12.mlp.gate_proj.weight": "pytorch_model-00001-of-00002.bin",
58
+ "model.layers.12.mlp.up_proj.weight": "pytorch_model-00001-of-00002.bin",
59
+ "model.layers.12.post_attention_layernorm.weight": "pytorch_model-00001-of-00002.bin",
60
+ "model.layers.12.self_attn.k_proj.bias": "pytorch_model-00001-of-00002.bin",
61
+ "model.layers.12.self_attn.k_proj.weight": "pytorch_model-00001-of-00002.bin",
62
+ "model.layers.12.self_attn.o_proj.weight": "pytorch_model-00001-of-00002.bin",
63
+ "model.layers.12.self_attn.q_proj.bias": "pytorch_model-00001-of-00002.bin",
64
+ "model.layers.12.self_attn.q_proj.weight": "pytorch_model-00001-of-00002.bin",
65
+ "model.layers.12.self_attn.v_proj.bias": "pytorch_model-00001-of-00002.bin",
66
+ "model.layers.12.self_attn.v_proj.weight": "pytorch_model-00001-of-00002.bin",
67
+ "model.layers.13.input_layernorm.weight": "pytorch_model-00001-of-00002.bin",
68
+ "model.layers.13.mlp.down_proj.weight": "pytorch_model-00001-of-00002.bin",
69
+ "model.layers.13.mlp.gate_proj.weight": "pytorch_model-00001-of-00002.bin",
70
+ "model.layers.13.mlp.up_proj.weight": "pytorch_model-00001-of-00002.bin",
71
+ "model.layers.13.post_attention_layernorm.weight": "pytorch_model-00001-of-00002.bin",
72
+ "model.layers.13.self_attn.k_proj.bias": "pytorch_model-00001-of-00002.bin",
73
+ "model.layers.13.self_attn.k_proj.weight": "pytorch_model-00001-of-00002.bin",
74
+ "model.layers.13.self_attn.o_proj.weight": "pytorch_model-00001-of-00002.bin",
75
+ "model.layers.13.self_attn.q_proj.bias": "pytorch_model-00001-of-00002.bin",
76
+ "model.layers.13.self_attn.q_proj.weight": "pytorch_model-00001-of-00002.bin",
77
+ "model.layers.13.self_attn.v_proj.bias": "pytorch_model-00001-of-00002.bin",
78
+ "model.layers.13.self_attn.v_proj.weight": "pytorch_model-00001-of-00002.bin",
79
+ "model.layers.14.input_layernorm.weight": "pytorch_model-00001-of-00002.bin",
80
+ "model.layers.14.mlp.down_proj.weight": "pytorch_model-00001-of-00002.bin",
81
+ "model.layers.14.mlp.gate_proj.weight": "pytorch_model-00001-of-00002.bin",
82
+ "model.layers.14.mlp.up_proj.weight": "pytorch_model-00001-of-00002.bin",
83
+ "model.layers.14.post_attention_layernorm.weight": "pytorch_model-00001-of-00002.bin",
84
+ "model.layers.14.self_attn.k_proj.bias": "pytorch_model-00001-of-00002.bin",
85
+ "model.layers.14.self_attn.k_proj.weight": "pytorch_model-00001-of-00002.bin",
86
+ "model.layers.14.self_attn.o_proj.weight": "pytorch_model-00001-of-00002.bin",
87
+ "model.layers.14.self_attn.q_proj.bias": "pytorch_model-00001-of-00002.bin",
88
+ "model.layers.14.self_attn.q_proj.weight": "pytorch_model-00001-of-00002.bin",
89
+ "model.layers.14.self_attn.v_proj.bias": "pytorch_model-00001-of-00002.bin",
90
+ "model.layers.14.self_attn.v_proj.weight": "pytorch_model-00001-of-00002.bin",
91
+ "model.layers.15.input_layernorm.weight": "pytorch_model-00001-of-00002.bin",
92
+ "model.layers.15.mlp.down_proj.weight": "pytorch_model-00001-of-00002.bin",
93
+ "model.layers.15.mlp.gate_proj.weight": "pytorch_model-00001-of-00002.bin",
94
+ "model.layers.15.mlp.up_proj.weight": "pytorch_model-00001-of-00002.bin",
95
+ "model.layers.15.post_attention_layernorm.weight": "pytorch_model-00001-of-00002.bin",
96
+ "model.layers.15.self_attn.k_proj.bias": "pytorch_model-00001-of-00002.bin",
97
+ "model.layers.15.self_attn.k_proj.weight": "pytorch_model-00001-of-00002.bin",
98
+ "model.layers.15.self_attn.o_proj.weight": "pytorch_model-00001-of-00002.bin",
99
+ "model.layers.15.self_attn.q_proj.bias": "pytorch_model-00001-of-00002.bin",
100
+ "model.layers.15.self_attn.q_proj.weight": "pytorch_model-00001-of-00002.bin",
101
+ "model.layers.15.self_attn.v_proj.bias": "pytorch_model-00001-of-00002.bin",
102
+ "model.layers.15.self_attn.v_proj.weight": "pytorch_model-00001-of-00002.bin",
103
+ "model.layers.16.input_layernorm.weight": "pytorch_model-00001-of-00002.bin",
104
+ "model.layers.16.mlp.down_proj.weight": "pytorch_model-00001-of-00002.bin",
105
+ "model.layers.16.mlp.gate_proj.weight": "pytorch_model-00001-of-00002.bin",
106
+ "model.layers.16.mlp.up_proj.weight": "pytorch_model-00001-of-00002.bin",
107
+ "model.layers.16.post_attention_layernorm.weight": "pytorch_model-00001-of-00002.bin",
108
+ "model.layers.16.self_attn.k_proj.bias": "pytorch_model-00001-of-00002.bin",
109
+ "model.layers.16.self_attn.k_proj.weight": "pytorch_model-00001-of-00002.bin",
110
+ "model.layers.16.self_attn.o_proj.weight": "pytorch_model-00001-of-00002.bin",
111
+ "model.layers.16.self_attn.q_proj.bias": "pytorch_model-00001-of-00002.bin",
112
+ "model.layers.16.self_attn.q_proj.weight": "pytorch_model-00001-of-00002.bin",
113
+ "model.layers.16.self_attn.v_proj.bias": "pytorch_model-00001-of-00002.bin",
114
+ "model.layers.16.self_attn.v_proj.weight": "pytorch_model-00001-of-00002.bin",
115
+ "model.layers.17.input_layernorm.weight": "pytorch_model-00001-of-00002.bin",
116
+ "model.layers.17.mlp.down_proj.weight": "pytorch_model-00001-of-00002.bin",
117
+ "model.layers.17.mlp.gate_proj.weight": "pytorch_model-00001-of-00002.bin",
118
+ "model.layers.17.mlp.up_proj.weight": "pytorch_model-00001-of-00002.bin",
119
+ "model.layers.17.post_attention_layernorm.weight": "pytorch_model-00001-of-00002.bin",
120
+ "model.layers.17.self_attn.k_proj.bias": "pytorch_model-00001-of-00002.bin",
121
+ "model.layers.17.self_attn.k_proj.weight": "pytorch_model-00001-of-00002.bin",
122
+ "model.layers.17.self_attn.o_proj.weight": "pytorch_model-00001-of-00002.bin",
123
+ "model.layers.17.self_attn.q_proj.bias": "pytorch_model-00001-of-00002.bin",
124
+ "model.layers.17.self_attn.q_proj.weight": "pytorch_model-00001-of-00002.bin",
125
+ "model.layers.17.self_attn.v_proj.bias": "pytorch_model-00001-of-00002.bin",
126
+ "model.layers.17.self_attn.v_proj.weight": "pytorch_model-00001-of-00002.bin",
127
+ "model.layers.18.input_layernorm.weight": "pytorch_model-00001-of-00002.bin",
128
+ "model.layers.18.mlp.down_proj.weight": "pytorch_model-00001-of-00002.bin",
129
+ "model.layers.18.mlp.gate_proj.weight": "pytorch_model-00001-of-00002.bin",
130
+ "model.layers.18.mlp.up_proj.weight": "pytorch_model-00001-of-00002.bin",
131
+ "model.layers.18.post_attention_layernorm.weight": "pytorch_model-00001-of-00002.bin",
132
+ "model.layers.18.self_attn.k_proj.bias": "pytorch_model-00001-of-00002.bin",
133
+ "model.layers.18.self_attn.k_proj.weight": "pytorch_model-00001-of-00002.bin",
134
+ "model.layers.18.self_attn.o_proj.weight": "pytorch_model-00001-of-00002.bin",
135
+ "model.layers.18.self_attn.q_proj.bias": "pytorch_model-00001-of-00002.bin",
136
+ "model.layers.18.self_attn.q_proj.weight": "pytorch_model-00001-of-00002.bin",
137
+ "model.layers.18.self_attn.v_proj.bias": "pytorch_model-00001-of-00002.bin",
138
+ "model.layers.18.self_attn.v_proj.weight": "pytorch_model-00001-of-00002.bin",
139
+ "model.layers.19.input_layernorm.weight": "pytorch_model-00001-of-00002.bin",
140
+ "model.layers.19.mlp.down_proj.weight": "pytorch_model-00001-of-00002.bin",
141
+ "model.layers.19.mlp.gate_proj.weight": "pytorch_model-00001-of-00002.bin",
142
+ "model.layers.19.mlp.up_proj.weight": "pytorch_model-00001-of-00002.bin",
143
+ "model.layers.19.post_attention_layernorm.weight": "pytorch_model-00001-of-00002.bin",
144
+ "model.layers.19.self_attn.k_proj.bias": "pytorch_model-00001-of-00002.bin",
145
+ "model.layers.19.self_attn.k_proj.weight": "pytorch_model-00001-of-00002.bin",
146
+ "model.layers.19.self_attn.o_proj.weight": "pytorch_model-00001-of-00002.bin",
147
+ "model.layers.19.self_attn.q_proj.bias": "pytorch_model-00001-of-00002.bin",
148
+ "model.layers.19.self_attn.q_proj.weight": "pytorch_model-00001-of-00002.bin",
149
+ "model.layers.19.self_attn.v_proj.bias": "pytorch_model-00001-of-00002.bin",
150
+ "model.layers.19.self_attn.v_proj.weight": "pytorch_model-00001-of-00002.bin",
151
+ "model.layers.2.input_layernorm.weight": "pytorch_model-00001-of-00002.bin",
152
+ "model.layers.2.mlp.down_proj.weight": "pytorch_model-00001-of-00002.bin",
153
+ "model.layers.2.mlp.gate_proj.weight": "pytorch_model-00001-of-00002.bin",
154
+ "model.layers.2.mlp.up_proj.weight": "pytorch_model-00001-of-00002.bin",
155
+ "model.layers.2.post_attention_layernorm.weight": "pytorch_model-00001-of-00002.bin",
156
+ "model.layers.2.self_attn.k_proj.bias": "pytorch_model-00001-of-00002.bin",
157
+ "model.layers.2.self_attn.k_proj.weight": "pytorch_model-00001-of-00002.bin",
158
+ "model.layers.2.self_attn.o_proj.weight": "pytorch_model-00001-of-00002.bin",
159
+ "model.layers.2.self_attn.q_proj.bias": "pytorch_model-00001-of-00002.bin",
160
+ "model.layers.2.self_attn.q_proj.weight": "pytorch_model-00001-of-00002.bin",
161
+ "model.layers.2.self_attn.v_proj.bias": "pytorch_model-00001-of-00002.bin",
162
+ "model.layers.2.self_attn.v_proj.weight": "pytorch_model-00001-of-00002.bin",
163
+ "model.layers.20.input_layernorm.weight": "pytorch_model-00001-of-00002.bin",
164
+ "model.layers.20.mlp.down_proj.weight": "pytorch_model-00001-of-00002.bin",
165
+ "model.layers.20.mlp.gate_proj.weight": "pytorch_model-00001-of-00002.bin",
166
+ "model.layers.20.mlp.up_proj.weight": "pytorch_model-00001-of-00002.bin",
167
+ "model.layers.20.post_attention_layernorm.weight": "pytorch_model-00001-of-00002.bin",
168
+ "model.layers.20.self_attn.k_proj.bias": "pytorch_model-00001-of-00002.bin",
169
+ "model.layers.20.self_attn.k_proj.weight": "pytorch_model-00001-of-00002.bin",
170
+ "model.layers.20.self_attn.o_proj.weight": "pytorch_model-00001-of-00002.bin",
171
+ "model.layers.20.self_attn.q_proj.bias": "pytorch_model-00001-of-00002.bin",
172
+ "model.layers.20.self_attn.q_proj.weight": "pytorch_model-00001-of-00002.bin",
173
+ "model.layers.20.self_attn.v_proj.bias": "pytorch_model-00001-of-00002.bin",
174
+ "model.layers.20.self_attn.v_proj.weight": "pytorch_model-00001-of-00002.bin",
175
+ "model.layers.21.input_layernorm.weight": "pytorch_model-00002-of-00002.bin",
176
+ "model.layers.21.mlp.down_proj.weight": "pytorch_model-00002-of-00002.bin",
177
+ "model.layers.21.mlp.gate_proj.weight": "pytorch_model-00001-of-00002.bin",
178
+ "model.layers.21.mlp.up_proj.weight": "pytorch_model-00001-of-00002.bin",
179
+ "model.layers.21.post_attention_layernorm.weight": "pytorch_model-00002-of-00002.bin",
180
+ "model.layers.21.self_attn.k_proj.bias": "pytorch_model-00001-of-00002.bin",
181
+ "model.layers.21.self_attn.k_proj.weight": "pytorch_model-00001-of-00002.bin",
182
+ "model.layers.21.self_attn.o_proj.weight": "pytorch_model-00001-of-00002.bin",
183
+ "model.layers.21.self_attn.q_proj.bias": "pytorch_model-00001-of-00002.bin",
184
+ "model.layers.21.self_attn.q_proj.weight": "pytorch_model-00001-of-00002.bin",
185
+ "model.layers.21.self_attn.v_proj.bias": "pytorch_model-00001-of-00002.bin",
186
+ "model.layers.21.self_attn.v_proj.weight": "pytorch_model-00001-of-00002.bin",
187
+ "model.layers.22.input_layernorm.weight": "pytorch_model-00002-of-00002.bin",
188
+ "model.layers.22.mlp.down_proj.weight": "pytorch_model-00002-of-00002.bin",
189
+ "model.layers.22.mlp.gate_proj.weight": "pytorch_model-00002-of-00002.bin",
190
+ "model.layers.22.mlp.up_proj.weight": "pytorch_model-00002-of-00002.bin",
191
+ "model.layers.22.post_attention_layernorm.weight": "pytorch_model-00002-of-00002.bin",
192
+ "model.layers.22.self_attn.k_proj.bias": "pytorch_model-00002-of-00002.bin",
193
+ "model.layers.22.self_attn.k_proj.weight": "pytorch_model-00002-of-00002.bin",
194
+ "model.layers.22.self_attn.o_proj.weight": "pytorch_model-00002-of-00002.bin",
195
+ "model.layers.22.self_attn.q_proj.bias": "pytorch_model-00002-of-00002.bin",
196
+ "model.layers.22.self_attn.q_proj.weight": "pytorch_model-00002-of-00002.bin",
197
+ "model.layers.22.self_attn.v_proj.bias": "pytorch_model-00002-of-00002.bin",
198
+ "model.layers.22.self_attn.v_proj.weight": "pytorch_model-00002-of-00002.bin",
199
+ "model.layers.23.input_layernorm.weight": "pytorch_model-00002-of-00002.bin",
200
+ "model.layers.23.mlp.down_proj.weight": "pytorch_model-00002-of-00002.bin",
201
+ "model.layers.23.mlp.gate_proj.weight": "pytorch_model-00002-of-00002.bin",
202
+ "model.layers.23.mlp.up_proj.weight": "pytorch_model-00002-of-00002.bin",
203
+ "model.layers.23.post_attention_layernorm.weight": "pytorch_model-00002-of-00002.bin",
204
+ "model.layers.23.self_attn.k_proj.bias": "pytorch_model-00002-of-00002.bin",
205
+ "model.layers.23.self_attn.k_proj.weight": "pytorch_model-00002-of-00002.bin",
206
+ "model.layers.23.self_attn.o_proj.weight": "pytorch_model-00002-of-00002.bin",
207
+ "model.layers.23.self_attn.q_proj.bias": "pytorch_model-00002-of-00002.bin",
208
+ "model.layers.23.self_attn.q_proj.weight": "pytorch_model-00002-of-00002.bin",
209
+ "model.layers.23.self_attn.v_proj.bias": "pytorch_model-00002-of-00002.bin",
210
+ "model.layers.23.self_attn.v_proj.weight": "pytorch_model-00002-of-00002.bin",
211
+ "model.layers.24.input_layernorm.weight": "pytorch_model-00002-of-00002.bin",
212
+ "model.layers.24.mlp.down_proj.weight": "pytorch_model-00002-of-00002.bin",
213
+ "model.layers.24.mlp.gate_proj.weight": "pytorch_model-00002-of-00002.bin",
214
+ "model.layers.24.mlp.up_proj.weight": "pytorch_model-00002-of-00002.bin",
215
+ "model.layers.24.post_attention_layernorm.weight": "pytorch_model-00002-of-00002.bin",
216
+ "model.layers.24.self_attn.k_proj.bias": "pytorch_model-00002-of-00002.bin",
217
+ "model.layers.24.self_attn.k_proj.weight": "pytorch_model-00002-of-00002.bin",
218
+ "model.layers.24.self_attn.o_proj.weight": "pytorch_model-00002-of-00002.bin",
219
+ "model.layers.24.self_attn.q_proj.bias": "pytorch_model-00002-of-00002.bin",
220
+ "model.layers.24.self_attn.q_proj.weight": "pytorch_model-00002-of-00002.bin",
221
+ "model.layers.24.self_attn.v_proj.bias": "pytorch_model-00002-of-00002.bin",
222
+ "model.layers.24.self_attn.v_proj.weight": "pytorch_model-00002-of-00002.bin",
223
+ "model.layers.25.input_layernorm.weight": "pytorch_model-00002-of-00002.bin",
224
+ "model.layers.25.mlp.down_proj.weight": "pytorch_model-00002-of-00002.bin",
225
+ "model.layers.25.mlp.gate_proj.weight": "pytorch_model-00002-of-00002.bin",
226
+ "model.layers.25.mlp.up_proj.weight": "pytorch_model-00002-of-00002.bin",
227
+ "model.layers.25.post_attention_layernorm.weight": "pytorch_model-00002-of-00002.bin",
228
+ "model.layers.25.self_attn.k_proj.bias": "pytorch_model-00002-of-00002.bin",
229
+ "model.layers.25.self_attn.k_proj.weight": "pytorch_model-00002-of-00002.bin",
230
+ "model.layers.25.self_attn.o_proj.weight": "pytorch_model-00002-of-00002.bin",
231
+ "model.layers.25.self_attn.q_proj.bias": "pytorch_model-00002-of-00002.bin",
232
+ "model.layers.25.self_attn.q_proj.weight": "pytorch_model-00002-of-00002.bin",
233
+ "model.layers.25.self_attn.v_proj.bias": "pytorch_model-00002-of-00002.bin",
234
+ "model.layers.25.self_attn.v_proj.weight": "pytorch_model-00002-of-00002.bin",
235
+ "model.layers.26.input_layernorm.weight": "pytorch_model-00002-of-00002.bin",
236
+ "model.layers.26.mlp.down_proj.weight": "pytorch_model-00002-of-00002.bin",
237
+ "model.layers.26.mlp.gate_proj.weight": "pytorch_model-00002-of-00002.bin",
238
+ "model.layers.26.mlp.up_proj.weight": "pytorch_model-00002-of-00002.bin",
239
+ "model.layers.26.post_attention_layernorm.weight": "pytorch_model-00002-of-00002.bin",
240
+ "model.layers.26.self_attn.k_proj.bias": "pytorch_model-00002-of-00002.bin",
241
+ "model.layers.26.self_attn.k_proj.weight": "pytorch_model-00002-of-00002.bin",
242
+ "model.layers.26.self_attn.o_proj.weight": "pytorch_model-00002-of-00002.bin",
243
+ "model.layers.26.self_attn.q_proj.bias": "pytorch_model-00002-of-00002.bin",
244
+ "model.layers.26.self_attn.q_proj.weight": "pytorch_model-00002-of-00002.bin",
245
+ "model.layers.26.self_attn.v_proj.bias": "pytorch_model-00002-of-00002.bin",
246
+ "model.layers.26.self_attn.v_proj.weight": "pytorch_model-00002-of-00002.bin",
247
+ "model.layers.27.input_layernorm.weight": "pytorch_model-00002-of-00002.bin",
248
+ "model.layers.27.mlp.down_proj.weight": "pytorch_model-00002-of-00002.bin",
249
+ "model.layers.27.mlp.gate_proj.weight": "pytorch_model-00002-of-00002.bin",
250
+ "model.layers.27.mlp.up_proj.weight": "pytorch_model-00002-of-00002.bin",
251
+ "model.layers.27.post_attention_layernorm.weight": "pytorch_model-00002-of-00002.bin",
252
+ "model.layers.27.self_attn.k_proj.bias": "pytorch_model-00002-of-00002.bin",
253
+ "model.layers.27.self_attn.k_proj.weight": "pytorch_model-00002-of-00002.bin",
254
+ "model.layers.27.self_attn.o_proj.weight": "pytorch_model-00002-of-00002.bin",
255
+ "model.layers.27.self_attn.q_proj.bias": "pytorch_model-00002-of-00002.bin",
256
+ "model.layers.27.self_attn.q_proj.weight": "pytorch_model-00002-of-00002.bin",
257
+ "model.layers.27.self_attn.v_proj.bias": "pytorch_model-00002-of-00002.bin",
258
+ "model.layers.27.self_attn.v_proj.weight": "pytorch_model-00002-of-00002.bin",
259
+ "model.layers.3.input_layernorm.weight": "pytorch_model-00001-of-00002.bin",
260
+ "model.layers.3.mlp.down_proj.weight": "pytorch_model-00001-of-00002.bin",
261
+ "model.layers.3.mlp.gate_proj.weight": "pytorch_model-00001-of-00002.bin",
262
+ "model.layers.3.mlp.up_proj.weight": "pytorch_model-00001-of-00002.bin",
263
+ "model.layers.3.post_attention_layernorm.weight": "pytorch_model-00001-of-00002.bin",
264
+ "model.layers.3.self_attn.k_proj.bias": "pytorch_model-00001-of-00002.bin",
265
+ "model.layers.3.self_attn.k_proj.weight": "pytorch_model-00001-of-00002.bin",
266
+ "model.layers.3.self_attn.o_proj.weight": "pytorch_model-00001-of-00002.bin",
267
+ "model.layers.3.self_attn.q_proj.bias": "pytorch_model-00001-of-00002.bin",
268
+ "model.layers.3.self_attn.q_proj.weight": "pytorch_model-00001-of-00002.bin",
269
+ "model.layers.3.self_attn.v_proj.bias": "pytorch_model-00001-of-00002.bin",
270
+ "model.layers.3.self_attn.v_proj.weight": "pytorch_model-00001-of-00002.bin",
271
+ "model.layers.4.input_layernorm.weight": "pytorch_model-00001-of-00002.bin",
272
+ "model.layers.4.mlp.down_proj.weight": "pytorch_model-00001-of-00002.bin",
273
+ "model.layers.4.mlp.gate_proj.weight": "pytorch_model-00001-of-00002.bin",
274
+ "model.layers.4.mlp.up_proj.weight": "pytorch_model-00001-of-00002.bin",
275
+ "model.layers.4.post_attention_layernorm.weight": "pytorch_model-00001-of-00002.bin",
276
+ "model.layers.4.self_attn.k_proj.bias": "pytorch_model-00001-of-00002.bin",
277
+ "model.layers.4.self_attn.k_proj.weight": "pytorch_model-00001-of-00002.bin",
278
+ "model.layers.4.self_attn.o_proj.weight": "pytorch_model-00001-of-00002.bin",
279
+ "model.layers.4.self_attn.q_proj.bias": "pytorch_model-00001-of-00002.bin",
280
+ "model.layers.4.self_attn.q_proj.weight": "pytorch_model-00001-of-00002.bin",
281
+ "model.layers.4.self_attn.v_proj.bias": "pytorch_model-00001-of-00002.bin",
282
+ "model.layers.4.self_attn.v_proj.weight": "pytorch_model-00001-of-00002.bin",
283
+ "model.layers.5.input_layernorm.weight": "pytorch_model-00001-of-00002.bin",
284
+ "model.layers.5.mlp.down_proj.weight": "pytorch_model-00001-of-00002.bin",
285
+ "model.layers.5.mlp.gate_proj.weight": "pytorch_model-00001-of-00002.bin",
286
+ "model.layers.5.mlp.up_proj.weight": "pytorch_model-00001-of-00002.bin",
287
+ "model.layers.5.post_attention_layernorm.weight": "pytorch_model-00001-of-00002.bin",
288
+ "model.layers.5.self_attn.k_proj.bias": "pytorch_model-00001-of-00002.bin",
289
+ "model.layers.5.self_attn.k_proj.weight": "pytorch_model-00001-of-00002.bin",
290
+ "model.layers.5.self_attn.o_proj.weight": "pytorch_model-00001-of-00002.bin",
291
+ "model.layers.5.self_attn.q_proj.bias": "pytorch_model-00001-of-00002.bin",
292
+ "model.layers.5.self_attn.q_proj.weight": "pytorch_model-00001-of-00002.bin",
293
+ "model.layers.5.self_attn.v_proj.bias": "pytorch_model-00001-of-00002.bin",
294
+ "model.layers.5.self_attn.v_proj.weight": "pytorch_model-00001-of-00002.bin",
295
+ "model.layers.6.input_layernorm.weight": "pytorch_model-00001-of-00002.bin",
296
+ "model.layers.6.mlp.down_proj.weight": "pytorch_model-00001-of-00002.bin",
297
+ "model.layers.6.mlp.gate_proj.weight": "pytorch_model-00001-of-00002.bin",
298
+ "model.layers.6.mlp.up_proj.weight": "pytorch_model-00001-of-00002.bin",
299
+ "model.layers.6.post_attention_layernorm.weight": "pytorch_model-00001-of-00002.bin",
300
+ "model.layers.6.self_attn.k_proj.bias": "pytorch_model-00001-of-00002.bin",
301
+ "model.layers.6.self_attn.k_proj.weight": "pytorch_model-00001-of-00002.bin",
302
+ "model.layers.6.self_attn.o_proj.weight": "pytorch_model-00001-of-00002.bin",
303
+ "model.layers.6.self_attn.q_proj.bias": "pytorch_model-00001-of-00002.bin",
304
+ "model.layers.6.self_attn.q_proj.weight": "pytorch_model-00001-of-00002.bin",
305
+ "model.layers.6.self_attn.v_proj.bias": "pytorch_model-00001-of-00002.bin",
306
+ "model.layers.6.self_attn.v_proj.weight": "pytorch_model-00001-of-00002.bin",
307
+ "model.layers.7.input_layernorm.weight": "pytorch_model-00001-of-00002.bin",
308
+ "model.layers.7.mlp.down_proj.weight": "pytorch_model-00001-of-00002.bin",
309
+ "model.layers.7.mlp.gate_proj.weight": "pytorch_model-00001-of-00002.bin",
310
+ "model.layers.7.mlp.up_proj.weight": "pytorch_model-00001-of-00002.bin",
311
+ "model.layers.7.post_attention_layernorm.weight": "pytorch_model-00001-of-00002.bin",
312
+ "model.layers.7.self_attn.k_proj.bias": "pytorch_model-00001-of-00002.bin",
313
+ "model.layers.7.self_attn.k_proj.weight": "pytorch_model-00001-of-00002.bin",
314
+ "model.layers.7.self_attn.o_proj.weight": "pytorch_model-00001-of-00002.bin",
315
+ "model.layers.7.self_attn.q_proj.bias": "pytorch_model-00001-of-00002.bin",
316
+ "model.layers.7.self_attn.q_proj.weight": "pytorch_model-00001-of-00002.bin",
317
+ "model.layers.7.self_attn.v_proj.bias": "pytorch_model-00001-of-00002.bin",
318
+ "model.layers.7.self_attn.v_proj.weight": "pytorch_model-00001-of-00002.bin",
319
+ "model.layers.8.input_layernorm.weight": "pytorch_model-00001-of-00002.bin",
320
+ "model.layers.8.mlp.down_proj.weight": "pytorch_model-00001-of-00002.bin",
321
+ "model.layers.8.mlp.gate_proj.weight": "pytorch_model-00001-of-00002.bin",
322
+ "model.layers.8.mlp.up_proj.weight": "pytorch_model-00001-of-00002.bin",
323
+ "model.layers.8.post_attention_layernorm.weight": "pytorch_model-00001-of-00002.bin",
324
+ "model.layers.8.self_attn.k_proj.bias": "pytorch_model-00001-of-00002.bin",
325
+ "model.layers.8.self_attn.k_proj.weight": "pytorch_model-00001-of-00002.bin",
326
+ "model.layers.8.self_attn.o_proj.weight": "pytorch_model-00001-of-00002.bin",
327
+ "model.layers.8.self_attn.q_proj.bias": "pytorch_model-00001-of-00002.bin",
328
+ "model.layers.8.self_attn.q_proj.weight": "pytorch_model-00001-of-00002.bin",
329
+ "model.layers.8.self_attn.v_proj.bias": "pytorch_model-00001-of-00002.bin",
330
+ "model.layers.8.self_attn.v_proj.weight": "pytorch_model-00001-of-00002.bin",
331
+ "model.layers.9.input_layernorm.weight": "pytorch_model-00001-of-00002.bin",
332
+ "model.layers.9.mlp.down_proj.weight": "pytorch_model-00001-of-00002.bin",
333
+ "model.layers.9.mlp.gate_proj.weight": "pytorch_model-00001-of-00002.bin",
334
+ "model.layers.9.mlp.up_proj.weight": "pytorch_model-00001-of-00002.bin",
335
+ "model.layers.9.post_attention_layernorm.weight": "pytorch_model-00001-of-00002.bin",
336
+ "model.layers.9.self_attn.k_proj.bias": "pytorch_model-00001-of-00002.bin",
337
+ "model.layers.9.self_attn.k_proj.weight": "pytorch_model-00001-of-00002.bin",
338
+ "model.layers.9.self_attn.o_proj.weight": "pytorch_model-00001-of-00002.bin",
339
+ "model.layers.9.self_attn.q_proj.bias": "pytorch_model-00001-of-00002.bin",
340
+ "model.layers.9.self_attn.q_proj.weight": "pytorch_model-00001-of-00002.bin",
341
+ "model.layers.9.self_attn.v_proj.bias": "pytorch_model-00001-of-00002.bin",
342
+ "model.layers.9.self_attn.v_proj.weight": "pytorch_model-00001-of-00002.bin",
343
+ "model.norm.weight": "pytorch_model-00002-of-00002.bin",
344
+ "score.bias": "pytorch_model-00002-of-00002.bin",
345
+ "score.weight": "pytorch_model-00002-of-00002.bin"
346
+ }
347
+ }
special_tokens_map.json ADDED
@@ -0,0 +1,42 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "additional_special_tokens": [
3
+ "<|im_start|>",
4
+ "<|im_end|>"
5
+ ],
6
+ "cls_token": {
7
+ "content": "<|im_start|>",
8
+ "lstrip": false,
9
+ "normalized": false,
10
+ "rstrip": false,
11
+ "single_word": false
12
+ },
13
+ "eos_token": {
14
+ "content": "<|endoftext|>",
15
+ "lstrip": false,
16
+ "normalized": false,
17
+ "rstrip": false,
18
+ "single_word": false
19
+ },
20
+ "mask_token": "<unk>",
21
+ "pad_token": {
22
+ "content": "<|endoftext|>",
23
+ "lstrip": false,
24
+ "normalized": false,
25
+ "rstrip": false,
26
+ "single_word": false
27
+ },
28
+ "sep_token": {
29
+ "content": "<|im_end|>",
30
+ "lstrip": false,
31
+ "normalized": false,
32
+ "rstrip": false,
33
+ "single_word": false
34
+ },
35
+ "unk_token": {
36
+ "content": "<unk>",
37
+ "lstrip": false,
38
+ "normalized": false,
39
+ "rstrip": false,
40
+ "single_word": false
41
+ }
42
+ }
tokenizer_config.json ADDED
@@ -0,0 +1,54 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_prefix_space": false,
3
+ "added_tokens_decoder": {
4
+ "128244": {
5
+ "content": "<unk>",
6
+ "lstrip": false,
7
+ "normalized": false,
8
+ "rstrip": false,
9
+ "single_word": false,
10
+ "special": true
11
+ },
12
+ "151643": {
13
+ "content": "<|endoftext|>",
14
+ "lstrip": false,
15
+ "normalized": false,
16
+ "rstrip": false,
17
+ "single_word": false,
18
+ "special": true
19
+ },
20
+ "151644": {
21
+ "content": "<|im_start|>",
22
+ "lstrip": false,
23
+ "normalized": false,
24
+ "rstrip": false,
25
+ "single_word": false,
26
+ "special": true
27
+ },
28
+ "151645": {
29
+ "content": "<|im_end|>",
30
+ "lstrip": false,
31
+ "normalized": false,
32
+ "rstrip": false,
33
+ "single_word": false,
34
+ "special": true
35
+ }
36
+ },
37
+ "additional_special_tokens": [
38
+ "<|im_start|>",
39
+ "<|im_end|>"
40
+ ],
41
+ "bos_token": null,
42
+ "clean_up_tokenization_spaces": false,
43
+ "cls_token": "<|im_start|>",
44
+ "eos_token": "<|endoftext|>",
45
+ "errors": "replace",
46
+ "mask_token": "<unk>",
47
+ "model_max_length": 32768,
48
+ "pad_token": "<|endoftext|>",
49
+ "padding_side": "right",
50
+ "sep_token": "<|im_end|>",
51
+ "split_special_tokens": false,
52
+ "tokenizer_class": "Qwen2Tokenizer",
53
+ "unk_token": "<unk>"
54
+ }
ud.py ADDED
@@ -0,0 +1,121 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import numpy
2
+ from transformers import TokenClassificationPipeline
3
+
4
+ class BellmanFordTokenClassificationPipeline(TokenClassificationPipeline):
5
+ def __init__(self,**kwargs):
6
+ super().__init__(**kwargs)
7
+ x=self.model.config.label2id
8
+ y=[k for k in x if k.startswith("B-") or not (k.startswith("I-") or k.endswith("|root") or k.find("|l-")>0 or k.find("|r-")>0)]
9
+ self.transition=numpy.full((len(x),len(x)),numpy.nan)
10
+ for k,v in x.items():
11
+ for j in ["I-"+k[2:]] if k.startswith("B-") else [k]+y if k.startswith("I-") else y:
12
+ self.transition[v,x[j]]=0
13
+ def check_model_type(self,supported_models):
14
+ pass
15
+ def postprocess(self,model_outputs,**kwargs):
16
+ if "logits" not in model_outputs:
17
+ return self.postprocess(model_outputs[0],**kwargs)
18
+ m=model_outputs["logits"][0].numpy()
19
+ e=numpy.exp(m-numpy.max(m,axis=-1,keepdims=True))
20
+ z=e/e.sum(axis=-1,keepdims=True)
21
+ for i in range(m.shape[0]-1,0,-1):
22
+ m[i-1]+=numpy.nanmax(m[i]+self.transition,axis=1)
23
+ k=[numpy.nanargmax(m[0]+self.transition[0])]
24
+ for i in range(1,m.shape[0]):
25
+ k.append(numpy.nanargmax(m[i]+self.transition[k[-1]]))
26
+ w=[{"entity":self.model.config.id2label[j],"start":s,"end":e,"score":z[i,j]} for i,((s,e),j) in enumerate(zip(model_outputs["offset_mapping"][0].tolist(),k)) if s<e]
27
+ if "aggregation_strategy" in kwargs and kwargs["aggregation_strategy"]!="none":
28
+ for i,t in reversed(list(enumerate(w))):
29
+ p=t.pop("entity")
30
+ if p.startswith("I-"):
31
+ w[i-1]["score"]=min(w[i-1]["score"],t["score"])
32
+ w[i-1]["end"]=w.pop(i)["end"]
33
+ elif p.startswith("B-"):
34
+ t["entity_group"]=p[2:]
35
+ else:
36
+ t["entity_group"]=p
37
+ for t in w:
38
+ t["text"]=model_outputs["sentence"][t["start"]:t["end"]]
39
+ return w
40
+
41
+ class UniversalDependenciesCausalPipeline(BellmanFordTokenClassificationPipeline):
42
+ def __init__(self,**kwargs):
43
+ kwargs["aggregation_strategy"]="simple"
44
+ super().__init__(**kwargs)
45
+ x=self.model.config.label2id
46
+ self.root=numpy.full((len(x)),numpy.nan)
47
+ self.left_arc=numpy.full((len(x)),numpy.nan)
48
+ self.right_arc=numpy.full((len(x)),numpy.nan)
49
+ for k,v in x.items():
50
+ if k.endswith("|root"):
51
+ self.root[v]=0
52
+ elif k.find("|l-")>0:
53
+ self.left_arc[v]=0
54
+ elif k.find("|r-")>0:
55
+ self.right_arc[v]=0
56
+ def postprocess(self,model_outputs,**kwargs):
57
+ import torch
58
+ if "logits" not in model_outputs:
59
+ return self.postprocess(model_outputs[0],**kwargs)
60
+ m=model_outputs["logits"][0].numpy()
61
+ for i in range(m.shape[0]-1,0,-1):
62
+ m[i-1]+=numpy.nanmax(m[i]+self.transition,axis=1)
63
+ k=[numpy.nanargmax(m[0]+self.transition[0])]
64
+ for i in range(1,m.shape[0]):
65
+ k.append(numpy.nanargmax(m[i]+self.transition[k[-1]]))
66
+ w=[{"entity":self.model.config.id2label[j],"start":s,"end":e} for i,((s,e),j) in enumerate(zip(model_outputs["offset_mapping"][0].tolist(),k)) if s<e]
67
+ for i,t in reversed(list(enumerate(w))):
68
+ p=t.pop("entity")
69
+ if p.startswith("I-"):
70
+ w[i-1]["end"]=w.pop(i)["end"]
71
+ elif p.startswith("B-"):
72
+ t["entity_group"]=p[2:]
73
+ else:
74
+ t["entity_group"]=p
75
+ d=[model_outputs["sentence"][t["start"]:t["end"]] for t in w]
76
+ v=self.tokenizer(d,add_special_tokens=False)
77
+ e=self.model.get_input_embeddings().weight
78
+ m=[]
79
+ for x in v["input_ids"]:
80
+ if x==[]:
81
+ x=[self.tokenizer.unk_token_id]
82
+ m.append(e[x,:].sum(axis=0))
83
+ m.append(e[self.tokenizer.sep_token_id,:])
84
+ m.append(e[self.tokenizer.pad_token_id,:])
85
+ m=torch.stack(m)
86
+ k=list(range(len(d)+1))
87
+ with torch.no_grad():
88
+ e=self.model(inputs_embeds=torch.stack([m[k+list(range(i,len(d)))+[-1]*i,:] for i in range(len(d))])).logits[:,-len(d):,:].numpy()
89
+ for i in range(len(d)):
90
+ for j in range(i):
91
+ e[-j-1,-i-1],e[-i-1,-j-1]=e[-i-1,i-j]+self.left_arc,e[-i-1,i-j]+self.right_arc
92
+ e[-i-1,-i-1]=e[-i-1,0]+self.root
93
+ m,p=numpy.nanmax(e,axis=2),numpy.nanargmax(e,axis=2)
94
+ h=self.chu_liu_edmonds(m)
95
+ z=[i for i,j in enumerate(h) if i==j]
96
+ if len(z)>1:
97
+ k,h=z[numpy.nanargmax(m[z,z])],numpy.nanmin(m)-numpy.nanmax(m)
98
+ m[:,z]+=[[0 if j in z and (i!=j or i==k) else h for i in z] for j in range(m.shape[0])]
99
+ h=self.chu_liu_edmonds(m)
100
+ q=[self.model.config.id2label[p[j,i]].split("|") for i,j in enumerate(h)]
101
+ t=model_outputs["sentence"].replace("\n"," ")
102
+ u="# text = "+t+"\n"
103
+ for i,j in enumerate(d):
104
+ u+="\t".join([str(i+1),j,"_",q[i][0],"_","_" if len(q[i])<3 else "|".join(q[i][1:-1]),str(0 if h[i]==i else h[i]+1),"root" if q[i][-1]=="root" else q[i][-1][2:],"_","_" if i+1<len(d) and w[i]["end"]<w[i+1]["start"] else "SpaceAfter=No"])+"\n"
105
+ return u+"\n"
106
+ def chu_liu_edmonds(self,matrix):
107
+ h=numpy.nanargmax(matrix,axis=0)
108
+ x=[-1 if i==j else j for i,j in enumerate(h)]
109
+ for b in [lambda x,i,j:-1 if i not in x else x[i],lambda x,i,j:-1 if j<0 else x[j]]:
110
+ y=[]
111
+ while x!=y:
112
+ y=list(x)
113
+ for i,j in enumerate(x):
114
+ x[i]=b(x,i,j)
115
+ if max(x)<0:
116
+ return h
117
+ y,x=[i for i,j in enumerate(x) if j==max(x)],[i for i,j in enumerate(x) if j<max(x)]
118
+ z=matrix-numpy.nanmax(matrix,axis=0)
119
+ m=numpy.block([[z[x,:][:,x],numpy.nanmax(z[x,:][:,y],axis=1).reshape(len(x),1)],[numpy.nanmax(z[y,:][:,x],axis=0),numpy.nanmax(z[y,y])]])
120
+ k=[j if i==len(x) else x[j] if j<len(x) else y[numpy.nanargmax(z[y,x[i]])] for i,j in enumerate(self.chu_liu_edmonds(m))]
121
+
vocab.json ADDED
The diff for this file is too large to render. See raw diff