New evaluation results added
Browse files
README.md
CHANGED
@@ -35,4 +35,19 @@ Pls refer "https://huggingface.co/transformers/_modules/transformers/pipelines/t
|
|
35 |
* accuracy: 0.9933935699477056
|
36 |
* f1: 0.9592969472710453
|
37 |
* precision: 0.9543530277931161
|
38 |
-
* recall: 0.9642923563325274
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
35 |
* accuracy: 0.9933935699477056
|
36 |
* f1: 0.9592969472710453
|
37 |
* precision: 0.9543530277931161
|
38 |
+
* recall: 0.9642923563325274
|
39 |
+
|
40 |
+
# Evaluation results with the test sets proposed in ["Küçük, D., Küçük, D., Arıcı, N. 2016. Türkçe Varlık İsmi Tanıma için bir Veri Kümesi ("A Named Entity Recognition Dataset for Turkish"). IEEE Sinyal İşleme, İletişim ve Uygulamaları Kurultayı. Zonguldak, Türkiye."](https://ieeexplore.ieee.org/document/7495744) paper.
|
41 |
+
|
42 |
+
Test Set Acc. Prec. Rec. F1-Score
|
43 |
+
20010000 0.9946 0.9871 0.9463 0.9662
|
44 |
+
20020000 0.9928 0.9134 0.9206 0.9170
|
45 |
+
20030000 0.9942 0.9814 0.9186 0.9489
|
46 |
+
20040000 0.9943 0.9660 0.9522 0.9590
|
47 |
+
20050000 0.9971 0.9539 0.9932 0.9732
|
48 |
+
20060000 0.9993 0.9942 0.9942 0.9942
|
49 |
+
20070000 0.9970 0.9806 0.9439 0.9619
|
50 |
+
20080000 0.9988 0.9821 0.9649 0.9735
|
51 |
+
20090000 0.9977 0.9891 0.9479 0.9681
|
52 |
+
20100000 0.9961 0.9684 0.9293 0.9485
|
53 |
+
Overall 0.9961 0.9720 0.9516 0.9617
|