Training completed!
Browse files- README.md +14 -21
- pytorch_model.bin +1 -1
- training_args.bin +1 -1
README.md
CHANGED
@@ -3,16 +3,11 @@ license: mit
|
|
3 |
base_model: xlm-roberta-base
|
4 |
tags:
|
5 |
- generated_from_trainer
|
6 |
-
- NERz
|
7 |
-
- crypto
|
8 |
metrics:
|
9 |
- f1
|
10 |
model-index:
|
11 |
-
- name: xlm-roberta-base-finetuned-
|
12 |
results: []
|
13 |
-
widget:
|
14 |
-
- text: "Didn't I tell you that that was a decent entry point on $PROPHET? If you are in - congrats, Prophet is up 90% in the last 2 weeks and 50% up in the last week alone"
|
15 |
-
|
16 |
---
|
17 |
|
18 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
@@ -22,23 +17,21 @@ should probably proofread and complete it, then remove this comment. -->
|
|
22 |
|
23 |
This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on the None dataset.
|
24 |
It achieves the following results on the evaluation set:
|
25 |
-
- Loss: 0.
|
26 |
-
- F1: 0.
|
27 |
|
28 |
## Model description
|
29 |
|
30 |
-
|
31 |
-
|
32 |
-
## Intended uses
|
33 |
-
Designed primarily for NER tasks in the cryptocurrency sector, this model excels in identifying and categorizing ticker symbols, cryptocurrency names, and addresses in textual content.
|
34 |
|
|
|
35 |
|
36 |
-
|
37 |
|
38 |
-
Performance may be subpar when the model encounters entities outside its training data or infrequently occurring entities within the cryptocurrency domain. The model might also be susceptible to variations in entity presentation and context.
|
39 |
## Training and evaluation data
|
40 |
|
41 |
-
|
|
|
42 |
## Training procedure
|
43 |
|
44 |
### Training hyperparameters
|
@@ -56,12 +49,12 @@ The following hyperparameters were used during training:
|
|
56 |
|
57 |
| Training Loss | Epoch | Step | Validation Loss | F1 |
|
58 |
|:-------------:|:-----:|:----:|:---------------:|:------:|
|
59 |
-
| 0.
|
60 |
-
| 0.
|
61 |
-
| 0.
|
62 |
-
| 0.
|
63 |
-
| 0.
|
64 |
-
| 0.
|
65 |
|
66 |
|
67 |
### Framework versions
|
|
|
3 |
base_model: xlm-roberta-base
|
4 |
tags:
|
5 |
- generated_from_trainer
|
|
|
|
|
6 |
metrics:
|
7 |
- f1
|
8 |
model-index:
|
9 |
+
- name: xlm-roberta-base-finetuned-NER-crypto
|
10 |
results: []
|
|
|
|
|
|
|
11 |
---
|
12 |
|
13 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
|
|
17 |
|
18 |
This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on the None dataset.
|
19 |
It achieves the following results on the evaluation set:
|
20 |
+
- Loss: 0.0058
|
21 |
+
- F1: 0.9970
|
22 |
|
23 |
## Model description
|
24 |
|
25 |
+
More information needed
|
|
|
|
|
|
|
26 |
|
27 |
+
## Intended uses & limitations
|
28 |
|
29 |
+
More information needed
|
30 |
|
|
|
31 |
## Training and evaluation data
|
32 |
|
33 |
+
More information needed
|
34 |
+
|
35 |
## Training procedure
|
36 |
|
37 |
### Training hyperparameters
|
|
|
49 |
|
50 |
| Training Loss | Epoch | Step | Validation Loss | F1 |
|
51 |
|:-------------:|:-----:|:----:|:---------------:|:------:|
|
52 |
+
| 0.0269 | 1.0 | 750 | 0.0080 | 0.9957 |
|
53 |
+
| 0.0049 | 2.0 | 1500 | 0.0074 | 0.9960 |
|
54 |
+
| 0.0042 | 3.0 | 2250 | 0.0074 | 0.9965 |
|
55 |
+
| 0.0034 | 4.0 | 3000 | 0.0058 | 0.9971 |
|
56 |
+
| 0.0028 | 5.0 | 3750 | 0.0059 | 0.9971 |
|
57 |
+
| 0.0024 | 6.0 | 4500 | 0.0058 | 0.9970 |
|
58 |
|
59 |
|
60 |
### Framework versions
|
pytorch_model.bin
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 1109908646
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:eb0199204864bfbf2afb23b12d566328fcff0b3ca07b369d101f15bd12d91c6d
|
3 |
size 1109908646
|
training_args.bin
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 4536
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:7f97e9408fb2064ad880c190e6b552ec42e7b3bf8ef5ac91969fe41997c5eb5b
|
3 |
size 4536
|