metadata
license: apache-2.0
base_model: t5-3b
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- accuracy
model-index:
- name: t5-3b_cola_dense_epochs-8_without_distillation_50
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Accuracy
type: accuracy
value: 0.8561840843720039
t5-3b_cola_dense_epochs-8_without_distillation_50
This model is a fine-tuned version of t5-3b on the glue dataset. It achieves the following results on the evaluation set:
- Loss: 5.6314
- Accuracy: 0.8562
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 32
- seed: 1
- distributed_type: multi-GPU
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 20
- num_epochs: 8
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
0.508 | 0.19 | 50 | 0.4814 | 0.8054 |
0.4158 | 0.37 | 100 | 0.3697 | 0.8399 |
0.4471 | 0.56 | 150 | 0.3512 | 0.8543 |
0.3381 | 0.75 | 200 | 0.3653 | 0.8399 |
0.428 | 0.93 | 250 | 0.3373 | 0.8591 |
0.2148 | 1.12 | 300 | 1.6354 | 0.8533 |
0.1962 | 1.31 | 350 | 1.9031 | 0.8610 |
0.2383 | 1.5 | 400 | 0.6977 | 0.8600 |
0.2276 | 1.68 | 450 | 0.7896 | 0.8543 |
0.2574 | 1.87 | 500 | 0.5960 | 0.8571 |
0.0955 | 2.06 | 550 | 6.3365 | 0.8543 |
0.1537 | 2.24 | 600 | 0.7912 | 0.8667 |
0.0846 | 2.43 | 650 | 0.8280 | 0.8658 |
0.1852 | 2.62 | 700 | 0.4582 | 0.8581 |
0.1836 | 2.8 | 750 | 5.0320 | 0.8485 |
0.7772 | 2.99 | 800 | 1.2307 | 0.8600 |
0.0544 | 3.18 | 850 | 6.9846 | 0.8466 |
0.1017 | 3.36 | 900 | 1.1242 | 0.8552 |
0.0783 | 3.55 | 950 | 0.6369 | 0.8667 |
0.0627 | 3.74 | 1000 | 3.8335 | 0.8600 |
0.7314 | 3.93 | 1050 | 2.0148 | 0.8706 |
0.024 | 4.11 | 1100 | 5.1811 | 0.8648 |
0.0627 | 4.3 | 1150 | 4.7943 | 0.8773 |
0.069 | 4.49 | 1200 | 4.1017 | 0.8639 |
0.0443 | 4.67 | 1250 | 2.4810 | 0.8648 |
0.0295 | 4.86 | 1300 | 2.5363 | 0.8485 |
0.0411 | 5.05 | 1350 | 3.3954 | 0.8581 |
1.2558 | 5.23 | 1400 | 5.3373 | 0.8495 |
0.064 | 5.42 | 1450 | 6.3714 | 0.8658 |
0.0259 | 5.61 | 1500 | 7.3145 | 0.8639 |
0.0413 | 5.79 | 1550 | 6.4314 | 0.8667 |
0.0568 | 5.98 | 1600 | 4.7175 | 0.8648 |
0.049 | 6.17 | 1650 | 6.4853 | 0.8523 |
0.0689 | 6.36 | 1700 | 3.8090 | 0.8677 |
0.6785 | 6.54 | 1750 | 4.8987 | 0.8600 |
0.6287 | 6.73 | 1800 | 3.7412 | 0.8658 |
0.1197 | 6.92 | 1850 | 5.6841 | 0.8629 |
0.0528 | 7.1 | 1900 | 4.6580 | 0.8591 |
0.6495 | 7.29 | 1950 | 5.2935 | 0.8619 |
0.0764 | 7.48 | 2000 | 4.2176 | 0.8466 |
0.0438 | 7.66 | 2050 | 6.9325 | 0.8533 |
0.0583 | 7.85 | 2100 | 4.7150 | 0.8591 |
Framework versions
- Transformers 4.34.1
- Pytorch 2.0.1+cu117
- Datasets 2.9.0
- Tokenizers 0.14.1