Edit model card

bert_adaptation_martin_fierro

This model is a fine-tuned version of dccuchile/bert-base-spanish-wwm-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 3.3589

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss
5.7247 1.0 29 5.0114
4.7766 2.0 58 4.5598
4.3216 3.0 87 4.3656
4.0132 4.0 116 4.5464
3.8366 5.0 145 4.0157
3.7597 6.0 174 3.8641
3.6244 7.0 203 3.7634
3.6043 8.0 232 3.9174
3.4767 9.0 261 3.7874
3.2991 10.0 290 3.8620
3.3156 11.0 319 3.6171
3.1485 12.0 348 3.7967
3.0732 13.0 377 3.3494
3.0403 14.0 406 3.6496
2.9722 15.0 435 3.9632
2.9535 16.0 464 3.4333
2.9057 17.0 493 3.9740
2.724 18.0 522 3.8314
2.885 19.0 551 3.7820
2.861 20.0 580 3.6939
2.8194 21.0 609 3.9656
2.7182 22.0 638 3.8121
2.6446 23.0 667 3.8670
2.6841 24.0 696 3.6597
2.793 25.0 725 3.8053
2.7631 26.0 754 4.1065
2.5897 27.0 783 3.7100
2.6229 28.0 812 3.7801
2.6084 29.0 841 3.7222
2.634 30.0 870 3.6621

Framework versions

  • Transformers 4.33.1
  • Pytorch 2.0.1+cu118
  • Datasets 2.14.5
  • Tokenizers 0.13.3
Downloads last month
1
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for javier-rooster/bert_adaptation_martin_fierro

Finetuned
(165)
this model