YAML Metadata
Error:
"model-index" must be an array
hausa-4-ha-wa2vec-data-aug-xls-r-300m
This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.3071
- Wer: 0.3304
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 60
- num_epochs: 30
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer |
---|---|---|---|---|
14.9837 | 0.46 | 30 | 10.7164 | 1.0 |
7.0027 | 0.92 | 60 | 3.9322 | 1.0 |
3.668 | 1.38 | 90 | 3.0115 | 1.0 |
2.9374 | 1.84 | 120 | 2.8464 | 1.0 |
2.8864 | 2.31 | 150 | 2.8234 | 1.0 |
2.8143 | 2.76 | 180 | 2.8158 | 1.0 |
2.8412 | 3.23 | 210 | 2.7971 | 1.0 |
2.7953 | 3.69 | 240 | 2.7910 | 1.0 |
2.835 | 4.15 | 270 | 2.7845 | 1.0 |
2.7802 | 4.61 | 300 | 2.7814 | 1.0 |
2.8292 | 5.08 | 330 | 2.7621 | 1.0 |
2.7618 | 5.53 | 360 | 2.7534 | 1.0 |
2.753 | 5.99 | 390 | 2.7468 | 1.0 |
2.7898 | 6.46 | 420 | 2.7431 | 1.0 |
2.7279 | 6.92 | 450 | 2.7243 | 1.0 |
2.7701 | 7.38 | 480 | 2.6845 | 1.0 |
2.6309 | 7.84 | 510 | 2.4668 | 1.0 |
2.3744 | 8.31 | 540 | 1.9042 | 1.0 |
1.6864 | 8.76 | 570 | 1.1582 | 0.9979 |
1.2278 | 9.23 | 600 | 0.8350 | 0.7765 |
0.987 | 9.69 | 630 | 0.7210 | 0.7456 |
0.8785 | 10.15 | 660 | 0.5951 | 0.6531 |
0.7311 | 10.61 | 690 | 0.5486 | 0.6141 |
0.7005 | 11.08 | 720 | 0.4986 | 0.5617 |
0.6442 | 11.53 | 750 | 0.4720 | 0.5658 |
0.5662 | 11.99 | 780 | 0.4476 | 0.5195 |
0.5385 | 12.46 | 810 | 0.4283 | 0.4938 |
0.5376 | 12.92 | 840 | 0.4029 | 0.4723 |
0.48 | 13.38 | 870 | 0.4047 | 0.4599 |
0.4786 | 13.84 | 900 | 0.3855 | 0.4378 |
0.4734 | 14.31 | 930 | 0.3843 | 0.4594 |
0.4572 | 14.76 | 960 | 0.3777 | 0.4188 |
0.406 | 15.23 | 990 | 0.3564 | 0.4060 |
0.4264 | 15.69 | 1020 | 0.3419 | 0.3983 |
0.3785 | 16.15 | 1050 | 0.3583 | 0.4013 |
0.3686 | 16.61 | 1080 | 0.3445 | 0.3844 |
0.3797 | 17.08 | 1110 | 0.3318 | 0.3839 |
0.3492 | 17.53 | 1140 | 0.3350 | 0.3808 |
0.3472 | 17.99 | 1170 | 0.3305 | 0.3772 |
0.3442 | 18.46 | 1200 | 0.3280 | 0.3684 |
0.3283 | 18.92 | 1230 | 0.3414 | 0.3762 |
0.3378 | 19.38 | 1260 | 0.3224 | 0.3607 |
0.3296 | 19.84 | 1290 | 0.3127 | 0.3669 |
0.3206 | 20.31 | 1320 | 0.3183 | 0.3546 |
0.3157 | 20.76 | 1350 | 0.3223 | 0.3402 |
0.3165 | 21.23 | 1380 | 0.3203 | 0.3371 |
0.3062 | 21.69 | 1410 | 0.3198 | 0.3499 |
0.2961 | 22.15 | 1440 | 0.3221 | 0.3438 |
0.2895 | 22.61 | 1470 | 0.3238 | 0.3469 |
0.2919 | 23.08 | 1500 | 0.3123 | 0.3397 |
0.2719 | 23.53 | 1530 | 0.3172 | 0.3412 |
0.2646 | 23.99 | 1560 | 0.3128 | 0.3345 |
0.2857 | 24.46 | 1590 | 0.3113 | 0.3366 |
0.2704 | 24.92 | 1620 | 0.3126 | 0.3433 |
0.2868 | 25.38 | 1650 | 0.3126 | 0.3402 |
0.2571 | 25.84 | 1680 | 0.3080 | 0.3397 |
0.2682 | 26.31 | 1710 | 0.3076 | 0.3371 |
0.2881 | 26.76 | 1740 | 0.3051 | 0.3330 |
0.2847 | 27.23 | 1770 | 0.3025 | 0.3381 |
0.2586 | 27.69 | 1800 | 0.3032 | 0.3350 |
0.2494 | 28.15 | 1830 | 0.3092 | 0.3345 |
0.2521 | 28.61 | 1860 | 0.3087 | 0.3340 |
0.2605 | 29.08 | 1890 | 0.3077 | 0.3320 |
0.2479 | 29.53 | 1920 | 0.3070 | 0.3304 |
0.2398 | 29.99 | 1950 | 0.3071 | 0.3304 |
Framework versions
- Transformers 4.12.5
- Pytorch 1.10.0+cu111
- Datasets 1.13.3
- Tokenizers 0.10.3
- Downloads last month
- 21
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.