csikasote's picture
End of training
7435467 verified
metadata
library_name: transformers
license: cc-by-nc-4.0
base_model: facebook/mms-1b-all
tags:
  - automatic-speech-recognition
  - genbed
  - mms
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: mms-1b-all-bem-genbed-combined-adapter-test
    results: []

mms-1b-all-bem-genbed-combined-adapter-test

This model is a fine-tuned version of facebook/mms-1b-all on the GENBED - BEM dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2293
  • Wer: 0.3752

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 100
  • num_epochs: 30.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
6.4783 0.1374 100 0.6403 0.69
0.5413 0.2747 200 0.3260 0.4931
0.437 0.4121 300 0.3014 0.4678
0.4402 0.5495 400 0.2982 0.4818
0.4153 0.6868 500 0.2936 0.4702
0.4154 0.8242 600 0.2884 0.4493
0.3789 0.9615 700 0.2806 0.4521
0.3667 1.0989 800 0.2764 0.4352
0.3929 1.2363 900 0.2763 0.4654
0.3847 1.3736 1000 0.2705 0.4406
0.3833 1.5110 1100 0.2697 0.4246
0.3742 1.6484 1200 0.2668 0.4250
0.3694 1.7857 1300 0.2690 0.4189
0.3494 1.9231 1400 0.2635 0.416
0.3724 2.0604 1500 0.2626 0.4323
0.3723 2.1978 1600 0.2598 0.4247
0.3505 2.3352 1700 0.2583 0.412
0.3393 2.4725 1800 0.2563 0.4128
0.3352 2.6099 1900 0.2545 0.4156
0.3516 2.7473 2000 0.2551 0.4315
0.3489 2.8846 2100 0.2560 0.4270
0.3512 3.0220 2200 0.2536 0.4039
0.339 3.1593 2300 0.2490 0.3989
0.3374 3.2967 2400 0.2495 0.3964
0.3295 3.4341 2500 0.2518 0.4037
0.3391 3.5714 2600 0.2491 0.4077
0.3373 3.7088 2700 0.2445 0.3989
0.3097 3.8462 2800 0.2462 0.4118
0.3458 3.9835 2900 0.2443 0.4034
0.313 4.1209 3000 0.2433 0.3882
0.3171 4.2582 3100 0.2426 0.3968
0.3122 4.3956 3200 0.2430 0.3936
0.3255 4.5330 3300 0.2404 0.3822
0.3253 4.6703 3400 0.2356 0.3946
0.3341 4.8077 3500 0.2369 0.3872
0.3183 4.9451 3600 0.2345 0.3854
0.3461 5.0824 3700 0.2395 0.3828
0.3147 5.2198 3800 0.2359 0.3775
0.317 5.3571 3900 0.2320 0.3808
0.3094 5.4945 4000 0.2366 0.3797
0.2913 5.6319 4100 0.2357 0.3749
0.3195 5.7692 4200 0.2332 0.3694
0.3189 5.9066 4300 0.2313 0.3870
0.3105 6.0440 4400 0.2326 0.3806
0.2937 6.1813 4500 0.2346 0.3784
0.3088 6.3187 4600 0.2313 0.3726
0.2852 6.4560 4700 0.2307 0.3709
0.3083 6.5934 4800 0.2293 0.3752
0.3194 6.7308 4900 0.2292 0.3711
0.297 6.8681 5000 0.2303 0.3715
0.3086 7.0055 5100 0.2340 0.4027
0.3058 7.1429 5200 0.2294 0.3665

Framework versions

  • Transformers 4.46.0.dev0
  • Pytorch 2.4.1+cu121
  • Datasets 3.0.1
  • Tokenizers 0.20.0