You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

wav2vec2-large-mms-1b-mos-V2

This model is a fine-tuned version of facebook/mms-1b-all on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3582
  • Wer: 0.4330

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.001
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 32
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 100
  • num_epochs: 5
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
36.2909 0.1552 100 1.0946 0.9531
4.3897 0.3103 200 0.7441 0.7478
3.8096 0.4655 300 0.5952 0.5489
3.2801 0.6206 400 0.5921 0.6137
3.241 0.7758 500 0.5730 0.6903
3.2619 0.9310 600 0.5520 0.7171
3.0675 1.0861 700 0.5009 0.5014
2.8962 1.2413 800 0.4948 0.4978
2.9737 1.3964 900 0.4830 0.5002
2.871 1.5516 1000 0.4692 0.4823
2.8204 1.7067 1100 0.4531 0.4705
2.8753 1.8619 1200 0.4439 0.4750
2.7596 2.0171 1300 0.4533 0.4951
2.6663 2.1722 1400 0.4556 0.4711
2.5645 2.3274 1500 0.4173 0.4708
2.691 2.4825 1600 0.4225 0.4596
2.5496 2.6377 1700 0.4254 0.4634
2.5257 2.7929 1800 0.4058 0.4479
2.6506 2.9480 1900 0.4065 0.4623
2.5693 3.1032 2000 0.4015 0.4560
2.3695 3.2583 2100 0.3984 0.4462
2.4252 3.4135 2200 0.3942 0.4496
2.3974 3.5687 2300 0.3841 0.4477
2.344 3.7238 2400 0.3789 0.4385
2.4957 3.8790 2500 0.3721 0.4392
2.3832 4.0341 2600 0.3693 0.4376
2.2369 4.1893 2700 0.3684 0.4357
2.2326 4.3445 2800 0.3666 0.4368
2.3816 4.4996 2900 0.3634 0.4379
2.2641 4.6548 3000 0.3619 0.4345
2.279 4.8099 3100 0.3595 0.4329
2.179 4.9651 3200 0.3582 0.4330

Framework versions

  • Transformers 4.46.0.dev0
  • Pytorch 2.4.1+cu121
  • Datasets 3.0.1
  • Tokenizers 0.20.1
Downloads last month
6
Safetensors
Model size
1.0B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for goaicorp/mos-wav2vec2-large-mms-1b

Finetuned
(382)
this model

Spaces using goaicorp/mos-wav2vec2-large-mms-1b 2