You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

Visualize in Weights & Biases Visualize in Weights & Biases

wav2vec2-xls-r-ewe-100-hours

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7355
  • Wer: 0.3374
  • Cer: 0.0950

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 32
  • eval_batch_size: 16
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 64
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Wer Cer
13.6452 1.7007 500 2.9602 1.0 1.0
3.0135 3.4014 1000 0.4361 0.3881 0.1111
1.0094 5.1020 1500 0.3739 0.3388 0.0976
0.8932 6.8027 2000 0.3530 0.3250 0.0933
0.8274 8.5034 2500 0.3506 0.3237 0.0933
0.7824 10.2041 3000 0.3480 0.3183 0.0921
0.7387 11.9048 3500 0.3651 0.3338 0.1028
0.6923 13.6054 4000 0.3551 0.3128 0.0918
0.6584 15.3061 4500 0.3511 0.3171 0.0916
0.6181 17.0068 5000 0.3572 0.3206 0.0924
0.5805 18.7075 5500 0.3604 0.3265 0.0958
0.5508 20.4082 6000 0.3734 0.3172 0.0929
0.5195 22.1088 6500 0.3913 0.3182 0.0932
0.4902 23.8095 7000 0.3914 0.3291 0.0945
0.4585 25.5102 7500 0.4277 0.3279 0.0934
0.4222 27.2109 8000 0.4336 0.3346 0.0964
0.3911 28.9116 8500 0.4419 0.3252 0.0938
0.3627 30.6122 9000 0.4687 0.3284 0.0951
0.3441 32.3129 9500 0.5102 0.3378 0.0962
0.3176 34.0136 10000 0.5415 0.3352 0.0950
0.2875 35.7143 10500 0.5497 0.3472 0.0978
0.2665 37.4150 11000 0.5724 0.3425 0.0975
0.2507 39.1156 11500 0.6267 0.3415 0.0968
0.2312 40.8163 12000 0.6423 0.3369 0.0960
0.2118 42.5170 12500 0.6688 0.3413 0.0964
0.2015 44.2177 13000 0.6806 0.3472 0.0970
0.1917 45.9184 13500 0.6875 0.3437 0.0970
0.1833 47.6190 14000 0.7128 0.3368 0.0951
0.1756 49.3197 14500 0.7355 0.3374 0.0950

Framework versions

  • Transformers 4.46.1
  • Pytorch 2.5.1+cu121
  • Datasets 3.2.0
  • Tokenizers 0.20.3
Downloads last month
-
Safetensors
Model size
0.3B params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for asr-africa/wav2vec2-xls-r-ewe-100-hours

Finetuned
(833)
this model
Finetunes
1 model

Spaces using asr-africa/wav2vec2-xls-r-ewe-100-hours 2

Collection including asr-africa/wav2vec2-xls-r-ewe-100-hours