whisper-small-yue-fold-2

This model is a fine-tuned version of openai/whisper-small on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4821
  • Wer: 94.3203

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 32
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • training_steps: 4000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
2.7204 0.38 50 2.3209 128.8641
0.7581 0.75 100 0.4446 191.2477
0.2924 1.13 150 0.3421 127.1881
0.2626 1.5 200 0.3177 109.4041
0.2592 1.88 250 0.2957 93.0168
0.1538 2.26 300 0.2956 90.5028
0.151 2.63 350 0.2930 72.6257
0.1577 3.01 400 0.2837 69.7393
0.0751 3.38 450 0.2959 74.7672
0.0747 3.76 500 0.2924 71.6946
0.0527 4.14 550 0.3009 70.8566
0.0465 4.51 600 0.3071 69.7393
0.047 4.89 650 0.2967 70.7635
0.0181 5.26 700 0.3134 67.1322
0.0227 5.64 750 0.3190 73.3706
0.0254 6.02 800 0.3155 84.3575
0.0103 6.39 850 0.3334 71.4153
0.0124 6.77 900 0.3390 81.8436
0.0085 7.14 950 0.3322 75.1397
0.0094 7.52 1000 0.3422 72.3464
0.0099 7.89 1050 0.3450 74.4879
0.0044 8.27 1100 0.3585 72.2533
0.0054 8.65 1150 0.3760 74.5810
0.0066 9.02 1200 0.3639 72.4395
0.0058 9.4 1250 0.3664 72.2533
0.0035 9.77 1300 0.3702 75.9777
0.0068 10.15 1350 0.3756 72.2533
0.0037 10.53 1400 0.3746 80.8194
0.0046 10.9 1450 0.3866 72.8119
0.0021 11.28 1500 0.4072 70.3911
0.0018 11.65 1550 0.4021 72.9050
0.003 12.03 1600 0.4056 76.3501
0.002 12.41 1650 0.4037 71.7877
0.0037 12.78 1700 0.4119 71.3222
0.0004 13.16 1750 0.4214 71.4153
0.0017 13.53 1800 0.4159 71.0428
0.0013 13.91 1850 0.4408 70.7635
0.0005 14.29 1900 0.4486 67.3184
0.0002 14.66 1950 0.4426 69.5531
0.0004 15.04 2000 0.4597 69.6462
0.0003 15.41 2050 0.4699 71.5084
0.0001 15.79 2100 0.4753 72.9981
0.0001 16.17 2150 0.4797 91.1546
0.0004 16.54 2200 0.4821 94.3203

Framework versions

  • Transformers 4.37.0.dev0
  • Pytorch 1.12.1
  • Datasets 2.15.0
  • Tokenizers 0.15.0
Downloads last month
2
Safetensors
Model size
0.2B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Chenxi-Chelsea-Liu/whisper-small-yue-fold-2

Finetuned
(3445)
this model