whisper-large-kurmanji-v1
This model is a fine-tuned version of openai/whisper-large-v3 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.4280
- Wer: 10.8064
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 15
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|---|---|---|---|---|
| 0.3459 | 0.4941 | 500 | 0.4129 | 23.7122 |
| 0.2412 | 0.9881 | 1000 | 0.3338 | 18.6864 |
| 0.1751 | 1.4822 | 1500 | 0.2937 | 15.9180 |
| 0.1799 | 1.9763 | 2000 | 0.2648 | 14.3789 |
| 0.1063 | 2.4704 | 2500 | 0.2722 | 13.9077 |
| 0.1035 | 2.9644 | 3000 | 0.2520 | 13.0673 |
| 0.0616 | 3.4585 | 3500 | 0.2739 | 12.4015 |
| 0.0686 | 3.9526 | 4000 | 0.2686 | 12.6026 |
| 0.0371 | 4.4466 | 4500 | 0.2964 | 12.7476 |
| 0.038 | 4.9407 | 5000 | 0.2906 | 12.5960 |
| 0.0182 | 5.4348 | 5500 | 0.3137 | 12.0324 |
| 0.0224 | 5.9289 | 6000 | 0.3099 | 12.7740 |
| 0.0083 | 6.4229 | 6500 | 0.3286 | 11.6930 |
| 0.0115 | 6.9170 | 7000 | 0.3322 | 11.8578 |
| 0.007 | 7.4111 | 7500 | 0.3465 | 12.2038 |
| 0.0087 | 7.9051 | 8000 | 0.3464 | 11.7259 |
| 0.0035 | 8.3992 | 8500 | 0.3756 | 11.4788 |
| 0.0053 | 8.8933 | 9000 | 0.3535 | 11.3832 |
| 0.0033 | 9.3874 | 9500 | 0.3805 | 11.2085 |
| 0.0037 | 9.8814 | 10000 | 0.3831 | 11.2645 |
| 0.0017 | 10.3755 | 10500 | 0.3754 | 11.1822 |
| 0.0024 | 10.8696 | 11000 | 0.3883 | 11.0899 |
| 0.001 | 11.3636 | 11500 | 0.3855 | 11.0075 |
| 0.0008 | 11.8577 | 12000 | 0.3890 | 11.2052 |
| 0.0004 | 12.3518 | 12500 | 0.3960 | 11.0338 |
| 0.0008 | 12.8458 | 13000 | 0.4007 | 10.7438 |
| 0.0001 | 13.3399 | 13500 | 0.4193 | 10.8526 |
| 0.0002 | 13.8340 | 14000 | 0.4219 | 10.7801 |
| 0.0001 | 14.3281 | 14500 | 0.4246 | 10.7636 |
| 0.0001 | 14.8221 | 15000 | 0.4280 | 10.8064 |
Framework versions
- Transformers 4.51.3
- Pytorch 2.4.1+cu124
- Datasets 3.6.0
- Tokenizers 0.21.2
- Downloads last month
- 1
Model tree for samil24/whisper-large-kurmanji-v1
Base model
openai/whisper-large-v3