whisper-tiny-mongolian-ver_0.4
This model is a fine-tuned version of openai/whisper-tiny on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 1.2186
- Wer: 0.7197
- Cer: 0.3174
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3.5e-06
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 60
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
|---|---|---|---|---|---|
| 4.8371 | 0.3914 | 200 | 2.8192 | 1.4535 | 2.1342 |
| 2.5062 | 0.7828 | 400 | 1.8720 | 1.0734 | 0.6824 |
| 2.0403 | 1.1742 | 600 | 1.4946 | 0.9449 | 0.5009 |
| 1.7612 | 1.5656 | 800 | 1.3044 | 0.8948 | 0.4486 |
| 1.6561 | 1.9569 | 1000 | 1.1837 | 0.8693 | 0.3876 |
| 1.5081 | 2.3483 | 1200 | 1.1053 | 0.8484 | 0.3905 |
| 1.4746 | 2.7397 | 1400 | 1.0471 | 0.8246 | 0.3658 |
| 1.4252 | 3.1311 | 1600 | 1.0011 | 0.8285 | 0.3604 |
| 1.3403 | 3.5225 | 1800 | 0.9659 | 0.7957 | 0.3442 |
| 1.3305 | 3.9139 | 2000 | 0.9380 | 0.7909 | 0.3336 |
| 1.2562 | 4.3053 | 2200 | 0.9085 | 0.7811 | 0.3521 |
| 1.2316 | 4.6967 | 2400 | 0.8907 | 0.7902 | 0.3472 |
| 1.2032 | 5.0881 | 2600 | 0.8755 | 0.7733 | 0.3290 |
| 1.1524 | 5.4795 | 2800 | 0.8579 | 0.7638 | 0.3313 |
| 1.1564 | 5.8708 | 3000 | 0.8423 | 0.7519 | 0.3254 |
| 1.0937 | 6.2622 | 3200 | 0.8351 | 0.7495 | 0.3189 |
| 1.0801 | 6.6536 | 3400 | 0.8236 | 0.7407 | 0.3212 |
| 1.0562 | 7.0450 | 3600 | 0.8176 | 0.7326 | 0.3103 |
| 1.0198 | 7.4364 | 3800 | 0.8133 | 0.7226 | 0.3080 |
| 1.0191 | 7.8278 | 4000 | 0.8041 | 0.7254 | 0.3165 |
| 0.9625 | 8.2192 | 4200 | 0.8038 | 0.7200 | 0.2990 |
| 0.9475 | 8.6106 | 4400 | 0.7985 | 0.7167 | 0.3083 |
| 0.9789 | 9.0020 | 4600 | 0.7912 | 0.7090 | 0.3063 |
| 0.9008 | 9.3933 | 4800 | 0.7943 | 0.7065 | 0.2987 |
| 0.9077 | 9.7847 | 5000 | 0.7911 | 0.7094 | 0.3106 |
| 0.8888 | 10.1761 | 5200 | 0.7948 | 0.7051 | 0.3013 |
| 0.8615 | 10.5675 | 5400 | 0.7914 | 0.6983 | 0.2968 |
| 0.8720 | 10.9589 | 5600 | 0.7910 | 0.6980 | 0.2932 |
| 0.8315 | 11.3503 | 5800 | 0.7932 | 0.7053 | 0.3065 |
| 0.8060 | 11.7417 | 6000 | 0.7918 | 0.6909 | 0.3128 |
| 0.7898 | 12.1331 | 6200 | 0.7929 | 0.6962 | 0.3027 |
| 0.7662 | 12.5245 | 6400 | 0.7988 | 0.6953 | 0.3027 |
| 0.8050 | 12.9159 | 6600 | 0.7921 | 0.6948 | 0.2952 |
| 0.7373 | 13.3072 | 6800 | 0.7988 | 0.6938 | 0.2963 |
| 0.7363 | 13.6986 | 7000 | 0.8036 | 0.6912 | 0.2985 |
| 0.7271 | 14.0900 | 7200 | 0.8058 | 0.6885 | 0.3058 |
| 0.6968 | 14.4814 | 7400 | 0.8124 | 0.6959 | 0.2979 |
| 0.7177 | 14.8728 | 7600 | 0.8107 | 0.6922 | 0.3030 |
| 0.6778 | 15.2642 | 7800 | 0.8175 | 0.6917 | 0.3019 |
| 0.6630 | 15.6556 | 8000 | 0.8170 | 0.6875 | 0.2984 |
| 0.6608 | 16.0470 | 8200 | 0.8236 | 0.6846 | 0.2874 |
| 0.6240 | 16.4384 | 8400 | 0.8273 | 0.6889 | 0.2977 |
| 0.6362 | 16.8297 | 8600 | 0.8316 | 0.6945 | 0.2972 |
| 0.6123 | 17.2211 | 8800 | 0.8350 | 0.6879 | 0.2986 |
| 0.6093 | 17.6125 | 9000 | 0.8431 | 0.6897 | 0.2918 |
| 0.5941 | 18.0039 | 9200 | 0.8463 | 0.6909 | 0.2936 |
| 0.5517 | 18.3953 | 9400 | 0.8530 | 0.6936 | 0.2969 |
| 0.5874 | 18.7867 | 9600 | 0.8549 | 0.6922 | 0.2932 |
| 0.5540 | 19.1781 | 9800 | 0.8599 | 0.6931 | 0.2973 |
| 0.5494 | 19.5695 | 10000 | 0.8643 | 0.6934 | 0.2914 |
| 0.5468 | 19.9609 | 10200 | 0.8704 | 0.6931 | 0.2959 |
| 0.5036 | 20.3523 | 10400 | 0.8726 | 0.6905 | 0.2917 |
| 0.5092 | 20.7436 | 10600 | 0.8795 | 0.6901 | 0.2969 |
| 0.5206 | 21.1350 | 10800 | 0.8864 | 0.6961 | 0.2952 |
| 0.4607 | 21.5264 | 11000 | 0.8921 | 0.6882 | 0.2931 |
| 0.5045 | 21.9178 | 11200 | 0.8933 | 0.6861 | 0.2964 |
| 0.4640 | 22.3092 | 11400 | 0.9028 | 0.6943 | 0.2979 |
| 0.4564 | 22.7006 | 11600 | 0.9056 | 0.6909 | 0.2989 |
| 0.4696 | 23.0920 | 11800 | 0.9098 | 0.6903 | 0.2962 |
| 0.4310 | 23.4834 | 12000 | 0.9169 | 0.6892 | 0.2958 |
| 0.4418 | 23.8748 | 12200 | 0.9203 | 0.6958 | 0.2927 |
| 0.4165 | 24.2661 | 12400 | 0.9301 | 0.6963 | 0.2946 |
| 0.4102 | 24.6575 | 12600 | 0.9339 | 0.6949 | 0.2951 |
| 0.4240 | 25.0489 | 12800 | 0.9410 | 0.6945 | 0.2985 |
| 0.3811 | 25.4403 | 13000 | 0.9438 | 0.7021 | 0.2967 |
| 0.4038 | 25.8317 | 13200 | 0.9509 | 0.7104 | 0.3112 |
| 0.3897 | 26.2231 | 13400 | 0.9589 | 0.7009 | 0.3081 |
| 0.3697 | 26.6145 | 13600 | 0.9573 | 0.7106 | 0.3070 |
| 0.3685 | 27.0059 | 13800 | 0.9649 | 0.7032 | 0.3028 |
| 0.3524 | 27.3973 | 14000 | 0.9661 | 0.7023 | 0.3015 |
| 0.3493 | 27.7886 | 14200 | 0.9752 | 0.7124 | 0.3098 |
| 0.3395 | 28.1800 | 14400 | 0.9831 | 0.6992 | 0.3019 |
| 0.3350 | 28.5714 | 14600 | 0.9830 | 0.7004 | 0.2989 |
| 0.3342 | 28.9628 | 14800 | 0.9889 | 0.7176 | 0.3129 |
| 0.2998 | 29.3542 | 15000 | 0.9939 | 0.7100 | 0.3111 |
| 0.3283 | 29.7456 | 15200 | 1.0028 | 0.7024 | 0.3003 |
| 0.2989 | 30.1370 | 15400 | 1.0067 | 0.7193 | 0.3162 |
| 0.2975 | 30.5284 | 15600 | 1.0112 | 0.7049 | 0.3069 |
| 0.2962 | 30.9198 | 15800 | 1.0139 | 0.7052 | 0.3061 |
| 0.2789 | 31.3112 | 16000 | 1.0209 | 0.7141 | 0.3116 |
| 0.2888 | 31.7025 | 16200 | 1.0227 | 0.7089 | 0.3038 |
| 0.2702 | 32.0939 | 16400 | 1.0308 | 0.7051 | 0.3071 |
| 0.2643 | 32.4853 | 16600 | 1.0341 | 0.7053 | 0.3074 |
| 0.2603 | 32.8767 | 16800 | 1.0367 | 0.7061 | 0.3070 |
| 0.2560 | 33.2681 | 17000 | 1.0446 | 0.7068 | 0.3039 |
| 0.2546 | 33.6595 | 17200 | 1.0480 | 0.7050 | 0.3100 |
| 0.2461 | 34.0509 | 17400 | 1.0540 | 0.7039 | 0.3052 |
| 0.2364 | 34.4423 | 17600 | 1.0591 | 0.7053 | 0.3083 |
| 0.2370 | 34.8337 | 17800 | 1.0655 | 0.7130 | 0.3086 |
| 0.2207 | 35.2250 | 18000 | 1.0673 | 0.7053 | 0.3061 |
| 0.2163 | 35.6164 | 18200 | 1.0690 | 0.7059 | 0.3065 |
| 0.2275 | 36.0078 | 18400 | 1.0721 | 0.7094 | 0.3096 |
| 0.2090 | 36.3992 | 18600 | 1.0817 | 0.7068 | 0.3065 |
| 0.2061 | 36.7906 | 18800 | 1.0827 | 0.7107 | 0.3114 |
| 0.2094 | 37.1820 | 19000 | 1.0882 | 0.7080 | 0.3097 |
| 0.1921 | 37.5734 | 19200 | 1.0920 | 0.7095 | 0.3121 |
| 0.1995 | 37.9648 | 19400 | 1.0944 | 0.7090 | 0.3092 |
| 0.1881 | 38.3562 | 19600 | 1.1006 | 0.7101 | 0.3103 |
| 0.1831 | 38.7476 | 19800 | 1.1014 | 0.7083 | 0.3072 |
| 0.1840 | 39.1389 | 20000 | 1.1108 | 0.7135 | 0.3115 |
| 0.1838 | 39.5303 | 20200 | 1.1122 | 0.7107 | 0.3117 |
| 0.1756 | 39.9217 | 20400 | 1.1139 | 0.7103 | 0.3099 |
| 0.1671 | 40.3131 | 20600 | 1.1199 | 0.7127 | 0.3079 |
| 0.1640 | 40.7045 | 20800 | 1.1239 | 0.7126 | 0.3077 |
| 0.1721 | 41.0959 | 21000 | 1.1248 | 0.7171 | 0.3116 |
| 0.1588 | 41.4873 | 21200 | 1.1303 | 0.7122 | 0.3087 |
| 0.1631 | 41.8787 | 21400 | 1.1334 | 0.7139 | 0.3119 |
| 0.1560 | 42.2701 | 21600 | 1.1383 | 0.7169 | 0.3174 |
| 0.1491 | 42.6614 | 21800 | 1.1404 | 0.7139 | 0.3178 |
| 0.1500 | 43.0528 | 22000 | 1.1434 | 0.7148 | 0.3120 |
| 0.1419 | 43.4442 | 22200 | 1.1459 | 0.7138 | 0.3137 |
| 0.1470 | 43.8356 | 22400 | 1.1505 | 0.7120 | 0.3132 |
| 0.1444 | 44.2270 | 22600 | 1.1541 | 0.7143 | 0.3146 |
| 0.1369 | 44.6184 | 22800 | 1.1550 | 0.7137 | 0.3126 |
| 0.1367 | 45.0098 | 23000 | 1.1582 | 0.7159 | 0.3124 |
| 0.1308 | 45.4012 | 23200 | 1.1626 | 0.7199 | 0.3166 |
| 0.1303 | 45.7926 | 23400 | 1.1618 | 0.7211 | 0.3153 |
| 0.1332 | 46.1840 | 23600 | 1.1676 | 0.7164 | 0.3153 |
| 0.1255 | 46.5753 | 23800 | 1.1701 | 0.7161 | 0.3136 |
| 0.1246 | 46.9667 | 24000 | 1.1711 | 0.7154 | 0.3121 |
| 0.1180 | 47.3581 | 24200 | 1.1760 | 0.7198 | 0.3175 |
| 0.1225 | 47.7495 | 24400 | 1.1767 | 0.7158 | 0.3164 |
| 0.1218 | 48.1409 | 24600 | 1.1810 | 0.7180 | 0.3175 |
| 0.1140 | 48.5323 | 24800 | 1.1813 | 0.7179 | 0.3170 |
| 0.1206 | 48.9237 | 25000 | 1.1860 | 0.7174 | 0.3150 |
| 0.1106 | 49.3151 | 25200 | 1.1873 | 0.7184 | 0.3154 |
| 0.1118 | 49.7065 | 25400 | 1.1894 | 0.7197 | 0.3179 |
| 0.1131 | 50.0978 | 25600 | 1.1919 | 0.7189 | 0.3184 |
| 0.1085 | 50.4892 | 25800 | 1.1916 | 0.7173 | 0.3159 |
| 0.1093 | 50.8806 | 26000 | 1.1948 | 0.7204 | 0.3183 |
| 0.1064 | 51.2720 | 26200 | 1.1974 | 0.7203 | 0.3183 |
| 0.1024 | 51.6634 | 26400 | 1.1994 | 0.7185 | 0.3153 |
| 0.1070 | 52.0548 | 26600 | 1.2017 | 0.7207 | 0.3178 |
| 0.1005 | 52.4462 | 26800 | 1.2023 | 0.7207 | 0.3174 |
| 0.1036 | 52.8376 | 27000 | 1.2021 | 0.7195 | 0.3192 |
| 0.1013 | 53.2290 | 27200 | 1.2037 | 0.7193 | 0.3184 |
| 0.0993 | 53.6204 | 27400 | 1.2078 | 0.7193 | 0.3158 |
| 0.1011 | 54.0117 | 27600 | 1.2089 | 0.7178 | 0.3166 |
| 0.0987 | 54.4031 | 27800 | 1.2096 | 0.7216 | 0.3167 |
| 0.0945 | 54.7945 | 28000 | 1.2112 | 0.7210 | 0.3194 |
| 0.0989 | 55.1859 | 28200 | 1.2127 | 0.7217 | 0.3184 |
| 0.0938 | 55.5773 | 28400 | 1.2122 | 0.7213 | 0.3225 |
| 0.0967 | 55.9687 | 28600 | 1.2142 | 0.7197 | 0.3196 |
| 0.0940 | 56.3601 | 28800 | 1.2141 | 0.7175 | 0.3186 |
| 0.0922 | 56.7515 | 29000 | 1.2144 | 0.7200 | 0.3191 |
| 0.0960 | 57.1429 | 29200 | 1.2161 | 0.7210 | 0.3180 |
| 0.0917 | 57.5342 | 29400 | 1.2170 | 0.7205 | 0.3171 |
| 0.0941 | 57.9256 | 29600 | 1.2168 | 0.7195 | 0.3171 |
| 0.0932 | 58.3170 | 29800 | 1.2180 | 0.7203 | 0.3187 |
| 0.0901 | 58.7084 | 30000 | 1.2177 | 0.7204 | 0.3181 |
| 0.0918 | 59.0998 | 30200 | 1.2181 | 0.7202 | 0.3163 |
| 0.0914 | 59.4912 | 30400 | 1.2185 | 0.7201 | 0.3175 |
| 0.0924 | 59.8826 | 30600 | 1.2186 | 0.7197 | 0.3174 |
Framework versions
- Transformers 5.3.0.dev0
- Pytorch 2.7.1+cu128
- Datasets 3.6.0
- Tokenizers 0.22.2
- Downloads last month
- 606
Model tree for Ganaa0614/whisper-tiny-mongolian-ver_0.4
Base model
openai/whisper-tiny