Whisper Large v3 - Japanese Zatoichi ASR

This model is a fine-tuned version of openai/whisper-large-v3 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4106
  • Wer: 71.7972
  • Cer: 19.1421

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 8e-06
  • train_batch_size: 32
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Use adamw_torch_fused with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 2

Training results

Training Loss Epoch Step Validation Loss Wer Cer
1.2387 0.0065 1 1.2545 98.1567 40.8419
1.2196 0.0130 2 0.9960 96.3134 38.5507
1.1002 0.0195 3 0.9822 96.3134 38.2443
0.9277 0.0260 4 0.9680 94.8387 37.6848
1.0004 0.0325 5 0.9518 93.6406 36.6991
0.9985 0.0390 6 0.9402 92.4424 35.8465
1.1458 0.0455 7 0.9111 88.7558 35.1272
1.043 0.0519 8 0.8960 87.1889 33.8084
0.9909 0.0584 9 0.8500 88.5714 32.7961
0.8984 0.0649 10 0.8274 83.6866 29.4259
0.845 0.0714 11 0.8075 93.0876 31.6904
0.9553 0.0779 12 0.7798 88.0184 29.3593
0.8064 0.0844 13 0.7566 91.6129 30.6114
0.7837 0.0909 14 0.7390 87.5576 28.9197
0.7576 0.0974 15 0.7172 90.6912 33.5021
0.8125 0.1039 16 0.6992 94.9309 32.4231
0.7267 0.1104 17 0.6825 98.5253 34.3812
0.5745 0.1169 18 0.6628 93.6406 33.1824
0.8306 0.1234 19 0.6482 92.9032 33.2889
0.6258 0.1299 20 0.6363 91.1521 35.7000
0.5959 0.1364 21 0.6221 88.7558 30.9045
0.6223 0.1429 22 0.6106 95.4839 31.8236
0.7606 0.1494 23 0.5975 91.9816 29.4125
0.6333 0.1558 24 0.5884 87.5576 27.4144
0.6841 0.1623 25 0.5830 77.9724 24.9101
0.5854 0.1688 26 0.5805 80.2765 26.1489
0.6213 0.1753 27 0.5740 80.5530 26.0157
0.589 0.1818 28 0.5714 89.7696 28.8664
0.6858 0.1883 29 0.5656 87.3733 27.3878
0.585 0.1948 30 0.5581 81.5668 24.8701
0.5453 0.2013 31 0.5580 80.4608 24.5771
0.6899 0.2078 32 0.5562 75.3917 22.6189
0.5337 0.2143 33 0.5456 74.1935 22.7921
0.5274 0.2208 34 0.5374 79.3548 24.2440
0.5111 0.2273 35 0.5345 79.3548 23.8444
0.6652 0.2338 36 0.5289 78.2488 23.5380
0.5203 0.2403 37 0.5259 83.1336 27.1347
0.4773 0.2468 38 0.5260 71.5207 21.8862
0.6092 0.2532 39 0.5263 72.3502 21.7930
0.5334 0.2597 40 0.5226 71.5207 22.0194
0.6202 0.2662 41 0.5160 72.0737 22.1260
0.6401 0.2727 42 0.5167 73.8249 22.6722
0.5465 0.2792 43 0.5169 74.6544 23.5647
0.5615 0.2857 44 0.5151 77.0507 27.2546
0.4892 0.2922 45 0.5145 74.9309 22.2592
0.4878 0.2987 46 0.5162 81.0138 25.1765
0.6339 0.3052 47 0.5145 79.7235 24.0442
0.5845 0.3117 48 0.5102 74.3779 22.2193
0.5806 0.3182 49 0.5058 74.4700 22.2459
0.5627 0.3247 50 0.5027 72.3502 22.0861
0.4449 0.3312 51 0.5028 77.1429 24.0842
0.4957 0.3377 52 0.5018 77.3272 24.3506
0.4837 0.3442 53 0.4976 77.7880 24.0842
0.597 0.3506 54 0.4944 70.7834 21.5932
0.4839 0.3571 55 0.4923 69.7696 21.0870
0.4551 0.3636 56 0.4908 70.2304 21.0071
0.4712 0.3701 57 0.4881 77.9724 24.2973
0.5517 0.3766 58 0.4867 77.8802 24.3240
0.4606 0.3831 59 0.4846 77.9724 24.2707
0.4775 0.3896 60 0.4838 75.2995 22.8853
0.403 0.3961 61 0.4836 74.3779 22.8187
0.492 0.4026 62 0.4813 74.6544 22.7921
0.468 0.4091 63 0.4781 78.7097 24.2973
0.557 0.4156 64 0.4761 73.8249 22.1793
0.4159 0.4221 65 0.4763 74.2857 22.0727
0.4792 0.4286 66 0.4786 76.8664 22.0994
0.5464 0.4351 67 0.4756 76.1290 22.3125
0.5694 0.4416 68 0.4719 72.3502 21.8463
0.472 0.4481 69 0.4676 69.9539 20.4209
0.5779 0.4545 70 0.4673 66.7281 20.1812
0.4255 0.4610 71 0.4677 67.5576 20.7939
0.4844 0.4675 72 0.4649 67.3733 20.9405
0.4341 0.4740 73 0.4608 67.5576 21.4999
0.4744 0.4805 74 0.4588 67.6498 20.2478
0.4628 0.4870 75 0.4607 68.9401 20.6208
0.4551 0.4935 76 0.4611 71.2442 21.1269
0.5278 0.5 77 0.4619 72.4424 21.5133
0.4654 0.5065 78 0.4605 73.2719 21.4333
0.397 0.5130 79 0.4603 73.6406 22.3924
0.4586 0.5195 80 0.4593 72.5346 21.3268
0.4289 0.5260 81 0.4569 71.1521 21.2468
0.4339 0.5325 82 0.4534 70.3226 20.9005
0.4794 0.5390 83 0.4527 69.0323 20.8872
0.486 0.5455 84 0.4546 69.4931 21.2202
0.4454 0.5519 85 0.4562 69.4009 21.1536
0.4738 0.5584 86 0.4564 70.5069 21.3401
0.4643 0.5649 87 0.4551 70.5069 21.3934
0.5539 0.5714 88 0.4531 70.0461 20.8206
0.4709 0.5779 89 0.4533 70.2304 20.9937
0.5182 0.5844 90 0.4541 70.1382 21.0603
0.5398 0.5909 91 0.4525 71.3364 21.0337
0.5722 0.5974 92 0.4482 75.2995 22.6722
0.5727 0.6039 93 0.4460 77.9724 23.8711
0.4026 0.6104 94 0.4478 73.0876 21.8596
0.4729 0.6169 95 0.4526 73.7327 22.1393
0.4761 0.6234 96 0.4554 73.0876 22.1393
0.5285 0.6299 97 0.4496 72.7189 21.6864
0.5247 0.6364 98 0.4472 74.2857 21.9395
0.4149 0.6429 99 0.4484 74.8387 21.9528
0.3178 0.6494 100 0.4504 76.3134 22.2725
0.5711 0.6558 101 0.4502 76.4055 23.0052
0.4755 0.6623 102 0.4466 76.3134 24.1375
0.45 0.6688 103 0.4438 74.3779 22.7921
0.4218 0.6753 104 0.4428 69.0323 19.8215
0.5156 0.6818 105 0.4434 69.0323 20.1279
0.516 0.6883 106 0.4445 68.5714 19.8481
0.4713 0.6948 107 0.4455 68.9401 19.9947
0.4231 0.7013 108 0.4432 68.7558 19.9147
0.4869 0.7078 109 0.4400 68.7558 19.7149
0.4826 0.7143 110 0.4390 73.7327 21.4200
0.4115 0.7208 111 0.4412 77.4194 21.6864
0.4802 0.7273 112 0.4451 82.9493 22.6056
0.4769 0.7338 113 0.4451 84.7926 22.8054
0.65 0.7403 114 0.4427 78.9862 21.2735
0.4185 0.7468 115 0.4381 78.7097 22.2859
0.3991 0.7532 116 0.4332 69.6774 20.0879
0.4593 0.7597 117 0.4305 68.5714 21.1136
0.3772 0.7662 118 0.4293 70.9677 21.2335
0.5449 0.7727 119 0.4293 75.8525 23.2849
0.4755 0.7792 120 0.4283 70.1382 21.1403
0.4492 0.7857 121 0.4264 69.8618 20.9538
0.4479 0.7922 122 0.4253 71.4286 21.3534
0.3756 0.7987 123 0.4261 72.6267 21.3800
0.4517 0.8052 124 0.4283 75.8525 21.6465
0.4056 0.8117 125 0.4309 77.9724 21.7397
0.4233 0.8182 126 0.4341 79.6313 21.4466
0.4364 0.8247 127 0.4345 75.8525 20.0613
0.6056 0.8312 128 0.4335 74.4700 20.0480
0.4723 0.8377 129 0.4308 73.2719 19.9147
0.4416 0.8442 130 0.4282 75.6682 20.9937
0.4451 0.8506 131 0.4260 72.7189 20.8872
0.4164 0.8571 132 0.4250 70.6912 20.6874
0.4485 0.8636 133 0.4254 69.7696 21.7530
0.4653 0.8701 134 0.4246 68.3871 21.4866
0.4766 0.8766 135 0.4234 69.4931 21.9928
0.5131 0.8831 136 0.4214 69.4009 20.8739
0.3936 0.8896 137 0.4195 69.8618 21.5532
0.4418 0.8961 138 0.4189 69.9539 21.4200
0.4406 0.9026 139 0.4188 70.5991 21.4067
0.3582 0.9091 140 0.4188 72.5346 21.4999
0.4034 0.9156 141 0.4186 72.1659 21.3667
0.4097 0.9221 142 0.4186 72.2581 21.3134
0.3894 0.9286 143 0.4187 72.3502 20.3543
0.49 0.9351 144 0.4197 72.1659 20.3543
0.4507 0.9416 145 0.4201 72.0737 20.2877
0.3817 0.9481 146 0.4202 71.4286 20.3277
0.5105 0.9545 147 0.4204 70.5991 20.3410
0.4041 0.9610 148 0.4198 69.0323 20.0480
0.3739 0.9675 149 0.4193 69.3088 20.0480
0.4634 0.9740 150 0.4177 69.2166 19.8748
0.3764 0.9805 151 0.4171 70.1382 20.2078
0.3917 0.9870 152 0.4167 67.0968 18.8890
0.4969 0.9935 153 0.4172 68.5714 19.2620
0.4567 1.0 154 0.4172 69.2166 19.4885
0.273 1.0065 155 0.4170 69.4931 19.7016
0.416 1.0130 156 0.4160 68.9401 19.5551
0.3148 1.0195 157 0.4155 71.7972 21.0470
0.3024 1.0260 158 0.4149 71.6129 21.1403
0.3591 1.0325 159 0.4156 70.5069 20.7007
0.3166 1.0390 160 0.4164 70.6912 20.9138
0.3121 1.0455 161 0.4182 71.9816 21.0071
0.2978 1.0519 162 0.4211 69.2166 19.5684
0.3342 1.0584 163 0.4240 70.6912 19.7682
0.3177 1.0649 164 0.4248 70.8756 19.7016
0.3586 1.0714 165 0.4238 69.6774 19.4485
0.3208 1.0779 166 0.4215 69.4009 19.3553
0.3007 1.0844 167 0.4192 68.2949 18.9690
0.2494 1.0909 168 0.4172 67.2811 18.8224
0.2998 1.0974 169 0.4169 67.2811 18.7558
0.2697 1.1039 170 0.4173 68.0184 18.8624
0.3422 1.1104 171 0.4179 68.4793 19.0889
0.2707 1.1169 172 0.4194 69.5853 19.3686
0.3064 1.1234 173 0.4200 69.8618 19.1954
0.3519 1.1299 174 0.4203 69.8618 19.3419
0.3234 1.1364 175 0.4201 69.5853 19.2487
0.3176 1.1429 176 0.4211 69.3088 19.3020
0.3265 1.1494 177 0.4227 69.4009 19.2753
0.2943 1.1558 178 0.4237 71.0599 19.5551
0.3052 1.1623 179 0.4243 71.9816 19.7016
0.3163 1.1688 180 0.4243 71.7051 19.7149
0.2923 1.1753 181 0.4247 71.9816 19.4752
0.2832 1.1818 182 0.4242 72.8111 19.6883
0.307 1.1883 183 0.4224 71.9816 19.4485
0.3098 1.1948 184 0.4212 71.5207 19.1555
0.3558 1.2013 185 0.4186 69.8618 18.9556
0.3017 1.2078 186 0.4171 69.2166 18.7292
0.2834 1.2143 187 0.4164 67.8341 18.6359
0.2438 1.2208 188 0.4159 67.0968 18.5827
0.2405 1.2273 189 0.4167 68.1106 18.8091
0.2683 1.2338 190 0.4175 67.6498 18.7558
0.2909 1.2403 191 0.4185 68.1106 18.7292
0.2822 1.2468 192 0.4193 68.0184 18.8491
0.3259 1.2532 193 0.4203 68.6636 19.1022
0.2995 1.2597 194 0.4209 69.3088 19.2354
0.3331 1.2662 195 0.4213 69.7696 19.1421
0.2877 1.2727 196 0.4218 69.3088 19.0089
0.3275 1.2792 197 0.4221 69.8618 19.2487
0.2956 1.2857 198 0.4224 69.5853 19.1288
0.2977 1.2922 199 0.4224 70.3226 18.9956
0.2808 1.2987 200 0.4228 71.5207 19.2753
0.3239 1.3052 201 0.4226 71.9816 19.3419
0.2837 1.3117 202 0.4217 72.0737 19.4086
0.3101 1.3182 203 0.4210 71.8894 19.4086
0.3464 1.3247 204 0.4196 70.9677 19.3686
0.301 1.3312 205 0.4192 70.7834 19.2487
0.3273 1.3377 206 0.4188 70.6912 19.3419
0.4202 1.3442 207 0.4182 71.1521 19.4219
0.3252 1.3506 208 0.4179 74.8387 20.8339
0.2569 1.3571 209 0.4191 75.3917 21.0071
0.3286 1.3636 210 0.4203 76.3134 20.8872
0.2648 1.3701 211 0.4215 77.1429 20.8872
0.3037 1.3766 212 0.4209 76.5899 20.8739
0.3362 1.3831 213 0.4201 76.4977 20.7806
0.299 1.3896 214 0.4193 76.6820 20.9671
0.2946 1.3961 215 0.4184 75.8525 20.8872
0.3 1.4026 216 0.4174 75.6682 20.8339
0.2586 1.4091 217 0.4169 74.9309 20.8072
0.3186 1.4156 218 0.4162 74.7465 20.7540
0.3404 1.4221 219 0.4154 74.1935 20.8739
0.2611 1.4286 220 0.4157 74.2857 20.7939
0.3337 1.4351 221 0.4160 74.3779 20.7540
0.3216 1.4416 222 0.4163 74.4700 20.7406
0.2883 1.4481 223 0.4160 74.8387 20.8739
0.3457 1.4545 224 0.4158 74.6544 20.9271
0.3438 1.4610 225 0.4155 74.7465 20.8472
0.3488 1.4675 226 0.4156 71.2442 19.3286
0.255 1.4740 227 0.4153 75.1152 20.9538
0.3896 1.4805 228 0.4155 75.8525 20.9671
0.288 1.4870 229 0.4156 76.3134 21.1269
0.2658 1.4935 230 0.4156 76.4977 21.0204
0.2739 1.5 231 0.4157 76.4977 20.9937
0.3124 1.5065 232 0.4153 72.6267 19.4752
0.2487 1.5130 233 0.4148 71.7972 19.3286
0.3273 1.5195 234 0.4147 71.6129 19.4752
0.227 1.5260 235 0.4143 71.2442 19.4885
0.2803 1.5325 236 0.4143 71.6129 19.5684
0.3627 1.5390 237 0.4145 71.5207 19.4752
0.276 1.5455 238 0.4142 71.7051 19.4352
0.342 1.5519 239 0.4139 72.0737 19.4219
0.3129 1.5584 240 0.4141 71.9816 19.3952
0.3007 1.5649 241 0.4137 71.9816 19.4219
0.2956 1.5714 242 0.4134 71.8894 19.3686
0.2229 1.5779 243 0.4132 71.6129 19.3553
0.2811 1.5844 244 0.4130 71.3364 19.3419
0.2827 1.5909 245 0.4122 70.9677 19.3153
0.3395 1.5974 246 0.4119 70.3226 19.1155
0.3588 1.6039 247 0.4115 70.2304 19.1954
0.3191 1.6104 248 0.4112 70.0461 19.1421
0.3598 1.6169 249 0.4110 69.7696 19.1555
0.3696 1.6234 250 0.4106 69.5853 19.1288
0.3208 1.6299 251 0.4103 69.3088 19.0889
0.2768 1.6364 252 0.4100 69.0323 18.9690
0.3166 1.6429 253 0.4099 69.7696 19.0489
0.2571 1.6494 254 0.4098 69.8618 19.0755
0.2733 1.6558 255 0.4096 69.3088 18.9956
0.2891 1.6623 256 0.4102 69.8618 19.1022
0.339 1.6688 257 0.4096 69.8618 19.2487
0.3159 1.6753 258 0.4101 70.5069 19.2087
0.2661 1.6818 259 0.4102 70.5991 19.2753
0.3232 1.6883 260 0.4108 71.0599 19.3286
0.3397 1.6948 261 0.4105 71.3364 19.2221
0.2708 1.7013 262 0.4109 71.4286 19.2620
0.2449 1.7078 263 0.4105 71.1521 19.3286
0.3193 1.7143 264 0.4107 71.0599 19.1954
0.2917 1.7208 265 0.4106 70.8756 19.0356
0.2583 1.7273 266 0.4103 71.3364 19.1421
0.3747 1.7338 267 0.4104 71.6129 19.1688
0.2289 1.7403 268 0.4104 71.3364 19.2753
0.2726 1.7468 269 0.4102 71.1521 19.2487
0.2622 1.7532 270 0.4102 70.5991 19.0222
0.2843 1.7597 271 0.4101 70.7834 19.2354
0.3014 1.7662 272 0.4101 70.5991 19.2620
0.2834 1.7727 273 0.4103 70.9677 19.2221
0.3183 1.7792 274 0.4103 70.5069 19.1555
0.3497 1.7857 275 0.4103 70.4147 19.0222
0.3025 1.7922 276 0.4103 70.4147 18.7825
0.2744 1.7987 277 0.4102 71.0599 18.9423
0.3358 1.8052 278 0.4102 70.6912 18.8358
0.3587 1.8117 279 0.4103 70.4147 18.7159
0.3066 1.8182 280 0.4103 70.5991 18.7825
0.3491 1.8247 281 0.4105 70.5991 18.8091
0.236 1.8312 282 0.4102 70.7834 18.7292
0.2849 1.8377 283 0.4103 70.9677 18.8624
0.3358 1.8442 284 0.4106 71.0599 18.7825
0.2355 1.8506 285 0.4104 70.1382 18.7691
0.2658 1.8571 286 0.4104 70.3226 18.7025
0.2698 1.8636 287 0.4101 71.0599 18.8091
0.2966 1.8701 288 0.4105 70.2304 18.7025
0.3557 1.8766 289 0.4104 70.9677 18.7558
0.2948 1.8831 290 0.4104 70.7834 18.8757
0.3971 1.8896 291 0.4109 70.8756 18.9290
0.3305 1.8961 292 0.4107 70.5991 18.7958
0.3375 1.9026 293 0.4106 71.1521 19.0356
0.2514 1.9091 294 0.4107 71.1521 18.9556
0.3308 1.9156 295 0.4104 70.8756 18.7425
0.2738 1.9221 296 0.4107 71.3364 19.0089
0.2567 1.9286 297 0.4103 71.3364 18.9956
0.3462 1.9351 298 0.4106 71.0599 18.9024
0.326 1.9416 299 0.4104 71.3364 19.0356
0.3063 1.9481 300 0.4107 71.4286 18.9823
0.2901 1.9545 301 0.4104 70.7834 18.9290
0.3204 1.9610 302 0.4107 70.9677 18.9290
0.353 1.9675 303 0.4102 71.5207 18.9823
0.366 1.9740 304 0.4105 71.4286 18.9956
0.2828 1.9805 305 0.4107 71.0599 19.0089
0.2655 1.9870 306 0.4105 71.0599 18.9956
0.3037 1.9935 307 0.4107 70.5991 18.9157
0.3579 2.0 308 0.4106 71.7972 19.1421

Framework versions

  • Transformers 4.57.3
  • Pytorch 2.9.1+cu128
  • Datasets 4.4.1
  • Tokenizers 0.22.1
Downloads last month
3
Safetensors
Model size
2B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for nkkbr/whisper-large-v3-zatoichi-ja-JDG_ver_20260217_lr_8e-6

Finetuned
(816)
this model