Whisper Large v3 - Japanese Zatoichi ASR

This model is a fine-tuned version of openai/whisper-large-v3 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3894
  • Wer: 69.0323
  • Cer: 18.4095

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2.4e-05
  • train_batch_size: 32
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Use adamw_torch_fused with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 2

Training results

Training Loss Epoch Step Validation Loss Wer Cer
1.2387 0.0065 1 1.2545 98.1567 40.8419
1.2196 0.0130 2 0.9949 96.2212 38.3242
1.0999 0.0195 3 0.9722 94.2857 37.2985
0.917 0.0260 4 0.9326 91.6129 36.0330
0.9664 0.0325 5 0.8996 87.0968 33.8218
0.9458 0.0390 6 0.8372 85.3456 29.6390
1.0439 0.0455 7 0.7956 85.4378 28.5867
0.9201 0.0519 8 0.7551 98.8018 32.7161
0.8404 0.0584 9 0.7302 83.4101 30.7313
0.7847 0.0649 10 0.7001 86.3594 32.0767
0.7349 0.0714 11 0.6688 103.0415 36.3927
0.8178 0.0779 12 0.6441 98.1567 31.4906
0.6819 0.0844 13 0.6250 84.6083 27.9739
0.6407 0.0909 14 0.6106 75.3917 24.8035
0.6267 0.0974 15 0.5967 79.6313 26.4420
0.6877 0.1039 16 0.5887 89.5853 28.7465
0.5978 0.1104 17 0.5800 91.9816 29.5324
0.4972 0.1169 18 0.5704 84.8848 26.2955
0.6828 0.1234 19 0.5726 78.2488 24.9900
0.5684 0.1299 20 0.5643 75.4839 24.6303
0.5249 0.1364 21 0.5525 70.1382 22.1793
0.5908 0.1429 22 0.5486 80.9217 27.0148
0.6832 0.1494 23 0.5519 82.2120 28.1604
0.5725 0.1558 24 0.5478 89.0323 30.7180
0.6327 0.1623 25 0.5463 87.2811 26.8416
0.5418 0.1688 26 0.5464 86.9124 27.2146
0.5762 0.1753 27 0.5423 80.8295 25.1898
0.5671 0.1818 28 0.5384 82.1198 25.7893
0.6287 0.1883 29 0.5312 75.9447 23.6979
0.5341 0.1948 30 0.5309 84.7005 25.2298
0.5044 0.2013 31 0.5362 81.0138 25.7759
0.6614 0.2078 32 0.5325 76.6820 25.1765
0.5255 0.2143 33 0.5258 76.9585 24.6170
0.5149 0.2208 34 0.5204 74.9309 22.2459
0.4872 0.2273 35 0.5219 77.6037 22.7121
0.6584 0.2338 36 0.5193 74.0092 22.5523
0.5073 0.2403 37 0.5167 72.3502 22.1926
0.4776 0.2468 38 0.5179 71.2442 22.0994
0.5958 0.2532 39 0.5136 71.1521 21.9928
0.5184 0.2597 40 0.5107 71.4286 22.0861
0.5817 0.2662 41 0.5114 74.1014 22.4324
0.6451 0.2727 42 0.5116 76.0369 22.2193
0.5374 0.2792 43 0.5037 72.4424 21.7530
0.563 0.2857 44 0.5067 73.0876 22.1660
0.4671 0.2922 45 0.5164 72.8111 22.1660
0.5028 0.2987 46 0.5049 73.2719 22.1926
0.6303 0.3052 47 0.4997 73.2719 22.3525
0.5647 0.3117 48 0.5008 74.8387 22.5256
0.5822 0.3182 49 0.4978 75.2995 22.7787
0.5548 0.3247 50 0.4951 85.9908 26.6684
0.4075 0.3312 51 0.4930 83.1336 25.9891
0.4862 0.3377 52 0.4892 80.9217 25.7493
0.4765 0.3442 53 0.4918 75.4839 23.7112
0.6034 0.3506 54 0.4908 69.9539 22.2592
0.5139 0.3571 55 0.4849 70.1382 21.9662
0.4526 0.3636 56 0.4822 78.1567 23.9510
0.4726 0.3701 57 0.4823 78.9862 24.8701
0.5949 0.3766 58 0.4842 75.8525 23.7645
0.4697 0.3831 59 0.4884 72.4424 21.7131
0.4943 0.3896 60 0.4870 76.2212 24.3240
0.4068 0.3961 61 0.4852 69.8618 20.3011
0.4813 0.4026 62 0.4826 68.2028 20.2478
0.4826 0.4091 63 0.4799 68.3871 20.2211
0.5617 0.4156 64 0.4731 68.0184 20.1545
0.4195 0.4221 65 0.4729 74.6544 21.4067
0.4733 0.4286 66 0.4855 88.5714 25.1632
0.5419 0.4351 67 0.4833 78.2488 22.4324
0.5919 0.4416 68 0.4715 82.7650 25.9358
0.484 0.4481 69 0.4675 73.9171 23.0052
0.5842 0.4545 70 0.4706 74.2857 23.3249
0.4245 0.4610 71 0.4707 77.6959 24.7369
0.4932 0.4675 72 0.4701 73.7327 22.1527
0.4514 0.4740 73 0.4708 76.2212 22.2592
0.492 0.4805 74 0.4660 73.9171 20.8072
0.4631 0.4870 75 0.4602 72.2581 20.9005
0.4525 0.4935 76 0.4601 75.1152 22.6988
0.5744 0.5 77 0.4590 72.9954 22.1127
0.4787 0.5065 78 0.4533 72.6267 21.8196
0.3931 0.5130 79 0.4550 74.5622 21.5665
0.4718 0.5195 80 0.4583 76.9585 21.8196
0.4215 0.5260 81 0.4559 73.6406 22.4191
0.4475 0.5325 82 0.4505 70.6912 21.9662
0.4644 0.5390 83 0.4526 66.4516 19.7416
0.4653 0.5455 84 0.4573 68.8479 20.5009
0.4613 0.5519 85 0.4614 78.3410 23.6979
0.5046 0.5584 86 0.4616 80.7373 23.8577
0.4898 0.5649 87 0.4606 86.3594 25.2165
0.5739 0.5714 88 0.4554 75.5760 21.4466
0.4596 0.5779 89 0.4495 69.7696 20.7673
0.5215 0.5844 90 0.4484 68.3871 21.0470
0.5359 0.5909 91 0.4477 68.0184 20.6474
0.5924 0.5974 92 0.4443 68.2028 20.3677
0.5832 0.6039 93 0.4403 68.0184 20.0346
0.4084 0.6104 94 0.4437 70.3226 20.3011
0.4915 0.6169 95 0.4527 75.3917 20.9271
0.4793 0.6234 96 0.4557 76.4055 21.2468
0.5271 0.6299 97 0.4475 74.1014 21.0870
0.5148 0.6364 98 0.4469 71.6129 20.4875
0.4026 0.6429 99 0.4500 71.4286 21.9928
0.3352 0.6494 100 0.4476 71.3364 21.3134
0.5475 0.6558 101 0.4450 72.2581 24.1242
0.4744 0.6623 102 0.4416 70.7834 21.9662
0.4517 0.6688 103 0.4420 76.5899 23.4315
0.4324 0.6753 104 0.4425 75.3917 22.8453
0.5335 0.6818 105 0.4428 74.6544 21.4600
0.522 0.6883 106 0.4424 73.5484 22.1793
0.4851 0.6948 107 0.4394 69.0323 20.5541
0.4433 0.7013 108 0.4363 68.5714 20.6074
0.4838 0.7078 109 0.4347 67.8341 20.3410
0.4939 0.7143 110 0.4326 69.0323 20.5408
0.4069 0.7208 111 0.4332 72.5346 20.5142
0.4762 0.7273 112 0.4363 75.8525 21.1003
0.4844 0.7338 113 0.4377 76.4977 20.5541
0.6469 0.7403 114 0.4379 76.4977 20.4742
0.4218 0.7468 115 0.4345 74.0092 20.4742
0.4011 0.7532 116 0.4300 73.9171 22.0727
0.4562 0.7597 117 0.4306 67.3733 19.9414
0.3829 0.7662 118 0.4325 67.2811 20.0346
0.5716 0.7727 119 0.4311 67.5576 19.9814
0.4747 0.7792 120 0.4292 68.9401 19.9814
0.4593 0.7857 121 0.4299 69.2166 19.6084
0.4635 0.7922 122 0.4321 70.5069 19.7815
0.378 0.7987 123 0.4306 69.4009 19.1421
0.4589 0.8052 124 0.4276 68.2949 19.1688
0.4257 0.8117 125 0.4264 73.8249 21.0071
0.4149 0.8182 126 0.4298 74.4700 19.7949
0.4435 0.8247 127 0.4264 77.8802 21.3134
0.6128 0.8312 128 0.4219 75.9447 21.3667
0.4661 0.8377 129 0.4168 72.4424 20.4609
0.4396 0.8442 130 0.4168 72.9954 20.7540
0.4563 0.8506 131 0.4160 72.6267 20.7673
0.4169 0.8571 132 0.4155 70.5069 20.3943
0.4689 0.8636 133 0.4175 65.0691 18.7691
0.4474 0.8701 134 0.4212 66.8203 19.1555
0.4877 0.8766 135 0.4228 68.4793 19.4086
0.5343 0.8831 136 0.4229 73.2719 19.8481
0.4016 0.8896 137 0.4218 73.1797 19.9680
0.4334 0.8961 138 0.4198 67.5576 19.4219
0.4422 0.9026 139 0.4186 66.1751 18.7958
0.334 0.9091 140 0.4193 66.6359 18.9690
0.4015 0.9156 141 0.4193 65.4378 18.5427
0.4146 0.9221 142 0.4193 65.7143 18.4894
0.3879 0.9286 143 0.4208 69.3088 18.8624
0.5038 0.9351 144 0.4249 70.6912 18.8890
0.4647 0.9416 145 0.4272 71.6129 19.0356
0.3795 0.9481 146 0.4257 68.5714 18.4628
0.5047 0.9545 147 0.4218 71.6129 20.2478
0.4029 0.9610 148 0.4181 64.0553 18.0232
0.3595 0.9675 149 0.4165 69.0323 20.0080
0.4575 0.9740 150 0.4158 69.4931 20.3543
0.3729 0.9805 151 0.4174 71.6129 21.0337
0.3784 0.9870 152 0.4178 71.6129 20.8072
0.5115 0.9935 153 0.4175 72.1659 21.0603
0.4417 1.0 154 0.4177 69.8618 19.8082
0.2339 1.0065 155 0.4158 69.0323 19.8748
0.3468 1.0130 156 0.4140 67.9263 19.5684
0.2693 1.0195 157 0.4134 67.1889 19.1555
0.2382 1.0260 158 0.4135 66.1751 18.8091
0.3079 1.0325 159 0.4148 66.7281 18.6359
0.2571 1.0390 160 0.4157 66.8203 18.6093
0.2553 1.0455 161 0.4171 69.9539 18.9556
0.2562 1.0519 162 0.4228 71.8894 19.0356
0.2901 1.0584 163 0.4245 73.0876 19.1155
0.2768 1.0649 164 0.4208 70.5991 18.9823
0.3112 1.0714 165 0.4156 66.8203 18.7292
0.2671 1.0779 166 0.4151 64.6083 18.7292
0.2475 1.0844 167 0.4135 64.9770 18.9690
0.2045 1.0909 168 0.4108 64.7005 18.8091
0.2462 1.0974 169 0.4100 67.4654 18.7825
0.2224 1.1039 170 0.4162 69.6774 18.8624
0.2923 1.1104 171 0.4236 76.6820 20.5275
0.2349 1.1169 172 0.4230 78.8940 22.0328
0.2644 1.1234 173 0.4189 77.7880 24.2707
0.2912 1.1299 174 0.4160 68.0184 19.0622
0.2695 1.1364 175 0.4161 65.5300 19.0489
0.2617 1.1429 176 0.4155 64.7926 19.1155
0.2664 1.1494 177 0.4122 63.6866 18.5693
0.295 1.1558 178 0.4092 65.5300 18.8091
0.2517 1.1623 179 0.4093 68.1106 18.8358
0.2524 1.1688 180 0.4146 72.9032 19.1022
0.2438 1.1753 181 0.4194 74.9309 19.7016
0.2314 1.1818 182 0.4173 75.0230 19.7949
0.2659 1.1883 183 0.4109 71.7972 19.5418
0.252 1.1948 184 0.4065 69.6774 19.3553
0.301 1.2013 185 0.4039 66.4516 19.1555
0.2549 1.2078 186 0.4035 65.9908 18.9823
0.231 1.2143 187 0.4041 66.7281 19.0489
0.1975 1.2208 188 0.4056 67.4654 19.0222
0.1987 1.2273 189 0.4100 71.2442 19.1821
0.2181 1.2338 190 0.4144 75.1152 19.8215
0.2375 1.2403 191 0.4161 76.2212 19.7149
0.2347 1.2468 192 0.4151 75.2074 19.5284
0.2722 1.2532 193 0.4121 71.6129 19.3153
0.2308 1.2597 194 0.4101 68.2949 18.7425
0.2759 1.2662 195 0.4095 71.7051 20.4875
0.2403 1.2727 196 0.4095 71.0599 20.3677
0.2694 1.2792 197 0.4087 71.5207 20.2478
0.2471 1.2857 198 0.4095 66.8203 18.2896
0.2368 1.2922 199 0.4123 74.4700 20.7406
0.2345 1.2987 200 0.4151 78.2488 21.1669
0.2647 1.3052 201 0.4173 80.2765 21.2735
0.2274 1.3117 202 0.4138 76.6820 20.8339
0.2464 1.3182 203 0.4094 73.2719 20.6607
0.2787 1.3247 204 0.4074 71.5207 20.5941
0.249 1.3312 205 0.4085 70.0461 20.5675
0.2703 1.3377 206 0.4085 70.3226 20.4742
0.3381 1.3442 207 0.4071 71.2442 20.5541
0.2611 1.3506 208 0.4063 73.1797 20.6341
0.2123 1.3571 209 0.4081 76.6820 20.9538
0.2653 1.3636 210 0.4098 79.5392 21.2602
0.2206 1.3701 211 0.4097 81.4747 21.3001
0.2473 1.3766 212 0.4067 79.1705 21.1269
0.2713 1.3831 213 0.4038 78.2488 21.1269
0.2417 1.3896 214 0.3999 69.7696 18.8358
0.2303 1.3961 215 0.3975 67.4654 18.6892
0.2206 1.4026 216 0.3968 64.7926 18.2496
0.1942 1.4091 217 0.3967 64.6083 18.0631
0.2484 1.4156 218 0.3971 64.8848 18.0898
0.2646 1.4221 219 0.3979 64.7926 18.1964
0.2198 1.4286 220 0.3995 65.7143 18.3162
0.2733 1.4351 221 0.4006 66.0829 18.0765
0.2666 1.4416 222 0.4011 67.2811 18.1697
0.2307 1.4481 223 0.4017 68.2028 18.2630
0.2565 1.4545 224 0.4018 69.0323 18.3029
0.2753 1.4610 225 0.4012 69.1244 18.3828
0.2845 1.4675 226 0.4013 69.5853 18.4095
0.1945 1.4740 227 0.4014 69.8618 18.3562
0.3045 1.4805 228 0.4013 70.3226 18.3429
0.2356 1.4870 229 0.3994 69.4009 18.1697
0.2188 1.4935 230 0.3974 70.0461 18.1431
0.2203 1.5 231 0.3956 69.2166 18.3296
0.2329 1.5065 232 0.3947 67.7419 18.1830
0.2015 1.5130 233 0.3944 67.4654 18.1164
0.2595 1.5195 234 0.3946 66.7281 18.0232
0.1935 1.5260 235 0.3941 67.6498 18.4494
0.2267 1.5325 236 0.3937 66.5438 17.9166
0.2818 1.5390 237 0.3933 66.5438 17.9299
0.2207 1.5455 238 0.3920 66.8203 17.8766
0.2591 1.5519 239 0.3920 66.3594 17.7035
0.2438 1.5584 240 0.3914 67.1889 17.8234
0.2242 1.5649 241 0.3915 66.9124 17.7834
0.2363 1.5714 242 0.3911 67.2811 17.7967
0.1735 1.5779 243 0.3906 67.0968 17.8900
0.2217 1.5844 244 0.3901 67.0046 17.8234
0.2197 1.5909 245 0.3892 66.1751 17.7434
0.2624 1.5974 246 0.3890 66.1751 17.8234
0.3023 1.6039 247 0.3883 66.1751 17.9433
0.2467 1.6104 248 0.3886 66.4516 17.9299
0.2801 1.6169 249 0.3883 66.4516 17.9433
0.2998 1.6234 250 0.3886 66.8203 18.0232
0.2552 1.6299 251 0.3890 66.9124 18.0232
0.2101 1.6364 252 0.3891 67.8341 18.3828
0.2514 1.6429 253 0.3892 67.5576 18.2763
0.1999 1.6494 254 0.3898 68.0184 18.1431
0.2184 1.6558 255 0.3903 67.7419 18.1697
0.2258 1.6623 256 0.3910 68.2028 18.2363
0.259 1.6688 257 0.3916 68.8479 18.1297
0.2553 1.6753 258 0.3924 70.1382 18.1031
0.2074 1.6818 259 0.3936 71.3364 18.3162
0.2533 1.6883 260 0.3938 71.6129 18.1830
0.2702 1.6948 261 0.3935 71.4286 18.1564
0.1993 1.7013 262 0.3933 70.5991 18.0765
0.1765 1.7078 263 0.3926 69.5853 18.0099
0.251 1.7143 264 0.3920 69.1244 18.0765
0.229 1.7208 265 0.3910 67.5576 17.7967
0.2004 1.7273 266 0.3903 67.3733 17.8100
0.2975 1.7338 267 0.3900 67.0046 17.8367
0.1764 1.7403 268 0.3896 66.1751 17.8633
0.2036 1.7468 269 0.3889 65.9908 17.8900
0.2055 1.7532 270 0.3891 65.9908 17.8633
0.2273 1.7597 271 0.3888 66.0829 17.9166
0.2209 1.7662 272 0.3887 66.4516 18.0365
0.2217 1.7727 273 0.3892 66.4516 17.9433
0.2509 1.7792 274 0.3891 66.1751 17.9965
0.2821 1.7857 275 0.3894 66.5438 17.8766
0.2238 1.7922 276 0.3895 65.9908 17.7434
0.2044 1.7987 277 0.3894 67.0968 17.9699
0.2589 1.8052 278 0.3896 67.0046 18.1031
0.2888 1.8117 279 0.3895 66.9124 17.9832
0.2328 1.8182 280 0.3898 67.8341 18.1697
0.2875 1.8247 281 0.3900 67.8341 18.2630
0.1767 1.8312 282 0.3900 68.1106 18.2896
0.222 1.8377 283 0.3901 68.1106 18.3296
0.2523 1.8442 284 0.3901 68.8479 18.4228
0.1884 1.8506 285 0.3897 69.0323 18.4761
0.193 1.8571 286 0.3901 69.3088 18.4095
0.2038 1.8636 287 0.3900 68.6636 18.3162
0.2271 1.8701 288 0.3898 69.6774 18.4494
0.2782 1.8766 289 0.3896 68.9401 18.3562
0.2216 1.8831 290 0.3898 69.4931 18.3562
0.308 1.8896 291 0.3898 69.0323 18.3429
0.249 1.8961 292 0.3896 69.4009 18.3162
0.258 1.9026 293 0.3896 69.0323 18.3296
0.182 1.9091 294 0.3895 69.1244 18.4228
0.254 1.9156 295 0.3896 69.2166 18.3429
0.2042 1.9221 296 0.3897 68.7558 18.3162
0.1904 1.9286 297 0.3897 69.0323 18.4095
0.2764 1.9351 298 0.3895 69.0323 18.4228
0.2526 1.9416 299 0.3891 68.9401 18.3828
0.2262 1.9481 300 0.3893 68.9401 18.3828
0.2182 1.9545 301 0.3894 68.9401 18.3962
0.2599 1.9610 302 0.3894 68.6636 18.4095
0.2851 1.9675 303 0.3894 68.9401 18.4095
0.2883 1.9740 304 0.3894 69.0323 18.3962
0.2093 1.9805 305 0.3892 68.9401 18.4361
0.2116 1.9870 306 0.3892 68.8479 18.3962
0.2259 1.9935 307 0.3896 68.8479 18.3429
0.2872 2.0 308 0.3894 69.0323 18.4095

Framework versions

  • Transformers 4.57.3
  • Pytorch 2.9.1+cu128
  • Datasets 4.4.1
  • Tokenizers 0.22.1
Downloads last month
4
Safetensors
Model size
2B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for nkkbr/whisper-large-v3-zatoichi-ja-JDG_ver_20260217_lr_2.4e-5

Finetuned
(815)
this model