Whisper Large v3 - Japanese Zatoichi ASR

This model is a fine-tuned version of openai/whisper-large-v3 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3953
  • Wer: 69.9539
  • Cer: 18.0631

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 32
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Use adamw_torch_fused with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 2

Training results

Training Loss Epoch Step Validation Loss Wer Cer
1.2387 0.0065 1 1.2545 98.1567 40.8419
1.2196 0.0130 2 0.9953 96.4055 38.4974
1.1009 0.0195 3 0.9771 95.5760 37.8314
0.9222 0.0260 4 0.9509 93.0876 36.1929
0.9843 0.0325 5 0.9110 88.0184 34.7143
0.9567 0.0390 6 0.8508 85.3456 30.5315
1.0579 0.0455 7 0.8196 84.7005 29.1328
0.9463 0.0519 8 0.7744 89.0323 29.6523
0.8603 0.0584 9 0.7454 84.6083 28.1471
0.7985 0.0649 10 0.7195 87.0046 31.7304
0.7489 0.0714 11 0.6887 93.8249 34.8741
0.8338 0.0779 12 0.6629 104.3318 36.0863
0.6975 0.0844 13 0.6408 86.1751 29.5857
0.6521 0.0909 14 0.6226 84.0553 28.5067
0.6355 0.0974 15 0.6071 86.5438 29.6923
0.7003 0.1039 16 0.5948 80.3687 25.6161
0.6051 0.1104 17 0.5848 80.8295 25.5495
0.4999 0.1169 18 0.5766 79.1705 24.5504
0.6925 0.1234 19 0.5756 78.8018 24.8701
0.5695 0.1299 20 0.5697 76.3134 24.6703
0.5311 0.1364 21 0.5584 74.6544 24.1508
0.591 0.1429 22 0.5528 79.9078 25.8825
0.6893 0.1494 23 0.5536 87.6498 28.8131
0.5734 0.1558 24 0.5480 88.1106 30.1852
0.6371 0.1623 25 0.5461 89.5853 29.7589
0.5515 0.1688 26 0.5455 84.6083 26.5086
0.5823 0.1753 27 0.5421 81.0138 25.1499
0.5627 0.1818 28 0.5409 78.9862 27.4677
0.6379 0.1883 29 0.5363 77.8802 25.1898
0.5452 0.1948 30 0.5281 76.5899 22.7654
0.5031 0.2013 31 0.5302 74.9309 22.3258
0.6543 0.2078 32 0.5278 75.1152 23.3382
0.5209 0.2143 33 0.5211 75.5760 25.1765
0.5141 0.2208 34 0.5143 77.0507 24.3772
0.4857 0.2273 35 0.5139 75.9447 22.5523
0.6512 0.2338 36 0.5141 77.4194 23.0585
0.504 0.2403 37 0.5123 72.5346 22.2992
0.4688 0.2468 38 0.5134 70.3226 21.9129
0.5879 0.2532 39 0.5105 70.2304 21.7264
0.5161 0.2597 40 0.5062 70.7834 22.1260
0.5797 0.2662 41 0.5055 72.3502 21.6598
0.6364 0.2727 42 0.5084 86.2673 26.3088
0.5402 0.2792 43 0.5014 76.6820 22.2059
0.557 0.2857 44 0.5039 75.5760 22.0727
0.469 0.2922 45 0.5161 73.0876 21.7397
0.4999 0.2987 46 0.5093 72.8111 21.9928
0.6358 0.3052 47 0.5014 71.7972 21.7930
0.5594 0.3117 48 0.4986 72.1659 21.9129
0.5698 0.3182 49 0.4993 74.3779 22.2725
0.5489 0.3247 50 0.5009 77.6037 22.5656
0.4195 0.3312 51 0.4949 78.2488 23.2317
0.4914 0.3377 52 0.4896 72.3502 21.6065
0.4754 0.3442 53 0.4910 75.4839 25.3630
0.598 0.3506 54 0.4904 73.7327 25.0167
0.4985 0.3571 55 0.4890 76.2212 25.3097
0.4503 0.3636 56 0.4838 76.4977 25.1499
0.4732 0.3701 57 0.4820 86.2673 28.4002
0.5868 0.3766 58 0.4793 85.5300 27.1880
0.4594 0.3831 59 0.4793 85.4378 27.1080
0.4759 0.3896 60 0.4792 81.6590 24.2174
0.3971 0.3961 61 0.4783 69.3088 19.9014
0.4702 0.4026 62 0.4753 67.1889 20.0080
0.468 0.4091 63 0.4726 67.9263 19.9547
0.5522 0.4156 64 0.4689 69.3088 20.1412
0.4129 0.4221 65 0.4697 71.2442 20.7007
0.4669 0.4286 66 0.4781 78.7097 21.3134
0.5413 0.4351 67 0.4759 79.3548 21.5133
0.5776 0.4416 68 0.4664 74.1935 21.0337
0.4765 0.4481 69 0.4612 70.4147 20.5009
0.5798 0.4545 70 0.4627 73.0876 23.2183
0.4214 0.4610 71 0.4625 68.5714 21.4733
0.4872 0.4675 72 0.4612 73.8249 22.9519
0.437 0.4740 73 0.4636 77.4194 23.6446
0.4775 0.4805 74 0.4622 77.8802 23.4448
0.4568 0.4870 75 0.4574 77.6959 23.3116
0.4524 0.4935 76 0.4556 74.3779 21.8996
0.5608 0.5 77 0.4563 74.0092 21.6997
0.4643 0.5065 78 0.4524 74.4700 21.7664
0.3893 0.5130 79 0.4504 72.6267 20.0879
0.4727 0.5195 80 0.4491 71.2442 20.1012
0.4159 0.5260 81 0.4507 70.9677 20.4875
0.4328 0.5325 82 0.4500 69.0323 20.2877
0.4691 0.5390 83 0.4494 65.9908 19.9547
0.4659 0.5455 84 0.4518 66.5438 20.1812
0.4581 0.5519 85 0.4545 69.6774 20.6208
0.4928 0.5584 86 0.4549 75.2074 23.8311
0.4833 0.5649 87 0.4546 81.1060 25.5228
0.5622 0.5714 88 0.4525 82.2120 25.4696
0.4607 0.5779 89 0.4492 73.7327 22.4857
0.5183 0.5844 90 0.4470 68.4793 20.2478
0.5285 0.5909 91 0.4461 68.1106 20.4476
0.5822 0.5974 92 0.4436 67.0968 19.9147
0.5832 0.6039 93 0.4415 67.9263 20.1012
0.4046 0.6104 94 0.4409 68.7558 19.9147
0.4904 0.6169 95 0.4467 70.3226 20.2211
0.4778 0.6234 96 0.4515 72.4424 20.4609
0.5281 0.6299 97 0.4479 76.7742 21.0603
0.5173 0.6364 98 0.4477 76.9585 21.5399
0.4006 0.6429 99 0.4492 75.9447 21.0337
0.3334 0.6494 100 0.4458 71.8894 20.4875
0.5472 0.6558 101 0.4434 67.9263 20.9138
0.4744 0.6623 102 0.4419 67.5576 21.1403
0.4501 0.6688 103 0.4415 69.1244 21.9262
0.4172 0.6753 104 0.4411 72.7189 22.5922
0.5283 0.6818 105 0.4404 70.2304 20.8739
0.5155 0.6883 106 0.4408 72.5346 21.6065
0.475 0.6948 107 0.4393 72.7189 21.5799
0.4383 0.7013 108 0.4361 69.0323 20.3410
0.4856 0.7078 109 0.4333 68.8479 19.8215
0.485 0.7143 110 0.4301 67.8341 19.3020
0.4066 0.7208 111 0.4290 68.8479 19.6883
0.4786 0.7273 112 0.4296 71.0599 20.6607
0.4717 0.7338 113 0.4304 72.2581 19.8748
0.6404 0.7403 114 0.4309 71.7051 21.3534
0.4155 0.7468 115 0.4284 70.5991 21.3134
0.3982 0.7532 116 0.4234 67.2811 20.9138
0.4518 0.7597 117 0.4238 66.6359 19.4885
0.3797 0.7662 118 0.4256 66.6359 19.7283
0.5587 0.7727 119 0.4245 65.5300 19.6084
0.47 0.7792 120 0.4225 66.3594 19.7815
0.4518 0.7857 121 0.4243 68.5714 19.5684
0.453 0.7922 122 0.4276 70.4147 19.3952
0.3714 0.7987 123 0.4284 69.8618 19.4885
0.4549 0.8052 124 0.4258 69.5853 19.2887
0.4097 0.8117 125 0.4248 68.9401 20.0346
0.411 0.8182 126 0.4271 76.5899 20.6874
0.4345 0.8247 127 0.4250 74.4700 20.3144
0.608 0.8312 128 0.4199 73.1797 20.1678
0.4558 0.8377 129 0.4150 73.2719 21.0737
0.4363 0.8442 130 0.4134 73.1797 20.3943
0.4492 0.8506 131 0.4139 73.1797 20.3144
0.4021 0.8571 132 0.4141 75.1152 21.5532
0.4587 0.8636 133 0.4142 73.2719 21.4466
0.4513 0.8701 134 0.4163 66.6359 18.8624
0.4815 0.8766 135 0.4157 71.7972 21.0204
0.5297 0.8831 136 0.4152 74.2857 21.2735
0.3949 0.8896 137 0.4148 73.9171 21.3800
0.4294 0.8961 138 0.4131 74.3779 21.1802
0.4428 0.9026 139 0.4121 74.0092 20.8472
0.3372 0.9091 140 0.4117 70.1382 19.0755
0.3985 0.9156 141 0.4111 69.0323 18.8624
0.4079 0.9221 142 0.4099 72.2581 20.0613
0.3832 0.9286 143 0.4101 69.8618 19.6883
0.5044 0.9351 144 0.4117 68.2028 18.5427
0.444 0.9416 145 0.4142 67.7419 18.4095
0.3672 0.9481 146 0.4157 67.8341 18.7825
0.5024 0.9545 147 0.4156 66.0829 18.6359
0.3961 0.9610 148 0.4151 65.8986 18.5827
0.3598 0.9675 149 0.4154 71.0599 20.0346
0.4558 0.9740 150 0.4156 71.9816 20.4076
0.3757 0.9805 151 0.4173 75.2074 21.1669
0.3809 0.9870 152 0.4168 71.0599 19.4885
0.4956 0.9935 153 0.4165 69.8618 19.6483
0.4399 1.0 154 0.4163 69.9539 20.0613
0.2401 1.0065 155 0.4147 69.9539 19.9547
0.3563 1.0130 156 0.4132 69.4009 19.7416
0.2789 1.0195 157 0.4120 68.8479 19.4485
0.2485 1.0260 158 0.4115 68.7558 19.1421
0.3088 1.0325 159 0.4128 67.3733 18.8358
0.2639 1.0390 160 0.4144 67.9263 18.6626
0.2542 1.0455 161 0.4171 69.4009 18.6093
0.2607 1.0519 162 0.4228 71.6129 18.7425
0.2982 1.0584 163 0.4261 72.5346 18.9690
0.2819 1.0649 164 0.4240 72.0737 19.1022
0.3122 1.0714 165 0.4188 68.7558 18.9157
0.2726 1.0779 166 0.4158 65.5300 18.5560
0.2497 1.0844 167 0.4141 64.4240 18.2896
0.207 1.0909 168 0.4120 62.4885 17.8766
0.2601 1.0974 169 0.4108 65.1613 18.1564
0.2321 1.1039 170 0.4140 68.0184 18.6759
0.293 1.1104 171 0.4197 70.6912 18.9556
0.2343 1.1169 172 0.4213 73.0876 20.2078
0.2839 1.1234 173 0.4203 72.8111 20.2877
0.3017 1.1299 174 0.4168 70.4147 20.1678
0.2698 1.1364 175 0.4158 67.0968 19.0489
0.2637 1.1429 176 0.4158 65.8065 18.9423
0.2647 1.1494 177 0.4147 65.1613 18.5827
0.2904 1.1558 178 0.4127 70.2304 20.1812
0.2567 1.1623 179 0.4119 73.2719 20.3543
0.2649 1.1688 180 0.4140 75.5760 20.5142
0.2449 1.1753 181 0.4195 78.5253 21.3134
0.2441 1.1818 182 0.4214 73.2719 19.2487
0.2639 1.1883 183 0.4186 70.5069 18.9290
0.2589 1.1948 184 0.4155 75.0230 21.0204
0.3154 1.2013 185 0.4116 71.9816 20.5275
0.2604 1.2078 186 0.4098 71.1521 20.4609
0.2405 1.2143 187 0.4100 70.5991 20.1012
0.208 1.2208 188 0.4110 70.9677 20.1012
0.2048 1.2273 189 0.4135 72.9954 20.4343
0.2235 1.2338 190 0.4164 70.6912 18.9556
0.2477 1.2403 191 0.4203 72.8111 19.3286
0.2297 1.2468 192 0.4223 73.5484 19.4086
0.2726 1.2532 193 0.4208 72.2581 19.4352
0.2377 1.2597 194 0.4176 74.0092 21.1136
0.2906 1.2662 195 0.4153 68.3871 19.2221
0.2408 1.2727 196 0.4151 67.1889 18.9024
0.2729 1.2792 197 0.4140 66.3594 18.6359
0.2558 1.2857 198 0.4143 66.9124 18.6093
0.241 1.2922 199 0.4146 69.6774 18.9024
0.2361 1.2987 200 0.4183 72.6267 19.2221
0.2678 1.3052 201 0.4208 74.5622 19.3419
0.2292 1.3117 202 0.4180 71.5207 18.8358
0.2507 1.3182 203 0.4146 68.4793 18.5960
0.2898 1.3247 204 0.4121 65.8986 18.2496
0.2598 1.3312 205 0.4125 65.0691 18.2896
0.2741 1.3377 206 0.4132 65.5300 18.4095
0.3536 1.3442 207 0.4131 67.1889 18.5560
0.2695 1.3506 208 0.4137 68.3871 18.4761
0.213 1.3571 209 0.4167 71.7972 18.7292
0.2728 1.3636 210 0.4188 73.8249 18.9690
0.2204 1.3701 211 0.4187 74.8387 19.0222
0.2505 1.3766 212 0.4152 73.7327 18.9556
0.2754 1.3831 213 0.4120 72.7189 18.8757
0.2556 1.3896 214 0.4084 68.7558 18.4361
0.2366 1.3961 215 0.4060 66.2673 18.1964
0.2333 1.4026 216 0.4044 65.7143 18.0898
0.2096 1.4091 217 0.4039 65.4378 18.1297
0.2566 1.4156 218 0.4046 65.6221 18.3429
0.2763 1.4221 219 0.4052 65.6221 18.3296
0.2184 1.4286 220 0.4063 67.0968 18.5161
0.2819 1.4351 221 0.4079 68.8479 18.5560
0.2715 1.4416 222 0.4085 69.0323 18.6493
0.2387 1.4481 223 0.4094 69.7696 18.6226
0.2676 1.4545 224 0.4093 69.0323 18.4361
0.2836 1.4610 225 0.4083 69.0323 18.3695
0.293 1.4675 226 0.4080 68.9401 18.4228
0.2113 1.4740 227 0.4080 69.3088 18.3962
0.314 1.4805 228 0.4080 68.9401 18.3162
0.2459 1.4870 229 0.4064 68.6636 18.5560
0.2227 1.4935 230 0.4049 68.4793 18.4628
0.2212 1.5 231 0.4033 68.2028 18.4494
0.2437 1.5065 232 0.4025 69.6774 18.7425
0.2093 1.5130 233 0.4017 68.7558 18.5161
0.2623 1.5195 234 0.4018 68.4793 18.4095
0.1991 1.5260 235 0.4014 68.0184 18.4494
0.2294 1.5325 236 0.4014 68.2028 18.4494
0.2946 1.5390 237 0.4007 68.2949 18.3562
0.2253 1.5455 238 0.4001 68.6636 18.3162
0.266 1.5519 239 0.3993 68.4793 18.1697
0.2511 1.5584 240 0.3985 68.6636 18.1564
0.2428 1.5649 241 0.3985 68.2949 17.9832
0.2469 1.5714 242 0.3983 68.0184 17.9299
0.1764 1.5779 243 0.3975 67.8341 17.9566
0.2309 1.5844 244 0.3964 67.5576 17.9166
0.2269 1.5909 245 0.3954 67.3733 17.7168
0.2759 1.5974 246 0.3948 66.8203 17.7168
0.3133 1.6039 247 0.3940 66.2673 17.6635
0.2534 1.6104 248 0.3941 65.9908 17.6635
0.2966 1.6169 249 0.3937 66.0829 17.6768
0.306 1.6234 250 0.3939 66.4516 17.8633
0.2621 1.6299 251 0.3940 66.9124 17.8633
0.2236 1.6364 252 0.3939 67.0968 17.8900
0.2613 1.6429 253 0.3941 67.1889 17.8900
0.2032 1.6494 254 0.3947 67.4654 17.8766
0.22 1.6558 255 0.3950 67.8341 17.9166
0.2264 1.6623 256 0.3953 68.2949 17.8900
0.2731 1.6688 257 0.3960 69.0323 17.7967
0.259 1.6753 258 0.3970 69.7696 17.8500
0.2114 1.6818 259 0.3982 71.3364 18.0365
0.2572 1.6883 260 0.3987 70.7834 17.8100
0.2803 1.6948 261 0.3992 71.0599 17.8100
0.2065 1.7013 262 0.3991 70.9677 17.8100
0.1825 1.7078 263 0.3982 70.8756 17.8367
0.2595 1.7143 264 0.3976 69.9539 17.8100
0.2384 1.7208 265 0.3968 69.1244 17.7568
0.2037 1.7273 266 0.3961 67.5576 17.6369
0.309 1.7338 267 0.3954 67.0968 17.6768
0.1839 1.7403 268 0.3950 66.7281 17.5836
0.2162 1.7468 269 0.3946 66.8203 17.6502
0.2109 1.7532 270 0.3943 65.7143 17.6635
0.2315 1.7597 271 0.3941 65.5300 17.6902
0.2299 1.7662 272 0.3939 66.0829 17.8367
0.2292 1.7727 273 0.3943 66.3594 17.8766
0.2563 1.7792 274 0.3941 66.2673 17.8500
0.2897 1.7857 275 0.3943 66.9124 17.8900
0.2318 1.7922 276 0.3945 66.8203 17.9699
0.2099 1.7987 277 0.3947 67.0046 17.8900
0.2701 1.8052 278 0.3947 67.1889 17.8633
0.2932 1.8117 279 0.3945 67.3733 18.0099
0.243 1.8182 280 0.3950 67.4654 17.9033
0.2923 1.8247 281 0.3953 67.9263 17.9433
0.1814 1.8312 282 0.3950 67.9263 17.9033
0.2298 1.8377 283 0.3950 68.9401 17.9699
0.2611 1.8442 284 0.3951 68.2028 17.9299
0.1918 1.8506 285 0.3948 69.3088 18.1297
0.2086 1.8571 286 0.3952 69.2166 18.0232
0.2136 1.8636 287 0.3951 69.0323 18.0365
0.234 1.8701 288 0.3952 69.4009 18.1431
0.2916 1.8766 289 0.3953 69.8618 18.1031
0.2316 1.8831 290 0.3954 69.7696 18.1164
0.3238 1.8896 291 0.3953 69.8618 18.1164
0.2607 1.8961 292 0.3954 69.8618 18.0631
0.2684 1.9026 293 0.3953 69.9539 18.0498
0.1912 1.9091 294 0.3954 69.9539 18.1297
0.2604 1.9156 295 0.3955 69.6774 18.1564
0.2131 1.9221 296 0.3955 69.7696 18.1564
0.2024 1.9286 297 0.3956 69.4931 18.1431
0.2936 1.9351 298 0.3952 69.9539 18.1431
0.2664 1.9416 299 0.3951 69.4931 18.0765
0.2316 1.9481 300 0.3954 69.9539 18.1031
0.2273 1.9545 301 0.3953 69.5853 18.1564
0.2628 1.9610 302 0.3951 69.9539 18.1297
0.2873 1.9675 303 0.3951 69.5853 18.1164
0.2954 1.9740 304 0.3953 69.8618 18.1830
0.2161 1.9805 305 0.3952 69.6774 18.1297
0.214 1.9870 306 0.3950 69.9539 18.2097
0.2352 1.9935 307 0.3951 69.8618 18.0765
0.2953 2.0 308 0.3953 69.9539 18.0631

Framework versions

  • Transformers 4.57.3
  • Pytorch 2.9.1+cu128
  • Datasets 4.4.1
  • Tokenizers 0.22.1
Downloads last month
-
Safetensors
Model size
2B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for nkkbr/whisper-large-v3-zatoichi-ja-JDG_ver_20260217_lr_2.0e-5

Finetuned
(816)
this model