Whisper Large v3 - Japanese Zatoichi ASR

This model is a fine-tuned version of openai/whisper-large-v3 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4197
  • Wer: 74.3779
  • Cer: 20.8739

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-06
  • train_batch_size: 32
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Use adamw_torch_fused with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 2

Training results

Training Loss Epoch Step Validation Loss Wer Cer
1.2387 0.0065 1 1.2545 98.1567 40.8419
1.2196 0.0130 2 0.9958 99.0783 38.8571
1.101 0.0195 3 0.9825 96.3134 38.3908
0.9273 0.0260 4 0.9683 95.0230 37.7781
1.0017 0.0325 5 0.9595 94.8387 37.3118
1.0056 0.0390 6 0.9460 93.0876 36.3794
1.152 0.0455 7 0.9310 91.2442 35.5268
1.065 0.0519 8 0.9087 88.2949 34.8874
1.0046 0.0584 9 0.8955 87.1889 34.0749
0.9436 0.0649 10 0.8586 88.7558 33.1024
0.8726 0.0714 11 0.8330 86.5438 30.7180
0.9813 0.0779 12 0.8154 84.5161 29.3326
0.8432 0.0844 13 0.8004 83.6866 28.8797
0.8361 0.0909 14 0.7746 89.4009 29.9454
0.7989 0.0974 15 0.7525 93.8249 31.9169
0.847 0.1039 16 0.7328 91.0599 30.7846
0.7609 0.1104 17 0.7168 88.2949 30.2251
0.6033 0.1169 18 0.7012 92.6267 31.7037
0.8912 0.1234 19 0.6853 93.2719 35.1139
0.6535 0.1299 20 0.6683 98.0645 39.5498
0.6348 0.1364 21 0.6553 80.3687 31.8103
0.6482 0.1429 22 0.6480 91.5207 31.9968
0.8049 0.1494 23 0.6304 103.2258 36.9522
0.6607 0.1558 24 0.6153 92.1659 32.2366
0.7077 0.1623 25 0.6067 100.4608 36.9522
0.6049 0.1688 26 0.6015 97.6037 34.0482
0.6452 0.1753 27 0.5957 93.1797 31.2375
0.6203 0.1818 28 0.5901 88.2028 28.5867
0.6987 0.1883 29 0.5849 88.0184 28.1337
0.6022 0.1948 30 0.5766 84.0553 26.7350
0.5597 0.2013 31 0.5700 80.9217 25.2564
0.6972 0.2078 32 0.5658 86.9124 27.3745
0.5436 0.2143 33 0.5641 85.4378 26.8150
0.5406 0.2208 34 0.5613 80.0922 24.7636
0.5313 0.2273 35 0.5533 80.1843 24.5105
0.6729 0.2338 36 0.5441 78.6175 24.3106
0.5448 0.2403 37 0.5404 78.0645 24.1641
0.4951 0.2468 38 0.5368 77.6959 24.1375
0.6275 0.2532 39 0.5342 77.3272 24.1908
0.545 0.2597 40 0.5339 72.2581 22.2059
0.6382 0.2662 41 0.5307 72.6267 22.3392
0.6493 0.2727 42 0.5284 72.7189 22.1127
0.5608 0.2792 43 0.5267 78.3410 24.0575
0.5668 0.2857 44 0.5272 79.9078 24.3506
0.5007 0.2922 45 0.5260 80.2765 24.3106
0.4978 0.2987 46 0.5251 85.8986 26.1356
0.6391 0.3052 47 0.5233 85.8986 25.9491
0.592 0.3117 48 0.5214 81.4747 24.4439
0.5902 0.3182 49 0.5189 76.2212 22.6322
0.5749 0.3247 50 0.5144 74.6544 22.3924
0.4618 0.3312 51 0.5126 75.0230 22.6322
0.5099 0.3377 52 0.5098 76.8664 25.0966
0.4941 0.3442 53 0.5069 77.0507 24.1375
0.6066 0.3506 54 0.5045 71.7972 22.1793
0.4908 0.3571 55 0.5028 71.2442 21.8463
0.4619 0.3636 56 0.5009 70.7834 21.5799
0.4767 0.3701 57 0.4980 71.1521 21.5399
0.5575 0.3766 58 0.4960 75.6682 23.3782
0.4679 0.3831 59 0.4941 79.8157 24.6037
0.4772 0.3896 60 0.4924 76.5899 23.2050
0.4086 0.3961 61 0.4926 76.7742 23.2583
0.5052 0.4026 62 0.4915 76.4055 23.1251
0.476 0.4091 63 0.4884 75.5760 23.0052
0.5671 0.4156 64 0.4852 88.2949 27.5743
0.4262 0.4221 65 0.4841 82.2120 25.3364
0.4937 0.4286 66 0.4842 83.3180 25.4296
0.5557 0.4351 67 0.4830 82.9493 25.3896
0.5741 0.4416 68 0.4807 71.7051 20.9271
0.4845 0.4481 69 0.4780 75.2995 22.3525
0.5882 0.4545 70 0.4762 73.3641 22.5523
0.431 0.4610 71 0.4761 70.3226 21.1802
0.4824 0.4675 72 0.4741 71.0599 22.3258
0.4339 0.4740 73 0.4710 70.1382 21.0470
0.473 0.4805 74 0.4683 72.6267 22.5123
0.4728 0.4870 75 0.4682 71.3364 21.2868
0.4666 0.4935 76 0.4676 71.4286 21.3534
0.5335 0.5 77 0.4678 73.1797 21.7797
0.473 0.5065 78 0.4668 74.3779 21.7797
0.4051 0.5130 79 0.4669 71.8894 20.7939
0.4635 0.5195 80 0.4683 71.4286 21.6065
0.4392 0.5260 81 0.4688 72.4424 21.9928
0.4417 0.5325 82 0.4662 71.2442 21.1136
0.4965 0.5390 83 0.4629 70.3226 20.9804
0.4947 0.5455 84 0.4622 69.5853 20.9138
0.4497 0.5519 85 0.4624 69.3088 20.9804
0.4779 0.5584 86 0.4622 70.0461 21.2868
0.4683 0.5649 87 0.4603 70.6912 21.6065
0.562 0.5714 88 0.4576 70.5991 21.4600
0.4821 0.5779 89 0.4573 69.8618 21.1936
0.523 0.5844 90 0.4576 70.2304 21.1269
0.5452 0.5909 91 0.4582 70.0461 20.7673
0.5843 0.5974 92 0.4555 69.5853 20.7540
0.5863 0.6039 93 0.4536 73.6406 22.3392
0.4045 0.6104 94 0.4550 79.7235 24.4172
0.4774 0.6169 95 0.4621 78.8018 24.3639
0.4852 0.6234 96 0.4656 78.7097 24.3240
0.5385 0.6299 97 0.4585 75.8525 23.1384
0.5306 0.6364 98 0.4547 72.9954 21.5932
0.424 0.6429 99 0.4553 70.0461 20.5808
0.3255 0.6494 100 0.4577 74.4700 22.1926
0.5697 0.6558 101 0.4580 72.1659 21.6465
0.4833 0.6623 102 0.4556 71.3364 20.8072
0.4565 0.6688 103 0.4526 70.9677 20.6740
0.4294 0.6753 104 0.4505 69.7696 20.5541
0.5184 0.6818 105 0.4510 73.0876 21.8996
0.5259 0.6883 106 0.4524 72.6267 21.6331
0.4768 0.6948 107 0.4536 72.0737 21.6198
0.4298 0.7013 108 0.4518 72.1659 21.6731
0.4975 0.7078 109 0.4473 72.2581 21.7930
0.4899 0.7143 110 0.4454 72.5346 21.7131
0.4175 0.7208 111 0.4465 74.3779 21.7397
0.4837 0.7273 112 0.4500 79.0783 22.5256
0.4806 0.7338 113 0.4526 81.2903 22.7388
0.6579 0.7403 114 0.4526 83.6866 22.8720
0.4249 0.7468 115 0.4494 78.3410 21.1269
0.407 0.7532 116 0.4439 73.7327 21.3134
0.4658 0.7597 117 0.4398 73.7327 22.4857
0.3802 0.7662 118 0.4375 73.8249 22.5922
0.5529 0.7727 119 0.4368 72.4424 21.5665
0.481 0.7792 120 0.4370 67.0046 19.7549
0.457 0.7857 121 0.4364 66.6359 19.7416
0.4569 0.7922 122 0.4348 66.2673 19.6483
0.382 0.7987 123 0.4333 71.0599 21.3268
0.4606 0.8052 124 0.4325 72.3502 21.6864
0.4096 0.8117 125 0.4332 75.4839 21.8063
0.4313 0.8182 126 0.4373 77.6037 21.9395
0.4375 0.8247 127 0.4397 78.7097 21.7397
0.6068 0.8312 128 0.4411 80.3687 21.8862
0.4769 0.8377 129 0.4402 79.7235 21.6198
0.4475 0.8442 130 0.4382 73.4562 19.7682
0.4535 0.8506 131 0.4363 72.8111 19.9680
0.4213 0.8571 132 0.4356 69.8618 19.6883
0.4573 0.8636 133 0.4360 67.6498 19.5551
0.4647 0.8701 134 0.4350 67.0968 20.6208
0.4878 0.8766 135 0.4336 69.9539 20.5541
0.5186 0.8831 136 0.4319 69.8618 20.6474
0.4021 0.8896 137 0.4297 69.5853 20.5941
0.4516 0.8961 138 0.4278 69.6774 20.5808
0.4462 0.9026 139 0.4272 70.0461 20.7140
0.3675 0.9091 140 0.4272 67.0046 19.1688
0.4138 0.9156 141 0.4270 69.1244 19.5817
0.4218 0.9221 142 0.4269 68.9401 20.5675
0.3967 0.9286 143 0.4269 73.1797 22.1793
0.4986 0.9351 144 0.4272 72.4424 22.0328
0.4555 0.9416 145 0.4275 73.0876 22.1660
0.3958 0.9481 146 0.4273 67.5576 20.3677
0.516 0.9545 147 0.4274 67.0046 20.4343
0.4145 0.9610 148 0.4278 71.2442 22.3258
0.3813 0.9675 149 0.4278 71.0599 22.1660
0.4715 0.9740 150 0.4272 71.5207 21.0870
0.3909 0.9805 151 0.4271 70.9677 21.1136
0.3989 0.9870 152 0.4261 71.6129 21.2468
0.5099 0.9935 153 0.4260 68.2028 19.6483
0.4622 1.0 154 0.4260 68.2028 19.7815
0.2969 1.0065 155 0.4257 68.6636 19.9680
0.4431 1.0130 156 0.4253 72.8111 22.2725
0.3373 1.0195 157 0.4248 71.9816 22.0461
0.3285 1.0260 158 0.4243 72.0737 22.1393
0.3821 1.0325 159 0.4243 71.4286 21.9129
0.3423 1.0390 160 0.4244 70.6912 21.8862
0.3342 1.0455 161 0.4248 71.2442 21.9795
0.3146 1.0519 162 0.4267 68.2949 19.6350
0.3567 1.0584 163 0.4286 68.9401 19.7283
0.3361 1.0649 164 0.4297 69.4931 19.7416
0.3823 1.0714 165 0.4300 69.3088 19.6616
0.3429 1.0779 166 0.4298 69.1244 19.6084
0.3255 1.0844 167 0.4283 68.4793 19.5151
0.2702 1.0909 168 0.4271 68.5714 19.6483
0.3192 1.0974 169 0.4267 68.3871 19.5551
0.2914 1.1039 170 0.4259 68.8479 19.7416
0.3629 1.1104 171 0.4264 69.5853 19.7416
0.2905 1.1169 172 0.4270 69.4931 20.5808
0.324 1.1234 173 0.4276 73.3641 22.1793
0.3802 1.1299 174 0.4273 68.6636 19.5817
0.3474 1.1364 175 0.4275 69.4931 19.9680
0.3412 1.1429 176 0.4276 69.2166 19.7549
0.3526 1.1494 177 0.4279 68.8479 19.7149
0.3026 1.1558 178 0.4292 69.6774 19.9814
0.3249 1.1623 179 0.4296 70.1382 19.9814
0.3455 1.1688 180 0.4312 71.1521 20.1146
0.317 1.1753 181 0.4322 72.0737 20.3810
0.3063 1.1818 182 0.4326 72.7189 20.5408
0.3325 1.1883 183 0.4327 73.2719 20.5009
0.3343 1.1948 184 0.4321 72.9032 20.4343
0.3768 1.2013 185 0.4304 72.8111 20.2877
0.3262 1.2078 186 0.4282 71.4286 19.9814
0.3031 1.2143 187 0.4261 70.9677 19.7416
0.2616 1.2208 188 0.4245 69.2166 19.5950
0.2599 1.2273 189 0.4243 68.2949 19.2487
0.2905 1.2338 190 0.4242 67.8341 19.3153
0.3132 1.2403 191 0.4237 68.1106 19.3553
0.3023 1.2468 192 0.4241 70.6912 20.6074
0.3467 1.2532 193 0.4242 70.6912 20.6874
0.3272 1.2597 194 0.4242 70.5069 20.7673
0.3528 1.2662 195 0.4250 70.7834 20.7406
0.3106 1.2727 196 0.4261 70.9677 20.9671
0.3523 1.2792 197 0.4270 71.9816 20.9138
0.3193 1.2857 198 0.4280 72.0737 20.9005
0.3201 1.2922 199 0.4284 73.1797 20.9804
0.3054 1.2987 200 0.4292 73.7327 20.8872
0.3434 1.3052 201 0.4298 75.4839 21.3134
0.3029 1.3117 202 0.4295 75.2074 21.4333
0.3327 1.3182 203 0.4288 71.7051 19.8615
0.3676 1.3247 204 0.4276 71.2442 19.7682
0.3227 1.3312 205 0.4275 71.4286 19.7949
0.3549 1.3377 206 0.4268 70.7834 19.7016
0.4426 1.3442 207 0.4261 70.2304 19.5151
0.3529 1.3506 208 0.4254 70.5069 19.4219
0.2773 1.3571 209 0.4253 70.8756 19.4219
0.3524 1.3636 210 0.4255 71.5207 19.6883
0.2855 1.3701 211 0.4265 75.1152 21.2468
0.3251 1.3766 212 0.4266 75.3917 21.4067
0.3624 1.3831 213 0.4264 75.0230 21.3401
0.3231 1.3896 214 0.4264 74.8387 21.2735
0.3193 1.3961 215 0.4260 74.4700 20.9937
0.3209 1.4026 216 0.4259 74.2857 21.0204
0.2793 1.4091 217 0.4263 74.3779 21.0470
0.3425 1.4156 218 0.4259 74.0092 20.9937
0.3709 1.4221 219 0.4258 74.3779 21.1136
0.2791 1.4286 220 0.4259 74.5622 21.0737
0.3543 1.4351 221 0.4256 74.3779 21.0337
0.3456 1.4416 222 0.4254 74.5622 21.1003
0.31 1.4481 223 0.4253 74.0092 20.9005
0.3729 1.4545 224 0.4243 73.8249 20.8339
0.3618 1.4610 225 0.4235 73.5484 20.7273
0.3687 1.4675 226 0.4235 74.5622 20.9538
0.2717 1.4740 227 0.4233 73.8249 20.7806
0.4188 1.4805 228 0.4231 73.5484 20.6874
0.3101 1.4870 229 0.4237 73.5484 20.6740
0.2885 1.4935 230 0.4231 73.9171 20.7273
0.295 1.5 231 0.4231 75.0230 20.9138
0.3388 1.5065 232 0.4232 75.0230 20.9405
0.2641 1.5130 233 0.4232 74.5622 21.0737
0.3566 1.5195 234 0.4231 74.5622 20.9405
0.2412 1.5260 235 0.4230 74.9309 21.0870
0.3016 1.5325 236 0.4231 74.3779 20.9804
0.3867 1.5390 237 0.4236 71.4286 19.8481
0.2931 1.5455 238 0.4234 71.9816 19.7416
0.37 1.5519 239 0.4232 72.0737 19.7815
0.3389 1.5584 240 0.4235 71.8894 19.8748
0.3265 1.5649 241 0.4232 71.8894 19.8748
0.3155 1.5714 242 0.4235 71.6129 19.7815
0.2443 1.5779 243 0.4230 71.5207 19.7682
0.2988 1.5844 244 0.4229 71.8894 19.9414
0.3093 1.5909 245 0.4225 71.2442 19.6750
0.3629 1.5974 246 0.4219 71.0599 19.7416
0.3861 1.6039 247 0.4216 71.1521 19.7949
0.3468 1.6104 248 0.4213 70.5991 19.6217
0.3866 1.6169 249 0.4211 69.9539 19.5151
0.3944 1.6234 250 0.4206 69.5853 19.3686
0.3439 1.6299 251 0.4202 70.0461 19.4352
0.2999 1.6364 252 0.4199 69.4009 19.3952
0.3452 1.6429 253 0.4200 69.2166 19.3553
0.283 1.6494 254 0.4198 73.0876 20.8739
0.2927 1.6558 255 0.4199 72.8111 20.8339
0.3132 1.6623 256 0.4198 72.6267 20.8206
0.3673 1.6688 257 0.4195 72.9954 20.8472
0.3349 1.6753 258 0.4198 73.5484 20.8605
0.2937 1.6818 259 0.4194 73.7327 20.9271
0.3488 1.6883 260 0.4195 74.1014 21.0603
0.3635 1.6948 261 0.4196 74.1935 20.9804
0.2944 1.7013 262 0.4196 74.1014 20.8872
0.2671 1.7078 263 0.4194 74.5622 20.9671
0.3458 1.7143 264 0.4194 74.4700 20.9671
0.3147 1.7208 265 0.4196 74.2857 21.0204
0.2836 1.7273 266 0.4199 74.1014 20.9138
0.401 1.7338 267 0.4195 74.1935 20.9671
0.2499 1.7403 268 0.4196 74.1935 20.9937
0.2954 1.7468 269 0.4196 74.1935 21.0603
0.2826 1.7532 270 0.4195 73.7327 20.8872
0.3059 1.7597 271 0.4195 73.7327 20.8472
0.3276 1.7662 272 0.4196 73.9171 20.9804
0.3028 1.7727 273 0.4196 73.8249 20.8872
0.3421 1.7792 274 0.4191 74.1935 20.9937
0.3724 1.7857 275 0.4196 73.5484 20.8339
0.335 1.7922 276 0.4196 74.1014 20.8206
0.2983 1.7987 277 0.4194 73.6406 20.8872
0.362 1.8052 278 0.4193 73.5484 20.7540
0.3849 1.8117 279 0.4195 73.8249 20.9138
0.332 1.8182 280 0.4196 74.0092 20.8472
0.3755 1.8247 281 0.4194 73.7327 20.8339
0.253 1.8312 282 0.4199 74.1014 20.8339
0.3104 1.8377 283 0.4196 74.1014 20.7540
0.3621 1.8442 284 0.4194 74.5622 20.8472
0.2522 1.8506 285 0.4194 74.3779 20.8872
0.2867 1.8571 286 0.4194 74.3779 20.7939
0.292 1.8636 287 0.4196 74.5622 20.8472
0.3201 1.8701 288 0.4196 74.1935 20.7939
0.3838 1.8766 289 0.4196 74.1935 20.9005
0.3221 1.8831 290 0.4199 74.3779 20.9271
0.4258 1.8896 291 0.4198 74.5622 20.8472
0.3603 1.8961 292 0.4197 74.2857 20.8339
0.3593 1.9026 293 0.4197 74.6544 20.9005
0.2741 1.9091 294 0.4196 74.5622 21.0870
0.3534 1.9156 295 0.4197 74.4700 20.8739
0.2989 1.9221 296 0.4194 74.5622 20.8605
0.2798 1.9286 297 0.4195 74.6544 20.8739
0.3727 1.9351 298 0.4199 74.3779 20.8872
0.353 1.9416 299 0.4197 74.5622 20.8072
0.3373 1.9481 300 0.4199 74.4700 20.8072
0.3113 1.9545 301 0.4200 74.4700 20.9937
0.3437 1.9610 302 0.4196 74.6544 20.8206
0.382 1.9675 303 0.4199 74.7465 20.8872
0.3955 1.9740 304 0.4199 74.4700 20.8605
0.3081 1.9805 305 0.4197 74.7465 20.9138
0.2885 1.9870 306 0.4198 74.6544 20.9138
0.3315 1.9935 307 0.4197 74.6544 20.8605
0.3833 2.0 308 0.4197 74.3779 20.8739

Framework versions

  • Transformers 4.57.3
  • Pytorch 2.9.1+cu128
  • Datasets 4.4.1
  • Tokenizers 0.22.1
Downloads last month
9
Safetensors
Model size
2B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for nkkbr/whisper-large-v3-zatoichi-ja-JDG_ver_20260217_lr_6e-6

Finetuned
(816)
this model