Whisper Large v3 - Japanese Zatoichi ASR

This model is a fine-tuned version of openai/whisper-large-v3 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3983
  • Wer: 70.0461
  • Cer: 18.1964

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1.8e-05
  • train_batch_size: 32
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Use adamw_torch_fused with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 2

Training results

Training Loss Epoch Step Validation Loss Wer Cer
1.2387 0.0065 1 1.2545 98.1567 40.8419
1.2196 0.0130 2 0.9955 96.3134 38.3509
1.1009 0.0195 3 0.9792 96.0369 38.0045
0.9239 0.0260 4 0.9553 93.2719 36.3661
0.9888 0.0325 5 0.9164 88.7558 35.0073
0.9625 0.0390 6 0.8644 89.4009 32.3964
1.0697 0.0455 7 0.8282 85.0691 29.5591
0.9558 0.0519 8 0.7945 82.9493 27.9872
0.8815 0.0584 9 0.7558 87.8341 29.4259
0.8071 0.0649 10 0.7297 88.2949 31.7171
0.7573 0.0714 11 0.7028 88.8479 32.3698
0.8486 0.0779 12 0.6765 82.8571 29.8255
0.7095 0.0844 13 0.6537 86.8203 33.3555
0.6638 0.0909 14 0.6336 91.3364 33.2623
0.6452 0.0974 15 0.6167 94.2857 33.2090
0.7106 0.1039 16 0.6038 79.8157 26.5885
0.6142 0.1104 17 0.5918 80.5530 26.1489
0.5032 0.1169 18 0.5819 78.3410 24.3639
0.7027 0.1234 19 0.5795 78.5253 24.6836
0.5748 0.1299 20 0.5739 76.6820 24.4971
0.5353 0.1364 21 0.5650 76.3134 24.6836
0.594 0.1429 22 0.5568 76.0369 24.1774
0.6924 0.1494 23 0.5537 84.4240 27.5343
0.5706 0.1558 24 0.5500 85.2535 27.3745
0.6413 0.1623 25 0.5461 84.9770 26.4553
0.5587 0.1688 26 0.5456 86.6359 28.6666
0.5833 0.1753 27 0.5407 84.7926 26.5619
0.5607 0.1818 28 0.5397 79.9078 24.7103
0.6409 0.1883 29 0.5402 82.0276 25.0167
0.5543 0.1948 30 0.5301 76.7742 22.7921
0.5053 0.2013 31 0.5279 74.2857 22.3791
0.6525 0.2078 32 0.5257 71.9816 22.9119
0.519 0.2143 33 0.5191 75.2995 24.7502
0.5125 0.2208 34 0.5115 76.4055 24.2574
0.4875 0.2273 35 0.5104 79.4470 23.9909
0.645 0.2338 36 0.5094 75.4839 22.2326
0.4995 0.2403 37 0.5109 72.1659 22.0061
0.4678 0.2468 38 0.5122 70.2304 21.7397
0.5865 0.2532 39 0.5069 70.4147 21.6065
0.5158 0.2597 40 0.5031 69.9539 21.8463
0.5788 0.2662 41 0.5031 72.5346 21.7131
0.6349 0.2727 42 0.5063 76.4055 22.4857
0.538 0.2792 43 0.5014 76.8664 22.2992
0.555 0.2857 44 0.5023 77.4194 22.4457
0.4672 0.2922 45 0.5120 74.0092 21.9395
0.4928 0.2987 46 0.5070 71.6129 21.9129
0.6274 0.3052 47 0.5016 69.6774 21.4733
0.5623 0.3117 48 0.4977 71.7051 21.8729
0.5685 0.3182 49 0.4959 73.9171 22.2592
0.5436 0.3247 50 0.4993 80.7373 23.8577
0.4211 0.3312 51 0.4968 78.9862 23.3782
0.4938 0.3377 52 0.4900 78.2488 25.4829
0.4733 0.3442 53 0.4896 74.5622 24.1375
0.6003 0.3506 54 0.4886 72.7189 23.7645
0.4854 0.3571 55 0.4870 76.8664 25.5628
0.4438 0.3636 56 0.4829 76.5899 25.0167
0.4712 0.3701 57 0.4801 78.2488 25.0433
0.5744 0.3766 58 0.4767 83.2258 26.6285
0.4535 0.3831 59 0.4780 84.7926 26.8283
0.4707 0.3896 60 0.4779 84.2396 26.5752
0.3958 0.3961 61 0.4773 70.4147 20.1945
0.4707 0.4026 62 0.4715 68.1106 19.6483
0.4653 0.4091 63 0.4665 67.2811 20.4076
0.5445 0.4156 64 0.4649 66.8203 20.1279
0.4072 0.4221 65 0.4654 68.7558 19.7549
0.4684 0.4286 66 0.4704 75.2995 21.4600
0.5302 0.4351 67 0.4709 76.5899 21.3134
0.5751 0.4416 68 0.4648 73.6406 20.8605
0.4728 0.4481 69 0.4594 71.0599 20.4609
0.5762 0.4545 70 0.4606 69.4931 21.7264
0.4199 0.4610 71 0.4611 68.7558 21.3667
0.4893 0.4675 72 0.4581 68.1106 20.1545
0.4346 0.4740 73 0.4585 70.1382 20.3543
0.4761 0.4805 74 0.4589 76.8664 22.5523
0.4519 0.4870 75 0.4565 72.4424 20.4476
0.4474 0.4935 76 0.4535 72.1659 20.8872
0.5484 0.5 77 0.4525 77.6959 23.6313
0.4569 0.5065 78 0.4505 73.2719 21.3667
0.3878 0.5130 79 0.4503 71.7972 20.2078
0.4656 0.5195 80 0.4484 71.9816 20.4742
0.4126 0.5260 81 0.4502 70.9677 20.3810
0.4357 0.5325 82 0.4516 69.4931 20.5009
0.472 0.5390 83 0.4526 69.1244 20.7273
0.4685 0.5455 84 0.4544 66.6359 20.2344
0.4649 0.5519 85 0.4551 68.5714 21.1269
0.4886 0.5584 86 0.4539 74.0092 23.8711
0.4771 0.5649 87 0.4526 75.8525 23.5514
0.5629 0.5714 88 0.4513 73.4562 22.2592
0.4605 0.5779 89 0.4488 71.2442 20.9138
0.5155 0.5844 90 0.4469 70.4147 20.8739
0.5252 0.5909 91 0.4454 71.2442 20.7406
0.579 0.5974 92 0.4430 69.9539 20.4476
0.5703 0.6039 93 0.4416 69.6774 20.2478
0.4007 0.6104 94 0.4413 68.6636 20.1812
0.4791 0.6169 95 0.4459 69.9539 20.0879
0.4786 0.6234 96 0.4477 70.8756 20.2877
0.5217 0.6299 97 0.4431 73.9171 20.7673
0.5215 0.6364 98 0.4450 75.8525 20.7806
0.396 0.6429 99 0.4486 77.0507 21.2335
0.3275 0.6494 100 0.4442 73.5484 20.7406
0.5484 0.6558 101 0.4407 68.2028 21.2335
0.4672 0.6623 102 0.4395 68.5714 21.6731
0.4439 0.6688 103 0.4406 69.2166 21.5932
0.4173 0.6753 104 0.4412 71.0599 21.9262
0.5234 0.6818 105 0.4399 70.0461 21.3001
0.5172 0.6883 106 0.4391 70.8756 20.9804
0.4715 0.6948 107 0.4365 70.5991 21.0870
0.4268 0.7013 108 0.4332 69.9539 20.9538
0.4759 0.7078 109 0.4298 68.6636 20.6740
0.4763 0.7143 110 0.4283 69.4931 20.5808
0.3971 0.7208 111 0.4274 70.6912 19.9414
0.4786 0.7273 112 0.4280 71.7051 20.0879
0.4729 0.7338 113 0.4273 73.2719 20.1812
0.634 0.7403 114 0.4266 72.6267 20.1545
0.4164 0.7468 115 0.4237 70.2304 19.6217
0.4021 0.7532 116 0.4195 68.2028 20.0213
0.4549 0.7597 117 0.4199 73.3641 21.9528
0.3797 0.7662 118 0.4223 67.0046 19.3419
0.5491 0.7727 119 0.4225 67.2811 19.4219
0.4675 0.7792 120 0.4210 67.1889 19.3952
0.4475 0.7857 121 0.4219 70.0461 19.6084
0.449 0.7922 122 0.4262 71.0599 19.8348
0.3707 0.7987 123 0.4275 70.5991 19.4352
0.4504 0.8052 124 0.4263 70.0461 19.4219
0.4069 0.8117 125 0.4256 69.3088 19.3553
0.4096 0.8182 126 0.4280 71.9816 19.5418
0.4349 0.8247 127 0.4259 76.1290 21.3268
0.6083 0.8312 128 0.4220 74.5622 21.1669
0.4557 0.8377 129 0.4170 71.7972 20.1412
0.4353 0.8442 130 0.4155 71.3364 20.1945
0.4454 0.8506 131 0.4165 72.4424 20.3543
0.4069 0.8571 132 0.4168 72.3502 20.3677
0.4593 0.8636 133 0.4167 68.8479 20.1279
0.4516 0.8701 134 0.4181 70.5991 20.4343
0.4776 0.8766 135 0.4174 72.5346 20.8605
0.5247 0.8831 136 0.4176 72.9954 20.9271
0.3926 0.8896 137 0.4184 73.8249 21.0737
0.4326 0.8961 138 0.4168 74.1935 21.0204
0.4367 0.9026 139 0.4156 71.0599 19.3419
0.3393 0.9091 140 0.4155 71.1521 19.3153
0.3951 0.9156 141 0.4149 69.8618 19.1688
0.4072 0.9221 142 0.4136 68.4793 18.7958
0.3785 0.9286 143 0.4135 67.8341 18.4494
0.4984 0.9351 144 0.4145 72.2581 20.0480
0.4445 0.9416 145 0.4152 68.1106 18.6226
0.3699 0.9481 146 0.4156 71.5207 20.3810
0.5052 0.9545 147 0.4153 65.5300 18.7159
0.3976 0.9610 148 0.4144 65.3456 18.5827
0.3596 0.9675 149 0.4141 67.7419 18.9290
0.4472 0.9740 150 0.4145 68.3871 18.7558
0.3702 0.9805 151 0.4163 70.9677 19.1555
0.3835 0.9870 152 0.4157 71.3364 19.3952
0.4931 0.9935 153 0.4152 71.4286 19.6217
0.4445 1.0 154 0.4148 74.1935 20.9138
0.2374 1.0065 155 0.4125 70.3226 19.5151
0.3594 1.0130 156 0.4100 68.4793 19.3020
0.2825 1.0195 157 0.4080 68.3871 19.0222
0.2507 1.0260 158 0.4080 67.3733 18.7558
0.3135 1.0325 159 0.4092 67.5576 18.8224
0.2659 1.0390 160 0.4107 68.2028 18.7958
0.2572 1.0455 161 0.4135 69.8618 19.1954
0.2553 1.0519 162 0.4195 72.6267 19.3686
0.2931 1.0584 163 0.4223 73.0876 19.3419
0.2792 1.0649 164 0.4201 70.9677 19.0755
0.313 1.0714 165 0.4147 67.6498 18.9423
0.2752 1.0779 166 0.4117 64.9770 18.5427
0.2542 1.0844 167 0.4104 64.9770 18.3695
0.2106 1.0909 168 0.4086 64.5161 18.2230
0.263 1.0974 169 0.4078 66.0829 18.3562
0.2285 1.1039 170 0.4110 68.7558 18.7825
0.2883 1.1104 171 0.4171 72.2581 19.0489
0.233 1.1169 172 0.4190 73.8249 20.1812
0.2745 1.1234 173 0.4179 72.9032 19.4485
0.3065 1.1299 174 0.4142 70.4147 19.3153
0.2788 1.1364 175 0.4127 69.6774 20.2478
0.2658 1.1429 176 0.4124 66.9124 18.8491
0.2707 1.1494 177 0.4117 66.1751 18.8224
0.2678 1.1558 178 0.4106 66.5438 18.6759
0.2562 1.1623 179 0.4096 68.4793 18.9556
0.2619 1.1688 180 0.4125 70.9677 18.9823
0.2473 1.1753 181 0.4173 74.1014 19.3686
0.242 1.1818 182 0.4192 74.6544 19.4752
0.2663 1.1883 183 0.4164 77.0507 20.9671
0.2611 1.1948 184 0.4131 69.4931 18.8358
0.3118 1.2013 185 0.4096 68.2028 18.7425
0.257 1.2078 186 0.4082 67.0968 18.5960
0.2417 1.2143 187 0.4085 66.1751 18.4761
0.2081 1.2208 188 0.4094 67.3733 18.4095
0.2106 1.2273 189 0.4121 68.1106 18.5161
0.2268 1.2338 190 0.4158 70.0461 18.5827
0.2435 1.2403 191 0.4187 72.9954 18.9024
0.2316 1.2468 192 0.4214 72.9032 18.9956
0.2946 1.2532 193 0.4197 72.6267 19.1022
0.2379 1.2597 194 0.4172 71.6129 19.6483
0.2915 1.2662 195 0.4148 67.9263 19.1155
0.243 1.2727 196 0.4142 67.8341 19.2620
0.2729 1.2792 197 0.4140 68.0184 19.1288
0.2534 1.2857 198 0.4143 68.3871 19.0889
0.2424 1.2922 199 0.4153 69.7696 18.9823
0.2351 1.2987 200 0.4185 73.4562 19.3819
0.2724 1.3052 201 0.4213 75.1152 19.4352
0.2318 1.3117 202 0.4199 74.3779 19.3819
0.2556 1.3182 203 0.4171 71.3364 19.1022
0.2941 1.3247 204 0.4146 69.2166 18.7292
0.2623 1.3312 205 0.4145 67.9263 18.5960
0.2717 1.3377 206 0.4149 68.2028 18.7425
0.3605 1.3442 207 0.4143 68.1106 18.6493
0.2687 1.3506 208 0.4153 70.3226 18.8491
0.2149 1.3571 209 0.4181 72.1659 18.9024
0.2744 1.3636 210 0.4195 74.6544 19.3020
0.2271 1.3701 211 0.4193 75.6682 19.2087
0.2511 1.3766 212 0.4161 74.3779 19.1022
0.2849 1.3831 213 0.4129 72.4424 18.8624
0.2595 1.3896 214 0.4096 69.7696 18.5827
0.2447 1.3961 215 0.4071 68.2949 18.3296
0.241 1.4026 216 0.4059 66.6359 17.9965
0.2172 1.4091 217 0.4054 66.3594 18.1697
0.2588 1.4156 218 0.4056 66.2673 18.0631
0.2827 1.4221 219 0.4063 66.3594 18.1830
0.2273 1.4286 220 0.4076 67.0968 18.2763
0.2845 1.4351 221 0.4089 68.1106 18.2363
0.2785 1.4416 222 0.4096 68.4793 18.4894
0.2451 1.4481 223 0.4099 69.7696 18.4894
0.2743 1.4545 224 0.4099 69.0323 18.4095
0.2867 1.4610 225 0.4087 69.4931 18.4628
0.2952 1.4675 226 0.4085 69.3088 18.3296
0.2125 1.4740 227 0.4087 69.1244 18.1964
0.3202 1.4805 228 0.4086 69.3088 18.2630
0.2463 1.4870 229 0.4076 69.7696 18.3562
0.2248 1.4935 230 0.4061 70.1382 18.5027
0.2257 1.5 231 0.4048 70.2304 18.5161
0.2482 1.5065 232 0.4040 70.3226 18.5827
0.2088 1.5130 233 0.4028 69.7696 18.6493
0.2675 1.5195 234 0.4033 69.5853 18.5827
0.1978 1.5260 235 0.4031 69.6774 18.7292
0.2323 1.5325 236 0.4027 70.2304 18.8091
0.3032 1.5390 237 0.4025 70.2304 18.6359
0.2307 1.5455 238 0.4023 70.0461 18.5960
0.2656 1.5519 239 0.4014 70.3226 18.5294
0.2518 1.5584 240 0.4009 69.9539 18.4361
0.2484 1.5649 241 0.4003 69.6774 18.3429
0.2481 1.5714 242 0.3999 69.4009 18.3562
0.1823 1.5779 243 0.3999 69.2166 18.2896
0.2387 1.5844 244 0.3989 68.9401 18.2363
0.2315 1.5909 245 0.3980 68.1106 18.1697
0.2852 1.5974 246 0.3973 67.5576 17.9832
0.3159 1.6039 247 0.3971 66.5438 17.7967
0.255 1.6104 248 0.3967 66.9124 17.7035
0.3021 1.6169 249 0.3967 66.7281 18.1697
0.3121 1.6234 250 0.3967 66.7281 18.1431
0.2645 1.6299 251 0.3969 66.9124 18.1697
0.2277 1.6364 252 0.3969 67.4654 18.2896
0.2667 1.6429 253 0.3972 67.4654 18.1697
0.2088 1.6494 254 0.3976 68.7558 18.3296
0.2262 1.6558 255 0.3978 68.3871 18.1697
0.2342 1.6623 256 0.3985 68.7558 18.1164
0.2769 1.6688 257 0.3997 70.3226 18.1564
0.2618 1.6753 258 0.4001 71.1521 18.1297
0.2136 1.6818 259 0.4012 72.5346 18.3562
0.2632 1.6883 260 0.4020 72.4424 18.3296
0.2889 1.6948 261 0.4021 72.6267 18.4228
0.2104 1.7013 262 0.4017 71.9816 18.3562
0.1867 1.7078 263 0.4010 72.8111 18.4361
0.2649 1.7143 264 0.4007 71.4286 18.2896
0.2398 1.7208 265 0.3996 70.3226 18.2097
0.2085 1.7273 266 0.3990 69.0323 18.1297
0.315 1.7338 267 0.3981 68.2949 18.0099
0.1846 1.7403 268 0.3976 68.2949 18.0365
0.2191 1.7468 269 0.3975 67.9263 18.0631
0.2116 1.7532 270 0.3971 67.6498 17.9566
0.2348 1.7597 271 0.3971 67.5576 17.9965
0.2362 1.7662 272 0.3969 67.1889 17.9433
0.2336 1.7727 273 0.3971 67.7419 18.0631
0.2616 1.7792 274 0.3969 67.6498 17.9965
0.2972 1.7857 275 0.3971 68.0184 18.1697
0.2366 1.7922 276 0.3973 67.8341 18.1297
0.2156 1.7987 277 0.3972 68.2028 18.1031
0.269 1.8052 278 0.3976 68.5714 18.2630
0.295 1.8117 279 0.3978 69.6774 18.3029
0.2436 1.8182 280 0.3976 69.0323 18.1830
0.2969 1.8247 281 0.3975 69.9539 18.5427
0.1841 1.8312 282 0.3979 69.6774 18.3828
0.2327 1.8377 283 0.3981 69.8618 18.3162
0.2736 1.8442 284 0.3983 69.9539 18.4095
0.1951 1.8506 285 0.3984 70.3226 18.4361
0.2151 1.8571 286 0.3984 70.2304 18.3562
0.2149 1.8636 287 0.3985 70.3226 18.4361
0.2435 1.8701 288 0.3986 69.9539 18.3429
0.2961 1.8766 289 0.3983 70.3226 18.3562
0.2376 1.8831 290 0.3982 70.3226 18.3828
0.3311 1.8896 291 0.3984 69.9539 18.2763
0.2687 1.8961 292 0.3986 70.2304 18.2763
0.2669 1.9026 293 0.3983 70.3226 18.3695
0.1972 1.9091 294 0.3984 71.0599 18.3828
0.264 1.9156 295 0.3985 70.2304 18.2763
0.2202 1.9221 296 0.3986 70.3226 18.3828
0.2047 1.9286 297 0.3985 70.3226 18.4095
0.2993 1.9351 298 0.3981 70.0461 18.2496
0.263 1.9416 299 0.3984 70.1382 18.3029
0.2374 1.9481 300 0.3982 70.0461 18.2630
0.2328 1.9545 301 0.3980 70.1382 18.3429
0.2677 1.9610 302 0.3980 70.8756 18.4228
0.2927 1.9675 303 0.3982 70.1382 18.2896
0.2999 1.9740 304 0.3982 70.1382 18.2496
0.2203 1.9805 305 0.3980 70.4147 18.2763
0.2215 1.9870 306 0.3982 70.4147 18.3562
0.2392 1.9935 307 0.3981 70.2304 18.3029
0.2952 2.0 308 0.3983 70.0461 18.1964

Framework versions

  • Transformers 4.57.3
  • Pytorch 2.9.1+cu128
  • Datasets 4.4.1
  • Tokenizers 0.22.1
Downloads last month
4
Safetensors
Model size
2B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for nkkbr/whisper-large-v3-zatoichi-ja-JDG_ver_20260217_lr_1.8e-5

Finetuned
(816)
this model