Whisper Large v3 - Japanese Zatoichi ASR

This model is a fine-tuned version of openai/whisper-large-v3 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4023
  • Wer: 70.2304
  • Cer: 18.8091

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1.2e-05
  • train_batch_size: 32
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Use adamw_torch_fused with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 2

Training results

Training Loss Epoch Step Validation Loss Wer Cer
1.2387 0.0065 1 1.2545 98.1567 40.8419
1.2196 0.0130 2 0.9960 98.8940 38.7638
1.1013 0.0195 3 0.9804 96.2212 38.2576
0.9256 0.0260 4 0.9607 94.3779 37.0987
0.9936 0.0325 5 0.9443 92.7189 36.0730
0.9906 0.0390 6 0.9115 88.2028 34.8342
1.1167 0.0455 7 0.8777 85.7143 30.7713
1.0074 0.0519 8 0.8355 88.3871 31.9968
0.9264 0.0584 9 0.8127 83.1336 28.6533
0.8625 0.0649 10 0.7764 88.9401 29.9587
0.7997 0.0714 11 0.7491 94.3779 30.6248
0.8926 0.0779 12 0.7277 88.2949 31.7037
0.7548 0.0844 13 0.7070 89.1244 32.8094
0.7272 0.0909 14 0.6851 86.3594 32.0634
0.6973 0.0974 15 0.6627 85.8065 28.6399
0.7584 0.1039 16 0.6477 87.0046 29.8388
0.6677 0.1104 17 0.6330 89.2166 30.6514
0.532 0.1169 18 0.6143 89.3088 31.7970
0.753 0.1234 19 0.6094 91.3364 30.1452
0.6006 0.1299 20 0.5956 77.3272 24.9767
0.5539 0.1364 21 0.5850 77.5115 24.8035
0.6033 0.1429 22 0.5780 80.2765 25.7759
0.716 0.1494 23 0.5722 80.3687 25.5362
0.6032 0.1558 24 0.5647 82.7650 26.0956
0.6596 0.1623 25 0.5590 85.8065 28.1071
0.5677 0.1688 26 0.5558 82.3963 25.9358
0.597 0.1753 27 0.5516 79.3548 24.6570
0.5681 0.1818 28 0.5464 79.0783 24.4439
0.6594 0.1883 29 0.5423 80.2765 24.3772
0.561 0.1948 30 0.5384 82.3041 24.9500
0.5226 0.2013 31 0.5383 80.1843 24.6303
0.6637 0.2078 32 0.5326 73.7327 22.5123
0.5193 0.2143 33 0.5248 75.9447 23.6180
0.5131 0.2208 34 0.5205 78.6175 24.2574
0.4955 0.2273 35 0.5170 77.8802 23.3382
0.6512 0.2338 36 0.5130 77.7880 24.1508
0.4991 0.2403 37 0.5142 72.9954 21.5932
0.4603 0.2468 38 0.5122 71.1521 21.7264
0.5906 0.2532 39 0.5097 70.1382 21.5799
0.5234 0.2597 40 0.5075 70.5991 21.8196
0.5965 0.2662 41 0.5052 70.4147 22.4324
0.6343 0.2727 42 0.5056 73.6406 22.9253
0.5366 0.2792 43 0.5026 74.0092 23.7112
0.5434 0.2857 44 0.5033 78.8940 24.1375
0.4729 0.2922 45 0.5084 80.0922 23.4315
0.4871 0.2987 46 0.5065 78.9862 23.1384
0.6239 0.3052 47 0.5000 73.3641 21.0603
0.5698 0.3117 48 0.4952 71.2442 21.2202
0.5706 0.3182 49 0.4924 71.7972 21.3534
0.548 0.3247 50 0.4929 72.5346 21.5399
0.4298 0.3312 51 0.4910 77.0507 22.9919
0.4842 0.3377 52 0.4872 75.5760 23.6446
0.4742 0.3442 53 0.4870 71.5207 21.2335
0.5962 0.3506 54 0.4870 69.9539 20.9538
0.4742 0.3571 55 0.4843 69.8618 20.6208
0.4435 0.3636 56 0.4809 69.6774 20.5142
0.4674 0.3701 57 0.4792 75.0230 22.4058
0.5585 0.3766 58 0.4780 75.0230 22.2859
0.4493 0.3831 59 0.4773 74.4700 21.7930
0.4704 0.3896 60 0.4770 74.5622 22.0461
0.3957 0.3961 61 0.4755 72.8111 21.2468
0.472 0.4026 62 0.4727 72.6267 21.3001
0.4625 0.4091 63 0.4685 72.4424 21.3534
0.5472 0.4156 64 0.4656 73.4562 22.7521
0.4124 0.4221 65 0.4651 73.4562 22.6855
0.4677 0.4286 66 0.4667 76.1290 22.8853
0.5311 0.4351 67 0.4651 76.2212 24.1908
0.5727 0.4416 68 0.4615 72.6267 21.5799
0.4681 0.4481 69 0.4592 70.2304 21.2335
0.5708 0.4545 70 0.4591 68.3871 19.9281
0.4197 0.4610 71 0.4579 67.5576 19.7283
0.4804 0.4675 72 0.4550 66.2673 19.7682
0.4314 0.4740 73 0.4531 66.8203 19.7815
0.4771 0.4805 74 0.4522 67.3733 19.5551
0.4502 0.4870 75 0.4522 69.9539 20.0480
0.4435 0.4935 76 0.4509 72.4424 20.7806
0.5332 0.5 77 0.4510 72.2581 20.5675
0.4571 0.5065 78 0.4506 71.8894 21.9129
0.3862 0.5130 79 0.4508 72.1659 22.0061
0.4579 0.5195 80 0.4468 70.4147 20.4875
0.4187 0.5260 81 0.4449 68.9401 20.2744
0.427 0.5325 82 0.4456 68.2028 20.4875
0.4666 0.5390 83 0.4489 68.6636 20.5009
0.4862 0.5455 84 0.4517 68.3871 20.9937
0.4543 0.5519 85 0.4515 68.9401 21.0337
0.4785 0.5584 86 0.4503 68.7558 20.9005
0.4731 0.5649 87 0.4496 69.7696 20.8872
0.5544 0.5714 88 0.4505 69.3088 21.6465
0.4622 0.5779 89 0.4505 70.5069 21.9129
0.5192 0.5844 90 0.4468 69.5853 21.5799
0.5293 0.5909 91 0.4431 69.8618 20.6208
0.5632 0.5974 92 0.4405 68.1106 20.2611
0.5592 0.6039 93 0.4416 68.2028 20.3144
0.3998 0.6104 94 0.4454 72.3502 21.6331
0.47 0.6169 95 0.4512 72.4424 21.7131
0.4799 0.6234 96 0.4492 72.6267 21.9129
0.5258 0.6299 97 0.4404 69.6774 21.1269
0.5195 0.6364 98 0.4422 71.1521 20.3011
0.4049 0.6429 99 0.4477 75.0230 20.6341
0.3254 0.6494 100 0.4464 75.4839 20.6074
0.5597 0.6558 101 0.4425 72.2581 20.3677
0.4612 0.6623 102 0.4394 70.5069 20.9271
0.4414 0.6688 103 0.4388 69.9539 19.6350
0.418 0.6753 104 0.4395 68.5714 20.0480
0.5207 0.6818 105 0.4397 68.4793 19.8615
0.5119 0.6883 106 0.4385 68.6636 19.9814
0.4665 0.6948 107 0.4366 68.5714 20.8339
0.4176 0.7013 108 0.4333 68.9401 20.5808
0.4731 0.7078 109 0.4330 70.9677 20.9671
0.4721 0.7143 110 0.4357 73.7327 21.3934
0.4045 0.7208 111 0.4365 74.0092 21.2868
0.4759 0.7273 112 0.4353 74.2857 20.9538
0.4691 0.7338 113 0.4304 72.5346 20.7273
0.6365 0.7403 114 0.4270 71.3364 20.5009
0.4084 0.7468 115 0.4250 73.1797 20.9538
0.3928 0.7532 116 0.4234 73.2719 22.9652
0.4567 0.7597 117 0.4214 72.4424 22.3658
0.3792 0.7662 118 0.4193 72.0737 22.3658
0.5385 0.7727 119 0.4178 70.4147 21.7131
0.464 0.7792 120 0.4168 67.4654 19.5284
0.4421 0.7857 121 0.4176 68.5714 19.6217
0.4436 0.7922 122 0.4203 69.3088 19.5684
0.3691 0.7987 123 0.4220 70.4147 19.6350
0.4508 0.8052 124 0.4218 71.7051 19.7283
0.3994 0.8117 125 0.4229 72.4424 19.8082
0.4129 0.8182 126 0.4264 73.8249 20.1545
0.4259 0.8247 127 0.4277 73.0876 20.4209
0.6061 0.8312 128 0.4261 72.9032 20.0879
0.4646 0.8377 129 0.4220 71.1521 19.6616
0.4371 0.8442 130 0.4192 72.7189 20.2744
0.4365 0.8506 131 0.4177 72.5346 20.3011
0.4095 0.8571 132 0.4179 71.4286 20.6208
0.4453 0.8636 133 0.4194 68.8479 20.1812
0.4615 0.8701 134 0.4195 70.1382 21.6331
0.4765 0.8766 135 0.4185 69.5853 20.5941
0.5094 0.8831 136 0.4162 69.7696 20.6074
0.3878 0.8896 137 0.4160 69.1244 20.2344
0.4323 0.8961 138 0.4159 70.5991 20.6740
0.4405 0.9026 139 0.4168 72.3502 21.0470
0.3511 0.9091 140 0.4170 73.9171 20.9538
0.3945 0.9156 141 0.4166 70.5069 19.0622
0.4038 0.9221 142 0.4159 74.0092 20.5408
0.3858 0.9286 143 0.4154 77.6959 22.8986
0.4924 0.9351 144 0.4156 77.6037 22.6722
0.448 0.9416 145 0.4160 76.6820 22.7521
0.3748 0.9481 146 0.4154 75.3917 22.3525
0.5069 0.9545 147 0.4139 69.8618 19.9547
0.3942 0.9610 148 0.4129 68.0184 19.8348
0.3658 0.9675 149 0.4112 64.4240 18.5960
0.4506 0.9740 150 0.4100 64.7926 18.8358
0.3643 0.9805 151 0.4103 66.2673 18.8757
0.3833 0.9870 152 0.4106 71.2442 20.3543
0.4862 0.9935 153 0.4113 73.2719 20.7673
0.4471 1.0 154 0.4126 71.3364 19.8615
0.2557 1.0065 155 0.4131 71.7972 20.0080
0.3845 1.0130 156 0.4122 69.8618 19.9014
0.2921 1.0195 157 0.4116 68.1106 19.6084
0.2706 1.0260 158 0.4110 70.5991 20.7007
0.3347 1.0325 159 0.4114 70.7834 20.8872
0.2923 1.0390 160 0.4125 67.3733 19.2620
0.2839 1.0455 161 0.4137 71.4286 20.4875
0.2757 1.0519 162 0.4182 70.7834 19.2087
0.3129 1.0584 163 0.4220 72.7189 19.1155
0.2967 1.0649 164 0.4229 72.2581 18.9556
0.336 1.0714 165 0.4196 70.1382 18.7825
0.2923 1.0779 166 0.4164 68.9401 18.6626
0.2708 1.0844 167 0.4139 67.9263 18.5560
0.2213 1.0909 168 0.4117 65.3456 18.2097
0.2752 1.0974 169 0.4113 70.0461 19.8881
0.2436 1.1039 170 0.4131 71.0599 19.9414
0.319 1.1104 171 0.4158 73.4562 20.3410
0.2463 1.1169 172 0.4176 70.1382 18.7159
0.2937 1.1234 173 0.4189 71.1521 19.0755
0.3241 1.1299 174 0.4181 70.7834 19.3020
0.295 1.1364 175 0.4170 69.9539 19.0089
0.2885 1.1429 176 0.4169 69.2166 19.0755
0.2987 1.1494 177 0.4169 69.5853 19.2887
0.2859 1.1558 178 0.4174 69.4009 18.8757
0.2773 1.1623 179 0.4166 69.5853 18.8757
0.2937 1.1688 180 0.4175 70.5069 18.9157
0.2614 1.1753 181 0.4189 70.8756 18.7292
0.2594 1.1818 182 0.4198 71.4286 19.1022
0.2838 1.1883 183 0.4190 70.6912 18.9690
0.2784 1.1948 184 0.4178 69.8618 18.7958
0.3273 1.2013 185 0.4155 68.5714 18.7425
0.2767 1.2078 186 0.4142 68.0184 18.7425
0.2593 1.2143 187 0.4143 68.0184 18.7425
0.2228 1.2208 188 0.4149 67.9263 18.6626
0.2207 1.2273 189 0.4172 68.5714 18.8358
0.2452 1.2338 190 0.4191 69.7696 19.0489
0.2682 1.2403 191 0.4221 71.3364 19.1821
0.2552 1.2468 192 0.4239 72.1659 19.4086
0.2921 1.2532 193 0.4242 72.5346 19.3952
0.2647 1.2597 194 0.4228 72.2581 19.5284
0.3187 1.2662 195 0.4201 70.2304 19.2487
0.262 1.2727 196 0.4190 69.4931 19.2354
0.295 1.2792 197 0.4184 69.7696 19.1954
0.2767 1.2857 198 0.4177 68.9401 19.0622
0.2671 1.2922 199 0.4179 70.1382 18.9956
0.2547 1.2987 200 0.4194 72.3502 19.4752
0.2936 1.3052 201 0.4212 73.1797 19.5551
0.257 1.3117 202 0.4205 72.3502 19.5418
0.2798 1.3182 203 0.4185 72.2581 19.3286
0.3145 1.3247 204 0.4166 71.0599 19.2487
0.2746 1.3312 205 0.4160 73.7327 20.5275
0.2863 1.3377 206 0.4160 74.0092 20.6074
0.3857 1.3442 207 0.4155 73.8249 20.3677
0.292 1.3506 208 0.4165 74.3779 20.4875
0.2313 1.3571 209 0.4185 76.4055 20.4875
0.3006 1.3636 210 0.4203 77.5115 20.6074
0.2402 1.3701 211 0.4194 74.5622 19.3419
0.2722 1.3766 212 0.4178 73.3641 19.1022
0.3094 1.3831 213 0.4154 76.8664 20.5808
0.2752 1.3896 214 0.4134 72.0737 19.0356
0.268 1.3961 215 0.4113 74.4700 20.4076
0.2845 1.4026 216 0.4097 74.1014 20.2611
0.2452 1.4091 217 0.4086 72.2581 20.2611
0.2888 1.4156 218 0.4084 68.6636 18.8757
0.3091 1.4221 219 0.4081 68.6636 18.9690
0.2407 1.4286 220 0.4085 69.3088 18.9690
0.3073 1.4351 221 0.4098 70.7834 19.1688
0.3 1.4416 222 0.4105 71.7972 19.2887
0.2637 1.4481 223 0.4113 72.3502 19.4352
0.307 1.4545 224 0.4113 71.9816 19.2354
0.316 1.4610 225 0.4098 71.8894 19.2354
0.3163 1.4675 226 0.4099 71.7051 19.0489
0.2301 1.4740 227 0.4098 71.1521 18.9423
0.3494 1.4805 228 0.4100 72.0737 19.1288
0.2636 1.4870 229 0.4097 72.0737 19.1155
0.2405 1.4935 230 0.4088 71.4286 19.1555
0.2492 1.5 231 0.4083 70.9677 19.0222
0.2775 1.5065 232 0.4079 70.5991 18.9423
0.2236 1.5130 233 0.4071 71.4286 18.9956
0.2945 1.5195 234 0.4064 71.3364 19.1421
0.2092 1.5260 235 0.4061 71.4286 19.2087
0.2573 1.5325 236 0.4065 71.7972 19.2354
0.331 1.5390 237 0.4060 71.7051 19.3153
0.2534 1.5455 238 0.4053 71.5207 18.9423
0.3 1.5519 239 0.4056 71.7972 18.9157
0.2816 1.5584 240 0.4051 71.1521 18.6626
0.2697 1.5649 241 0.4044 71.6129 18.8091
0.2721 1.5714 242 0.4046 70.9677 18.7159
0.1971 1.5779 243 0.4043 71.0599 18.6759
0.2545 1.5844 244 0.4037 71.4286 18.7425
0.2556 1.5909 245 0.4032 71.3364 18.8358
0.3141 1.5974 246 0.4022 70.8756 18.8624
0.335 1.6039 247 0.4014 69.4931 18.7159
0.2848 1.6104 248 0.4014 69.1244 18.6759
0.3292 1.6169 249 0.4013 68.7558 18.5827
0.3436 1.6234 250 0.4006 68.3871 18.4628
0.2923 1.6299 251 0.4006 68.1106 18.5161
0.2482 1.6364 252 0.4008 67.9263 18.5827
0.2834 1.6429 253 0.4005 68.1106 18.5427
0.225 1.6494 254 0.4007 68.0184 18.4761
0.2502 1.6558 255 0.4008 68.2028 18.5560
0.2626 1.6623 256 0.4013 69.1244 18.6493
0.3039 1.6688 257 0.4020 69.2166 18.6093
0.2886 1.6753 258 0.4019 69.0323 18.5560
0.2352 1.6818 259 0.4031 69.9539 18.4494
0.2875 1.6883 260 0.4033 71.2442 18.7691
0.307 1.6948 261 0.4036 71.5207 18.8358
0.2336 1.7013 262 0.4037 71.3364 18.7025
0.2183 1.7078 263 0.4037 71.3364 18.7159
0.2902 1.7143 264 0.4034 71.5207 18.6759
0.2614 1.7208 265 0.4032 71.0599 18.5560
0.2297 1.7273 266 0.4030 71.3364 18.6892
0.3411 1.7338 267 0.4029 70.7834 18.6093
0.2022 1.7403 268 0.4021 70.0461 18.6493
0.2409 1.7468 269 0.4020 69.9539 18.8757
0.2423 1.7532 270 0.4019 69.6774 18.6759
0.2538 1.7597 271 0.4016 69.5853 18.7025
0.2672 1.7662 272 0.4014 68.6636 18.5827
0.257 1.7727 273 0.4013 68.5714 18.5294
0.285 1.7792 274 0.4014 68.2949 18.5161
0.325 1.7857 275 0.4015 69.1244 18.5693
0.2646 1.7922 276 0.4016 68.6636 18.5427
0.2407 1.7987 277 0.4016 68.6636 18.4628
0.2996 1.8052 278 0.4014 68.9401 18.4361
0.322 1.8117 279 0.4014 68.7558 18.5294
0.274 1.8182 280 0.4019 69.2166 18.6093
0.3232 1.8247 281 0.4019 69.1244 18.6093
0.2068 1.8312 282 0.4017 69.4931 18.6093
0.2543 1.8377 283 0.4016 69.2166 18.4494
0.299 1.8442 284 0.4019 69.2166 18.5560
0.2164 1.8506 285 0.4020 69.2166 18.5027
0.2444 1.8571 286 0.4021 69.3088 18.5960
0.2397 1.8636 287 0.4020 69.4009 18.5560
0.2662 1.8701 288 0.4023 69.3088 18.5560
0.3251 1.8766 289 0.4021 70.0461 18.6892
0.2616 1.8831 290 0.4022 70.3226 18.7292
0.3575 1.8896 291 0.4021 70.1382 18.6892
0.2968 1.8961 292 0.4019 70.3226 18.6759
0.2989 1.9026 293 0.4022 69.8618 18.7025
0.22 1.9091 294 0.4021 70.4147 18.7425
0.3004 1.9156 295 0.4020 70.4147 18.7825
0.2407 1.9221 296 0.4024 70.5069 18.7958
0.228 1.9286 297 0.4020 69.8618 18.7025
0.3132 1.9351 298 0.4023 70.7834 18.8224
0.3002 1.9416 299 0.4023 70.2304 18.7025
0.2712 1.9481 300 0.4022 70.4147 18.7558
0.2544 1.9545 301 0.4028 70.5991 18.8890
0.2933 1.9610 302 0.4024 70.4147 18.7558
0.3194 1.9675 303 0.4023 70.1382 18.7825
0.334 1.9740 304 0.4021 70.3226 18.8358
0.2522 1.9805 305 0.4022 70.2304 18.7292
0.2414 1.9870 306 0.4023 70.5069 18.8358
0.2648 1.9935 307 0.4023 70.2304 18.7825
0.3236 2.0 308 0.4023 70.2304 18.8091

Framework versions

  • Transformers 4.57.3
  • Pytorch 2.9.1+cu128
  • Datasets 4.4.1
  • Tokenizers 0.22.1
Downloads last month
2
Safetensors
Model size
2B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for nkkbr/whisper-large-v3-zatoichi-ja-JDG_ver_20260217_lr_1.2e-5

Finetuned
(816)
this model