Whisper Large v3 - Japanese Zatoichi ASR

This model is a fine-tuned version of openai/whisper-large-v3 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3916
  • Wer: 67.4654
  • Cer: 18.0898

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2.6e-05
  • train_batch_size: 32
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Use adamw_torch_fused with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 2

Training results

Training Loss Epoch Step Validation Loss Wer Cer
1.2387 0.0065 1 1.2545 98.1567 40.8419
1.2196 0.0130 2 0.9948 96.3134 38.3509
1.1004 0.0195 3 0.9713 93.9171 37.2452
0.9157 0.0260 4 0.9293 90.6912 35.6734
0.9635 0.0325 5 0.8749 85.8065 32.7827
0.9217 0.0390 6 0.8290 84.0553 28.9330
1.0356 0.0455 7 0.7788 89.2166 29.8655
0.9003 0.0519 8 0.7476 90.4147 30.2651
0.8303 0.0584 9 0.7224 81.3825 30.5981
0.7767 0.0649 10 0.6909 88.7558 34.8342
0.7277 0.0714 11 0.6621 103.7788 33.3822
0.8113 0.0779 12 0.6374 93.9171 30.2518
0.6771 0.0844 13 0.6178 84.8848 27.5476
0.6352 0.0909 14 0.6092 75.4839 24.7636
0.6253 0.0974 15 0.5936 78.9862 25.3896
0.684 0.1039 16 0.5883 89.8618 28.7598
0.5975 0.1104 17 0.5795 88.5714 28.0538
0.5001 0.1169 18 0.5676 78.6175 24.2707
0.6798 0.1234 19 0.5720 78.2488 25.6427
0.5697 0.1299 20 0.5650 79.6313 27.1613
0.524 0.1364 21 0.5502 73.8249 23.5913
0.5876 0.1429 22 0.5479 75.2995 24.6303
0.6807 0.1494 23 0.5536 82.3041 26.6418
0.5778 0.1558 24 0.5471 88.6636 28.5600
0.6341 0.1623 25 0.5458 85.2535 25.8958
0.5366 0.1688 26 0.5446 87.3733 27.5876
0.5737 0.1753 27 0.5408 83.8710 27.2279
0.5666 0.1818 28 0.5356 78.0645 24.7769
0.6234 0.1883 29 0.5296 83.5023 26.9082
0.5303 0.1948 30 0.5324 87.6498 27.2945
0.5078 0.2013 31 0.5380 77.5115 24.6170
0.6635 0.2078 32 0.5340 76.8664 24.6570
0.5251 0.2143 33 0.5272 72.9032 22.9519
0.5147 0.2208 34 0.5243 76.7742 22.9253
0.4906 0.2273 35 0.5246 78.1567 22.5123
0.6572 0.2338 36 0.5207 74.0092 22.3525
0.507 0.2403 37 0.5204 71.1521 22.1127
0.485 0.2468 38 0.5219 71.5207 22.2459
0.6019 0.2532 39 0.5166 71.6129 22.0861
0.5178 0.2597 40 0.5129 73.9171 23.0718
0.5859 0.2662 41 0.5135 78.7097 26.5086
0.647 0.2727 42 0.5141 88.0184 29.6257
0.5417 0.2792 43 0.5049 72.6267 21.9528
0.5698 0.2857 44 0.5075 71.7051 22.1527
0.4717 0.2922 45 0.5155 71.4286 22.1660
0.5015 0.2987 46 0.5068 74.2857 22.3125
0.6319 0.3052 47 0.5011 75.8525 22.5123
0.5732 0.3117 48 0.5021 76.0369 22.4191
0.5854 0.3182 49 0.4975 78.8018 24.2041
0.5549 0.3247 50 0.4946 78.9862 24.2041
0.4074 0.3312 51 0.4933 74.5622 22.8853
0.4913 0.3377 52 0.4907 71.8894 23.0185
0.4794 0.3442 53 0.4935 76.2212 25.6028
0.6085 0.3506 54 0.4904 75.1152 24.2174
0.5182 0.3571 55 0.4852 74.5622 23.8311
0.46 0.3636 56 0.4844 78.5253 24.0842
0.477 0.3701 57 0.4874 75.8525 24.1508
0.6016 0.3766 58 0.4911 79.0783 25.4962
0.4762 0.3831 59 0.4936 75.0230 23.4981
0.4966 0.3896 60 0.4918 76.5899 23.6846
0.4068 0.3961 61 0.4925 69.4009 20.8339
0.4862 0.4026 62 0.4906 67.5576 21.5133
0.4921 0.4091 63 0.4832 69.1244 21.6465
0.5687 0.4156 64 0.4750 69.9539 20.3943
0.4286 0.4221 65 0.4758 74.0092 21.5133
0.4775 0.4286 66 0.4887 81.8433 22.7255
0.5473 0.4351 67 0.4829 88.8479 27.2679
0.5975 0.4416 68 0.4712 83.5023 26.2821
0.4962 0.4481 69 0.4699 72.7189 24.2174
0.5839 0.4545 70 0.4713 72.9954 24.4039
0.4231 0.4610 71 0.4697 74.7465 24.0442
0.4893 0.4675 72 0.4699 85.0691 27.6275
0.4525 0.4740 73 0.4723 89.9539 29.2261
0.4997 0.4805 74 0.4665 81.1982 24.9101
0.4723 0.4870 75 0.4605 71.7051 20.7806
0.4521 0.4935 76 0.4624 74.3779 23.6712
0.5768 0.5 77 0.4623 77.3272 23.8444
0.4842 0.5065 78 0.4570 69.5853 20.6607
0.3986 0.5130 79 0.4595 75.2995 21.8330
0.4806 0.5195 80 0.4618 75.0230 21.0870
0.4213 0.5260 81 0.4560 71.9816 21.1669
0.4451 0.5325 82 0.4508 67.0046 20.0346
0.467 0.5390 83 0.4544 68.0184 20.4076
0.4654 0.5455 84 0.4609 69.3088 21.6864
0.4724 0.5519 85 0.4649 77.8802 24.9367
0.5144 0.5584 86 0.4645 80.5530 24.4705
0.5054 0.5649 87 0.4631 80.6452 23.1251
0.573 0.5714 88 0.4570 74.3779 20.8339
0.4593 0.5779 89 0.4527 70.3226 20.8206
0.5275 0.5844 90 0.4530 68.6636 20.9005
0.5507 0.5909 91 0.4513 68.2028 20.4875
0.6028 0.5974 92 0.4469 75.6682 23.7512
0.5819 0.6039 93 0.4442 71.1521 20.8472
0.4107 0.6104 94 0.4488 74.7465 20.8739
0.4948 0.6169 95 0.4567 77.6037 21.2335
0.4802 0.6234 96 0.4584 73.9171 20.9005
0.5285 0.6299 97 0.4522 70.4147 20.7540
0.5163 0.6364 98 0.4536 71.3364 21.7930
0.4123 0.6429 99 0.4561 77.2350 24.1375
0.3428 0.6494 100 0.4536 73.4562 23.2450
0.5492 0.6558 101 0.4505 73.3641 22.9253
0.4752 0.6623 102 0.4483 77.4194 23.3915
0.4538 0.6688 103 0.4487 75.7604 22.6189
0.4348 0.6753 104 0.4482 70.0461 20.2877
0.5313 0.6818 105 0.4472 70.6912 20.4343
0.5227 0.6883 106 0.4464 70.5991 21.1003
0.4887 0.6948 107 0.4450 70.4147 21.1136
0.4475 0.7013 108 0.4418 69.7696 20.5941
0.4893 0.7078 109 0.4381 70.2304 20.7273
0.4939 0.7143 110 0.4347 70.5991 20.5541
0.4113 0.7208 111 0.4362 73.5484 19.8215
0.4868 0.7273 112 0.4402 76.6820 20.4875
0.4863 0.7338 113 0.4408 75.6682 20.4609
0.6439 0.7403 114 0.4395 73.8249 20.2344
0.4334 0.7468 115 0.4361 70.7834 20.0346
0.4035 0.7532 116 0.4322 70.3226 22.0061
0.4579 0.7597 117 0.4341 70.3226 22.2725
0.3899 0.7662 118 0.4346 67.0046 20.4742
0.5728 0.7727 119 0.4320 68.1106 20.8872
0.4827 0.7792 120 0.4304 67.9263 19.9414
0.4649 0.7857 121 0.4313 69.2166 19.6616
0.4632 0.7922 122 0.4334 69.0323 19.3153
0.3845 0.7987 123 0.4336 70.4147 20.3011
0.462 0.8052 124 0.4311 68.5714 19.4618
0.4227 0.8117 125 0.4315 68.9401 19.3286
0.4123 0.8182 126 0.4347 73.9171 19.9947
0.441 0.8247 127 0.4315 73.1797 20.0346
0.6278 0.8312 128 0.4268 77.2350 22.5656
0.4613 0.8377 129 0.4230 67.1889 19.2354
0.435 0.8442 130 0.4236 67.8341 18.9024
0.4593 0.8506 131 0.4230 67.1889 18.6892
0.414 0.8571 132 0.4225 69.1244 20.2478
0.466 0.8636 133 0.4253 65.1613 19.1954
0.4587 0.8701 134 0.4293 67.5576 19.6750
0.4973 0.8766 135 0.4297 69.8618 20.3011
0.5384 0.8831 136 0.4290 71.3364 20.1545
0.4018 0.8896 137 0.4281 71.3364 21.0737
0.4317 0.8961 138 0.4252 68.8479 19.8348
0.4488 0.9026 139 0.4241 67.6498 19.6217
0.332 0.9091 140 0.4246 66.2673 18.8358
0.4075 0.9156 141 0.4242 66.3594 18.6626
0.4108 0.9221 142 0.4254 72.0737 21.9395
0.3875 0.9286 143 0.4276 68.7558 19.2487
0.5051 0.9351 144 0.4323 69.5853 19.1821
0.4702 0.9416 145 0.4345 75.1152 21.6065
0.3697 0.9481 146 0.4332 76.1290 22.5390
0.5082 0.9545 147 0.4297 74.3779 22.2859
0.4097 0.9610 148 0.4276 74.1935 22.6988
0.3611 0.9675 149 0.4262 68.1106 20.3943
0.4552 0.9740 150 0.4259 69.0323 20.7673
0.3704 0.9805 151 0.4276 71.0599 21.4067
0.3836 0.9870 152 0.4279 71.9816 21.7131
0.5089 0.9935 153 0.4262 69.1244 20.4343
0.4392 1.0 154 0.4240 68.8479 20.3810
0.2294 1.0065 155 0.4204 67.6498 19.7549
0.3462 1.0130 156 0.4188 67.7419 19.8215
0.261 1.0195 157 0.4186 67.0968 19.3686
0.2355 1.0260 158 0.4195 67.1889 19.1022
0.3061 1.0325 159 0.4213 66.6359 18.7159
0.2569 1.0390 160 0.4237 68.6636 18.9956
0.2493 1.0455 161 0.4260 70.3226 19.1688
0.2591 1.0519 162 0.4307 71.3364 19.4086
0.2971 1.0584 163 0.4322 71.2442 19.2887
0.2727 1.0649 164 0.4293 69.9539 19.4485
0.3089 1.0714 165 0.4248 66.7281 19.1288
0.2642 1.0779 166 0.4235 64.6083 19.3553
0.2434 1.0844 167 0.4205 64.3318 19.3286
0.2019 1.0909 168 0.4161 64.9770 18.7691
0.2492 1.0974 169 0.4139 67.3733 18.6359
0.2201 1.1039 170 0.4184 71.3364 20.1012
0.2896 1.1104 171 0.4247 75.2995 21.1669
0.2341 1.1169 172 0.4225 72.8111 20.0346
0.2613 1.1234 173 0.4181 70.4147 19.9147
0.2915 1.1299 174 0.4152 67.5576 18.9690
0.2607 1.1364 175 0.4163 64.7926 18.8358
0.2609 1.1429 176 0.4172 63.8710 18.6626
0.2681 1.1494 177 0.4145 64.1475 18.6759
0.2154 1.1558 178 0.4121 67.2811 18.8757
0.2509 1.1623 179 0.4137 68.7558 18.9956
0.2478 1.1688 180 0.4212 73.4562 19.3286
0.2417 1.1753 181 0.4246 74.4700 19.3686
0.2384 1.1818 182 0.4201 70.6912 18.8890
0.2564 1.1883 183 0.4122 67.1889 18.7292
0.249 1.1948 184 0.4087 65.5300 18.5161
0.2984 1.2013 185 0.4064 63.3180 18.4095
0.2469 1.2078 186 0.4057 63.5945 18.3828
0.2171 1.2143 187 0.4059 64.7926 18.3562
0.1974 1.2208 188 0.4090 66.8203 18.4361
0.1916 1.2273 189 0.4153 71.0599 18.7691
0.2155 1.2338 190 0.4201 74.1014 19.2087
0.231 1.2403 191 0.4200 75.0230 19.3952
0.2289 1.2468 192 0.4173 73.2719 19.5950
0.268 1.2532 193 0.4137 70.3226 19.3819
0.2255 1.2597 194 0.4128 66.9124 18.8491
0.2726 1.2662 195 0.4125 70.2304 20.5408
0.235 1.2727 196 0.4124 70.7834 20.7673
0.2626 1.2792 197 0.4120 71.7972 20.7406
0.2404 1.2857 198 0.4124 73.1797 20.7939
0.2309 1.2922 199 0.4155 70.9677 19.2087
0.233 1.2987 200 0.4189 74.8387 19.5950
0.2669 1.3052 201 0.4201 76.2212 19.8615
0.2216 1.3117 202 0.4165 75.7604 20.9937
0.2461 1.3182 203 0.4122 73.9171 20.9937
0.276 1.3247 204 0.4110 70.4147 20.5275
0.2493 1.3312 205 0.4124 70.1382 21.1136
0.2635 1.3377 206 0.4128 70.9677 20.3677
0.3373 1.3442 207 0.4117 71.7972 20.3410
0.2595 1.3506 208 0.4114 72.7189 20.3677
0.2089 1.3571 209 0.4123 75.9447 21.3667
0.2618 1.3636 210 0.4144 78.3410 21.7664
0.2183 1.3701 211 0.4148 77.6959 21.3001
0.2365 1.3766 212 0.4120 77.0507 20.6074
0.2665 1.3831 213 0.4091 76.5899 20.8206
0.2352 1.3896 214 0.4059 74.5622 20.5941
0.2363 1.3961 215 0.4038 68.7558 18.4494
0.2109 1.4026 216 0.4029 67.0046 18.4628
0.1937 1.4091 217 0.4029 64.3318 18.2896
0.2505 1.4156 218 0.4031 62.8571 17.8766
0.2695 1.4221 219 0.4032 63.6866 17.7701
0.2159 1.4286 220 0.4044 64.9770 17.9299
0.2786 1.4351 221 0.4061 65.5300 17.7834
0.2654 1.4416 222 0.4066 66.5438 17.9965
0.2208 1.4481 223 0.4066 67.3733 18.0765
0.2551 1.4545 224 0.4064 67.9263 18.0765
0.2697 1.4610 225 0.4049 67.4654 18.0898
0.288 1.4675 226 0.4049 67.4654 18.1431
0.1903 1.4740 227 0.4055 68.2028 18.1697
0.3011 1.4805 228 0.4051 68.2028 17.9965
0.2276 1.4870 229 0.4038 67.6498 18.0631
0.2144 1.4935 230 0.4022 67.9263 18.0232
0.2189 1.5 231 0.4010 68.2028 18.4894
0.2282 1.5065 232 0.4002 68.3871 18.4228
0.1982 1.5130 233 0.4005 68.2949 18.5960
0.2542 1.5195 234 0.4003 68.1106 18.4761
0.1909 1.5260 235 0.4006 67.2811 18.2230
0.2198 1.5325 236 0.4003 67.0046 18.1564
0.2818 1.5390 237 0.3996 67.2811 18.0099
0.2151 1.5455 238 0.3985 67.3733 18.2097
0.2493 1.5519 239 0.3980 67.7419 18.1564
0.239 1.5584 240 0.3977 68.6636 18.2363
0.224 1.5649 241 0.3976 68.0184 18.2896
0.2383 1.5714 242 0.3970 68.5714 18.4761
0.1725 1.5779 243 0.3965 68.6636 18.5693
0.2208 1.5844 244 0.3953 67.9263 18.6226
0.2158 1.5909 245 0.3948 67.0046 18.5161
0.2612 1.5974 246 0.3943 67.0046 18.4761
0.3016 1.6039 247 0.3941 66.9124 18.5693
0.2438 1.6104 248 0.3940 66.5438 18.5294
0.2778 1.6169 249 0.3943 67.1889 18.6892
0.3009 1.6234 250 0.3942 66.9124 18.7825
0.2535 1.6299 251 0.3942 66.7281 18.4894
0.2148 1.6364 252 0.3940 66.5438 18.4894
0.2469 1.6429 253 0.3943 67.0046 18.3962
0.1953 1.6494 254 0.3944 67.6498 18.4494
0.2116 1.6558 255 0.3949 68.0184 18.4761
0.2218 1.6623 256 0.3952 68.6636 18.4228
0.2606 1.6688 257 0.3956 68.4793 18.0631
0.2546 1.6753 258 0.3963 68.6636 17.9965
0.2079 1.6818 259 0.3970 69.3088 18.0099
0.2547 1.6883 260 0.3972 69.2166 17.8234
0.2679 1.6948 261 0.3973 69.4931 17.9166
0.1978 1.7013 262 0.3964 69.4009 17.8766
0.1711 1.7078 263 0.3955 68.4793 17.6635
0.2443 1.7143 264 0.3946 68.2028 17.5969
0.224 1.7208 265 0.3937 67.0046 17.7035
0.1986 1.7273 266 0.3934 66.7281 17.7834
0.2985 1.7338 267 0.3927 65.8065 17.7035
0.1725 1.7403 268 0.3924 65.6221 17.7434
0.2004 1.7468 269 0.3921 65.7143 17.8900
0.2046 1.7532 270 0.3920 64.4240 17.7568
0.2287 1.7597 271 0.3920 65.0691 17.6635
0.2195 1.7662 272 0.3919 65.1613 17.8234
0.2275 1.7727 273 0.3917 65.1613 17.7301
0.2527 1.7792 274 0.3918 65.4378 17.7834
0.2797 1.7857 275 0.3920 65.8986 17.9299
0.218 1.7922 276 0.3921 66.4516 18.1031
0.1996 1.7987 277 0.3921 65.8986 17.8367
0.2537 1.8052 278 0.3923 66.2673 17.9166
0.2878 1.8117 279 0.3922 66.2673 17.9166
0.2298 1.8182 280 0.3921 66.5438 18.0099
0.2759 1.8247 281 0.3923 66.4516 17.9566
0.1731 1.8312 282 0.3922 66.8203 17.9965
0.2199 1.8377 283 0.3924 67.5576 18.1031
0.2467 1.8442 284 0.3924 67.5576 18.1297
0.1854 1.8506 285 0.3922 67.7419 18.1031
0.1897 1.8571 286 0.3923 67.9263 18.1830
0.1998 1.8636 287 0.3924 67.4654 18.1031
0.2244 1.8701 288 0.3925 67.6498 18.0765
0.2853 1.8766 289 0.3924 67.6498 18.1164
0.2224 1.8831 290 0.3922 67.6498 18.1431
0.307 1.8896 291 0.3922 67.6498 18.0898
0.2441 1.8961 292 0.3922 67.5576 18.1164
0.2512 1.9026 293 0.3922 67.4654 18.1164
0.1865 1.9091 294 0.3920 67.5576 18.1164
0.2484 1.9156 295 0.3920 67.6498 18.2230
0.2059 1.9221 296 0.3919 67.9263 18.1031
0.1908 1.9286 297 0.3923 67.7419 18.2230
0.2753 1.9351 298 0.3919 67.5576 18.1031
0.2505 1.9416 299 0.3917 67.6498 18.0365
0.2178 1.9481 300 0.3920 67.6498 18.1031
0.214 1.9545 301 0.3919 67.7419 18.1031
0.2541 1.9610 302 0.3918 67.6498 18.1564
0.2804 1.9675 303 0.3917 67.4654 18.1697
0.2789 1.9740 304 0.3915 67.9263 18.1830
0.2039 1.9805 305 0.3916 67.9263 18.2630
0.2073 1.9870 306 0.3921 67.4654 18.0631
0.2216 1.9935 307 0.3919 67.6498 18.0365
0.2754 2.0 308 0.3916 67.4654 18.0898

Framework versions

  • Transformers 4.57.3
  • Pytorch 2.9.1+cu128
  • Datasets 4.4.1
  • Tokenizers 0.22.1
Downloads last month
3
Safetensors
Model size
2B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for nkkbr/whisper-large-v3-zatoichi-ja-JDG_ver_20260217_lr_2.6e-5

Finetuned
(816)
this model