Whisper Large v3 - Japanese Zatoichi ASR

This model is a fine-tuned version of openai/whisper-large-v3 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4060
  • Wer: 70.2304
  • Cer: 18.4628

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 32
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Use adamw_torch_fused with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 2

Training results

Training Loss Epoch Step Validation Loss Wer Cer
1.2387 0.0065 1 1.2545 98.1567 40.8419
1.2196 0.0130 2 0.9956 96.2212 38.5773
1.1012 0.0195 3 0.9816 96.2212 38.3908
0.927 0.0260 4 0.9659 95.0230 37.6715
0.9989 0.0325 5 0.9486 92.7189 36.1130
0.9951 0.0390 6 0.9192 88.8479 34.9674
1.1244 0.0455 7 0.9006 87.5576 34.0749
1.032 0.0519 8 0.8547 89.3088 33.0492
0.9465 0.0584 9 0.8286 84.4240 29.8255
0.8775 0.0649 10 0.8071 93.5484 31.6105
0.8273 0.0714 11 0.7723 93.1797 31.3840
0.917 0.0779 12 0.7493 94.1935 31.3974
0.7742 0.0844 13 0.7297 92.7189 33.5420
0.7525 0.0909 14 0.7098 88.7558 32.8493
0.7236 0.0974 15 0.6881 86.7281 32.3831
0.7826 0.1039 16 0.6686 89.7696 30.4249
0.6916 0.1104 17 0.6529 88.5714 30.3850
0.5472 0.1169 18 0.6340 92.6267 33.1957
0.7826 0.1234 19 0.6260 89.7696 33.9417
0.6127 0.1299 20 0.6122 89.7696 29.6124
0.5714 0.1364 21 0.5977 90.5991 28.9596
0.6087 0.1429 22 0.5894 81.0138 26.2821
0.7335 0.1494 23 0.5815 77.2350 24.8035
0.6164 0.1558 24 0.5740 80.7373 26.0690
0.6704 0.1623 25 0.5688 80.5530 25.6960
0.5722 0.1688 26 0.5659 84.3318 27.2412
0.6052 0.1753 27 0.5596 87.0046 27.7608
0.5731 0.1818 28 0.5567 80.2765 24.5904
0.6719 0.1883 29 0.5502 81.6590 24.4838
0.5693 0.1948 30 0.5456 81.8433 24.7769
0.5305 0.2013 31 0.5471 75.2995 22.6722
0.6789 0.2078 32 0.5386 74.0092 22.4990
0.5221 0.2143 33 0.5293 77.6959 23.8577
0.5166 0.2208 34 0.5251 79.3548 25.3896
0.5011 0.2273 35 0.5219 79.7235 25.2831
0.6547 0.2338 36 0.5190 73.0876 21.6598
0.5039 0.2403 37 0.5195 73.2719 21.6731
0.4667 0.2468 38 0.5167 71.7051 21.6465
0.5978 0.2532 39 0.5133 70.2304 21.5399
0.5252 0.2597 40 0.5102 70.0461 21.6065
0.6047 0.2662 41 0.5080 70.1382 22.4457
0.6358 0.2727 42 0.5088 72.5346 24.1375
0.5356 0.2792 43 0.5066 73.4562 23.9243
0.5468 0.2857 44 0.5060 74.1014 24.6037
0.4777 0.2922 45 0.5096 79.6313 24.3639
0.4845 0.2987 46 0.5099 79.7235 24.2174
0.6275 0.3052 47 0.5048 78.0645 23.9243
0.5762 0.3117 48 0.5000 76.5899 23.0851
0.5713 0.3182 49 0.4966 75.6682 23.2983
0.553 0.3247 50 0.4970 72.9032 21.9928
0.437 0.3312 51 0.4961 72.7189 21.9129
0.489 0.3377 52 0.4915 78.6175 24.0842
0.4744 0.3442 53 0.4903 72.9032 22.0994
0.5908 0.3506 54 0.4891 70.7834 21.3401
0.4778 0.3571 55 0.4875 70.4147 21.2335
0.448 0.3636 56 0.4843 69.1244 20.6874
0.4674 0.3701 57 0.4816 74.1014 22.7921
0.5514 0.3766 58 0.4813 74.0092 22.7388
0.454 0.3831 59 0.4816 74.1014 22.5390
0.4726 0.3896 60 0.4802 74.3779 22.3525
0.4018 0.3961 61 0.4783 74.2857 22.2592
0.4788 0.4026 62 0.4757 68.5714 20.0080
0.4642 0.4091 63 0.4726 72.3502 21.4999
0.5506 0.4156 64 0.4704 72.9954 21.8063
0.4109 0.4221 65 0.4696 73.5484 21.6598
0.4708 0.4286 66 0.4706 74.8387 21.9528
0.5352 0.4351 67 0.4671 74.4700 22.0594
0.5696 0.4416 68 0.4632 73.2719 21.6331
0.4644 0.4481 69 0.4607 70.1382 21.0470
0.5704 0.4545 70 0.4612 68.5714 21.9262
0.4216 0.4610 71 0.4599 67.4654 21.1536
0.4833 0.4675 72 0.4565 67.1889 21.0071
0.4289 0.4740 73 0.4551 72.0737 23.4181
0.4735 0.4805 74 0.4562 71.8894 23.3249
0.4565 0.4870 75 0.4566 69.5853 21.1269
0.4511 0.4935 76 0.4544 72.2581 21.1136
0.5266 0.5 77 0.4534 72.8111 21.3134
0.4574 0.5065 78 0.4533 71.8894 21.9395
0.3887 0.5130 79 0.4546 71.8894 22.0194
0.4571 0.5195 80 0.4519 70.8756 20.5941
0.4165 0.5260 81 0.4492 70.4147 20.5808
0.4289 0.5325 82 0.4489 69.1244 20.5675
0.4696 0.5390 83 0.4511 69.3088 20.8739
0.4808 0.5455 84 0.4544 68.5714 20.6874
0.4512 0.5519 85 0.4546 69.4931 21.1536
0.4756 0.5584 86 0.4530 69.2166 21.0470
0.4695 0.5649 87 0.4507 69.4009 20.8605
0.5501 0.5714 88 0.4512 68.7558 20.8206
0.4615 0.5779 89 0.4534 68.7558 20.7806
0.5171 0.5844 90 0.4522 68.5714 20.7406
0.5357 0.5909 91 0.4477 70.5069 21.0870
0.5665 0.5974 92 0.4427 74.2857 22.7121
0.5576 0.6039 93 0.4427 77.2350 24.0709
0.4047 0.6104 94 0.4481 72.5346 22.1926
0.472 0.6169 95 0.4560 72.6267 22.0461
0.48 0.6234 96 0.4549 72.6267 21.9928
0.5341 0.6299 97 0.4442 72.9954 22.0594
0.523 0.6364 98 0.4431 74.4700 21.7131
0.4111 0.6429 99 0.4482 80.0922 23.1784
0.322 0.6494 100 0.4489 81.1982 24.1108
0.5676 0.6558 101 0.4454 77.6037 23.1517
0.4684 0.6623 102 0.4416 75.1152 22.7255
0.4482 0.6688 103 0.4396 74.6544 21.6731
0.4196 0.6753 104 0.4388 77.3272 22.9519
0.5154 0.6818 105 0.4397 72.1659 20.9271
0.5125 0.6883 106 0.4404 68.2028 19.4352
0.4668 0.6948 107 0.4395 68.1106 19.5950
0.419 0.7013 108 0.4363 71.2442 20.9671
0.4765 0.7078 109 0.4348 73.3641 21.2602
0.4795 0.7143 110 0.4361 76.4055 22.6189
0.4074 0.7208 111 0.4390 76.4977 21.4600
0.472 0.7273 112 0.4405 76.9585 21.4067
0.4686 0.7338 113 0.4378 80.3687 22.7654
0.6422 0.7403 114 0.4337 77.2350 21.4999
0.4122 0.7468 115 0.4300 74.9309 21.3001
0.3986 0.7532 116 0.4269 71.8894 22.8187
0.4592 0.7597 117 0.4256 77.2350 23.8577
0.3768 0.7662 118 0.4242 75.9447 23.1784
0.5402 0.7727 119 0.4233 69.9539 20.9138
0.4685 0.7792 120 0.4216 66.0829 19.2753
0.4475 0.7857 121 0.4207 66.8203 19.3020
0.446 0.7922 122 0.4223 68.2028 19.4485
0.3721 0.7987 123 0.4249 69.4931 19.6217
0.452 0.8052 124 0.4254 72.9032 19.9014
0.4032 0.8117 125 0.4260 73.1797 19.6217
0.4149 0.8182 126 0.4284 78.0645 21.3001
0.4293 0.8247 127 0.4282 82.3041 23.1517
0.6045 0.8312 128 0.4271 81.8433 23.2583
0.4711 0.8377 129 0.4245 73.4562 20.9138
0.4434 0.8442 130 0.4211 72.0737 20.4742
0.4394 0.8506 131 0.4192 70.5069 20.1812
0.4133 0.8571 132 0.4182 69.4009 20.1412
0.4439 0.8636 133 0.4193 69.4931 21.5932
0.4596 0.8701 134 0.4198 69.8618 20.7939
0.4739 0.8766 135 0.4190 69.5853 20.8072
0.5118 0.8831 136 0.4173 70.5069 20.5408
0.3891 0.8896 137 0.4161 69.3088 20.2744
0.4362 0.8961 138 0.4150 69.6774 20.5275
0.4397 0.9026 139 0.4151 70.4147 21.3401
0.3514 0.9091 140 0.4158 71.7051 21.3934
0.3963 0.9156 141 0.4156 72.3502 20.3410
0.4065 0.9221 142 0.4156 72.2581 20.2078
0.3866 0.9286 143 0.4158 72.4424 20.3277
0.4894 0.9351 144 0.4164 73.0876 20.3144
0.4508 0.9416 145 0.4171 74.0092 20.5142
0.3761 0.9481 146 0.4175 72.8111 20.5408
0.5027 0.9545 147 0.4167 71.5207 20.2744
0.397 0.9610 148 0.4168 71.0599 20.2877
0.3688 0.9675 149 0.4161 70.9677 20.1146
0.4576 0.9740 150 0.4145 69.4931 19.9414
0.3695 0.9805 151 0.4134 70.4147 20.4343
0.3859 0.9870 152 0.4132 69.9539 20.2877
0.4897 0.9935 153 0.4132 70.3226 20.3543
0.4517 1.0 154 0.4132 71.1521 20.7806
0.2596 1.0065 155 0.4134 71.7051 21.0470
0.4003 1.0130 156 0.4137 71.7972 20.9138
0.3047 1.0195 157 0.4137 71.7972 20.8472
0.2925 1.0260 158 0.4136 70.5069 20.8872
0.3475 1.0325 159 0.4143 69.7696 20.7673
0.3128 1.0390 160 0.4154 67.0968 19.3819
0.2973 1.0455 161 0.4168 67.9263 19.5551
0.2816 1.0519 162 0.4195 70.4147 19.7149
0.3236 1.0584 163 0.4221 71.7051 19.7815
0.3012 1.0649 164 0.4218 70.6912 19.4485
0.345 1.0714 165 0.4199 69.1244 19.3153
0.3065 1.0779 166 0.4175 67.7419 18.8890
0.2878 1.0844 167 0.4160 67.5576 18.6759
0.2323 1.0909 168 0.4147 67.0046 18.5827
0.2852 1.0974 169 0.4147 66.8203 18.4894
0.2601 1.1039 170 0.4162 68.2028 19.6483
0.3307 1.1104 171 0.4181 72.7189 20.5808
0.2583 1.1169 172 0.4194 74.1935 20.8339
0.2922 1.1234 173 0.4204 70.8756 19.5551
0.342 1.1299 174 0.4204 70.4147 19.4885
0.3027 1.1364 175 0.4208 69.7696 19.4219
0.2992 1.1429 176 0.4209 69.5853 19.4485
0.3099 1.1494 177 0.4221 69.4931 19.2887
0.2818 1.1558 178 0.4223 70.5991 19.3819
0.294 1.1623 179 0.4218 71.2442 19.3153
0.2992 1.1688 180 0.4212 71.2442 19.3686
0.2722 1.1753 181 0.4218 72.2581 19.4352
0.2717 1.1818 182 0.4225 72.7189 19.2887
0.2916 1.1883 183 0.4209 72.3502 19.3819
0.2903 1.1948 184 0.4194 71.5207 19.1821
0.3371 1.2013 185 0.4171 69.7696 18.9290
0.2886 1.2078 186 0.4156 69.4009 18.8091
0.2692 1.2143 187 0.4143 68.2949 18.5827
0.2305 1.2208 188 0.4146 68.1106 18.5827
0.2249 1.2273 189 0.4160 68.8479 18.5693
0.256 1.2338 190 0.4173 68.7558 18.7425
0.2794 1.2403 191 0.4188 69.8618 18.9157
0.2623 1.2468 192 0.4210 70.5991 19.0222
0.3132 1.2532 193 0.4223 70.8756 19.1688
0.2827 1.2597 194 0.4216 71.2442 19.3153
0.3222 1.2662 195 0.4216 70.7834 19.1688
0.2733 1.2727 196 0.4209 70.4147 18.9823
0.3099 1.2792 197 0.4204 70.2304 19.0222
0.2828 1.2857 198 0.4197 70.1382 18.8890
0.2846 1.2922 199 0.4193 70.1382 18.9690
0.2613 1.2987 200 0.4201 71.2442 19.0356
0.3057 1.3052 201 0.4209 71.9816 19.1954
0.2651 1.3117 202 0.4199 71.8894 18.9423
0.291 1.3182 203 0.4186 71.3364 18.9556
0.3278 1.3247 204 0.4169 70.0461 18.9556
0.288 1.3312 205 0.4160 70.3226 18.9823
0.3066 1.3377 206 0.4159 70.1382 18.9024
0.3988 1.3442 207 0.4158 73.4562 20.2744
0.3067 1.3506 208 0.4166 74.7465 20.3410
0.2415 1.3571 209 0.4188 79.5392 21.9795
0.3142 1.3636 210 0.4210 74.0092 19.4219
0.2567 1.3701 211 0.4214 74.5622 19.5284
0.2883 1.3766 212 0.4200 74.5622 19.7549
0.3206 1.3831 213 0.4186 74.0092 19.6350
0.2862 1.3896 214 0.4170 74.0092 19.7815
0.2803 1.3961 215 0.4151 73.9171 19.6750
0.2832 1.4026 216 0.4136 72.9954 19.5151
0.246 1.4091 217 0.4123 71.6129 19.4086
0.3061 1.4156 218 0.4113 70.5069 19.2753
0.3204 1.4221 219 0.4108 73.4562 20.5408
0.2519 1.4286 220 0.4109 73.5484 21.3134
0.3175 1.4351 221 0.4115 74.4700 21.6864
0.3036 1.4416 222 0.4121 74.8387 21.7264
0.2743 1.4481 223 0.4124 75.6682 21.7930
0.3251 1.4545 224 0.4120 75.5760 21.8196
0.3236 1.4610 225 0.4118 75.2995 20.9538
0.3332 1.4675 226 0.4121 76.2212 20.6341
0.2409 1.4740 227 0.4117 75.9447 20.6740
0.3664 1.4805 228 0.4122 75.9447 20.5808
0.2744 1.4870 229 0.4122 76.3134 20.6607
0.2563 1.4935 230 0.4118 75.7604 20.6607
0.2574 1.5 231 0.4114 76.3134 20.5675
0.2927 1.5065 232 0.4114 76.4055 20.7273
0.2332 1.5130 233 0.4102 76.0369 20.8339
0.3094 1.5195 234 0.4099 71.8894 19.1288
0.2222 1.5260 235 0.4096 72.0737 19.2221
0.2614 1.5325 236 0.4094 71.4286 19.1421
0.3485 1.5390 237 0.4093 71.7051 18.9157
0.264 1.5455 238 0.4094 71.6129 18.8624
0.3214 1.5519 239 0.4093 71.9816 18.9157
0.2926 1.5584 240 0.4092 72.6267 18.9423
0.2838 1.5649 241 0.4090 72.5346 18.9423
0.2814 1.5714 242 0.4088 72.1659 18.7958
0.2101 1.5779 243 0.4083 72.3502 18.8890
0.2666 1.5844 244 0.4081 71.8894 18.8624
0.2682 1.5909 245 0.4075 71.3364 18.7292
0.3245 1.5974 246 0.4068 70.5069 18.6626
0.3416 1.6039 247 0.4060 70.5069 18.7159
0.2974 1.6104 248 0.4058 70.3226 18.7292
0.3397 1.6169 249 0.4054 70.5069 18.7825
0.3509 1.6234 250 0.4055 69.4009 18.6226
0.3031 1.6299 251 0.4051 69.1244 18.5560
0.2647 1.6364 252 0.4049 68.7558 18.4361
0.3019 1.6429 253 0.4046 68.4793 18.3562
0.2463 1.6494 254 0.4048 68.2949 18.3695
0.2569 1.6558 255 0.4048 68.7558 18.4095
0.2723 1.6623 256 0.4052 69.7696 18.6493
0.3194 1.6688 257 0.4051 69.5853 18.6093
0.2987 1.6753 258 0.4054 68.8479 18.4761
0.2527 1.6818 259 0.4058 69.4931 18.6759
0.3065 1.6883 260 0.4067 69.0323 18.6226
0.3212 1.6948 261 0.4066 70.3226 18.7159
0.2502 1.7013 262 0.4066 71.0599 18.7025
0.2237 1.7078 263 0.4067 70.5991 18.6892
0.3028 1.7143 264 0.4067 70.5991 18.6226
0.2785 1.7208 265 0.4065 73.9171 20.0080
0.2418 1.7273 266 0.4063 74.2857 20.0480
0.3551 1.7338 267 0.4062 70.6912 18.6626
0.2122 1.7403 268 0.4062 70.6912 18.6359
0.2562 1.7468 269 0.4057 70.6912 18.6626
0.2449 1.7532 270 0.4057 73.7327 20.0213
0.2664 1.7597 271 0.4056 73.6406 19.9147
0.2811 1.7662 272 0.4054 73.2719 19.9147
0.2727 1.7727 273 0.4053 72.9032 19.8615
0.3064 1.7792 274 0.4052 72.7189 19.8348
0.3374 1.7857 275 0.4055 69.2166 18.4361
0.2797 1.7922 276 0.4056 69.1244 18.2763
0.2574 1.7987 277 0.4054 69.3088 18.3429
0.3193 1.8052 278 0.4056 69.0323 18.3029
0.3415 1.8117 279 0.4055 69.3088 18.3695
0.284 1.8182 280 0.4059 69.2166 18.3296
0.3329 1.8247 281 0.4054 69.2166 18.3828
0.2219 1.8312 282 0.4058 69.5853 18.4894
0.2651 1.8377 283 0.4060 70.2304 18.4894
0.3128 1.8442 284 0.4059 69.8618 18.4894
0.223 1.8506 285 0.4056 70.1382 18.4361
0.2572 1.8571 286 0.4059 70.1382 18.4361
0.2546 1.8636 287 0.4061 69.7696 18.3962
0.2788 1.8701 288 0.4058 70.0461 18.4228
0.3391 1.8766 289 0.4054 70.0461 18.4628
0.277 1.8831 290 0.4061 70.2304 18.4628
0.3715 1.8896 291 0.4058 70.2304 18.4894
0.3115 1.8961 292 0.4060 70.1382 18.4628
0.3194 1.9026 293 0.4064 69.6774 18.4095
0.231 1.9091 294 0.4061 70.0461 18.4494
0.3145 1.9156 295 0.4060 69.8618 18.3962
0.2536 1.9221 296 0.4060 69.5853 18.3562
0.2439 1.9286 297 0.4059 70.0461 18.4494
0.3291 1.9351 298 0.4058 70.1382 18.4628
0.3075 1.9416 299 0.4062 70.0461 18.4761
0.2836 1.9481 300 0.4058 70.0461 18.4894
0.2671 1.9545 301 0.4064 70.4147 18.5027
0.3035 1.9610 302 0.4061 70.0461 18.4494
0.3343 1.9675 303 0.4060 70.3226 18.4494
0.3471 1.9740 304 0.4060 70.0461 18.4228
0.2665 1.9805 305 0.4060 70.2304 18.4628
0.2519 1.9870 306 0.4060 69.4931 18.4095
0.2796 1.9935 307 0.4057 69.6774 18.3828
0.3363 2.0 308 0.4060 70.2304 18.4628

Framework versions

  • Transformers 4.57.3
  • Pytorch 2.9.1+cu128
  • Datasets 4.4.1
  • Tokenizers 0.22.1
Downloads last month
4
Safetensors
Model size
2B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for nkkbr/whisper-large-v3-zatoichi-ja-JDG_ver_20260217_lr_1.0e-5

Finetuned
(816)
this model