Whisper Large v3 - Japanese Zatoichi ASR

This model is a fine-tuned version of openai/whisper-large-v3 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3915
  • Wer: 68.2949
  • Cer: 17.9832

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2.8e-05
  • train_batch_size: 32
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Use adamw_torch_fused with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 2

Training results

Training Loss Epoch Step Validation Loss Wer Cer
1.2387 0.0065 1 1.2545 98.1567 40.8419
1.2196 0.0130 2 0.9950 96.0369 38.2843
1.0994 0.0195 3 0.9709 93.9171 37.3918
0.9151 0.0260 4 0.9273 89.4931 35.1405
0.9616 0.0325 5 0.8628 85.3456 31.0777
0.9094 0.0390 6 0.8236 84.2396 28.8930
1.0301 0.0455 7 0.7693 88.8479 29.2127
0.8894 0.0519 8 0.7403 90.4147 30.1052
0.8214 0.0584 9 0.7175 82.0276 30.6914
0.7715 0.0649 10 0.6833 82.6728 32.1833
0.7213 0.0714 11 0.6582 98.6175 31.8769
0.8096 0.0779 12 0.6338 99.7235 32.0634
0.6733 0.0844 13 0.6132 74.4700 24.4172
0.6323 0.0909 14 0.6049 75.9447 24.3373
0.6216 0.0974 15 0.5913 80.7373 25.8292
0.6799 0.1039 16 0.5872 84.7005 26.4553
0.5972 0.1104 17 0.5781 88.2028 27.6009
0.499 0.1169 18 0.5657 78.8018 24.2041
0.6764 0.1234 19 0.5716 83.0415 28.4268
0.5708 0.1299 20 0.5663 81.2903 28.6799
0.5236 0.1364 21 0.5500 73.9171 23.5247
0.5857 0.1429 22 0.5486 76.9585 24.0043
0.6832 0.1494 23 0.5538 88.5714 28.7998
0.5784 0.1558 24 0.5472 88.9401 28.9996
0.627 0.1623 25 0.5477 84.5161 25.7626
0.5338 0.1688 26 0.5462 80.0 25.0566
0.5717 0.1753 27 0.5426 77.7880 25.5228
0.568 0.1818 28 0.5363 78.4332 25.6294
0.6202 0.1883 29 0.5305 81.2903 24.4705
0.5301 0.1948 30 0.5355 84.0553 24.9900
0.5108 0.2013 31 0.5400 77.7880 25.2031
0.6654 0.2078 32 0.5350 73.2719 22.8187
0.5239 0.2143 33 0.5303 73.8249 24.0842
0.5149 0.2208 34 0.5294 78.0645 22.4191
0.4943 0.2273 35 0.5262 77.0507 22.4457
0.662 0.2338 36 0.5243 72.4424 22.1393
0.5113 0.2403 37 0.5255 70.2304 22.2059
0.4925 0.2468 38 0.5248 71.5207 22.3924
0.6095 0.2532 39 0.5208 71.6129 22.4990
0.5213 0.2597 40 0.5201 73.0876 22.6056
0.5916 0.2662 41 0.5187 73.9171 22.1793
0.6564 0.2727 42 0.5174 87.0046 30.2518
0.5432 0.2792 43 0.5101 71.1521 21.9129
0.5898 0.2857 44 0.5093 70.6912 22.2193
0.4761 0.2922 45 0.5168 71.4286 22.3791
0.5058 0.2987 46 0.5123 76.9585 23.3915
0.635 0.3052 47 0.5098 77.8802 22.9119
0.5744 0.3117 48 0.5094 75.2074 22.3924
0.5873 0.3182 49 0.5043 77.6037 23.8711
0.5578 0.3247 50 0.5007 84.9770 26.5352
0.4117 0.3312 51 0.4989 81.9355 26.1756
0.4953 0.3377 52 0.4968 77.4194 25.3230
0.4848 0.3442 53 0.5010 71.9816 22.8320
0.6095 0.3506 54 0.4969 70.5069 22.4590
0.5234 0.3571 55 0.4895 72.7189 23.7245
0.458 0.3636 56 0.4898 73.0876 23.6979
0.4765 0.3701 57 0.4950 76.4977 24.2041
0.6049 0.3766 58 0.4999 88.1106 28.7865
0.4809 0.3831 59 0.5009 76.9585 24.0709
0.5072 0.3896 60 0.4976 79.3548 24.5238
0.4124 0.3961 61 0.4989 68.7558 20.8605
0.4952 0.4026 62 0.4981 68.4793 22.0461
0.4902 0.4091 63 0.4880 67.7419 21.5932
0.5739 0.4156 64 0.4773 72.9032 22.3658
0.4293 0.4221 65 0.4779 76.4977 22.0328
0.4895 0.4286 66 0.4900 88.5714 25.8292
0.5475 0.4351 67 0.4834 87.8341 26.6152
0.5982 0.4416 68 0.4725 82.6728 26.7883
0.4988 0.4481 69 0.4718 72.9032 24.4838
0.5932 0.4545 70 0.4722 73.6406 25.2298
0.4253 0.4610 71 0.4698 75.8525 24.8701
0.4924 0.4675 72 0.4718 86.5438 28.9996
0.4595 0.4740 73 0.4741 78.5253 24.9900
0.4973 0.4805 74 0.4679 77.7880 24.1508
0.4828 0.4870 75 0.4641 73.4562 21.7264
0.4565 0.4935 76 0.4658 83.5023 25.8559
0.5927 0.5 77 0.4637 76.4055 23.1517
0.4861 0.5065 78 0.4576 71.3364 21.7264
0.4022 0.5130 79 0.4615 78.8018 23.3915
0.4886 0.5195 80 0.4619 78.6175 23.1650
0.4294 0.5260 81 0.4557 74.4700 23.0851
0.447 0.5325 82 0.4525 67.4654 20.5941
0.4683 0.5390 83 0.4577 72.8111 22.2859
0.471 0.5455 84 0.4645 75.3917 23.4581
0.4818 0.5519 85 0.4676 83.5023 27.6009
0.5147 0.5584 86 0.4669 80.6452 23.4448
0.5021 0.5649 87 0.4659 79.0783 22.7921
0.5743 0.5714 88 0.4595 75.5760 21.4466
0.4631 0.5779 89 0.4552 70.3226 20.9271
0.5281 0.5844 90 0.4553 69.3088 20.4875
0.5449 0.5909 91 0.4528 69.9539 20.7140
0.5998 0.5974 92 0.4492 70.5991 20.7007
0.5759 0.6039 93 0.4484 74.8387 21.2335
0.418 0.6104 94 0.4539 78.8018 22.6056
0.4954 0.6169 95 0.4606 76.4055 21.8996
0.4815 0.6234 96 0.4626 78.9862 24.4172
0.5316 0.6299 97 0.4568 70.5069 21.6731
0.5166 0.6364 98 0.4569 73.1797 23.6579
0.4144 0.6429 99 0.4573 86.8203 27.8673
0.339 0.6494 100 0.4564 73.8249 21.5532
0.5584 0.6558 101 0.4532 79.0783 23.2317
0.4781 0.6623 102 0.4511 75.5760 22.7521
0.4578 0.6688 103 0.4501 79.5392 25.5894
0.4315 0.6753 104 0.4499 75.2995 23.7512
0.5418 0.6818 105 0.4495 69.5853 21.1136
0.518 0.6883 106 0.4514 69.0323 20.2344
0.4931 0.6948 107 0.4511 68.5714 20.4742
0.4548 0.7013 108 0.4480 68.8479 20.1412
0.5005 0.7078 109 0.4446 70.5069 21.2335
0.4966 0.7143 110 0.4427 73.5484 21.6997
0.4136 0.7208 111 0.4454 79.4470 21.5266
0.4879 0.7273 112 0.4492 79.4470 21.6864
0.4875 0.7338 113 0.4477 75.8525 21.4733
0.6519 0.7403 114 0.4468 76.4977 23.0984
0.4455 0.7468 115 0.4440 75.8525 24.3906
0.4069 0.7532 116 0.4415 73.9171 23.3249
0.4589 0.7597 117 0.4427 72.7189 23.4847
0.3933 0.7662 118 0.4420 67.5576 20.2478
0.5701 0.7727 119 0.4382 67.8341 20.2611
0.4864 0.7792 120 0.4366 68.6636 20.0346
0.4703 0.7857 121 0.4385 69.9539 20.0080
0.4698 0.7922 122 0.4394 70.8756 19.8481
0.3891 0.7987 123 0.4384 73.3641 22.5123
0.4654 0.8052 124 0.4352 78.5253 24.9234
0.4226 0.8117 125 0.4356 85.0691 27.0547
0.4171 0.8182 126 0.4372 79.1705 22.5656
0.4444 0.8247 127 0.4313 72.4424 20.1146
0.6233 0.8312 128 0.4263 75.2995 21.0204
0.4644 0.8377 129 0.4223 67.0046 18.9157
0.4393 0.8442 130 0.4222 70.6912 20.4076
0.4547 0.8506 131 0.4214 71.9816 20.7140
0.4041 0.8571 132 0.4225 70.3226 20.3277
0.468 0.8636 133 0.4273 66.8203 19.4219
0.4605 0.8701 134 0.4309 68.9401 19.9547
0.4952 0.8766 135 0.4310 69.2166 19.8348
0.5459 0.8831 136 0.4306 70.9677 20.0480
0.3959 0.8896 137 0.4310 70.6912 20.3410
0.433 0.8961 138 0.4287 68.7558 20.0746
0.447 0.9026 139 0.4274 68.3871 19.8082
0.3285 0.9091 140 0.4273 67.2811 19.4885
0.403 0.9156 141 0.4272 67.0046 19.3686
0.4146 0.9221 142 0.4276 67.0968 19.2753
0.3898 0.9286 143 0.4296 73.4562 21.3934
0.5144 0.9351 144 0.4341 76.7742 21.4600
0.4821 0.9416 145 0.4350 73.7327 20.2877
0.3887 0.9481 146 0.4321 75.0230 21.3800
0.4997 0.9545 147 0.4282 74.2857 22.4058
0.4111 0.9610 148 0.4253 67.6498 20.5408
0.3709 0.9675 149 0.4242 67.2811 19.6883
0.4578 0.9740 150 0.4241 68.3871 20.3943
0.3669 0.9805 151 0.4262 74.4700 21.9662
0.3849 0.9870 152 0.4252 74.3779 21.9662
0.5165 0.9935 153 0.4223 74.5622 22.9253
0.4305 1.0 154 0.4202 73.7327 22.4191
0.2323 1.0065 155 0.4172 68.2949 20.3943
0.3509 1.0130 156 0.4172 66.6359 19.4885
0.261 1.0195 157 0.4181 66.1751 19.3153
0.2353 1.0260 158 0.4199 66.5438 19.5018
0.3009 1.0325 159 0.4214 66.9124 19.2887
0.2598 1.0390 160 0.4235 66.9124 19.4086
0.2452 1.0455 161 0.4261 69.6774 19.6217
0.2554 1.0519 162 0.4297 70.7834 19.6350
0.2854 1.0584 163 0.4311 70.7834 19.5817
0.2677 1.0649 164 0.4289 69.4009 19.5950
0.3105 1.0714 165 0.4262 66.1751 19.3819
0.2594 1.0779 166 0.4253 65.8986 19.7283
0.2413 1.0844 167 0.4213 65.6221 19.2887
0.2033 1.0909 168 0.4171 65.6221 18.9823
0.2409 1.0974 169 0.4167 68.4793 19.4086
0.215 1.1039 170 0.4223 73.2719 20.8472
0.2899 1.1104 171 0.4263 75.2074 20.6074
0.2339 1.1169 172 0.4225 73.8249 20.1678
0.2622 1.1234 173 0.4188 69.0323 18.8491
0.2824 1.1299 174 0.4167 67.1889 18.9024
0.2596 1.1364 175 0.4177 67.2811 20.1812
0.2569 1.1429 176 0.4168 68.4793 20.4875
0.2604 1.1494 177 0.4127 69.2166 20.4343
0.2662 1.1558 178 0.4109 71.2442 20.4875
0.2545 1.1623 179 0.4132 74.1935 20.5675
0.2516 1.1688 180 0.4185 73.5484 19.0755
0.2431 1.1753 181 0.4196 74.0092 19.5551
0.2283 1.1818 182 0.4126 70.5069 19.3686
0.2557 1.1883 183 0.4053 65.9908 18.6759
0.2485 1.1948 184 0.4032 64.0553 18.5027
0.2974 1.2013 185 0.4030 69.1244 20.3543
0.2483 1.2078 186 0.4030 63.8710 18.1031
0.2245 1.2143 187 0.4044 64.2396 17.8900
0.1945 1.2208 188 0.4092 67.7419 18.3029
0.1918 1.2273 189 0.4168 71.9816 18.5693
0.2167 1.2338 190 0.4215 73.9171 18.7558
0.2328 1.2403 191 0.4199 75.0230 19.0622
0.2279 1.2468 192 0.4159 71.2442 18.9690
0.2764 1.2532 193 0.4112 66.7281 18.4761
0.2256 1.2597 194 0.4117 66.2673 18.7159
0.2654 1.2662 195 0.4139 66.3594 18.9290
0.2432 1.2727 196 0.4134 71.0599 20.3943
0.2668 1.2792 197 0.4122 67.3733 18.5294
0.2381 1.2857 198 0.4139 69.5853 18.7691
0.2358 1.2922 199 0.4184 72.4424 18.9823
0.2322 1.2987 200 0.4213 74.8387 19.2221
0.26 1.3052 201 0.4215 78.9862 21.0204
0.2235 1.3117 202 0.4159 76.5899 20.7806
0.2391 1.3182 203 0.4113 72.0737 20.3144
0.2753 1.3247 204 0.4109 70.5069 20.0879
0.2512 1.3312 205 0.4129 70.5069 20.2877
0.271 1.3377 206 0.4132 71.2442 20.0879
0.3285 1.3442 207 0.4112 70.0461 19.8881
0.2569 1.3506 208 0.4103 79.7235 22.2859
0.2053 1.3571 209 0.4127 82.0276 22.6322
0.2618 1.3636 210 0.4153 86.3594 23.1251
0.2155 1.3701 211 0.4161 87.3733 23.0718
0.2401 1.3766 212 0.4130 87.3733 22.9919
0.2591 1.3831 213 0.4100 85.8986 23.0452
0.2363 1.3896 214 0.4063 83.1336 22.6855
0.2348 1.3961 215 0.4035 80.0922 22.2992
0.2096 1.4026 216 0.4020 65.6221 18.1830
0.1941 1.4091 217 0.4022 65.0691 18.2230
0.246 1.4156 218 0.4030 64.3318 17.9033
0.2595 1.4221 219 0.4034 64.2396 17.9566
0.2106 1.4286 220 0.4044 64.5161 17.7967
0.275 1.4351 221 0.4052 64.8848 17.7834
0.2685 1.4416 222 0.4055 65.3456 17.7301
0.2187 1.4481 223 0.4065 67.0046 17.9299
0.2525 1.4545 224 0.4062 67.0046 17.8633
0.268 1.4610 225 0.4054 67.6498 17.9566
0.2854 1.4675 226 0.4053 67.9263 18.0631
0.1853 1.4740 227 0.4052 67.0968 17.8900
0.298 1.4805 228 0.4048 66.6359 17.6635
0.2264 1.4870 229 0.4028 65.8986 17.8100
0.2072 1.4935 230 0.4009 66.5438 18.0631
0.217 1.5 231 0.3995 65.8065 17.9832
0.2223 1.5065 232 0.3996 66.0829 18.2363
0.2005 1.5130 233 0.3993 65.9908 18.1697
0.2499 1.5195 234 0.3993 67.0046 18.4361
0.1862 1.5260 235 0.3994 66.4516 18.2496
0.2183 1.5325 236 0.3988 66.8203 18.4095
0.279 1.5390 237 0.3982 66.9124 18.2763
0.219 1.5455 238 0.3976 66.0829 18.0232
0.2487 1.5519 239 0.3971 66.2673 18.0232
0.237 1.5584 240 0.3969 67.0046 18.1431
0.2197 1.5649 241 0.3967 67.1889 18.0631
0.24 1.5714 242 0.3963 67.5576 18.0099
0.1693 1.5779 243 0.3954 67.7419 18.1297
0.2169 1.5844 244 0.3949 67.3733 18.2363
0.2102 1.5909 245 0.3941 66.6359 18.0898
0.2543 1.5974 246 0.3936 66.3594 18.0232
0.2924 1.6039 247 0.3934 66.1751 17.9832
0.2382 1.6104 248 0.3933 66.6359 18.0099
0.2712 1.6169 249 0.3933 67.0046 18.1830
0.299 1.6234 250 0.3935 66.2673 18.1031
0.2538 1.6299 251 0.3933 66.2673 18.0365
0.2038 1.6364 252 0.3934 65.9908 18.0365
0.2405 1.6429 253 0.3932 65.4378 17.7967
0.1907 1.6494 254 0.3935 65.8986 17.7834
0.2152 1.6558 255 0.3940 65.4378 17.7434
0.2224 1.6623 256 0.3953 66.2673 17.8367
0.2622 1.6688 257 0.3957 68.2949 17.9033
0.2514 1.6753 258 0.3961 68.4793 17.8500
0.2027 1.6818 259 0.3969 69.6774 17.9299
0.254 1.6883 260 0.3969 70.1382 17.9299
0.2693 1.6948 261 0.3966 69.4931 17.9299
0.1984 1.7013 262 0.3962 69.4009 17.9832
0.1701 1.7078 263 0.3956 68.7558 17.8234
0.2382 1.7143 264 0.3945 68.4793 17.9299
0.2201 1.7208 265 0.3938 68.2028 18.1031
0.1963 1.7273 266 0.3934 67.3733 18.3296
0.3011 1.7338 267 0.3931 66.9124 18.1697
0.1705 1.7403 268 0.3921 66.0829 18.1830
0.1972 1.7468 269 0.3918 65.9908 18.1164
0.2094 1.7532 270 0.3919 65.4378 17.9832
0.224 1.7597 271 0.3917 65.4378 17.9566
0.2235 1.7662 272 0.3916 65.1613 17.8766
0.2234 1.7727 273 0.3912 65.4378 17.9433
0.2479 1.7792 274 0.3917 65.2535 17.7701
0.2726 1.7857 275 0.3918 65.5300 17.7568
0.2151 1.7922 276 0.3915 65.6221 17.7568
0.1956 1.7987 277 0.3917 66.1751 17.8633
0.2515 1.8052 278 0.3918 65.8986 17.7701
0.278 1.8117 279 0.3920 67.2811 17.9566
0.2266 1.8182 280 0.3919 67.2811 17.9699
0.2753 1.8247 281 0.3921 67.0046 17.9566
0.1702 1.8312 282 0.3922 67.2811 17.9299
0.2145 1.8377 283 0.3916 67.3733 17.9566
0.2417 1.8442 284 0.3918 68.1106 18.0365
0.1823 1.8506 285 0.3915 68.2028 18.0765
0.184 1.8571 286 0.3915 67.8341 17.9965
0.1968 1.8636 287 0.3916 68.6636 18.1697
0.2292 1.8701 288 0.3917 68.1106 17.9965
0.277 1.8766 289 0.3914 68.4793 18.0898
0.2191 1.8831 290 0.3914 67.8341 18.0365
0.2994 1.8896 291 0.3912 68.8479 18.1697
0.2391 1.8961 292 0.3918 68.1106 17.9832
0.256 1.9026 293 0.3913 68.0184 17.9832
0.1817 1.9091 294 0.3914 68.5714 18.0631
0.2465 1.9156 295 0.3913 68.2028 17.9965
0.2004 1.9221 296 0.3912 68.2028 18.0232
0.1845 1.9286 297 0.3912 67.9263 17.9166
0.2721 1.9351 298 0.3910 67.8341 17.9299
0.2402 1.9416 299 0.3912 67.8341 17.9299
0.2178 1.9481 300 0.3915 68.0184 17.9566
0.2083 1.9545 301 0.3912 68.2028 18.0232
0.2508 1.9610 302 0.3914 68.1106 17.9699
0.2782 1.9675 303 0.3909 67.7419 17.9299
0.2763 1.9740 304 0.3911 67.9263 17.9033
0.2028 1.9805 305 0.3913 68.1106 18.0232
0.2054 1.9870 306 0.3913 67.9263 17.9166
0.2165 1.9935 307 0.3912 67.9263 17.9566
0.2734 2.0 308 0.3915 68.2949 17.9832

Framework versions

  • Transformers 4.57.3
  • Pytorch 2.9.1+cu128
  • Datasets 4.4.1
  • Tokenizers 0.22.1
Downloads last month
1
Safetensors
Model size
2B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for nkkbr/whisper-large-v3-zatoichi-ja-JDG_ver_20260217_lr_2.8e-5

Finetuned
(816)
this model