vit-base-kidney-stone-4-Jonathan_El-Beze_-w256_1k_v1-_MIX
This model is a fine-tuned version of google/vit-base-patch16-224-in21k on the imagefolder dataset. It achieves the following results on the evaluation set:
- Loss: 0.5049
- Accuracy: 0.9046
- Precision: 0.9119
- Recall: 0.9046
- F1: 0.9032
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 15
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 |
|---|---|---|---|---|---|---|---|
| 0.3582 | 0.1667 | 100 | 0.6579 | 0.7746 | 0.8010 | 0.7746 | 0.7645 |
| 0.152 | 0.3333 | 200 | 0.8315 | 0.7492 | 0.8132 | 0.7492 | 0.7457 |
| 0.1642 | 0.5 | 300 | 0.6003 | 0.8383 | 0.8506 | 0.8383 | 0.8390 |
| 0.088 | 0.6667 | 400 | 0.6790 | 0.81 | 0.8451 | 0.81 | 0.8064 |
| 0.0268 | 0.8333 | 500 | 0.5720 | 0.8596 | 0.8815 | 0.8596 | 0.8560 |
| 0.0503 | 1.0 | 600 | 0.5348 | 0.8671 | 0.8820 | 0.8671 | 0.8661 |
| 0.1888 | 1.1667 | 700 | 0.7472 | 0.8225 | 0.8405 | 0.8225 | 0.8233 |
| 0.0983 | 1.3333 | 800 | 0.9774 | 0.7875 | 0.8528 | 0.7875 | 0.7892 |
| 0.1343 | 1.5 | 900 | 0.9097 | 0.7983 | 0.8273 | 0.7983 | 0.7919 |
| 0.0681 | 1.6667 | 1000 | 0.6611 | 0.845 | 0.8639 | 0.845 | 0.8432 |
| 0.0768 | 1.8333 | 1100 | 0.8916 | 0.8133 | 0.8677 | 0.8133 | 0.8163 |
| 0.0447 | 2.0 | 1200 | 0.7102 | 0.8462 | 0.8541 | 0.8462 | 0.8450 |
| 0.0417 | 2.1667 | 1300 | 0.7364 | 0.8438 | 0.8549 | 0.8438 | 0.8404 |
| 0.0049 | 2.3333 | 1400 | 1.1942 | 0.7567 | 0.8037 | 0.7567 | 0.7570 |
| 0.1265 | 2.5 | 1500 | 0.5920 | 0.8812 | 0.8828 | 0.8812 | 0.8793 |
| 0.0117 | 2.6667 | 1600 | 0.7807 | 0.8421 | 0.8723 | 0.8421 | 0.8394 |
| 0.0256 | 2.8333 | 1700 | 0.5049 | 0.9046 | 0.9119 | 0.9046 | 0.9032 |
| 0.0776 | 3.0 | 1800 | 0.7417 | 0.8558 | 0.8685 | 0.8558 | 0.8564 |
| 0.0535 | 3.1667 | 1900 | 0.6490 | 0.8717 | 0.8771 | 0.8717 | 0.8711 |
| 0.1292 | 3.3333 | 2000 | 0.7179 | 0.87 | 0.8759 | 0.87 | 0.8681 |
| 0.0013 | 3.5 | 2100 | 0.6103 | 0.8921 | 0.8946 | 0.8921 | 0.8918 |
| 0.0015 | 3.6667 | 2200 | 0.8573 | 0.8558 | 0.8668 | 0.8558 | 0.8523 |
| 0.0006 | 3.8333 | 2300 | 0.6061 | 0.8896 | 0.8993 | 0.8896 | 0.8891 |
| 0.0015 | 4.0 | 2400 | 0.7029 | 0.8658 | 0.8758 | 0.8658 | 0.8638 |
| 0.0005 | 4.1667 | 2500 | 0.7734 | 0.8804 | 0.8928 | 0.8804 | 0.8808 |
| 0.0019 | 4.3333 | 2600 | 0.7360 | 0.8742 | 0.8911 | 0.8742 | 0.8746 |
| 0.001 | 4.5 | 2700 | 0.8893 | 0.8358 | 0.8531 | 0.8358 | 0.8346 |
| 0.0267 | 4.6667 | 2800 | 0.8946 | 0.8612 | 0.8830 | 0.8612 | 0.8545 |
| 0.0004 | 4.8333 | 2900 | 0.6665 | 0.8983 | 0.9081 | 0.8983 | 0.8981 |
| 0.0015 | 5.0 | 3000 | 0.7736 | 0.8788 | 0.8931 | 0.8788 | 0.8774 |
| 0.0005 | 5.1667 | 3100 | 0.7346 | 0.8846 | 0.8936 | 0.8846 | 0.8854 |
| 0.0005 | 5.3333 | 3200 | 1.0391 | 0.8512 | 0.8657 | 0.8512 | 0.8506 |
| 0.1055 | 5.5 | 3300 | 1.8161 | 0.73 | 0.7998 | 0.73 | 0.7148 |
| 0.0007 | 5.6667 | 3400 | 1.1328 | 0.8392 | 0.8677 | 0.8392 | 0.8361 |
| 0.0108 | 5.8333 | 3500 | 0.7424 | 0.8788 | 0.8821 | 0.8788 | 0.8782 |
| 0.0021 | 6.0 | 3600 | 1.0478 | 0.8271 | 0.8424 | 0.8271 | 0.8239 |
| 0.01 | 6.1667 | 3700 | 1.0144 | 0.8475 | 0.8719 | 0.8475 | 0.8478 |
| 0.0014 | 6.3333 | 3800 | 0.7536 | 0.8708 | 0.8837 | 0.8708 | 0.8697 |
| 0.0005 | 6.5 | 3900 | 0.9003 | 0.8567 | 0.8758 | 0.8567 | 0.8544 |
| 0.0003 | 6.6667 | 4000 | 0.8318 | 0.8667 | 0.8816 | 0.8667 | 0.8660 |
| 0.0003 | 6.8333 | 4100 | 0.8213 | 0.8679 | 0.8817 | 0.8679 | 0.8673 |
| 0.0003 | 7.0 | 4200 | 0.8114 | 0.8721 | 0.8849 | 0.8721 | 0.8716 |
| 0.0003 | 7.1667 | 4300 | 0.8461 | 0.8683 | 0.8825 | 0.8683 | 0.8681 |
| 0.0002 | 7.3333 | 4400 | 0.8416 | 0.8692 | 0.8820 | 0.8692 | 0.8690 |
| 0.048 | 7.5 | 4500 | 1.1867 | 0.8163 | 0.8539 | 0.8163 | 0.8168 |
| 0.0373 | 7.6667 | 4600 | 0.8870 | 0.8596 | 0.8829 | 0.8596 | 0.8587 |
| 0.0004 | 7.8333 | 4700 | 1.1816 | 0.7913 | 0.8061 | 0.7913 | 0.7769 |
| 0.0013 | 8.0 | 4800 | 1.2743 | 0.8087 | 0.8456 | 0.8087 | 0.7974 |
| 0.0002 | 8.1667 | 4900 | 0.8387 | 0.8712 | 0.8773 | 0.8712 | 0.8692 |
| 0.0002 | 8.3333 | 5000 | 0.8463 | 0.8688 | 0.8732 | 0.8688 | 0.8673 |
| 0.0002 | 8.5 | 5100 | 0.8732 | 0.8721 | 0.8751 | 0.8721 | 0.8713 |
| 0.0002 | 8.6667 | 5200 | 0.9575 | 0.8546 | 0.8654 | 0.8546 | 0.8539 |
| 0.0002 | 8.8333 | 5300 | 0.9553 | 0.8654 | 0.8651 | 0.8654 | 0.8646 |
| 0.0005 | 9.0 | 5400 | 0.9674 | 0.8583 | 0.8681 | 0.8583 | 0.8586 |
| 0.0002 | 9.1667 | 5500 | 0.7823 | 0.885 | 0.8842 | 0.885 | 0.8842 |
| 0.0002 | 9.3333 | 5600 | 0.9682 | 0.8621 | 0.8837 | 0.8621 | 0.8600 |
| 0.0002 | 9.5 | 5700 | 0.8930 | 0.8629 | 0.8739 | 0.8629 | 0.8616 |
| 0.0002 | 9.6667 | 5800 | 1.1100 | 0.8475 | 0.8764 | 0.8475 | 0.8417 |
| 0.0001 | 9.8333 | 5900 | 0.9290 | 0.8646 | 0.8646 | 0.8646 | 0.8634 |
| 0.0001 | 10.0 | 6000 | 0.9349 | 0.8629 | 0.8633 | 0.8629 | 0.8617 |
| 0.0001 | 10.1667 | 6100 | 0.9423 | 0.8629 | 0.8635 | 0.8629 | 0.8617 |
| 0.0001 | 10.3333 | 6200 | 0.9459 | 0.8633 | 0.8639 | 0.8633 | 0.8622 |
| 0.0001 | 10.5 | 6300 | 0.9522 | 0.8625 | 0.8631 | 0.8625 | 0.8613 |
| 0.0001 | 10.6667 | 6400 | 0.9575 | 0.8629 | 0.8634 | 0.8629 | 0.8617 |
| 0.0001 | 10.8333 | 6500 | 0.9637 | 0.8629 | 0.8638 | 0.8629 | 0.8618 |
| 0.0001 | 11.0 | 6600 | 0.9643 | 0.8642 | 0.8649 | 0.8642 | 0.8631 |
| 0.0001 | 11.1667 | 6700 | 0.9678 | 0.8646 | 0.8653 | 0.8646 | 0.8635 |
| 0.0001 | 11.3333 | 6800 | 0.9722 | 0.8646 | 0.8654 | 0.8646 | 0.8635 |
| 0.0001 | 11.5 | 6900 | 0.9772 | 0.8633 | 0.8642 | 0.8633 | 0.8623 |
| 0.0001 | 11.6667 | 7000 | 0.9795 | 0.8646 | 0.8653 | 0.8646 | 0.8635 |
| 0.0001 | 11.8333 | 7100 | 0.9828 | 0.8642 | 0.8650 | 0.8642 | 0.8631 |
| 0.0001 | 12.0 | 7200 | 0.9851 | 0.8646 | 0.8654 | 0.8646 | 0.8635 |
| 0.0001 | 12.1667 | 7300 | 0.9879 | 0.8646 | 0.8654 | 0.8646 | 0.8635 |
| 0.0001 | 12.3333 | 7400 | 0.9903 | 0.8646 | 0.8654 | 0.8646 | 0.8635 |
| 0.0001 | 12.5 | 7500 | 0.9937 | 0.865 | 0.8658 | 0.865 | 0.8639 |
| 0.0001 | 12.6667 | 7600 | 0.9963 | 0.865 | 0.8658 | 0.865 | 0.8639 |
| 0.0001 | 12.8333 | 7700 | 0.9989 | 0.8646 | 0.8654 | 0.8646 | 0.8635 |
| 0.0001 | 13.0 | 7800 | 1.0018 | 0.8646 | 0.8654 | 0.8646 | 0.8635 |
| 0.0001 | 13.1667 | 7900 | 1.0047 | 0.8646 | 0.8654 | 0.8646 | 0.8635 |
| 0.0001 | 13.3333 | 8000 | 1.0069 | 0.8646 | 0.8654 | 0.8646 | 0.8635 |
| 0.0001 | 13.5 | 8100 | 1.0088 | 0.8646 | 0.8654 | 0.8646 | 0.8635 |
| 0.0001 | 13.6667 | 8200 | 1.0108 | 0.8646 | 0.8654 | 0.8646 | 0.8635 |
| 0.0001 | 13.8333 | 8300 | 1.0124 | 0.8646 | 0.8654 | 0.8646 | 0.8635 |
| 0.0001 | 14.0 | 8400 | 1.0135 | 0.8646 | 0.8654 | 0.8646 | 0.8635 |
| 0.0001 | 14.1667 | 8500 | 1.0150 | 0.8646 | 0.8654 | 0.8646 | 0.8635 |
| 0.0001 | 14.3333 | 8600 | 1.0160 | 0.8646 | 0.8654 | 0.8646 | 0.8635 |
| 0.0001 | 14.5 | 8700 | 1.0172 | 0.8646 | 0.8654 | 0.8646 | 0.8635 |
| 0.0001 | 14.6667 | 8800 | 1.0178 | 0.8646 | 0.8654 | 0.8646 | 0.8635 |
| 0.0001 | 14.8333 | 8900 | 1.0183 | 0.8646 | 0.8654 | 0.8646 | 0.8635 |
| 0.0001 | 15.0 | 9000 | 1.0184 | 0.8646 | 0.8654 | 0.8646 | 0.8635 |
Framework versions
- Transformers 4.48.2
- Pytorch 2.6.0+cu126
- Datasets 3.1.0
- Tokenizers 0.21.0
- Downloads last month
- 2
Model tree for Ivanrs/vit-base-kidney-stone-4-Jonathan_El-Beze_-w256_1k_v1-_MIX
Base model
google/vit-base-patch16-224-in21kEvaluation results
- Accuracy on imagefoldertest set self-reported0.905
- Precision on imagefoldertest set self-reported0.912
- Recall on imagefoldertest set self-reported0.905
- F1 on imagefoldertest set self-reported0.903