timesformer-base-finetuned-k400-finetuned-yt_short_classification-sample_rate16
This model is a fine-tuned version of facebook/timesformer-base-finetuned-k400 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.2786
- Accuracy: 0.9220
- 0 Precision: 0.8978
- 0 Recall: 0.9432
- 0 F1-score: 0.9200
- 0 Support: 5796.0
- 1 Precision: 0.9460
- 1 Recall: 0.9026
- 1 F1-score: 0.9238
- 1 Support: 6389.0
- Accuracy F1-score: 0.9220
- Macro avg Precision: 0.9219
- Macro avg Recall: 0.9229
- Macro avg F1-score: 0.9219
- Macro avg Support: 12185.0
- Weighted avg Precision: 0.9231
- Weighted avg Recall: 0.9220
- Weighted avg F1-score: 0.9220
- Weighted avg Support: 12185.0
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 19800
Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | 0 Precision | 0 Recall | 0 F1-score | 0 Support | 1 Precision | 1 Recall | 1 F1-score | 1 Support | Accuracy F1-score | Macro avg Precision | Macro avg Recall | Macro avg F1-score | Macro avg Support | Weighted avg Precision | Weighted avg Recall | Weighted avg F1-score | Weighted avg Support |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0.6099 | 0.0501 | 991 | 0.4427 | 0.7965 | 0.8486 | 0.6963 | 0.7650 | 5796.0 | 0.7631 | 0.8873 | 0.8205 | 6389.0 | 0.7965 | 0.8059 | 0.7918 | 0.7927 | 12185.0 | 0.8038 | 0.7965 | 0.7941 | 12185.0 |
| 0.462 | 1.0501 | 1982 | 0.4645 | 0.7883 | 0.7250 | 0.8941 | 0.8007 | 5796.0 | 0.8781 | 0.6923 | 0.7742 | 6389.0 | 0.7883 | 0.8015 | 0.7932 | 0.7874 | 12185.0 | 0.8053 | 0.7883 | 0.7868 | 12185.0 |
| 0.3425 | 2.0501 | 2973 | 0.4453 | 0.8008 | 0.7376 | 0.9023 | 0.8117 | 5796.0 | 0.8889 | 0.7087 | 0.7886 | 6389.0 | 0.8008 | 0.8132 | 0.8055 | 0.8002 | 12185.0 | 0.8169 | 0.8008 | 0.7996 | 12185.0 |
| 0.4597 | 3.0501 | 3964 | 0.4212 | 0.8135 | 0.8965 | 0.6872 | 0.7780 | 5796.0 | 0.7658 | 0.9280 | 0.8391 | 6389.0 | 0.8135 | 0.8311 | 0.8076 | 0.8086 | 12185.0 | 0.8280 | 0.8135 | 0.8101 | 12185.0 |
| 0.34 | 4.0501 | 4955 | 0.3782 | 0.8435 | 0.8672 | 0.7923 | 0.8281 | 5796.0 | 0.8253 | 0.8900 | 0.8564 | 6389.0 | 0.8435 | 0.8462 | 0.8411 | 0.8422 | 12185.0 | 0.8452 | 0.8435 | 0.8429 | 12185.0 |
| 0.2322 | 5.0501 | 5946 | 0.3786 | 0.8507 | 0.8105 | 0.8956 | 0.8509 | 5796.0 | 0.8953 | 0.8100 | 0.8505 | 6389.0 | 0.8507 | 0.8529 | 0.8528 | 0.8507 | 12185.0 | 0.8550 | 0.8507 | 0.8507 | 12185.0 |
| 0.3278 | 6.0501 | 6937 | 0.5580 | 0.7881 | 0.6988 | 0.9746 | 0.8140 | 5796.0 | 0.9642 | 0.6189 | 0.7539 | 6389.0 | 0.7881 | 0.8315 | 0.7968 | 0.7839 | 12185.0 | 0.8379 | 0.7881 | 0.7825 | 12185.0 |
| 0.3531 | 7.0501 | 7928 | 0.4057 | 0.8516 | 0.7948 | 0.9275 | 0.8561 | 5796.0 | 0.9225 | 0.7828 | 0.8469 | 6389.0 | 0.8516 | 0.8587 | 0.8551 | 0.8515 | 12185.0 | 0.8618 | 0.8516 | 0.8513 | 12185.0 |
| 0.2513 | 8.0501 | 8919 | 0.3643 | 0.8574 | 0.8021 | 0.9296 | 0.8612 | 5796.0 | 0.9254 | 0.7920 | 0.8535 | 6389.0 | 0.8574 | 0.8638 | 0.8608 | 0.8573 | 12185.0 | 0.8668 | 0.8574 | 0.8572 | 12185.0 |
| 0.2592 | 9.0501 | 9910 | 0.2819 | 0.8968 | 0.8775 | 0.9099 | 0.8934 | 5796.0 | 0.9155 | 0.8848 | 0.8999 | 6389.0 | 0.8968 | 0.8965 | 0.8974 | 0.8967 | 12185.0 | 0.8974 | 0.8968 | 0.8968 | 12185.0 |
| 0.405 | 10.0501 | 10901 | 0.3755 | 0.8693 | 0.8229 | 0.9241 | 0.8705 | 5796.0 | 0.9225 | 0.8195 | 0.8680 | 6389.0 | 0.8693 | 0.8727 | 0.8718 | 0.8693 | 12185.0 | 0.8751 | 0.8693 | 0.8692 | 12185.0 |
| 0.1971 | 11.0501 | 11892 | 0.3913 | 0.8762 | 0.8102 | 0.9662 | 0.8813 | 5796.0 | 0.9628 | 0.7946 | 0.8707 | 6389.0 | 0.8762 | 0.8865 | 0.8804 | 0.8760 | 12185.0 | 0.8902 | 0.8762 | 0.8758 | 12185.0 |
| 0.3499 | 12.0501 | 12883 | 0.2850 | 0.8971 | 0.8939 | 0.8892 | 0.8915 | 5796.0 | 0.9000 | 0.9042 | 0.9021 | 6389.0 | 0.8971 | 0.8969 | 0.8967 | 0.8968 | 12185.0 | 0.8971 | 0.8971 | 0.8971 | 12185.0 |
| 0.1551 | 13.0501 | 13874 | 0.2965 | 0.9004 | 0.8876 | 0.9051 | 0.8963 | 5796.0 | 0.9124 | 0.8961 | 0.9041 | 6389.0 | 0.9004 | 0.9000 | 0.9006 | 0.9002 | 12185.0 | 0.9006 | 0.9004 | 0.9004 | 12185.0 |
| 0.1379 | 14.0501 | 14865 | 0.3175 | 0.9028 | 0.8661 | 0.9412 | 0.9021 | 5796.0 | 0.9421 | 0.8681 | 0.9036 | 6389.0 | 0.9028 | 0.9041 | 0.9046 | 0.9028 | 12185.0 | 0.9060 | 0.9028 | 0.9029 | 12185.0 |
| 0.1979 | 15.0501 | 15856 | 0.5168 | 0.8652 | 0.7884 | 0.9793 | 0.8736 | 5796.0 | 0.9759 | 0.7616 | 0.8556 | 6389.0 | 0.8652 | 0.8822 | 0.8705 | 0.8646 | 12185.0 | 0.8867 | 0.8652 | 0.8641 | 12185.0 |
| 0.043 | 16.0501 | 16847 | 0.3269 | 0.9093 | 0.8676 | 0.9551 | 0.9093 | 5796.0 | 0.9552 | 0.8677 | 0.9094 | 6389.0 | 0.9093 | 0.9114 | 0.9114 | 0.9093 | 12185.0 | 0.9135 | 0.9093 | 0.9093 | 12185.0 |
| 0.079 | 17.0501 | 17838 | 0.2941 | 0.9156 | 0.8929 | 0.9346 | 0.9133 | 5796.0 | 0.9381 | 0.8983 | 0.9177 | 6389.0 | 0.9156 | 0.9155 | 0.9164 | 0.9155 | 12185.0 | 0.9166 | 0.9156 | 0.9156 | 12185.0 |
| 0.2818 | 18.0501 | 18829 | 0.3127 | 0.9137 | 0.8751 | 0.9550 | 0.9133 | 5796.0 | 0.9555 | 0.8763 | 0.9142 | 6389.0 | 0.9137 | 0.9153 | 0.9157 | 0.9137 | 12185.0 | 0.9172 | 0.9137 | 0.9138 | 12185.0 |
| 0.0789 | 19.0490 | 19800 | 0.2786 | 0.9220 | 0.8978 | 0.9432 | 0.9200 | 5796.0 | 0.9460 | 0.9026 | 0.9238 | 6389.0 | 0.9220 | 0.9219 | 0.9229 | 0.9219 | 12185.0 | 0.9231 | 0.9220 | 0.9220 | 12185.0 |
Framework versions
- Transformers 4.46.3
- Pytorch 2.0.0+cu117
- Datasets 3.1.0
- Tokenizers 0.20.3
- Downloads last month
- 4
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support
Model tree for Kartikeya/timesformer-base-finetuned-k400-finetuned-yt_short_classification-sample_rate16
Base model
facebook/timesformer-base-finetuned-k400