emotion-english-distilroberta-base
This model was trained from scratch on an unknown dataset. It achieves the following results on the evaluation set:
- eval_loss: 0.4031
- eval_accuracy: 0.8595
- eval_f1: [0.748936170212766, 0.8347107438016529, 0.8489795918367347, 0.9195402298850576, 0.8264462809917354, 0.8995633187772925, 0.9380530973451328]
- eval_precision: [0.7213114754098361, 0.8487394957983193, 0.8455284552845529, 0.9230769230769231, 0.8771929824561403, 0.8803418803418803, 0.9217391304347826]
- eval_recall: [0.7787610619469026, 0.8211382113821138, 0.8524590163934426, 0.916030534351145, 0.78125, 0.9196428571428571, 0.954954954954955]
- eval_runtime: 1.8443
- eval_samples_per_second: 455.467
- eval_steps_per_second: 28.738
- step: 0
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
Framework versions
- Transformers 4.35.2
- Pytorch 2.1.0+cu118
- Datasets 2.15.0
- Tokenizers 0.15.0
- Downloads last month
- 4