--- library_name: transformers license: mit base_model: FacebookAI/roberta-large tags: - generated_from_trainer metrics: - accuracy - precision - recall - f1 model-index: - name: roberta-Validation-goodareas-eval_FeedbackESConv5pp_CARE10pp-sweeps-current results: [] --- # roberta-Validation-goodareas-eval_FeedbackESConv5pp_CARE10pp-sweeps-current This model is a fine-tuned version of [FacebookAI/roberta-large](https://huggingface.co/FacebookAI/roberta-large) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.7517 - Accuracy: 0.8331 - Precision: 0.4610 - Recall: 0.6017 - F1: 0.5221 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 3.263554762497362e-06 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 | |:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:------:| | 0.533 | 1.0 | 486 | 0.3998 | 0.8485 | 0.0 | 0.0 | 0.0 | | 0.4106 | 2.0 | 972 | 0.3184 | 0.8601 | 0.5714 | 0.3051 | 0.3978 | | 0.3506 | 3.0 | 1458 | 0.3458 | 0.8665 | 0.5778 | 0.4407 | 0.5 | | 0.3103 | 4.0 | 1944 | 0.3449 | 0.8460 | 0.4926 | 0.5678 | 0.5276 | | 0.2684 | 5.0 | 2430 | 0.3504 | 0.8472 | 0.4959 | 0.5085 | 0.5021 | | 0.2295 | 6.0 | 2916 | 0.4242 | 0.8588 | 0.5299 | 0.6017 | 0.5635 | | 0.1905 | 7.0 | 3402 | 0.5236 | 0.8549 | 0.5191 | 0.5763 | 0.5462 | | 0.1614 | 8.0 | 3888 | 0.6659 | 0.8434 | 0.4859 | 0.5847 | 0.5308 | | 0.1499 | 9.0 | 4374 | 0.7185 | 0.8421 | 0.4825 | 0.5847 | 0.5287 | | 0.1347 | 10.0 | 4860 | 0.7517 | 0.8331 | 0.4610 | 0.6017 | 0.5221 | ### Framework versions - Transformers 4.49.0 - Pytorch 2.5.1+cu124 - Datasets 2.21.0 - Tokenizers 0.21.0