BERTimbau finetuning - Portuguese Metaphors
Collection
5 items • Updated
This model is a fine-tuned version of neuralmind/bert-base-portuguese-cased on the None dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|---|---|---|---|---|---|---|---|
| 0.4531 | 1.0 | 23 | 0.2322 | 0.0 | 0.0 | 0.0 | 0.9291 |
| 0.2086 | 2.0 | 46 | 0.1396 | 0.5 | 0.2381 | 0.3226 | 0.9550 |
| 0.1221 | 3.0 | 69 | 0.1000 | 0.6429 | 0.4286 | 0.5143 | 0.9686 |
| 0.0637 | 4.0 | 92 | 0.0829 | 0.8 | 0.5714 | 0.6667 | 0.9741 |
| 0.0336 | 5.0 | 115 | 0.0734 | 0.75 | 0.5714 | 0.6486 | 0.9741 |
| 0.0245 | 6.0 | 138 | 0.0758 | 0.8667 | 0.6190 | 0.7222 | 0.9768 |
| 0.0141 | 7.0 | 161 | 0.0755 | 0.9375 | 0.7143 | 0.8108 | 0.9809 |
| 0.0108 | 8.0 | 184 | 0.0950 | 0.8125 | 0.6190 | 0.7027 | 0.9768 |
| 0.0088 | 9.0 | 207 | 0.0837 | 0.8333 | 0.7143 | 0.7692 | 0.9795 |
Base model
neuralmind/bert-base-portuguese-cased