BERTimbau finetuning - Portuguese Metaphors
Collection
5 items • Updated
This model is a fine-tuned version of neuralmind/bert-base-portuguese-cased on the None dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|---|---|---|---|---|---|---|---|
| 0.4006 | 1.0 | 23 | 0.2516 | 0.0 | 0.0 | 0.0 | 0.9240 |
| 0.2078 | 2.0 | 46 | 0.1660 | 0.375 | 0.1304 | 0.1935 | 0.9475 |
| 0.1226 | 3.0 | 69 | 0.1353 | 0.7273 | 0.3478 | 0.4706 | 0.9641 |
| 0.0678 | 4.0 | 92 | 0.1082 | 0.5882 | 0.4348 | 0.5 | 0.9682 |
| 0.0383 | 5.0 | 115 | 0.1125 | 0.6667 | 0.4348 | 0.5263 | 0.9710 |
| 0.0231 | 6.0 | 138 | 0.1205 | 0.625 | 0.4348 | 0.5128 | 0.9724 |
| 0.0138 | 7.0 | 161 | 0.1405 | 0.625 | 0.4348 | 0.5128 | 0.9724 |
Base model
neuralmind/bert-base-portuguese-cased