| ====================================================================== |
| MODEL 4.1: mBERT Fine-tuning Summary |
| ====================================================================== |
| Member: 4 |
| Date: 2025-12-25 10:36:58 |
|
|
| SPECIFICATION COMPLIANCE: |
| Model: bert-base-multilingual-cased |
| Max Length: 128 |
| Batch Size: 16 |
| Learning Rate: 2e-05 |
| Epochs: 3/5 |
| Seed: 42 (reproducibility) |
|
|
| TEST RESULTS (FINAL): |
| Macro F1: 0.6448 |
| Macro Precision: 0.6496 |
| Macro Recall: 0.6441 |
| Weighted F1: 0.7570 |
| Accuracy: 0.7542 |
|
|
| TRAINING TIME: |
| Total: 23.11 minutes |
| Per Epoch: 277.29s |
|
|