| ====================================================================== |
| MODEL 4.2: XLM-RoBERTa with Data Augmentation |
| ====================================================================== |
| Date: 2025-12-25 16:18:05 |
|
|
| SPECIFICATION: |
| Model: xlm-roberta-large |
| Batch Size: 8 (Grad Accum: 2) |
| Learning Rate: 1e-05 |
| Max Length: 128 |
| Epochs: 3/5 |
| Seed: 42 |
| Augmentation: Yes (Target minority 5%) |
|
|
| DATA AUGMENTATION: |
| Original train size: 15699 |
| Augmented train size: 16156 |
| Samples added: 457 |
|
|
| TEST RESULTS: |
| Macro F1: 0.6822 |
| Weighted F1: 0.7843 |
| Accuracy: 0.7832 |
|
|
| TRAINING TIME: |
| Total: 81.88 minutes |
| Per epoch: 982.5s |
|
|