router-qwen3-0.6b-text-only-v4
This model is a fine-tuned version of Qwen/Qwen3-0.6B on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.5576
- Accuracy: 0.7273
- Precision: 0.7241
- Recall: 0.7273
- F1: 0.7067
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 16
- eval_batch_size: 32
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 2
Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 |
|---|---|---|---|---|---|---|---|
| No log | 0 | 0 | 1.1855 | 0.5909 | 0.5603 | 0.5909 | 0.5667 |
| 0.8449 | 0.1136 | 10 | 0.7082 | 0.7102 | 0.7023 | 0.7102 | 0.6896 |
| 0.8398 | 0.2273 | 20 | 0.5703 | 0.7216 | 0.7262 | 0.7216 | 0.6911 |
| 0.6942 | 0.3409 | 30 | 0.6016 | 0.6989 | 0.6881 | 0.6989 | 0.6864 |
| 0.7332 | 0.4545 | 40 | 0.6146 | 0.6307 | 0.6924 | 0.6307 | 0.6361 |
| 0.636 | 0.5682 | 50 | 0.5402 | 0.7557 | 0.8095 | 0.7557 | 0.7170 |
| 0.5157 | 0.6818 | 60 | 0.5080 | 0.7159 | 0.7088 | 0.7159 | 0.7099 |
| 0.6613 | 0.7955 | 70 | 0.5060 | 0.7330 | 0.7274 | 0.7330 | 0.7182 |
| 0.6174 | 0.9091 | 80 | 0.5203 | 0.7443 | 0.7538 | 0.7443 | 0.7190 |
| 0.5021 | 1.0227 | 90 | 0.5441 | 0.7216 | 0.7134 | 0.7216 | 0.7101 |
| 0.4124 | 1.1364 | 100 | 0.5618 | 0.7330 | 0.7286 | 0.7330 | 0.7161 |
| 0.3623 | 1.25 | 110 | 0.5811 | 0.7159 | 0.7080 | 0.7159 | 0.7084 |
| 0.3416 | 1.3636 | 120 | 0.5897 | 0.6932 | 0.6912 | 0.6932 | 0.6921 |
| 0.2837 | 1.4773 | 130 | 0.5869 | 0.7443 | 0.7388 | 0.7443 | 0.7338 |
| 0.3153 | 1.5909 | 140 | 0.5608 | 0.7330 | 0.7323 | 0.7330 | 0.7116 |
| 0.272 | 1.7045 | 150 | 0.5676 | 0.7386 | 0.7409 | 0.7386 | 0.7166 |
| 0.3459 | 1.8182 | 160 | 0.5586 | 0.7273 | 0.7241 | 0.7273 | 0.7067 |
| 0.368 | 1.9318 | 170 | 0.5576 | 0.7273 | 0.7241 | 0.7273 | 0.7067 |
Framework versions
- Transformers 4.57.1
- Pytorch 2.8.0+cu128
- Datasets 4.2.0
- Tokenizers 0.22.1
- Downloads last month
- 3