Sfaya-W-Ary-Arz-MSA

This model is a fine-tuned version of CAMeL-Lab/bert-base-arabic-camelbert-da on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0158
  • Accuracy: 0.9978

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 200
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.4027 0.1862 100 0.0415 0.9923
0.0701 0.3724 200 0.0637 0.9923
0.0567 0.5587 300 0.0686 0.9945
0.0365 0.7449 400 0.0605 0.9934
0.036 0.9311 500 0.0351 0.9934
0.0154 1.1173 600 0.0158 0.9978
0.0182 1.3035 700 0.0219 0.9967
0.0235 1.4898 800 0.0956 0.9934
0.0237 1.6760 900 0.0360 0.9956
0.0105 1.8622 1000 0.0240 0.9956
0.0093 2.0484 1100 0.0738 0.9934
0.009 2.2346 1200 0.0278 0.9945
0.0096 2.4209 1300 0.0333 0.9956
0.0164 2.6071 1400 0.0998 0.9945
0.0206 2.7933 1500 0.0747 0.9945
0.0024 2.9795 1600 0.0524 0.9912
0.0002 3.1657 1700 0.1051 0.9945
0.0068 3.3520 1800 0.0264 0.9967
0.0034 3.5382 1900 0.0293 0.9945
0.0087 3.7244 2000 0.0222 0.9956
0.0069 3.9106 2100 0.0305 0.9967
0.0036 4.0968 2200 0.0108 0.9967
0.0025 4.2831 2300 0.0367 0.9967
0.0 4.4693 2400 0.0269 0.9967
0.0044 4.6555 2500 0.0240 0.9967
0.0 4.8417 2600 0.0297 0.9967
0.0 5.0279 2700 0.0608 0.9956
0.0 5.2142 2800 0.0592 0.9956
0.0 5.4004 2900 0.0284 0.9967
0.0 5.5866 3000 0.0285 0.9967
0.0 5.7728 3100 0.0285 0.9967
0.0 5.9590 3200 0.0287 0.9967
0.0016 6.1453 3300 0.0294 0.9967
0.0 6.3315 3400 0.0226 0.9978
0.0 6.5177 3500 0.0227 0.9978
0.0 6.7039 3600 0.0228 0.9978
0.0 6.8901 3700 0.0232 0.9978
0.0 7.0764 3800 0.0233 0.9978
0.0 7.2626 3900 0.0235 0.9978
0.0 7.4488 4000 0.0235 0.9978
0.0 7.6350 4100 0.0235 0.9978
0.0 7.8212 4200 0.0236 0.9978
0.0 8.0074 4300 0.0239 0.9978
0.0 8.1937 4400 0.0240 0.9978
0.0 8.3799 4500 0.0240 0.9978
0.0 8.5661 4600 0.0241 0.9978
0.0 8.7523 4700 0.0241 0.9978
0.0025 8.9385 4800 0.0355 0.9934
0.0 9.1248 4900 0.0310 0.9945
0.0 9.3110 5000 0.0291 0.9945
0.0 9.4972 5100 0.0290 0.9956
0.0 9.6834 5200 0.0290 0.9956
0.0 9.8696 5300 0.0290 0.9956

Framework versions

  • Transformers 4.45.2
  • Pytorch 2.5.1+cu124
  • Datasets 3.1.0
  • Tokenizers 0.20.3
Downloads last month
4
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for atlasia/Sfaya-W-Ary-Arz-MSA

Finetuned
(7)
this model