Update README.md
Browse files
README.md
CHANGED
|
@@ -30,7 +30,6 @@ A 3.14B parameter multilingual language model trained from scratch for **Hebrew,
|
|
| 30 |
| Max Seq Length | 2,048 |
|
| 31 |
| Pretraining Data | 4.48B tokens (HE 40%, AR 20%, FA 20%, EN 20%) |
|
| 32 |
| SFT Data | 36,980 samples (sentiment + translation) |
|
| 33 |
-
| Training Cost | ~$1,500 total (pretraining + SFT) |
|
| 34 |
|
| 35 |
## Key Results
|
| 36 |
|
|
|
|
| 30 |
| Max Seq Length | 2,048 |
|
| 31 |
| Pretraining Data | 4.48B tokens (HE 40%, AR 20%, FA 20%, EN 20%) |
|
| 32 |
| SFT Data | 36,980 samples (sentiment + translation) |
|
|
|
|
| 33 |
|
| 34 |
## Key Results
|
| 35 |
|