I trained Llama2-7B after extending its tokenizer by 21,455 token on about 15B farsi text(common crawl, social, papers)
- Downloads last month
- -
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support