Description
Paramanu-Bangla is a 108 million parameters open source monolingual generative pretrained Language Model for Bangla/Bengali.
This is a pretrained model from scratch at a context size of 1024.
This model is not either chat-tuned or fine-tuned.
We recommend to fine-tune/chat-tune this pretrained model on Bangla/Bengali chat or Bangla NLP datasets.
This model is strictly prohibited to use for commercial purposes.
If you use our model, please cite our paper Niyogi et al., 2026
Model Architecture
Transformer Decoder Auto Regressive Model
Limitations
The model was trained on data that contains toxic language, unsafe content, and societal biases originally crawled from the internet. Therefore, the model may amplify those biases and return toxic responses especially when prompted with toxic prompts. The model may generate answers that may be inaccurate, omit key information, or include irrelevant or redundant text producing socially unacceptable or undesirable text, even if the prompt itself does not include anything explicitly offensive.
Citations
@misc{niyogi2026paramanucompactcompetitivemonolingual,
title={Paramanu: Compact and Competitive Monolingual Language Models for Low-Resource Morphologically Rich Indian Languages},
author={Mitodru Niyogi and Eric Gaussier and Arnab Bhattacharya},
year={2026},
eprint={2401.18034},
archivePrefix={arXiv},
primaryClass={cs.CL},
url={https://arxiv.org/abs/2401.18034},
}
- Downloads last month
- -