Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Buckets new
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up

mesolitica
/
malaysian-mistral-191M-MLM-512

Feature Extraction
Transformers
Safetensors
Malay
mistral
custom_code
text-embeddings-inference
Model card Files Files and versions
xet
Community
  • Malaysian Mistral 191M on MLM task using 512 context length

Malaysian Mistral 191M on MLM task using 512 context length

Replicating https://github.com/McGill-NLP/llm2vec using https://huggingface.co/mesolitica/malaysian-mistral-191M-4096, done by https://github.com/aisyahrzk https://twitter.com/aisyahhhrzk

Source code at https://github.com/mesolitica/malaya/tree/master/session/llm2vec

WandB, https://wandb.ai/aisyahrazak/mistral-191M-mlm?nw=nwuseraisyahrazak

Downloads last month
33
Safetensors
Model size
0.2B params
Tensor type
F32
Β·
Inference Providers NEW
Feature Extraction
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Spaces using mesolitica/malaysian-mistral-191M-MLM-512 3

πŸ“š
eduagarcia/multilingual-tokenizer-leaderboard
πŸŒ–
HalalFoodNLP/halalnlp
πŸ•Œ
HalalFoodNLP/flask-halalnlp

Collections including mesolitica/malaysian-mistral-191M-MLM-512

Malaysian MaskLM

Collection
Trained on 17B tokens, 81GB of cleaned texts, able to understand standard Malay, local Malay, local Mandarin, Manglish, and local Tamil. β€’ 7 items β€’ Updated Jun 24, 2025

Malaysian LLM2Vec

Collection
Extending Malaysian CausalLM on non-causal masking training, https://arxiv.org/abs/2404.05961 β€’ 5 items β€’ Updated Jun 24, 2025
Company
TOS Privacy About Careers
Website
Models Datasets Spaces Pricing Docs