Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Buckets new
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up

mesolitica
/
malaysian-mistral-64M-MLM-512

Feature Extraction
Transformers
Safetensors
Malay
mistral
custom_code
text-embeddings-inference
Model card Files Files and versions
xet
Community
  • Malaysian Mistral 64M on MLM task using 512 context length

Malaysian Mistral 64M on MLM task using 512 context length

Replicating https://github.com/McGill-NLP/llm2vec using https://huggingface.co/mesolitica/malaysian-mistral-64M-4096, done by https://github.com/aisyahrzk https://twitter.com/aisyahhhrzk

Source code at https://github.com/mesolitica/malaya/tree/master/session/llm2vec

WandB, https://wandb.ai/aisyahrazak/mistral-64M-mlm?nw=nwuseraisyahrazak

Downloads last month
6
Safetensors
Model size
64.2M params
Tensor type
F32
·
Inference Providers NEW
Feature Extraction
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Collections including mesolitica/malaysian-mistral-64M-MLM-512

Malaysian MaskLM

Collection
Trained on 17B tokens, 81GB of cleaned texts, able to understand standard Malay, local Malay, local Mandarin, Manglish, and local Tamil. • 7 items • Updated Jun 24, 2025

Malaysian LLM2Vec

Collection
Extending Malaysian CausalLM on non-causal masking training, https://arxiv.org/abs/2404.05961 • 5 items • Updated Jun 24, 2025
Company
TOS Privacy About Careers
Website
Models Datasets Spaces Pricing Docs