Nemotron-3 Nano 30B — Arabic Multitask SFT
Model Details
- Developed by: Mushari Alothman
- Model type: Causal Language Model
- Language(s): Arabic, English
- License: Apache 2.0
- Finetuned from: NVIDIA Nemotron-3 Nano 30B
This model is a Supervised Fine-Tuned (SFT) version of NVIDIA Nemotron-3 Nano 30B optimized for high-accuracy Arabic-first understanding with strong performance across multitask settings.
It is designed to produce clean, well-structured, and properly terminated responses aligned with ChatML-style training.
Intended Uses
Direct Use
- Arabic & English MCQ answering (A–H format)
- Context-based QA / RAG
- Instruction following
- Saudi-domain knowledge understanding
How to Use
Transformers
from transformers import AutoTokenizer, AutoModelForCausalLM
model_id = "Mushari440/Nemotron-3-Nano-30B-sft"
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForCausalLM.from_pretrained(
model_id,
torch_dtype="bfloat16",
device_map="auto"
)
prompt = "<|im_start|>user\nما عاصمة السعودية؟\n<|im_end|>\n<|im_start|>assistant\n"
inputs = tokenizer(prompt, return_tensors="pt")
outputs = model.generate(**inputs, max_new_tokens=100)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
- Downloads last month
- 356