Mistral-Small-24B-NF4
This repository contains a 4-bit quantized version of Mistral-Small-24B, optimized with NF4 (NormalFloat 4) via bitsandbytes. This version is designed to run efficiently on consumer hardware (GPUs with less VRAM) without significant performance loss.
Model Description
Mistral-Small-24B is a powerful language model that balances reasoning capabilities and speed. Using NF4 quantization reduces the model size to approximately 14.2 GB, making it available for setups with 16GB or 24GB VRAM.
- Architecture: Mistral
- Precision: 4-bit (NF4)
- Quantization method:
bitsandbytes - Format: Safetensors
Installation and usage
To use this model you need transformers, bitsandbytes and accelerate.
pip install -U transformers bitsandbytes accelerate
Example of use (Python)
from transformers import AutoModelForCausalLM, AutoTokenizer
import torch
model_id = "ikarius/Mistral-Small-24B-NF4"
# Load the model with NF4 configuration
model = AutoModelForCausalLM.from_pretrained(
model_id,
device_map="auto",
torch_dtype=torch.bfloat16
)
tokenizer = AutoTokenizer.from_pretrained(model_id)
# Simple chat test
messages = [
{"role": "user", "content": "Hi! Can you explain the benefit of NF4 quantization?"}
]
inputs = tokenizer.apply_chat_template(messages, add_generation_prompt=True, return_tensors="pt").to("cuda")
outputs = model.generate(inputs, max_new_tokens=200)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
Files in this repository
model.safetensors: The actual model weights in 4-bit format.config.json&generation_config.json: Configuration files for model architecture and generation.tokenizer.json&tokenizer_config.json: Tokenizer settings for correct text processing.chat_template.jinja: Template for formatting chats (Instruct format).
License
Please see the original Mistral-Small guidelines for terms of use. This model is distributed assuming the user follows the Mistral AI license agreement.
- Downloads last month
- 3
Model tree for ikarius/Mistral-Small-24B-NF4
Base model
mistralai/Mistral-Small-24B-Base-2501