Liquid Claude

LFM2.5-1.2B-Distilled-Claude-4.6 (Liquid Claude)

LFM2.5-1.2B-Distilled-Claude-4.6 (Liquid Claude) is a distillation of Claude into LFM2.5-1.2B-Thinking via LoRA.

Use model

from transformers import pipeline

pipe = pipeline("text-generation", model="FlameF0X/LFM2.5-1.2B-Distilled-Claude-4.6")
messages = [
    {"role": "system", "content": "You are a helpful assistant."}, # I RECOMMEND TO KEEP THIS FOR STABILITY! But you can change the system.
    {"role": "user", "content": "Who are you?"},
]
pipe(messages)

Sample chat:

image (Ignore the fact that it took 1min to reason, i got a i3-6006u / 12GB as hardware and running the f16 quantization)

Benchmark

The results are in progress.

Downloads last month
725
Safetensors
Model size
1B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for FlameF0X/LFM2.5-1.2B-Distilled-Claude-4.6

Finetuned
(24)
this model
Quantizations
2 models

Space using FlameF0X/LFM2.5-1.2B-Distilled-Claude-4.6 1

Collection including FlameF0X/LFM2.5-1.2B-Distilled-Claude-4.6