alchemai-LFM2.5-1.2B-Instruct-ONNX
A quantized ONNX version of Alchemai LFM2.5 1.2B Instruct
Exported with Liquid AI onnx-export
- Downloads last month
- 62
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support
Model tree for alpaim/alchemai-LFM2.5-1.2B-Instruct-ONNX
Base model
LiquidAI/LFM2.5-1.2B-Base Finetuned
LiquidAI/LFM2.5-1.2B-Instruct Finetuned
unsloth/LFM2.5-1.2B-Instruct Finetuned
alpaim/alchemai-LFM2.5-1.2B-Instruct