meta-llama/Meta-Llama-3-8B W4A4 (lf=1, seed=1) - ce

This model was quantized with lambda factor 1.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for RyanLucas3/ptq-meta-llama-Meta-Llama-3-8B-W4A4-lf1-seed1-ce

Finetuned
(552)
this model