Configuration Parsing Warning:In config.json: "quantization_config.bits" must be an integer

fits into 24gb with 24576 ctx (q4)

set rope_alpha to 3.75

Downloads last month
8
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for waldie/gemma-2-27b-it-SimPO-37K-5.5bpw-h6-exl2

Quantized
(5)
this model