Flagstone8878/Qwen3.5-18B-REAP-A3B-Coding (Quantized)

Description

This model is a quantized version of the original model Flagstone8878/Qwen3.5-18B-REAP-A3B-Coding.

Quantization Details

  • Quantization Type: int4
  • bnb_4bit_quant_type: nf4
  • bnb_4bit_use_double_quant: True
  • bnb_4bit_compute_dtype: bfloat16
  • bnb_4bit_quant_storage: int8
Downloads last month
4
Safetensors
Model size
17B params
Tensor type
F32
·
F16
·
I8
·
U8
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for lainlives/Qwen3.5-18B-REAP-A3B-Coding-bnb-4bit

Quantized
(1)
this model

Dataset used to train lainlives/Qwen3.5-18B-REAP-A3B-Coding-bnb-4bit