Flagstone8878/Qwen3.5-18B-REAP-A3B-Coding (Quantized)
Description
This model is a quantized version of the original model Flagstone8878/Qwen3.5-18B-REAP-A3B-Coding.
Quantization Details
- Quantization Type: int4
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: bfloat16
- bnb_4bit_quant_storage: int8
- Downloads last month
- 4
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support
Model tree for lainlives/Qwen3.5-18B-REAP-A3B-Coding-bnb-4bit
Base model
Qwen/Qwen3.5-35B-A3B-Base Finetuned
Qwen/Qwen3.5-35B-A3B