Configuration Parsing Warning:In config.json: "quantization_config.bits" must be an integer
llmfan46 / Qwen3.5-27B-ultra-uncensored-heretic-v2
Information
2.06bpw exl3 quantization of Qwen3.5-27B-ultra-uncensored-heretic-v2 via
exllamav3.
repo generated automatically with ezexl3.
repo generated automatically with ezexl3.
Repo Data
CLI Download
hf download UnstableLlama/Qwen3.5-27B-ultra-uncensored-heretic-v2-exl3-2.06bpw --local-dir ./Qwen3.5-27B-ultra-uncensored-heretic-v2-exl3-2.06bpw
- Downloads last month
- 29
Inference Providers NEW
This model isn't deployed by any Inference Provider. 馃檵 Ask for provider support
Model tree for UnstableLlama/Qwen3.5-27B-ultra-uncensored-heretic-v2-exl3-2.06bpw
Base model
Qwen/Qwen3.5-27B