EXL2 Quants of ddh0/Cassiopeia-70B
EXL2 quants of ddh0/Cassiopeia-70B using exllamav2 for quantization.
Quants
Downloading quants with huggingface-cli
Click to view download instructions
Install hugginface-cli:
pip install -U "huggingface_hub[cli]"
Download quant by targeting the specific quant revision (branch):
huggingface-cli download ArtusDev/ddh0_Cassiopeia-70B-EXL2 --revision "5.0bpw_H6" --local-dir ./
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support
Model tree for ArtusDev/ddh0_Cassiopeia-70B-EXL2
Base model
ddh0/Cassiopeia-70B