EXL2 Quants of ddh0/Cassiopeia-70B

EXL2 quants of ddh0/Cassiopeia-70B using exllamav2 for quantization.

Quants

Quant(Revision) Bits per Weight Head Bits
4.0_H6 4.0 6
4.45_H6 4.45 6
5.35_H6 5.35 6
6.7_H6 6.7 6

Downloading quants with huggingface-cli

Click to view download instructions

Install hugginface-cli:

pip install -U "huggingface_hub[cli]"

Download quant by targeting the specific quant revision (branch):

huggingface-cli download ArtusDev/ddh0_Cassiopeia-70B-EXL2 --revision "5.0bpw_H6" --local-dir ./
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for ArtusDev/ddh0_Cassiopeia-70B-EXL2

Quantized
(7)
this model