Sharded GGUF version of bartowski/internlm2_5-1_8b-chat-GGUF.
- Downloads last month
- 1
Hardware compatibility
Log In to add your hardware
We're not able to determine the quantization variants.
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐ Ask for provider support
Model tree for Felladrin/gguf-sharded-Q4_K_S-internlm2_5-1_8b-chat
Base model
internlm/internlm2_5-1_8b-chat Quantized
bartowski/internlm2_5-1_8b-chat-GGUF