Sharded GGUF version of internlm/internlm2_5-1_8b-chat-gguf.
- Downloads last month
- -
Hardware compatibility
Log In to add your hardware
We're not able to determine the quantization variants.
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐ Ask for provider support
Model tree for Felladrin/gguf-sharded-q5_k_m-internlm2_5-1_8b-chat
Base model
internlm/internlm2_5-1_8b-chat-gguf