120B model?

#21
by jacek2024 - opened

Could you also create a 120B model? That size is much more local-friendly

This would be my dream, really want to get a quantized model that can run on 100GB of Ram or less

Sign up or log in to comment