Very good GGUF build for translating texts offline

#1
by Demanin - opened

Very good GGUF build for translating texts offline. But could you also build YandexGPT-5-Lite-8B-instruct-Q8-GGUF version, i'd like to take a look if it's noticeably better than Q5 or not. Thank you.

Sorry for not being able to provide nessesary version. I made this repo only for my YandexGPT tests but recently found, that people download this particular gguf a lot. I suppose, that you found Q8-GGUF model elsewhere, for example here https://huggingface.co/NikolayKozloff/YandexGPT-5-Lite-8B-instruct-Q8_0-GGUF, or here https://huggingface.co/mradermacher/YandexGPT-5-Lite-8B-instruct-GGUF. I found this model way too inconcistent for general use, but it could be good translator.

For future inquiries, I recommend you to try next huggingface space: https://huggingface.co/spaces/ggml-org/gguf-my-repo. It allows to quantify other huggingface models with ease and completely free. In most cases models gets quantizations quickly after their release, but for not-so-popular models it could need more time, so you can try quantify models for yourself.

Thank you for your comment, hope this helps.

  • Lev

Thank you, you have helped me a lot by releasing YandexGPT-5-Lite-8B-instruct-Q8-GGUF

Sign up or log in to comment