llama.cpp example
Can you add llama.cpp example for approperiate settings.
How does it compare to nanonets ?
If you want to use it with llama.cpp then treat is like a vision model , its based on Qwen-VL-2.5 so it should work similar. In the example code given from the original model card it uses a python import : from mineru_vl_utils import MinerUClient
from mineru_vl_utils import MinerULogitsProcessor # if vllm>=0.10.1
... so you will need to look at the source of that code to see what it is doing.
After loading the model, the .mmproj file is not attached or displayed
I wrote a memo for using this model with llama.cpp: https://medium.com/@jason.ni.py/how-to-run-mineru2-5-vl-document-ocr-model-with-llama-cpp-714b0bb8cd71
I wrote a memo for using this model with llama.cpp: https://medium.com/@jason.ni.py/how-to-run-mineru2-5-vl-document-ocr-model-with-llama-cpp-714b0bb8cd71
That's really helpful thank you