Projector problem.
clip_init: failed to load model '...\mmproj-gemma-3n-E4B-it-absolute-heresy-MPOA-F16.gguf': load_hparams: unknown projector type: gemma3nv
mtmd_init_from_file: error: Failed to load CLIP model from ...\mmproj-gemma-3n-E4B-it-absolute-heresy-MPOA-F16.gguf
srv load_model: failed to load multimodal model, '...\mmproj-gemma-3n-E4B-it-absolute-heresy-MPOA-F16.gguf'
srv operator(): operator(): cleaning up before exit...
main: exiting due to model loading error
mmproj-gemma-3n-E4B-it-absolute-heresy-MPOA-Q8_0.gguf fails the same way.
Were you able to get it to run?
Logs indicate load_hparams: unknown projector type: gemma3nv, which means the version of llama.cpp you're using is outdated and, thus, does not support the Gemma 3n multimodal processor. It loads and processes images correctly on my setup.
clip_model_loader: model name: Gemma 3n E4B It Absolute Heresy MPOA
clip_model_loader: description:
clip_model_loader: GGUF version: 3
clip_model_loader: alignment: 32
clip_model_loader: n_tensors: 825
clip_model_loader: n_kv: 37
clip_model_loader: has vision encoder
clip_model_loader: has audio encoder
clip_ctx: CLIP using CUDA0 backend
load_hparams: projector: gemma3nv
load_hparams: n_embd: 2048
load_hparams: n_head: 8
load_hparams: n_ff: 8192
load_hparams: n_layer: 128
load_hparams: ffn_op: gelu_quick
load_hparams: projection_dim: 2048
--- vision hparams ---
load_hparams: image_size: 768
load_hparams: patch_size: 3
load_hparams: has_llava_proj: 0
load_hparams: minicpmv_version: 0
load_hparams: n_merge: 1
load_hparams: n_wa_pattern: 0
load_hparams: model size: 1876.59 MiB
load_hparams: metadata size: 0.29 MiB
load_tensors: Stage 0 ended at global block index 2
load_tensors: Stage 1 ended at global block index 7
load_tensors: Stage 2 ended at global block index 44
load_tensors: Stage 3 ended at global block index 83
warmup: warmup with image size = 768 x 768
Wow fast reply!
And my apologies, it works perfectly 🙏🏻
I updated Llama.cpp and away it goes.
Very excellent, great work 😊
Happy to help, lad. Let me know if you encounter any other issue.
