yuzaa commited on
Commit
caf2810
·
verified ·
1 Parent(s): 1740887

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -26,7 +26,7 @@ A Pocket-Sized MLLM for Ultra-Efficient Image and Video Understanding on Your Ph
26
  - 📱 **Broad Mobile Platform Coverage.**
27
  MiniCPM-V 4.6 can be deployed across all three mainstream mobile platforms — iOS, Android, and HarmonyOS. With every edge adaptation code open-sourced, developers can reproduce the on-device experience in [just a few steps](#deploy-minicpm-v-46-on-ios-android-and-harmonyos-platforms).
28
  - 🛠️ **Developer Friendly.**
29
- MiniCPM-V 4.6 is adapted to [inference frameworks](#use-minicpm-v-46-in-other-inference-and-training-frameworks) such as vLLM, SGLang, llama.cpp, Ollama, and supports [fine-tuning ecosystems](#use-minicpm-v-46-in-other-inference-and-training-frameworks) such as SWIFT and LLaMA-Factory. Developers can quickly customize models for new domains and tasks on consumer-grade GPUs. We provide multiple quantized variants across GGUF, BNB, AWQ, and GPTQ formats.
30
 
31
 
32
  ### Evaluation <!-- omit in toc -->
 
26
  - 📱 **Broad Mobile Platform Coverage.**
27
  MiniCPM-V 4.6 can be deployed across all three mainstream mobile platforms — iOS, Android, and HarmonyOS. With every edge adaptation code open-sourced, developers can reproduce the on-device experience in [just a few steps](#deploy-minicpm-v-46-on-ios-android-and-harmonyos-platforms).
28
  - 🛠️ **Developer Friendly.**
29
+ MiniCPM-V 4.6 is adapted to [inference frameworks](https://github.com/yiranyyu/MiniCPM-o-private/tree/main#supported-inference-and-training-frameworks) such as vLLM, SGLang, llama.cpp, Ollama, and supports [fine-tuning ecosystems](#use-minicpm-v-46-in-other-inference-and-training-frameworks) such as SWIFT and LLaMA-Factory. Developers can quickly customize models for new domains and tasks on consumer-grade GPUs. We provide multiple quantized variants across GGUF, BNB, AWQ, and GPTQ formats.
30
 
31
 
32
  ### Evaluation <!-- omit in toc -->