yuzaa commited on
Commit
d206bf2
·
verified ·
1 Parent(s): caf2810

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -1
README.md CHANGED
@@ -26,7 +26,7 @@ A Pocket-Sized MLLM for Ultra-Efficient Image and Video Understanding on Your Ph
26
  - 📱 **Broad Mobile Platform Coverage.**
27
  MiniCPM-V 4.6 can be deployed across all three mainstream mobile platforms — iOS, Android, and HarmonyOS. With every edge adaptation code open-sourced, developers can reproduce the on-device experience in [just a few steps](#deploy-minicpm-v-46-on-ios-android-and-harmonyos-platforms).
28
  - 🛠️ **Developer Friendly.**
29
- MiniCPM-V 4.6 is adapted to [inference frameworks](https://github.com/yiranyyu/MiniCPM-o-private/tree/main#supported-inference-and-training-frameworks) such as vLLM, SGLang, llama.cpp, Ollama, and supports [fine-tuning ecosystems](#use-minicpm-v-46-in-other-inference-and-training-frameworks) such as SWIFT and LLaMA-Factory. Developers can quickly customize models for new domains and tasks on consumer-grade GPUs. We provide multiple quantized variants across GGUF, BNB, AWQ, and GPTQ formats.
30
 
31
 
32
  ### Evaluation <!-- omit in toc -->
@@ -282,6 +282,7 @@ def normalize_response_text(text: str) -> str:
282
 
283
  We have adapted MiniCPM-V 4.6 for deployment on **iOS, Android, and HarmonyOS** platforms, with **all edge adaptation code fully open-sourced**. Developers can reproduce the on-device experience in just a few steps. Visit our [edge deployment repository](https://github.com/OpenBMB/MiniCPM-V-edge-demo) for platform-specific build guides, or go to the [download page](https://github.com/OpenBMB/MiniCPM-V-edge-demo/blob/main/DOWNLOAD.md) to try pre-built apps directly.
284
 
 
285
  #### Use MiniCPM-V 4.6 in Other Inference and Training Frameworks <!-- omit in toc -->
286
 
287
  MiniCPM-V 4.6 supports multiple inference and training frameworks. Below are quick-start commands for each. For full details, see our [Cookbook](https://github.com/OpenSQZ/MiniCPM-V-CookBook).
 
26
  - 📱 **Broad Mobile Platform Coverage.**
27
  MiniCPM-V 4.6 can be deployed across all three mainstream mobile platforms — iOS, Android, and HarmonyOS. With every edge adaptation code open-sourced, developers can reproduce the on-device experience in [just a few steps](#deploy-minicpm-v-46-on-ios-android-and-harmonyos-platforms).
28
  - 🛠️ **Developer Friendly.**
29
+ MiniCPM-V 4.6 is adapted to [inference frameworks](#inference-and-training) such as vLLM, SGLang, llama.cpp, Ollama, and supports [fine-tuning ecosystems](#inference-and-training) such as SWIFT and LLaMA-Factory. Developers can quickly customize models for new domains and tasks on consumer-grade GPUs. We provide multiple quantized variants across GGUF, BNB, AWQ, and GPTQ formats.
30
 
31
 
32
  ### Evaluation <!-- omit in toc -->
 
282
 
283
  We have adapted MiniCPM-V 4.6 for deployment on **iOS, Android, and HarmonyOS** platforms, with **all edge adaptation code fully open-sourced**. Developers can reproduce the on-device experience in just a few steps. Visit our [edge deployment repository](https://github.com/OpenBMB/MiniCPM-V-edge-demo) for platform-specific build guides, or go to the [download page](https://github.com/OpenBMB/MiniCPM-V-edge-demo/blob/main/DOWNLOAD.md) to try pre-built apps directly.
284
 
285
+ <a id="inference-and-training"></a>
286
  #### Use MiniCPM-V 4.6 in Other Inference and Training Frameworks <!-- omit in toc -->
287
 
288
  MiniCPM-V 4.6 supports multiple inference and training frameworks. Below are quick-start commands for each. For full details, see our [Cookbook](https://github.com/OpenSQZ/MiniCPM-V-CookBook).