Update README.md
Browse files
README.md
CHANGED
|
@@ -26,7 +26,7 @@ A Pocket-Sized MLLM for Ultra-Efficient Image and Video Understanding on Your Ph
|
|
| 26 |
- 📱 **Broad Mobile Platform Coverage.**
|
| 27 |
MiniCPM-V 4.6 can be deployed across all three mainstream mobile platforms — iOS, Android, and HarmonyOS. With every edge adaptation code open-sourced, developers can reproduce the on-device experience in [just a few steps](#deploy-minicpm-v-46-on-ios-android-and-harmonyos-platforms).
|
| 28 |
- 🛠️ **Developer Friendly.**
|
| 29 |
-
MiniCPM-V 4.6 is adapted to [inference frameworks](
|
| 30 |
|
| 31 |
|
| 32 |
### Evaluation <!-- omit in toc -->
|
|
@@ -282,6 +282,7 @@ def normalize_response_text(text: str) -> str:
|
|
| 282 |
|
| 283 |
We have adapted MiniCPM-V 4.6 for deployment on **iOS, Android, and HarmonyOS** platforms, with **all edge adaptation code fully open-sourced**. Developers can reproduce the on-device experience in just a few steps. Visit our [edge deployment repository](https://github.com/OpenBMB/MiniCPM-V-edge-demo) for platform-specific build guides, or go to the [download page](https://github.com/OpenBMB/MiniCPM-V-edge-demo/blob/main/DOWNLOAD.md) to try pre-built apps directly.
|
| 284 |
|
|
|
|
| 285 |
#### Use MiniCPM-V 4.6 in Other Inference and Training Frameworks <!-- omit in toc -->
|
| 286 |
|
| 287 |
MiniCPM-V 4.6 supports multiple inference and training frameworks. Below are quick-start commands for each. For full details, see our [Cookbook](https://github.com/OpenSQZ/MiniCPM-V-CookBook).
|
|
|
|
| 26 |
- 📱 **Broad Mobile Platform Coverage.**
|
| 27 |
MiniCPM-V 4.6 can be deployed across all three mainstream mobile platforms — iOS, Android, and HarmonyOS. With every edge adaptation code open-sourced, developers can reproduce the on-device experience in [just a few steps](#deploy-minicpm-v-46-on-ios-android-and-harmonyos-platforms).
|
| 28 |
- 🛠️ **Developer Friendly.**
|
| 29 |
+
MiniCPM-V 4.6 is adapted to [inference frameworks](#inference-and-training) such as vLLM, SGLang, llama.cpp, Ollama, and supports [fine-tuning ecosystems](#inference-and-training) such as SWIFT and LLaMA-Factory. Developers can quickly customize models for new domains and tasks on consumer-grade GPUs. We provide multiple quantized variants across GGUF, BNB, AWQ, and GPTQ formats.
|
| 30 |
|
| 31 |
|
| 32 |
### Evaluation <!-- omit in toc -->
|
|
|
|
| 282 |
|
| 283 |
We have adapted MiniCPM-V 4.6 for deployment on **iOS, Android, and HarmonyOS** platforms, with **all edge adaptation code fully open-sourced**. Developers can reproduce the on-device experience in just a few steps. Visit our [edge deployment repository](https://github.com/OpenBMB/MiniCPM-V-edge-demo) for platform-specific build guides, or go to the [download page](https://github.com/OpenBMB/MiniCPM-V-edge-demo/blob/main/DOWNLOAD.md) to try pre-built apps directly.
|
| 284 |
|
| 285 |
+
<a id="inference-and-training"></a>
|
| 286 |
#### Use MiniCPM-V 4.6 in Other Inference and Training Frameworks <!-- omit in toc -->
|
| 287 |
|
| 288 |
MiniCPM-V 4.6 supports multiple inference and training frameworks. Below are quick-start commands for each. For full details, see our [Cookbook](https://github.com/OpenSQZ/MiniCPM-V-CookBook).
|