Image-Text-to-Text
Transformers
Safetensors
Korean
English
qwen3_5
korean
reasoning
darwin
evolutionary-merge
sft
conversational
Instructions to use Warecube/Warecube-KO-27B-v3 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use Warecube/Warecube-KO-27B-v3 with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("image-text-to-text", model="Warecube/Warecube-KO-27B-v3") messages = [ { "role": "user", "content": [ {"type": "image", "url": "https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/p-blog/candy.JPG"}, {"type": "text", "text": "What animal is on the candy?"} ] }, ] pipe(text=messages)# Load model directly from transformers import AutoProcessor, AutoModelForImageTextToText processor = AutoProcessor.from_pretrained("Warecube/Warecube-KO-27B-v3") model = AutoModelForImageTextToText.from_pretrained("Warecube/Warecube-KO-27B-v3") messages = [ { "role": "user", "content": [ {"type": "image", "url": "https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/p-blog/candy.JPG"}, {"type": "text", "text": "What animal is on the candy?"} ] }, ] inputs = processor.apply_chat_template( messages, add_generation_prompt=True, tokenize=True, return_dict=True, return_tensors="pt", ).to(model.device) outputs = model.generate(**inputs, max_new_tokens=40) print(processor.decode(outputs[0][inputs["input_ids"].shape[-1]:])) - Notebooks
- Google Colab
- Kaggle
- Local Apps
- vLLM
How to use Warecube/Warecube-KO-27B-v3 with vLLM:
Install from pip and serve model
# Install vLLM from pip: pip install vllm # Start the vLLM server: vllm serve "Warecube/Warecube-KO-27B-v3" # Call the server using curl (OpenAI-compatible API): curl -X POST "http://localhost:8000/v1/chat/completions" \ -H "Content-Type: application/json" \ --data '{ "model": "Warecube/Warecube-KO-27B-v3", "messages": [ { "role": "user", "content": [ { "type": "text", "text": "Describe this image in one sentence." }, { "type": "image_url", "image_url": { "url": "https://cdn.britannica.com/61/93061-050-99147DCE/Statue-of-Liberty-Island-New-York-Bay.jpg" } } ] } ] }'Use Docker
docker model run hf.co/Warecube/Warecube-KO-27B-v3
- SGLang
How to use Warecube/Warecube-KO-27B-v3 with SGLang:
Install from pip and serve model
# Install SGLang from pip: pip install sglang # Start the SGLang server: python3 -m sglang.launch_server \ --model-path "Warecube/Warecube-KO-27B-v3" \ --host 0.0.0.0 \ --port 30000 # Call the server using curl (OpenAI-compatible API): curl -X POST "http://localhost:30000/v1/chat/completions" \ -H "Content-Type: application/json" \ --data '{ "model": "Warecube/Warecube-KO-27B-v3", "messages": [ { "role": "user", "content": [ { "type": "text", "text": "Describe this image in one sentence." }, { "type": "image_url", "image_url": { "url": "https://cdn.britannica.com/61/93061-050-99147DCE/Statue-of-Liberty-Island-New-York-Bay.jpg" } } ] } ] }'Use Docker images
docker run --gpus all \ --shm-size 32g \ -p 30000:30000 \ -v ~/.cache/huggingface:/root/.cache/huggingface \ --env "HF_TOKEN=<secret>" \ --ipc=host \ lmsysorg/sglang:latest \ python3 -m sglang.launch_server \ --model-path "Warecube/Warecube-KO-27B-v3" \ --host 0.0.0.0 \ --port 30000 # Call the server using curl (OpenAI-compatible API): curl -X POST "http://localhost:30000/v1/chat/completions" \ -H "Content-Type: application/json" \ --data '{ "model": "Warecube/Warecube-KO-27B-v3", "messages": [ { "role": "user", "content": [ { "type": "text", "text": "Describe this image in one sentence." }, { "type": "image_url", "image_url": { "url": "https://cdn.britannica.com/61/93061-050-99147DCE/Statue-of-Liberty-Island-New-York-Bay.jpg" } } ] } ] }' - Docker Model Runner
How to use Warecube/Warecube-KO-27B-v3 with Docker Model Runner:
docker model run hf.co/Warecube/Warecube-KO-27B-v3
upload README.md
Browse files
README.md
ADDED
|
@@ -0,0 +1,119 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
---
|
| 2 |
+
license: apache-2.0
|
| 3 |
+
language:
|
| 4 |
+
- ko
|
| 5 |
+
- en
|
| 6 |
+
library_name: transformers
|
| 7 |
+
tags:
|
| 8 |
+
- korean
|
| 9 |
+
- reasoning
|
| 10 |
+
- darwin
|
| 11 |
+
- evolutionary-merge
|
| 12 |
+
- sft
|
| 13 |
+
base_model:
|
| 14 |
+
- ginigen-ai/Rogue-28B-MIX
|
| 15 |
+
---
|
| 16 |
+
|
| 17 |
+
# Warecube-KO-27B-v2
|
| 18 |
+
|
| 19 |
+
한국어 reasoning 모델 — Darwin 진화 + 추가 SFT 정제 변종.
|
| 20 |
+
|
| 21 |
+
---
|
| 22 |
+
|
| 23 |
+
## 🧬 Darwin 진화 컨셉
|
| 24 |
+
|
| 25 |
+
본 모델은 **Darwin V7 진화적 모델 머지** 기반의 부모 모델에
|
| 26 |
+
**한국어 K-AI 도메인 SFT**를 추가 학습한 자식 모델입니다.
|
| 27 |
+
|
| 28 |
+
```
|
| 29 |
+
자연 진화 Darwin 머지 + SFT
|
| 30 |
+
───────── ───────────────────
|
| 31 |
+
유전자 교차 → 가중치 모듈별 비율 결합 (부모)
|
| 32 |
+
세대 진화 → 부모 모델에 추가 SFT 정제
|
| 33 |
+
적자 생존 → K-AI 도메인 우수 자손 보존
|
| 34 |
+
```
|
| 35 |
+
|
| 36 |
+
---
|
| 37 |
+
|
| 38 |
+
## 🏛️ 가문 계보
|
| 39 |
+
|
| 40 |
+
```
|
| 41 |
+
┌────────────────────────────────────────┐
|
| 42 |
+
│ 베이스 (Base / Parent) │
|
| 43 |
+
│ ginigen-ai/Rogue-28B-MIX │
|
| 44 |
+
│ │
|
| 45 |
+
│ - K-AI Leaderboard 2위 (avg 0.559) │
|
| 46 |
+
│ - Darwin family + Quetta 진화 머지 │
|
| 47 |
+
│ - <think> reasoning trace │
|
| 48 |
+
└────────────────────────────────────────┘
|
| 49 |
+
│
|
| 50 |
+
▼ K-AI 도메인 추가 SFT 진화
|
| 51 |
+
╔════════════════════════════════════════╗
|
| 52 |
+
║ 자식 (Child) — 본 모델 ║
|
| 53 |
+
║ Warecube/Warecube-KO-27B-v2 ║
|
| 54 |
+
║ ║
|
| 55 |
+
║ ✦ 베이스의 모든 능력 계승 ║
|
| 56 |
+
║ ✦ Com2-main 도메인 강화 ║
|
| 57 |
+
║ ✦ K-AI Leaderboard Docker 호환 형식 ║
|
| 58 |
+
╚════════════════════════════════════════╝
|
| 59 |
+
```
|
| 60 |
+
|
| 61 |
+
---
|
| 62 |
+
|
| 63 |
+
## 🎓 학습 개요
|
| 64 |
+
|
| 65 |
+
| Stage | 개략 |
|
| 66 |
+
|:---|:---|
|
| 67 |
+
| **Base** | ginigen-ai/Rogue-28B-MIX (Darwin family × Quetta family 진화 머지) |
|
| 68 |
+
| **SFT** | 한국어 K-AI 도메인 instruction 데이터로 추가 정제 |
|
| 69 |
+
| **호환** | K-AI Leaderboard Docker 호환 형식으로 정비 |
|
| 70 |
+
|
| 71 |
+
---
|
| 72 |
+
|
| 73 |
+
## 🎯 사용법
|
| 74 |
+
|
| 75 |
+
```python
|
| 76 |
+
from transformers import AutoTokenizer, AutoModelForCausalLM
|
| 77 |
+
import torch
|
| 78 |
+
|
| 79 |
+
model_id = "Warecube/Warecube-KO-27B-v2"
|
| 80 |
+
tokenizer = AutoTokenizer.from_pretrained(
|
| 81 |
+
model_id, trust_remote_code=True
|
| 82 |
+
)
|
| 83 |
+
model = AutoModelForCausalLM.from_pretrained(
|
| 84 |
+
model_id,
|
| 85 |
+
torch_dtype=torch.bfloat16,
|
| 86 |
+
device_map="auto",
|
| 87 |
+
trust_remote_code=True,
|
| 88 |
+
)
|
| 89 |
+
|
| 90 |
+
prompt = "한국의 추석에 대해 설명해주세요."
|
| 91 |
+
messages = [{"role": "user", "content": prompt}]
|
| 92 |
+
inputs = tokenizer.apply_chat_template(
|
| 93 |
+
messages, return_tensors="pt", add_generation_prompt=True
|
| 94 |
+
)
|
| 95 |
+
out = model.generate(
|
| 96 |
+
inputs.to(model.device),
|
| 97 |
+
max_new_tokens=512,
|
| 98 |
+
do_sample=False,
|
| 99 |
+
)
|
| 100 |
+
print(tokenizer.decode(out[0], skip_special_tokens=False))
|
| 101 |
+
```
|
| 102 |
+
|
| 103 |
+
---
|
| 104 |
+
|
| 105 |
+
## 🛠️ 사양
|
| 106 |
+
|
| 107 |
+
- 파라미터: 28B (multimodal)
|
| 108 |
+
- 양자화: bf16
|
| 109 |
+
- 컨텍스트: 8K (확장 가능)
|
| 110 |
+
- 언어: 한국어 + 영어
|
| 111 |
+
- 추론: `<think>` reasoning trace
|
| 112 |
+
- License: Apache 2.0
|
| 113 |
+
|
| 114 |
+
---
|
| 115 |
+
|
| 116 |
+
## 🤝 출처
|
| 117 |
+
|
| 118 |
+
- 베이스: [ginigen-ai/Rogue-28B-MIX](https://huggingface.co/ginigen-ai/Rogue-28B-MIX) (K-AI Leaderboard 2위)
|
| 119 |
+
- 가문: Darwin family (Darwin V7 진화적 머지 시리즈)
|