KARAKURI VL 2 8B Thinking 2603

Model Details

Model Description

  • Developed by: KARAKURI Inc.
  • Model type: Vision-Language Models
  • Languages: Japanese and English
  • License: Apache 2.0
  • Finetuned from model: Qwen/Qwen3-VL-8B-Thinking
  • Contact: For questions and comments about the model, please email karakuri-rd@karakuri.ai

Usage

Use in 🤗 Transformers

First, install the required dependencies:

pip install transformers accelerate qwen-vl-utils[decord]==0.0.8

Then, use the following code to load the model and generate responses:

from transformers import AutoModelForImageTextToText, AutoProcessor

model_name = "karakuri-ai/karakuri-vl-2-8b-thinking-2603"
model = AutoModelForImageTextToText.from_pretrained(
    model_name, torch_dtype="auto", device_map="auto"
)
processor = AutoProcessor.from_pretrained(model_name)

messages = [
    {
        "role": "user",
        "content": [
            {
                "type": "image",
                "image": "https://qianwen-res.oss-cn-beijing.aliyuncs.com/Qwen-VL/assets/demo.jpeg",
            },
            {"type": "text", "text": "Describe this image."},
        ],
    }
]

# Preparation for inference
inputs = processor.apply_chat_template(
    messages,
    tokenize=True,
    add_generation_prompt=True,
    return_dict=True,
    return_tensors="pt"
)
inputs = inputs.to(model.device)

# Inference: Generation of the output
generated_ids = model.generate(**inputs, max_new_tokens=128)
generated_ids_trimmed = [
    out_ids[len(in_ids) :] for in_ids, out_ids in zip(inputs.input_ids, generated_ids)
]
output_text = processor.batch_decode(
    generated_ids_trimmed, skip_special_tokens=True, clean_up_tokenization_spaces=False
)
print(output_text)

Training Details

Training Infrastructure

  • Hardware: The model was trained on Amazon EC2 trn2.48xlarge instances.
  • Software: We use code based on neuronx-distributed.

Acknowledgments

This work was supported by the Ministry of Economy, Trade and Industry (METI) and the New Energy and Industrial Technology Development Organization (NEDO) through the Generative AI Accelerator Challenge (GENIAC).

Citation

@misc{karakuri_vl_2_8b_thinking_2603,
    author       = { {KARAKURI} {Inc.} },
    title        = { {KARAKURI} {VL} 2 8{B} {Thinking} 2603 },
    year         = { 2026 },
    url          = { https://huggingface.co/karakuri-ai/karakuri-vl-2-8b-thinking-2603 },
    publisher    = { {Hugging Face} },
    journal      = { {Hugging Face} repository }
}
Downloads last month
81
Safetensors
Model size
9B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for karakuri-ai/karakuri-vl-2-8b-thinking-2603

Finetuned
(49)
this model
Quantizations
5 models