HIKARI-Antares-8B-SkinCaption-STS-LoRA
π Model Type: LoRA Adapter
This is a LoRA adapter (~1.2 GB) β it must be loaded on top of the base model
Qwen/Qwen3-VL-8B-Thinking.β Advantage: Lightweight β download only ~1.2 GB instead of ~17 GB.
β οΈ Requirement: You must separately load
Qwen/Qwen3-VL-8B-Thinking(base model, ~17 GB) first.πΎ If you prefer a standalone ready-to-use model, see the merged version: E27085921/HIKARI-Antares-8B-SkinCaption-STS (~17 GB)
What is this adapter?
LoRA adapter for HIKARI-Antares-8B-SkinCaption-STS β Clinical caption generation with Selective Token Supervision ablation (research). Metric: BLEU-4: 0.61 (collapsed).
β οΈ Research artifact documenting STS training collapse. Output quality is poor. See HIKARI-Vega-8B-SkinCaption-Fused for production use.
See the full model card at E27085921/HIKARI-Antares-8B-SkinCaption-STS for complete details, usage examples, and performance comparison.
Usage
from peft import PeftModel
from transformers import Qwen3VLForConditionalGeneration, AutoProcessor
import torch
from PIL import Image
# Step 1: Load base model (Qwen3-VL-8B-Thinking, ~17 GB)
base = Qwen3VLForConditionalGeneration.from_pretrained(
"Qwen/Qwen3-VL-8B-Thinking",
torch_dtype=torch.bfloat16,
device_map="auto",
trust_remote_code=True,
)
# Step 2: Apply LoRA adapter (~1.2 GB)
model = PeftModel.from_pretrained(base, "E27085921/HIKARI-Antares-8B-SkinCaption-STS-LoRA")
processor = AutoProcessor.from_pretrained("E27085921/HIKARI-Antares-8B-SkinCaption-STS-LoRA", trust_remote_code=True)
# Step 3: Inference β see full examples at E27085921/HIKARI-Antares-8B-SkinCaption-STS
image = Image.open("skin_lesion.jpg").convert("RGB")
For complete inference examples including vLLM and SGLang production code, see: E27085921/HIKARI-Antares-8B-SkinCaption-STS
π Citation
@misc{hikari2026,
title = {HIKARI: RAG-in-Training for Skin Disease Diagnosis
with Cascaded Vision-Language Models},
author = {Watin Promfiy and Pawitra Boonprasart},
year = {2026},
institution = {King Mongkut's Institute of Technology Ladkrabang,
Department of Information Technology, Bangkok, Thailand}
}
Made with β€οΈ at King Mongkut's Institute of Technology Ladkrabang (KMITL)
- Downloads last month
- 1
Model tree for E27085921/HIKARI-Antares-8B-SkinCaption-STS-LoRA
Base model
Qwen/Qwen3-VL-8B-Thinking