HIKARI-Altair-8B-SkinDx-LoRA
π Model Type: LoRA Adapter
This is a LoRA adapter (~1.1 GB) β it must be loaded on top of the base model
Qwen/Qwen3-VL-8B-Thinking.β Advantage: Lightweight β download only ~1.1 GB instead of ~17 GB.
β οΈ Requirement: You must separately load
Qwen/Qwen3-VL-8B-Thinking(base model) first.πΎ If you prefer a standalone ready-to-use model, see the merged version: E27085921/HIKARI-Altair-8B-SkinDx (~17 GB)
What is this adapter?
LoRA adapter for HIKARI-Altair β 10-class skin disease diagnosis (single-image fine-tuning baseline). Accuracy: 74.00% on SkinCAP validation set.
See the full model card at E27085921/HIKARI-Altair-8B-SkinDx for complete details, disease classes, and performance comparison.
Usage
from peft import PeftModel
from transformers import Qwen3VLForConditionalGeneration, AutoProcessor
import torch
from PIL import Image
# Step 1: Load base model
base = Qwen3VLForConditionalGeneration.from_pretrained(
"Qwen/Qwen3-VL-8B-Thinking",
torch_dtype=torch.bfloat16,
device_map="auto",
trust_remote_code=True,
)
# Step 2: Apply LoRA adapter
model = PeftModel.from_pretrained(base, "E27085921/HIKARI-Altair-8B-SkinDx-LoRA")
processor = AutoProcessor.from_pretrained("E27085921/HIKARI-Altair-8B-SkinDx-LoRA", trust_remote_code=True)
# Step 3: Inference
image = Image.open("skin_lesion.jpg").convert("RGB")
PROMPT = (
"This skin lesion belongs to the group '{group}'. "
"Examine the lesion morphology (papules, plaques, macules), "
"color (red, violet, white, brown), scale/crust, border sharpness, "
"and distribution pattern. Based on these visual features, what is the specific skin disease?"
)
messages = [{"role": "user", "content": [
{"type": "image", "image": image},
{"type": "text", "text": PROMPT.format(group="inflammatory")},
]}]
text = processor.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
inputs = processor(text=[text], images=[image], return_tensors="pt").to(model.device)
with torch.no_grad():
out = model.generate(**inputs, max_new_tokens=64, temperature=0.0, do_sample=False)
print(processor.batch_decode(out[:, inputs["input_ids"].shape[1]:], skip_special_tokens=True)[0].strip())
Made with β€οΈ at King Mongkut's Institute of Technology Ladkrabang (KMITL)
- Downloads last month
- 1
Model tree for E27085921/HIKARI-Altair-8B-SkinDx-LoRA
Base model
Qwen/Qwen3-VL-8B-Thinking