YAML Metadata Warning:empty or missing yaml metadata in repo card

Check out the documentation for more information.

Qwen3 LoRA Adapters Collection

Collection of LoRA adapters extracted from Qwen/Qwen3-4B-Thinking-2507 using different rank and epsilon parameters.

Structure

adapters/
β”œβ”€β”€ rank8_eps0.01/
β”œβ”€β”€ rank8_eps0.1/
β”œβ”€β”€ rank16_eps0.01/
└── ...

Usage

from peft import PeftModel
from transformers import AutoModelForCausalLM

# Load base model
base_model = AutoModelForCausalLM.from_pretrained("Qwen/Qwen3-4B-Base")

# Apply specific LoRA adapter
lora_model = PeftModel.from_pretrained(base_model, "adapters/rank16_eps0.10")

Parameter Guide

  • Rank: Lower = more efficient, Higher = potentially better quality
  • Epsilon: Threshold for singular value filtering (lower = more components kept)
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support