Qwen3-0.6B SEO Fine-tuned
Fine-tuned version of Qwen/Qwen3-0.6B on the Global SEO Knowledge dataset.
Training Details
- Method: LoRA (r=16, alpha=32)
- Trainable params: 10M / 606M (1.67%)
- Epochs: 3
- Dataset: 2,065 SEO knowledge examples
- Final loss: 1.14
- Token accuracy: 79.8%
Usage
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("Kelnux/Qwen3-0.6B-seo-finetuned")
tokenizer = AutoTokenizer.from_pretrained("Kelnux/Qwen3-0.6B-seo-finetuned")
messages = [{"role": "user", "content": "What is Core Web Vitals in SEO?"}]
text = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
inputs = tokenizer(text, return_tensors="pt")
outputs = model.generate(**inputs, max_new_tokens=200)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
Categories Covered
- Technical SEO (robots.txt, sitemaps, Core Web Vitals, canonical URLs, schema markup)
- On-Page SEO (keyword density, meta tags, content optimization)
- Off-Page SEO (backlinking, domain authority, link building)
- Content SEO (E-E-A-T, content quality)
- Downloads last month
- 4