Qwen3-4B SEO uczciweseo.pl (Experimental)
Domain-specific fine-tuned version of Qwen/Qwen3-4B for the uczciweseo.pl brand.
Status: Experimental. Training completed with low loss but generation quality degraded on Polish domain questions. English SEO knowledge preserved.
Training Details
- Base model: Qwen3-4B (3.09B params)
- Method: LoRA (r=16, alpha=32) on all linear layers
- Trainable params: 33,030,144 (0.81%)
- Dataset: 2,312 examples (925 domain 5x-oversampled + 1,387 bilingual SEO)
- Epochs: 2
- Learning rate: 2e-5 (cosine scheduler)
- Training loss: 0.4322
- Best eval loss: 1.014
- Hardware: Apple Silicon MPS (24GB), fp16 LoRA
Brand Knowledge Target
Training data covers uczciweseo.pl (EXELMEDIA sp. z o.o.):
- Company values: no long-term contracts, full transparency
- Services: SEO, Google Ads, Bing Ads, AI SEO, CRO, automation
- Industry experience: construction, legal, industrial, automotive, furniture, e-commerce
Known Issues
- Polish domain answers show quality degradation (URL-like artifacts)
- English general SEO knowledge well preserved
- Recommended for research/experimentation only
Usage
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("Kelnux/Qwen3-4B-seo-uczciweseo")
tokenizer = AutoTokenizer.from_pretrained("Kelnux/Qwen3-4B-seo-uczciweseo")
- Downloads last month
- 2