Signal DSL Generator (LoRA Adapter)
A fine-tuned LoRA adapter for generating Signal DSL configurations from natural language descriptions.
Model Details
| Attribute | Value |
|---|---|
| Base Model | Qwen/Qwen2.5-Coder-7B-Instruct |
| Training Method | 3-Stage (SFT → Preference Tuning → DPO) |
| Final Accuracy | 96.25% |
| DPO Margins | 1.57 |
| Parameters | LoRA rank=16, alpha=32 |
What is Signal DSL?
Signal DSL is a domain-specific language for configuring intelligent LLM routing. It allows you to define:
- Signals: Domain, modality, complexity, language detection
- Routes: Conditional model selection based on signals
- Plugins: System prompts, RAG, semantic cache, etc.
Usage
from transformers import AutoModelForCausalLM, AutoTokenizer
from peft import PeftModel
import torch
# Load base model + LoRA adapter
base_model = AutoModelForCausalLM.from_pretrained(
"Qwen/Qwen2.5-Coder-7B-Instruct",
torch_dtype=torch.bfloat16,
device_map="auto"
)
model = PeftModel.from_pretrained(base_model, "haowu1234/signal-dsl-generator-lora")
tokenizer = AutoTokenizer.from_pretrained("Qwen/Qwen2.5-Coder-7B-Instruct")
# Generate DSL
messages = [
{"role": "system", "content": "You are a Signal DSL configuration generator. Generate valid Signal DSL configurations."},
{"role": "user", "content": "创建一条路由:当用户提问代码问题时,使用 deepseek-coder 模型"}
]
text = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
inputs = tokenizer(text, return_tensors="pt").to(model.device)
with torch.no_grad():
outputs = model.generate(**inputs, max_new_tokens=512, temperature=0.1, do_sample=True)
response = tokenizer.decode(outputs[0][inputs["input_ids"].shape[1]:], skip_special_tokens=True)
print(response)
Example Output
Input: "处理代码相关的问题,使用专业模型"
Output:
SIGNAL domain code_domain {
description: "Code and programming related queries"
}
ROUTE code_route (description = "Route code questions to specialist") {
PRIORITY 100
WHEN domain("code_domain")
MODEL "deepseek-coder" (reasoning = true)
}
Training Details
Stage 1: Supervised Fine-Tuning (SFT)
- Dataset: 10K+ synthetic DSL examples
- Epochs: 3
- Learning rate: 2e-4
Stage 2: Preference Tuning
- Dataset: Preference pairs (correct vs incorrect DSL)
- Method: Contrastive learning
Stage 3: Direct Preference Optimization (DPO)
- Beta: 0.1
- Final metrics:
- Accuracy: 96.25%
- Margins: 1.57
- Loss: 0.27
Limitations
- Generates Signal DSL syntax specifically; not a general-purpose code generator
- Best results with clear, specific natural language descriptions
- May produce verbose configurations for simple requests
Citation
@misc{signal-dsl-generator,
author = {Signal Router Team},
title = {Signal DSL Generator: LoRA Fine-tuned Model for DSL Configuration Generation},
year = {2025},
publisher = {Hugging Face},
url = {https://huggingface.co/haowu1234/signal-dsl-generator-lora}
}
License
Apache 2.0 - See LICENSE for details.
- Downloads last month
- 1