Open Ticket AI – Lite Free (0.6B, BF16)

Model Summary

Open Ticket AI – Lite Free (BF16) is a free ticket tagging model designed for helpdesk and IT support systems, providing automatic classification into up to 100 structured tags.

This BF16 variant is intended for maximum numerical stability and compatibility, making it suitable for environments where INT8 quantization is not desired or supported.
It can be used on CPUs with BF16 support or on GPUs, and serves as the reference-quality Lite Free model.


Model Details

  • Developed by: Softoft (Tobias Bück)
  • Model type: Text Classification / Multi-Label Tagging
  • Base model: Qwen-based architecture
  • Model size: ~0.6B parameters
  • Precision: BF16
  • Languages: English, German, French, Spanish, Portuguese, Italian
  • License: Apache 2.0
  • Intended use: Automatic ticket tagging for support and ITSM systems

Intended Use

Direct Use

  • Automatic tagging of support tickets
  • Ticket routing and prioritization
  • Reporting and analytics enrichment
  • Evaluation and benchmarking of tagging accuracy

Typical integrations include:

  • Zammad
  • OTOBO
  • Znuny / OTRS forks
  • Other helpdesk systems via API or webhook

Out-of-Scope Use

  • Conversational or generative chatbots
  • Legal, medical, or safety-critical decisions
  • Autonomous decision making without human validation

Performance Characteristics

  • Designed for CPU (BF16-capable) or GPU inference
  • Higher memory usage than INT8, but improved numerical precision
  • Suitable for evaluation, benchmarking, and stable production environments
  • Typical real-world accuracy: ~0.8

Compared to the INT8 variant, this model prioritizes precision and reproducibility over minimal resource usage.


Limitations

  • Higher memory footprint than INT8
  • Limited to ~100 predefined tags
  • Accuracy lower than larger or commercial variants
  • Performance depends on ticket text quality and language consistency

For higher accuracy, more tags, or enterprise use cases, consider Lite-Pro or Full variants.


How to Use

from transformers import AutoTokenizer, AutoModelForSequenceClassification

model_id = "softoft/otai-tags-lite-free-0.6b-bf16"

tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForSequenceClassification.from_pretrained(model_id)

inputs = tokenizer(
    "Email service is down since this morning, users cannot log in.",
    return_tensors="pt",
    truncation=True
)

outputs = model(**inputs)

The output logits can be mapped to the predefined Open Ticket AI Lite tag set.


Training Data

The model was fine-tuned on a large, curated dataset of synthetic and semi-synthetic support tickets, designed to reflect realistic helpdesk scenarios across multiple domains and languages.

No personal or customer-identifiable data was used.


Evaluation

  • Internal evaluation on held-out ticket sets
  • Metrics focused on multi-label classification accuracy and consistency
  • Real-world performance depends on domain, language, and ticket quality

Environmental Impact

  • Training hardware: GPU (one-time fine-tuning)
  • Inference target: CPU (BF16) or GPU
  • Designed to minimize inference-time energy usage while maintaining stability

Technical Details

  • Architecture: Transformer-based sequence classification
  • Objective: Multi-label classification
  • Framework: PyTorch / Transformers
  • Deployment: On-premise, virtual machines, containerized environments

Citation

If you use this model in academic or commercial work, please cite:

Open Ticket AI – Lite Free (BF16), Softoft (2025)


Contact

For questions, integrations, or commercial licensing options:

Downloads last month
9
Safetensors
Model size
0.6B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Collection including open-ticket-ai/tags-lite-free-0-6b-bf16