--- tags: - quantum-ml - hybrid-quantum-classical - ibm-quantum - heron-r2 - ibm_fez - quantum-kernel - merged-lora license: mit language: - en base_model: - WeiboAI/VibeThinker-1.5B pipeline_tag: text-generation --- # Chronos 1.5B - Quantum-Classical model ![chronos](https://cdn-uploads.huggingface.co/production/uploads/67329d3f69fded92d56ab41a/mkH0yazAc13v0gvi04RIF.jpeg) **A hybrid quantum-classical model combining VibeThinker-1.5B with quantum kernel methods** [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT) [![Python 3.8+](https://img.shields.io/badge/python-3.8+-blue.svg)](https://www.python.org/downloads/) [![Transformers](https://img.shields.io/badge/🤗%20Transformers-Compatible-blue)](https://github.com/huggingface/transformers) ## Overview **Chronos 1.5B** is an experimental quantum-enhanced language model that combines: - **VibeThinker-1.5B** as the base transformer model for embedding extraction - **Quantum Kernel Methods** for similarity computation - **2-qubit quantum circuits** for enhanced feature space representation This model demonstrates a proof-of-concept for hybrid quantum-classical machine learning applied to sentiment analysis. ## Quantum Component Details | Feature | Implementation | |------------------------------------|---------------------------------------------------------------------------------| | Real quantum training | Quantum rotation angles were optimized on IBM **Heron r2** (`ibm_fez`) in 2025 | | Saved quantum parameters | `quantum_kernel.pkl` — trained 2-qubit gate angles (pickle) | | Quantum circuit definition | Available in `k_train_quantum.npy` / `k_test_quantum.npy` (future use) | | Current inference | Classical simulation using the trained quantum angles (via cosine similarity) | | True quantum execution (optional) | Possible by loading `quantum_kernel.pkl` + circuit files and running on IBM Quantum (example scripts will be added) | ## Architecture ![chrn](https://cdn-uploads.huggingface.co/production/uploads/67329d3f69fded92d56ab41a/DPIBm4hDJd9ztLz2EE0jU.png) ## Model Details - **Base Model**: [WeiboAI/VibeThinker-1.5B](https://huggingface.co/WeiboAI/VibeThinker-1.5B) - **Architecture**: Qwen2ForCausalLM - **Parameters**: ~1.5B - **Context Length**: 131,072 tokens - **Embedding Dimension**: 1536 - **Quantum Component**: 2-qubit kernel - **Training Data**: 8 sentiment examples (demonstration) ## Performance ## Base VibeThinker-1.5B Benchmarks
![bench](https://cdn-uploads.huggingface.co/production/uploads/67329d3f69fded92d56ab41a/sdjLC2Oa2JXcwJc-qqSx2.png)
### Benchmark Results | Model | Accuracy | Type | |-------|----------|------| | Classical (Linear SVM) | 100% | Baseline | | Quantum Hybrid | 75% | Experimental | ![chronos_o1_results_english](https://cdn-uploads.huggingface.co/production/uploads/67329d3f69fded92d56ab41a/LNOXKqlOV96HWJzammq2Y.png) ![chronos_o1_results](https://cdn-uploads.huggingface.co/production/uploads/67329d3f69fded92d56ab41a/wE_sARe9MdeSnwiwe8bq6.png) **Note**: Performance varies with dataset size and quantum simulation parameters. This is a proof-of-concept demonstrating quantum-classical integration. ## 🧬 Also take a look at The Hypnos Family | Model | Parameters | Quantum Sources | Best For | Status | |-------|------------|-----------------|----------|--------| | **Hypnos-i2-32B** | 32B | 3 (Matter + Light + Nucleus) | Production, Research | ✅ Available | | **Hypnos-i1-8B** | 8B | 1 (Matter only) | Edge, Experiments | ✅ 10k+ Downloads | Start with [Hypnos-i1-8B](https://huggingface.co/squ11z1/hypnos-i1-8b) for lightweight quantum-regularized AI! ## Installation ### Requirements ```bash pip install torch transformers numpy scikit-learn ``` ## Usage ### Python Inference ```python from transformers import AutoModel, AutoTokenizer import torch import numpy as np from sklearn.preprocessing import normalize from sklearn.metrics.pairwise import cosine_similarity device = torch.device("cuda" if torch.cuda.is_available() else "cpu") tokenizer = AutoTokenizer.from_pretrained("squ11z1/Chronos-1.5B") model = AutoModel.from_pretrained( "squ11z1/Chronos-1.5B", torch_dtype=torch.float16 ).to(device).eval() def predict_sentiment(text): inputs = tokenizer(text, return_tensors="pt", padding=True, truncation=True, max_length=128).to(device) with torch.no_grad(): outputs = model(**inputs) embedding = outputs.last_hidden_state.mean(dim=1).cpu().numpy()[0] embedding = normalize([embedding])[0] # Your quantum kernel logic here return sentiment ``` ### Quick Start Script ```bash python inference.py ``` This will start an interactive session where you can enter text for sentiment analysis. ### Example Output ``` Input text: 'Random text!' [1/3] VibeThinker embedding: 1536D (normalized) [2/3] Quantum similarity computed [3/3] Classification: POSITIVE Confidence: 87.3% Positive avg: 0.756, Negative avg: 0.128 Time: 0.42s ``` ## Quantum Kernel Details The quantum component uses a simplified kernel approach: 1. Extract 1536D embeddings from VibeThinker 2. Normalize using L2 normalization 3. Compute cosine similarity against training examples 4. Apply quantum-inspired weighted voting 5. Return sentiment with confidence score **Note**: This implementation uses classical simulation. For true quantum execution, integration with IBM Quantum or similar platforms is required. ## Training Data The model uses 8 hand-crafted examples for demonstration: - 4 positive sentiment examples - 4 negative sentiment examples For production use, retrain with larger datasets. ## Limitations - Small training set (8 examples) - Quantum kernel is simulated, not executed on real quantum hardware - Performance may vary significantly with different inputs - Designed for English text sentiment analysis only ## Future Improvements 1. Expand training dataset to 100+ examples 2. Implement true quantum kernel execution on IBM Quantum 3. Increase quantum circuit complexity (3-4 qubits) 4. Add error mitigation for quantum noise 5. Support multi-language sentiment analysis 6. Fine-tune on domain-specific sentiment data ## Citation If you use this model in your research, please cite: ```bibtex @misc{chronos-1.5b, title={Chronos 1.5B: Quantum-Enhanced Sentiment Analysis}, author={squ11z1}, year={2025}, publisher={Hugging Face}, howpublished={\url{https://huggingface.co/squ11z1/chronos-1.5b}} } ``` ## Acknowledgments - Base model: [VibeThinker-1.5B](https://huggingface.co/WeiboAI/VibeThinker-1.5B) by WeiboAI - Quantum computing framework: Qiskit - Inspired by quantum machine learning research ## License MIT License - See LICENSE file for details --- **Disclaimer**: This is an experimental proof-of-concept model. Performance and accuracy are not guaranteed for production use cases. The quantum component is currently does not provide quantum advantage over classical methods.