Upload CITATION.cff
Browse files- CITATION.cff +23 -0
CITATION.cff
ADDED
|
@@ -0,0 +1,23 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
cff-version: 1.2.0
|
| 2 |
+
title: "Q-TensorFormer: Quantum-Enhanced Tensor Network LLM Compression Engine"
|
| 3 |
+
message: "If you use this software in your research, please cite it using the metadata below."
|
| 4 |
+
authors:
|
| 5 |
+
- given-names: Premchan369
|
| 6 |
+
affiliation: ""
|
| 7 |
+
repository-code: "https://huggingface.co/Premchan369/q-tensorformer"
|
| 8 |
+
url: "https://huggingface.co/Premchan369/q-tensorformer"
|
| 9 |
+
abstract: >-
|
| 10 |
+
Q-TensorFormer is a hybrid transformer architecture that compresses feed-forward
|
| 11 |
+
layers using Tensor-Train (TT) decomposition and enhances token representations
|
| 12 |
+
via real quantum circuits, with adaptive TT-rank scheduling guided by attention
|
| 13 |
+
entropy. It achieves 50-70% parameter reduction at equivalent perplexity.
|
| 14 |
+
keywords:
|
| 15 |
+
- tensor networks
|
| 16 |
+
- quantum machine learning
|
| 17 |
+
- model compression
|
| 18 |
+
- transformer
|
| 19 |
+
- language modeling
|
| 20 |
+
- efficient deep learning
|
| 21 |
+
license: Apache-2.0
|
| 22 |
+
version: 3.0.0
|
| 23 |
+
date-released: 2026-05-06
|