base_model:
- Qwen/Qwen3-1.7B-Base
datasets:
- HuggingFaceFW/fineweb-edu
license: apache-2.0
model_name: Qwen3_1.7B_LoopUS
tags:
- LoopUS
- LoopedTransformers
pipeline_tag: text-generation
LoopUS:
Recasting Pretrained LLMs into Looped Latent Refinement Models
BAELAB, Pusan National University, Busan, Korea
DOLAB, Changwon National University, Changwon, Korea
Taekhyun Park1, Yongjae Lee1, Dohee Kim2, Hyerim Bae1,โ
๐ Github | ๐ Project Page | ๐ Paper
Introduction
Looped Depth Up-Scaling (LoopUS) is a post-training framework that converts a standard pretrained LLM into a looped latent refinement model. Instead of extending output traces, LoopUS restructures the model into an encoder, a looped reasoning block, and a decoder, then performs iterative latent refinement in the hidden space. This approach enables test-time compute scaling and improves reasoning-oriented performance without requiring recurrent training from scratch.
Quick Start
To use this model, clone the official repository and run the provided scripts:
git clone https://github.com/Thrillcrazyer/LoopUS.git
cd LoopUS
# Install dependencies
uv sync
# Run the chat interface
uv run chat.py --model-name Thrillcrazyer/Qwen3_1.7B_LoopUS
Illustration of LoopUS
Citation
If you find LoopUS useful in your research, please cite the following paper:
@misc{park2026loopus,
title={LoopUS: Recasting Pretrained LLMs into Looped Latent Refinement Models},
author={Taekhyun Park and Yongjae Lee and Dohee Kim and Hyerim Bae},
year={2026},
eprint={2605.11011},
archivePrefix={arXiv},
primaryClass={cs.CL},
url={https://arxiv.org/abs/2605.11011},
}