Hprairie nielsr HF Staff commited on
Commit
236739c
·
1 Parent(s): d9b42bf

Add model card and metadata (#1)

Browse files

- Add model card and metadata (1f4e4d3ca117cd11d7fcd618978e91f9bea2e0f1)


Co-authored-by: Niels Rogge <nielsr@users.noreply.huggingface.co>

Files changed (1) hide show
  1. README.md +44 -0
README.md ADDED
@@ -0,0 +1,44 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ pipeline_tag: text-generation
3
+ ---
4
+
5
+ # Parcae: Scaling Laws For Stable Looped Language Models
6
+
7
+ [**Paper**](https://huggingface.co/papers/2604.12946) | [**Project Page**](https://sandyresearch.github.io/parcae/) | [**GitHub**](https://github.com/sandyresearch/parcae)
8
+
9
+ Parcae is a novel stable, looped architecture for language models. Unlike traditional fixed-depth architectures that scale by increasing parameter count, looped architectures increase compute (FLOPs) by sending activations through a block of layers in a loop. Parcae addresses training instabilities in prior looped models by recasting looping as a nonlinear time-variant dynamical system and constraining the spectral norm of injection parameters.
10
+
11
+ This checkpoint is the 140M parameter version of Parcae trained on the FineWeb-Edu dataset.
12
+
13
+ ## Installation
14
+
15
+ To use this model, install the `parcae-lm` package:
16
+
17
+ ```bash
18
+ pip install parcae-lm
19
+ ```
20
+
21
+ ## Usage
22
+
23
+ You can load the pretrained weights using the following code:
24
+
25
+ ```python
26
+ import parcae_lm
27
+
28
+ # Load a pretrained model from HuggingFace
29
+ model = parcae_lm.from_pretrained("SandyResearch/parcae-small-140m")
30
+ ```
31
+
32
+ ## Citation
33
+
34
+ ```bibtex
35
+ @misc{prairie2026parcaescalinglawsstable,
36
+ title={Parcae: Scaling Laws For Stable Looped Language Models},
37
+ author={Hayden Prairie and Zachary Novack and Taylor Berg-Kirkpatrick and Daniel Y. Fu},
38
+ year={2026},
39
+ eprint={2604.12946},
40
+ archivePrefix={arXiv},
41
+ primaryClass={cs.LG},
42
+ url={https://arxiv.org/abs/2604.12946},
43
+ }
44
+ ```