Text Generation
Transformers
PyTorch
Safetensors
English
gpt_neox
causal-lm
pythia
text-generation-inference
stellaathena commited on
Commit
e556ace
·
verified ·
1 Parent(s): fb2af1b

Add note about prior deduplicated model at this URL

Browse files
Files changed (1) hide show
  1. README.md +2 -0
README.md CHANGED
@@ -23,6 +23,8 @@ library_name: transformers
23
  - **License:** Apache 2.0
24
  - **Contact:** Join the [EleutherAI Discord](https://discord.gg/zBGx3azzUn) and post in `#release-discussion`. For general correspondence: contact@eleuther.ai
25
 
 
 
26
  > **Note:** Pythia-31M was trained after the original Pythia suite at the request of interpretability researchers who wanted a smaller model with the same training setup. It uses the same tokenizer, hyperparameter conventions, and checkpoint schedule as the rest of the Pythia suite. A deduplicated variant is available at [`EleutherAI/pythia-31m-deduped`](https://huggingface.co/EleutherAI/pythia-31m-deduped).
27
 
28
  **Model Configuration:**
 
23
  - **License:** Apache 2.0
24
  - **Contact:** Join the [EleutherAI Discord](https://discord.gg/zBGx3azzUn) and post in `#release-discussion`. For general correspondence: contact@eleuther.ai
25
 
26
+ > **Note:** Prior to Feb 27th, 2026, a model was hosted at this URL that was trained on the deduplicated Pile. That model is now at [EleutherAI/pythia-31m-deduped](https://huggingface.co/EleutherAI/pythia-31m-deduped) and this model was correctly trained on the standard Pile. We apologize for any confusion this has caused.
27
+
28
  > **Note:** Pythia-31M was trained after the original Pythia suite at the request of interpretability researchers who wanted a smaller model with the same training setup. It uses the same tokenizer, hyperparameter conventions, and checkpoint schedule as the rest of the Pythia suite. A deduplicated variant is available at [`EleutherAI/pythia-31m-deduped`](https://huggingface.co/EleutherAI/pythia-31m-deduped).
29
 
30
  **Model Configuration:**