File size: 4,288 Bytes
83a1e20
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
---
license: odc-by
task_categories:
  - text-generation
language:
  - en
pretty_name: Parameter Golf Competition Data
size_categories:
  - 1B<n<10B
tags:
  - parameter-golf
  - fineweb
  - language-modeling
  - competition
---

# Parameter Golf Competition Data

Pre-tokenized [FineWeb](https://huggingface.co/datasets/HuggingFaceFW/fineweb) shards for the [OpenAI Parameter Golf](https://github.com/openai/parameter-golf) competition.

**Two tokenizations included:**
- **SP1024** — SentencePiece BPE, 1024 tokens (competition default)
- **Scylla** — TokenMonster-derived, 998 tokens (community alternative from [PR #1143](https://github.com/openai/parameter-golf/pull/1143))

## Why this exists

Every time you launch a GPU pod, re-downloading 16+ GB of data from the competition repo costs 10-30 minutes of billable GPU time. This dataset provides the same data via `huggingface-cli download` — fast, resumable, and available from any provider (RunPod, Modal, Colab, Vast.ai).

## Quick Start

```bash
# Full SP1024 dataset (~11 GB)
huggingface-cli download LightSpeedUp/parameter-golf-data --include "fineweb_sp1024/*" --local-dir /workspace/data

# Full Scylla dataset (~11 GB)
huggingface-cli download LightSpeedUp/parameter-golf-data --include "fineweb_scylla/*" --local-dir /workspace/data

# Tokenizers only
huggingface-cli download LightSpeedUp/parameter-golf-data --include "tokenizers/*" --local-dir /workspace/data

# Mini subset for smoke tests (~2 GB, 10 shards + val)
huggingface-cli download LightSpeedUp/parameter-golf-data \
    --include "fineweb_sp1024/fineweb_train_00000?.bin" \
    --include "fineweb_sp1024/fineweb_val*" \
    --local-dir /workspace/data

# Val only (~200 MB)
huggingface-cli download LightSpeedUp/parameter-golf-data \
    --include "fineweb_sp1024/fineweb_val*" \
    --local-dir /workspace/data
```

## Dataset Structure

```
parameter-golf-data/
├── fineweb_sp1024/
│   ├── fineweb_train_000000.bin ... fineweb_train_000079.bin  (80 shards, ~11 GB)
│   └── fineweb_val_000000.bin                                 (1 shard, ~200 MB)
├── fineweb_scylla/
│   ├── fineweb_train_000000.bin ... fineweb_train_000079.bin  (80 shards, ~11 GB)
│   └── fineweb_val_000000.bin                                 (1 shard)
├── tokenizers/
│   ├── fineweb_1024_bpe.model         (SP1024 SentencePiece)
│   ├── scylla/candidate.vocab         (TokenMonster 998)
│   └── scylla/candidate.meta.npz     (byte LUTs)
└── SHA256SUMS.txt                     (integrity manifest)
```

## Data Integrity

Verify your download:
```bash
cd /workspace/data
sha256sum -c SHA256SUMS.txt
```

## Provenance

- **Source:** [FineWeb](https://huggingface.co/datasets/HuggingFaceFW/fineweb) (CommonCrawl-derived, by Hugging Face)
- **SP1024 tokenization:** SentencePiece BPE trained on FineWeb, 1024 tokens — from [openai/parameter-golf](https://github.com/openai/parameter-golf) competition repo
- **Scylla tokenization:** TokenMonster vocabulary (998 tokens) by [@simon-marcus](https://github.com/simon-marcus) ([PR #1143](https://github.com/openai/parameter-golf/pull/1143)). Retokenized using our [retokenize_scylla.py](https://github.com/MatoTeziTanka/parameter-golf-private) pipeline.
- **No modification** to token sequences — these are byte-identical to what you'd get by cloning the competition repo and running the tokenizer yourself.

## Attribution Chain

FineWeb → CommonCrawl (CC-BY) → Hugging Face (ODC-By 1.0) → This dataset (ODC-By 1.0)

## License

**Data:** [Open Data Commons Attribution License (ODC-By 1.0)](https://opendatacommons.org/licenses/by/1-0/) — required by FineWeb upstream license. You may use, share, and adapt this data with attribution.

**Retokenization code:** Apache 2.0 — see [PATENTS.md](PATENTS.md) for patent boundary notice.

## Community

- [The Agora](https://matotezitanka.github.io/parameter-golf) — live leaderboard + compliance tracker
- [Issue #942](https://github.com/openai/parameter-golf/issues/942) — compute resources discussion
- [Issue #140](https://github.com/openai/parameter-golf/issues/140) — competition discussion thread

Built by [Light Speed Up](https://lightspeedup.com) for the Parameter Golf community.