parameter-golf-data / README.md
Norelec7's picture
Upload README.md with huggingface_hub
83a1e20 verified
|
raw
history blame
4.29 kB
metadata
license: odc-by
task_categories:
  - text-generation
language:
  - en
pretty_name: Parameter Golf Competition Data
size_categories:
  - 1B<n<10B
tags:
  - parameter-golf
  - fineweb
  - language-modeling
  - competition

Parameter Golf Competition Data

Pre-tokenized FineWeb shards for the OpenAI Parameter Golf competition.

Two tokenizations included:

  • SP1024 — SentencePiece BPE, 1024 tokens (competition default)
  • Scylla — TokenMonster-derived, 998 tokens (community alternative from PR #1143)

Why this exists

Every time you launch a GPU pod, re-downloading 16+ GB of data from the competition repo costs 10-30 minutes of billable GPU time. This dataset provides the same data via huggingface-cli download — fast, resumable, and available from any provider (RunPod, Modal, Colab, Vast.ai).

Quick Start

# Full SP1024 dataset (~11 GB)
huggingface-cli download LightSpeedUp/parameter-golf-data --include "fineweb_sp1024/*" --local-dir /workspace/data

# Full Scylla dataset (~11 GB)
huggingface-cli download LightSpeedUp/parameter-golf-data --include "fineweb_scylla/*" --local-dir /workspace/data

# Tokenizers only
huggingface-cli download LightSpeedUp/parameter-golf-data --include "tokenizers/*" --local-dir /workspace/data

# Mini subset for smoke tests (~2 GB, 10 shards + val)
huggingface-cli download LightSpeedUp/parameter-golf-data \
    --include "fineweb_sp1024/fineweb_train_00000?.bin" \
    --include "fineweb_sp1024/fineweb_val*" \
    --local-dir /workspace/data

# Val only (~200 MB)
huggingface-cli download LightSpeedUp/parameter-golf-data \
    --include "fineweb_sp1024/fineweb_val*" \
    --local-dir /workspace/data

Dataset Structure

parameter-golf-data/
├── fineweb_sp1024/
│   ├── fineweb_train_000000.bin ... fineweb_train_000079.bin  (80 shards, ~11 GB)
│   └── fineweb_val_000000.bin                                 (1 shard, ~200 MB)
├── fineweb_scylla/
│   ├── fineweb_train_000000.bin ... fineweb_train_000079.bin  (80 shards, ~11 GB)
│   └── fineweb_val_000000.bin                                 (1 shard)
├── tokenizers/
│   ├── fineweb_1024_bpe.model         (SP1024 SentencePiece)
│   ├── scylla/candidate.vocab         (TokenMonster 998)
│   └── scylla/candidate.meta.npz     (byte LUTs)
└── SHA256SUMS.txt                     (integrity manifest)

Data Integrity

Verify your download:

cd /workspace/data
sha256sum -c SHA256SUMS.txt

Provenance

  • Source: FineWeb (CommonCrawl-derived, by Hugging Face)
  • SP1024 tokenization: SentencePiece BPE trained on FineWeb, 1024 tokens — from openai/parameter-golf competition repo
  • Scylla tokenization: TokenMonster vocabulary (998 tokens) by @simon-marcus (PR #1143). Retokenized using our retokenize_scylla.py pipeline.
  • No modification to token sequences — these are byte-identical to what you'd get by cloning the competition repo and running the tokenizer yourself.

Attribution Chain

FineWeb → CommonCrawl (CC-BY) → Hugging Face (ODC-By 1.0) → This dataset (ODC-By 1.0)

License

Data: Open Data Commons Attribution License (ODC-By 1.0) — required by FineWeb upstream license. You may use, share, and adapt this data with attribution.

Retokenization code: Apache 2.0 — see PATENTS.md for patent boundary notice.

Community

Built by Light Speed Up for the Parameter Golf community.