Datasets:
Dataset Viewer
The dataset could not be loaded because the splits use different data file formats, which is not supported. Read more about the splits configuration. Click for more details.
Couldn't infer the same data file format for all splits. Got {NamedSplit('train'): ('json', {}), NamedSplit('validation'): ('json', {}), NamedSplit('test'): (None, {})}
Error code: FileFormatMismatchBetweenSplitsError
Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.
GNN Ruby Code Study
Systematic study of Graph Neural Network architectures for Ruby code complexity prediction and generation.
Dataset
22,452 Ruby methods parsed into AST graphs with 74-dimensional node features.
| Split | Samples | File |
|---|---|---|
| Train | 19,084 | dataset/train.jsonl |
| Validation | 3,368 | dataset/val.jsonl |
Each JSONL record contains:
repo_name: Source repositoryfile_path: Original file pathraw_source: Raw Ruby source codecomplexity_score: McCabe cyclomatic complexityast_json: Full AST as nested JSON (node types + literal values)id: Unique identifier
Node Features (74D)
- One-hot encoding of 73 AST node types (def, send, args, lvar, str, ...) + 1 unknown
- Types cover Ruby AST nodes; literal values (identifiers, strings, numbers) map to unknown
Key Findings
- 5-layer GraphSAGE achieves MAE 4.018 (RΒ² = 0.709) for complexity prediction β 16% better than 3-layer baseline (9.9Ο significant)
- GNN autoencoders produce 0% valid Ruby across all 15+ tested configurations
- The literal value bottleneck: Teacher-forced GIN achieves 81% node type accuracy and 99.5% type diversity, but 0% syntax validity because 47% of AST elements are literals with no learnable representation
- Chain decoders collapse: 93% of predictions default to UNKNOWN without structural supervision
- Total cost: ~$4.32 across 51 GPU experiments on Vast.ai RTX 4090 + local RTX 2070 SUPER
Repository Structure
βββ paper.md # Full research paper
βββ dataset/
β βββ train.jsonl # 19,084 Ruby methods (37 MB)
β βββ val.jsonl # 3,368 Ruby methods (6.5 MB)
βββ models/
β βββ encoder_sage_5layer.pt # Pre-trained SAGE encoder
β βββ decoders/ # Trained decoder checkpoints
β βββ tf-gin-256-deep.pt # Best: teacher-forced GIN, 5 layers
β βββ tf-gin-{128,256,512}.pt # Dimension ablation
β βββ chain-gin-256.pt # Control (no structural supervision)
βββ results/
β βββ fleet_experiments.json # All Vast.ai experiment metrics
β βββ autonomous_research.json # 18 baseline variance replicates
β βββ gin_deep_dive/ # Local deep-dive analysis
β βββ summary.json # Ablation summary table
β βββ *_results.json # Per-config detailed results
βββ experiments/ # Ratiocinator fleet YAML specs
βββ specs/ # Ratiocinator research YAML specs
βββ src/ # Model source code
β βββ models.py # GNN architectures
β βββ data_processing.py # ASTβgraph pipeline
β βββ loss.py # Loss functions
β βββ train.py # Complexity prediction trainer
β βββ train_autoencoder.py # Autoencoder trainer
βββ scripts/ # Runner and evaluation scripts
Reproducing Results
# Clone the experiment branch
git clone -b experiment/ratiocinator-gnn-study https://github.com/timlawrenz/jubilant-palm-tree
cd jubilant-palm-tree
# Install dependencies
python -m venv .venv && source .venv/bin/activate
pip install torch torchvision torch_geometric
# Train complexity prediction (Track 1)
python train.py --conv_type SAGE --num_layers 5 --epochs 50
# Train autoencoder with teacher-forced GIN decoder (Track 4)
python train_autoencoder.py --decoder_conv_type GIN --decoder_edge_mode teacher_forced --epochs 30
# Run the full deep-dive ablation
python scripts/gin_deep_dive.py
Source Code
- Model code: jubilant-palm-tree (branch:
experiment/ratiocinator-gnn-study) - Orchestrator: ratiocinator
Citation
If you use this dataset or findings, please cite:
@misc{lawrenz2025gnnruby,
title={Graph Neural Networks for Ruby Code Complexity Prediction and Generation: A Systematic Architecture Study},
author={Tim Lawrenz},
year={2025},
howpublished={\url{https://huggingface.co/datasets/timlawrenz/gnn-ruby-code-study}}
}
- Downloads last month
- -