gnn-ruby-code-study / README.md
timlawrenz's picture
Upload README.md with huggingface_hub
714c354 verified
metadata
language:
  - code
license: mit
task_categories:
  - graph-ml
  - text-classification
tags:
  - code
  - ast
  - gnn
  - graph-neural-network
  - ruby
  - complexity-prediction
  - code-generation
  - negative-results
size_categories:
  - 10K<n<100K

GNN Ruby Code Study

Systematic study of Graph Neural Network architectures for Ruby code complexity prediction and generation.

Paper: Graph Neural Networks for Ruby Code Complexity Prediction and Generation: A Systematic Architecture Study

Dataset

22,452 Ruby methods parsed into AST graphs with 74-dimensional node features.

Split Samples File
Train 19,084 dataset/train.jsonl
Validation 3,368 dataset/val.jsonl

Each JSONL record contains:

  • repo_name: Source repository
  • file_path: Original file path
  • raw_source: Raw Ruby source code
  • complexity_score: McCabe cyclomatic complexity
  • ast_json: Full AST as nested JSON (node types + literal values)
  • id: Unique identifier

Node Features (74D)

  • One-hot encoding of 73 AST node types (def, send, args, lvar, str, ...) + 1 unknown
  • Types cover Ruby AST nodes; literal values (identifiers, strings, numbers) map to unknown

Key Findings

  1. 5-layer GraphSAGE achieves MAE 4.018 (R² = 0.709) for complexity prediction — 16% better than 3-layer baseline (9.9σ significant)
  2. GNN autoencoders produce 0% valid Ruby across all 15+ tested configurations
  3. The literal value bottleneck: Teacher-forced GIN achieves 81% node type accuracy and 99.5% type diversity, but 0% syntax validity because 47% of AST elements are literals with no learnable representation
  4. Chain decoders collapse: 93% of predictions default to UNKNOWN without structural supervision
  5. Total cost: ~$4.32 across 51 GPU experiments on Vast.ai RTX 4090 + local RTX 2070 SUPER

Repository Structure

├── paper.md                           # Full research paper
├── dataset/
│   ├── train.jsonl                    # 19,084 Ruby methods (37 MB)
│   └── val.jsonl                      # 3,368 Ruby methods (6.5 MB)
├── models/
│   ├── encoder_sage_5layer.pt         # Pre-trained SAGE encoder
│   └── decoders/                      # Trained decoder checkpoints
│       ├── tf-gin-256-deep.pt         # Best: teacher-forced GIN, 5 layers
│       ├── tf-gin-{128,256,512}.pt    # Dimension ablation
│       └── chain-gin-256.pt           # Control (no structural supervision)
├── results/
│   ├── fleet_experiments.json         # All Vast.ai experiment metrics
│   ├── autonomous_research.json       # 18 baseline variance replicates
│   └── gin_deep_dive/                 # Local deep-dive analysis
│       ├── summary.json               # Ablation summary table
│       └── *_results.json             # Per-config detailed results
├── experiments/                       # Ratiocinator fleet YAML specs
├── specs/                             # Ratiocinator research YAML specs
├── src/                               # Model source code
│   ├── models.py                      # GNN architectures
│   ├── data_processing.py             # AST→graph pipeline
│   ├── loss.py                        # Loss functions
│   ├── train.py                       # Complexity prediction trainer
│   └── train_autoencoder.py           # Autoencoder trainer
└── scripts/                           # Runner and evaluation scripts

Reproducing Results

# Clone the experiment branch
git clone -b experiment/ratiocinator-gnn-study https://github.com/timlawrenz/jubilant-palm-tree
cd jubilant-palm-tree

# Install dependencies
python -m venv .venv && source .venv/bin/activate
pip install torch torchvision torch_geometric

# Train complexity prediction (Track 1)
python train.py --conv_type SAGE --num_layers 5 --epochs 50

# Train autoencoder with teacher-forced GIN decoder (Track 4)
python train_autoencoder.py --decoder_conv_type GIN --decoder_edge_mode teacher_forced --epochs 30

# Run the full deep-dive ablation
python scripts/gin_deep_dive.py

Source Code

Citation

If you use this dataset or findings, please cite:

@misc{lawrenz2025gnnruby,
  title={Graph Neural Networks for Ruby Code Complexity Prediction and Generation: A Systematic Architecture Study},
  author={Tim Lawrenz},
  year={2025},
  howpublished={\url{https://huggingface.co/datasets/timlawrenz/gnn-ruby-code-study}}
}