SAGE Topology Orchestrator
Base: nvidia/Nemotron-Orchestrator-8B (original weights, tool-calling preserved)
Training: DAPO via verl 0.7.1 on 2x H100 NVL 94GB
Format: JSON tool-calls (<tool_call> native format, NOT YAML)
Framework: YGN-SAGE
Why JSON, not YAML
Nemotron-Orchestrator-8B was GRPO-trained by NVIDIA to emit structured JSON tool calls. Previous YAML training caused 91% malformation. JSON uses the model's native format.
Training Status
Not yet started β dataset conversion and script preparation in progress (March 30, 2026).
SAGE Modules as Tools
The model learns to orchestrate YGN-SAGE modules:
add_node(role, model_tier, prompt)β create topology nodeadd_edge(from_idx, to_idx)β connect nodesset_reasoning(text)β explain topology choicecheckpoint(node_idx, fallback_tier)β mark adaptation pointupgrade(node_idx, new_tier)β upgrade model at checkpoint (Phase C)continue()β proceed to next node (Phase C)reroute()β abort and restart (Phase C)
Previous Model
See yannabadie/sage-topology-policy-v2 for the YAML-based training history (Phase A step 1050, DAPO step 300).
Inference Providers NEW
This model isn't deployed by any Inference Provider. π Ask for provider support