SAGE Topology Orchestrator

Base: nvidia/Nemotron-Orchestrator-8B (original weights, tool-calling preserved) Training: DAPO via verl 0.7.1 on 2x H100 NVL 94GB Format: JSON tool-calls (<tool_call> native format, NOT YAML) Framework: YGN-SAGE

Why JSON, not YAML

Nemotron-Orchestrator-8B was GRPO-trained by NVIDIA to emit structured JSON tool calls. Previous YAML training caused 91% malformation. JSON uses the model's native format.

Training Status

Not yet started β€” dataset conversion and script preparation in progress (March 30, 2026).

SAGE Modules as Tools

The model learns to orchestrate YGN-SAGE modules:

  • add_node(role, model_tier, prompt) β€” create topology node
  • add_edge(from_idx, to_idx) β€” connect nodes
  • set_reasoning(text) β€” explain topology choice
  • checkpoint(node_idx, fallback_tier) β€” mark adaptation point
  • upgrade(node_idx, new_tier) β€” upgrade model at checkpoint (Phase C)
  • continue() β€” proceed to next node (Phase C)
  • reroute() β€” abort and restart (Phase C)

Previous Model

See yannabadie/sage-topology-policy-v2 for the YAML-based training history (Phase A step 1050, DAPO step 300).

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for yannabadie/sage-topology-orchestrator

Finetuned
Qwen/Qwen3-8B
Finetuned
(11)
this model