nielsr's picture
nielsr HF Staff
Improve model card for Mixture-Summarizer-Qwen3.5-2B
9c1a7d8 verified
|
raw
history blame
2.95 kB
metadata
base_model:
  - Qwen/Qwen3.5-2B
license: mit
pipeline_tag: text-generation
library_name: transformers

Mixture-Summarizer-Qwen3.5-2B

๐ŸŒ Project Page | ๐Ÿ’ป Code | ๐Ÿ“„ Paper

We introduce RecursiveMAS, a multi-agent framework introduced in Recursive Multi-Agent Systems that scales agent collaboration through latent-space recursion.

RecursiveMAS treats a multi-agent system as a unified recursive computation, where heterogeneous agents iteratively exchange, refine, and evolve their latent states across recursion rounds. In the Mixture-Style setting, this Summarizer Agent integrates outputs from domain-specialized agents and produces the final response through recursive latent-space collaboration.

Model Details

Item Description
Model Mixture-Summarizer-Qwen3.5-2B
Collaboration Style Mixture-Style
Agent Role Summarizer Agent
Base Model Qwen3.5-2B

โš ๏ธ Note: This checkpoint is a role-specific agent in RecursiveMAS, rather than a standalone model intended for plain-text generation.

Usage

To use this agent as part of the RecursiveMAS pipeline, you can load the system using the provided loader from the official repository:

from system_loader import load_mas_system

# Load the whole Mixture-Style MAS pipeline
mas = load_mas_system(
    style="mixture",
    device="cuda",
    trust_remote_code=True,
)

# Access the specific summarizer model
summarizer = mas.agents["summarizer"].model

Model Collections for RecursiveMAS

Style Model Collection
Sequential-Style ๐Ÿค— HuggingFace
Mixture-Style ๐Ÿค— HuggingFace
Distillation-Style ๐Ÿค— HuggingFace
Deliberation-Style ๐Ÿค— HuggingFace

Experiment Results

RecursiveMAS Experiment Results

Citation

@misc{recursivemas,
      title={Recursive Multi-Agent Systems}, 
      author={Xiyuan Yang and Jiaru Zou and Rui Pan and Ruizhong Qiu and Pan Lu and Shizhe Diao and Jindong Jiang and Hanghang Tong and Tong Zhang and Markus J. Buehler and Jingrui He and James Zou},
      year={2026},
      eprint={2604.25917},
      archivePrefix={arXiv},
      primaryClass={cs.AI},
      url={https://arxiv.org/abs/2604.25917}, 
}