ToolOmni
ToolOmni is a tool-use language model released for the ACL 2026 Main Conference paper ToolOmni: Enabling Open-World Tool Use via Agentic Learning with Proactive Retrieval and Grounded Execution.
This checkpoint is built on top of Qwen/Qwen3-4B-Instruct and is designed for open-world tool use. The model is trained to proactively retrieve relevant tools and generate grounded multi-step tool calls for downstream task completion.
Model Description
- Model type: Causal language model for tool use
- Base model:
Qwen/Qwen3-4B-Instruct - Paper venue: ACL 2026 Main Conference
- Codebase: training, evaluation, retrieval, and tool execution utilities are available in the public repository
Intended Uses
This model is intended for:
- research on tool-use agents
- benchmarking open-world tool retrieval and grounded execution
- studying retrieval-augmented and execution-aware training
- reproducing the ToolOmni evaluation pipeline
The model is expected to work best together with the ToolOmni codebase, retriever, and tool execution environment.
Training
ToolOmni follows an agentic learning framework with:
- proactive tool retrieval
- grounded tool execution
- reinforcement learning for multi-step tool-use behavior
The training and evaluation pipeline is released in the ToolOmni repository.
Evaluation
ToolOmni is evaluated on ToolBench-style benchmarks in both:
- with-api-list / golden-tool settings
- open-domain settings without golden tool lists
Please refer to the project repository and paper for the detailed evaluation protocol and benchmark results.
Repository
- Paper: https://arxiv.org/abs/2604.13787
- Code: https://github.com/Huangsz2021/ToolOmni
- Model: https://huggingface.co/bue0912/ToolOmni-Qwen3-4B
- Dataset: https://huggingface.co/datasets/bue0912/ToolOmni-Data
- Collection: https://huggingface.co/collections/bue0912/toolomni
Citation
@misc{huang2026toolomnienablingopenworldtool,
title={ToolOmni: Enabling Open-World Tool Use via Agentic learning with Proactive Retrieval and Grounded Execution},
author={Shouzheng Huang and Meishan Zhang and Baotian Hu and Min Zhang},
year={2026},
eprint={2604.13787},
archivePrefix={arXiv},
primaryClass={cs.CL},
url={https://arxiv.org/abs/2604.13787},
}
License
This release is aligned with Apache-2.0. See the repository-level LICENSE for details.
- Downloads last month
- 214