Speak or Stay Silent: Context-Aware Turn-Taking in Multi-Party Dialogue
Paper • 2603.11409 • Published
A LoRA fine-tuned Qwen2.5-7B-Instruct model for proactive response prediction in multi-party meeting dialogue, with chain-of-thought reasoning.
This model predicts whether a target speaker will SPEAK or remain SILENT at a given decision point in a multi-party conversation. It outputs a reasoning explanation before the decision, supporting interpretability and improved accuracy.
The model generates structured output:
<reasoning>One sentence explaining whether the target is an ACTIVE PARTICIPANT or BYSTANDER, and why they should or should not respond.</reasoning>
<decision>SPEAK</decision>
<confidence>high</confidence>
Trained on the AMI Corpus — meeting recordings and transcripts with explicit addressee annotations. Decision points were extracted at each turn where a speaker change could occur.
from transformers import AutoModelForCausalLM, AutoTokenizer
from peft import PeftModel
base_model = "Qwen/Qwen2.5-7B-Instruct"
adapter = "kraken07/qwen2.5-7b-ami-reasoning"
model = AutoModelForCausalLM.from_pretrained(base_model, device_map="auto")
model = PeftModel.from_pretrained(model, adapter)
tokenizer = AutoTokenizer.from_pretrained(base_model)
# Format: provide conversational context and current turn
# The model expects a prompt that includes context turns and asks for
# reasoning + decision for a target speaker
prompt = """<conversation context>
<instruction to predict SPEAK/SILENT with reasoning>
"""
inputs = tokenizer(prompt, return_tensors="pt").to(model.device)
outputs = model.generate(**inputs, max_new_tokens=128)
response = tokenizer.decode(outputs[0][inputs["input_ids"].shape[1]:], skip_special_tokens=True)
# Parse <reasoning>, <decision>, <confidence> from response
If you use this model, please cite our work:
@misc{bhagtani2026speakstaysilentcontextaware,
title={Speak or Stay Silent: Context-Aware Turn-Taking in Multi-Party Dialogue},
author={Bhagtani, Kratika and Anand, Mrinal and Xu, Yu Chen and Yadav, Amit Kumar Singh},
year={2026},
archivePrefix={arXiv},
url={https://arxiv.org/abs/2603.11409}
}