Papers
arxiv:2605.07654

Reliable Chain-of-Thought via Prefix Consistency

Published on May 8
· Submitted by
Naoto Iwase
on May 13
Authors:
,
,
,

Abstract

Prefix consistency uses answer reproduction rates under trace regeneration to weight candidate responses, achieving high accuracy with significantly fewer tokens than standard majority voting.

AI-generated summary

Large Language Models often improve accuracy on reasoning tasks by sampling multiple Chain-of-Thought (CoT) traces and aggregating them with majority voting (MV), a test-time technique called self-consistency. When we truncate a CoT partway through and regenerate the remainder, we observe that traces with correct answers reproduce their original answer more often than traces with wrong answers. We use this difference as a reliability signal, prefix consistency, that weights each candidate answer by how often it reappears under regeneration. It requires no access to token log-probabilities or self-rating prompts. Across five reasoning models and four math and science benchmarks, prefix consistency is the best correctness predictor in most settings, and reweighting votes by it reaches Standard MV plateau accuracy at up to 21x fewer tokens (median 4.6x). Our code is available at https://github.com/naoto-iwase/prefix-consistency.

Community

Paper submitter

TL;DR: Correct Chain-of-Thought traces reproduce their answer under prefix regeneration more often than wrong ones, and weighting majority voting by this prefix consistency reaches plateau accuracy at up to 21x fewer tokens (median 4.6x).

High-ROI. I like it.

Sign up or log in to comment

Get this paper in your agent:

hf papers read 2605.07654
Don't have the latest CLI?
curl -LsSf https://hf.co/cli/install.sh | bash

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2605.07654 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2605.07654 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2605.07654 in a Space README.md to link it from this page.

Collections including this paper 1