Dataset Viewer
Auto-converted to Parquet Duplicate
token_ids
listlengths
2
599k
text
stringlengths
7
1.32M
[ 1, 6739, 1573, 22549, 9205, 50295, 59342, 2302, 4691, 1457, 2246, 1447, 14332, 7934, 1502, 2303, 32000, 1766, 6103, 72, 3348, 3382, 4017, 1397, 19173, 3866, 1972, 7196, 1381, 2435, 10343, 2437, 1643, 1377, 49225, 1498, 3888, 1384, 49454, ...
<s> While not strictly external parasites, many species that infest domestic animals are found beneath their skin. These include various nematodes like Dirofilaria repens in carnivores and Parafilaria species in livestock, larval tapeworms causing subcutaneous coenurosis in rabbits, and insect larvae such as Hypoderma ...
[ 1, 1786, 74, 3983, 5, 5, 14331, 1940, 59328, 59375, 5, 5, 7062, 59360, 21316, 1772, 24812, 72, 3983, 29981, 5, 7062, 59360, 21316, 1772, 2735, 59359, 2028, 72, 3983, 29981, 5, 5, 1848, 1940, 59328, 59330, 1606, 59330, 11571, 59375, 5,...
<s> <?php namespace phs; require_once 'glob.php'; require_once 'util/set.php'; use phs\ast\Unit; use phs\util\Set; use phs\util\LooseSet; /** abstract base class */ abstract class Source { // @var Location public $loc = null; // @var Unit the parsed unit for this source public $unit; // @var strin...
[ 1, 19797, 4199, 4692, 13142, 1421, 1358, 2236, 11471, 1712, 1385, 4712, 1450, 19963, 1377, 4034, 4924, 4650, 1384, 1358, 5917, 4034, 4924, 14191, 72, 5, 5, 59364, 1986, 2081, 1457, 8282, 1385, 4034, 4924, 1502, 7299, 1384, 7266, 1421, 2...
<s> Drive search engine optimization for the group keeping up to date with advances in social media technology and the latest social media platforms. Ensuring that approaches to social media are relevant and appropriate for each medium. Continuous research into new relevant social media channels and their impact on th...
[1,59320,63,63,63,5,5241,59358,1772,21937,72,2916,62232,59320,1325,1276,1269,1326,1232,1261,1324,122(...TRUNCATED)
"<s> ---\ndescription: 'React.js의 컴포넌트에 대한 사용법을 정리하며, 각 컴포넌(...TRUNCATED)
[1,22882,1449,1502,3836,1358,46736,6332,1872,4740,1348,29537,14286,1377,1766,21031,72,3519,59342,142(...TRUNCATED)
"<s> Perhaps you are among the fortunate individuals who enjoy a spacious bathroom in their residenc(...TRUNCATED)
[1,1525,50519,59370,1931,20073,8434,59342,17323,21427,1102,5,1724,1554,4057,6832,1102,5,3840,12832,1(...TRUNCATED)
"<s> from PyQt4 import QtCore,QtGui\r\nimport webbrowser\r\nfrom HTMLParser import HTMLParser\r\nimp(...TRUNCATED)
[1,59320,1328,1276,1280,5,2631,3309,72,54525,72,3057,48610,59375,5,2631,4589,59336,72,9905,72,17562,(...TRUNCATED)
"<s> \nusing System.ComponentModel.DataAnnotations;\nusing Abp.Application.Services.Dto;\nusing A(...TRUNCATED)
[1,3566,59345,72,59810,33568,1588,5,5,1345,59345,72,18380,59931,59390,12506,7132,1744,52168,2432,245(...TRUNCATED)
"<s> **1.简答题**\n\n 1. 用例是用来描述一个参与者使用系统来完成目标的所(...TRUNCATED)
[1,1786,74,3983,5,5,14331,5216,59330,6840,59330,21486,59375,5,5,1848,14319,59330,6840,59330,4486,593(...TRUNCATED)
"<s> <?php\n\nnamespace App\\Http\\Controllers;\n\nuse Illuminate\\Http\\Request;\nuse App\\Models\\(...TRUNCATED)
[1,4038,1536,8794,6217,5,1566,11745,1385,4570,72,3983,5,5,1566,59378,72,59357,72,1921,37938,59320,63(...TRUNCATED)
"<s> // this runs algorithm\n// attached to index.php\n\n//B.C. : NOTE - COPY PASTE index.php URL in(...TRUNCATED)
End of preview. Expand in Data Studio

InfLLM-V2 Long-Context Training Dataset with 5B Tokens

Project Links: [Paper] [InfLLM-V2 Models] [CUDA Kernel Code]


🚀 About InfLLM-V2

InfLLM-V2 is a native sparse attention framework designed for the efficient processing of long-sequence texts. Its core advantage is the ability to maintain high performance comparable to dense attention in short-text scenarios—without any extra parameters—while seamlessly switching to a sparse mode for long-text scenarios, achieving significant end-to-end acceleration.

To support community reproduction and further exploration, we are open-sourcing the full suite of resources for the InfLLM-V2 project, including:

✨ Dataset Description

This dataset contains 5B tokens of long-text data used for training InfLLM-V2.

We demonstrate that only 5B tokens of high-quality long-text data are needed to successfully unlock the model's powerful sparse attention capabilities, without resorting to the trillion-scale data required by other methods. Using this dataset, researchers can efficiently reproduce our results or explore more advanced training methods for long-context models.

Data Composition and Specifications

1. Data Composition

This dataset is a carefully curated mixture from sources including web data, source code, scientific papers, and Wikipedia, augmented with a selection of high-quality in-house data.

2. Specifications

  • Total Tokens: Approximately 5 Billion (5B).
  • Tokenizer: Processed using the tokenizer from MiniCPM4.
  • Data Format: Sharded Parquet (.parquet).
  • Data Fields:
    • input_ids: (list[int]) The list of encoded Token IDs.
    • text: (string) The original text.

How to Use

Given the large size of the dataset, it is highly recommended to load it in streaming mode using the Hugging Face datasets library to avoid memory exhaustion.

from datasets import load_dataset

# Recommended: Load in streaming mode to save memory
ds = load_dataset("openbmb/InfLLM-V2-data-5B", split="train", streaming=True)

The InfLLM-V2 Training Workflow

The long-context capability of InfLLM-V2 is achieved through continued training on high-quality long-text data.

  • Step 1: Start from the base model.

  • Step 2: Continue training on this dataset.

    • Use this dataset (InfLLM-V2-data-5B) to perform continued training on the base model.
  • Step 3: Get the final long-context model.

    • InfLLM-V2-Long-Sparse-Base: The final model after training, equipped with powerful long-context and sparse attention capabilities.

Related Projects

Citation

If you use our work in your research, please cite our paper:

@misc{zhao2025infllmv2densesparseswitchableattention,
      title={InfLLM-V2: Dense-Sparse Switchable Attention for Seamless Short-to-Long Adaptation}, 
      author={Weilin Zhao and Zihan Zhou and Zhou Su and Chaojun Xiao and Yuxuan Li and Yanghao Li and Yudi Zhang and Weilun Zhao and Zhen Li and Yuxiang Huang and Ao Sun and Xu Han and Zhiyuan Liu},
      year={2025},
      eprint={2509.24663},
      archivePrefix={arXiv},
      primaryClass={cs.CL},
      url={https://arxiv.org/abs/2509.24663}, 
}
Downloads last month
92

Paper for victor/InfLLM-V2-data-5B