--- license: mit language: - en pretty_name: TactileEval size_categories: - 10K///*.{jpg,png} processed/ records_full.jsonl splits/{train,val,test}.jsonl family_splits/F{1..6}/{train,val,test}.jsonl dataset_summary.{csv,json} ``` JSONL fields: | Field | Description | |-------|-------------| | `pair_id` | `NaturalRel::TactileRel` identifier | | `task_family` | Task code (e.g., F1QL) | | `option_id` | Option identifier (e.g., `too_thick`) | | `option_description` | Plain-language description | | `natural_image`, `tactile_image` | Paths relative to `images/` | | `votes_total`, `positives`, `negatives` | Vote stats | | `label` | Majority label (0/1) | | `vote_fraction` | positives / total | | `status_counts` | counts of approved/submitted ballots | ## Splits - `splits/train.jsonl` (11,348 records) - `splits/val.jsonl` (1,341 records) - `splits/test.jsonl` (1,406 records) `family_splits/` mirrors these splits per object family. ## Usage ```python from datasets import load_dataset ds = load_dataset("Adnank1998/TactileEval", name="full", split="train") family = load_dataset("Adnank1998/TactileEval", name="family_f1", split="train") example = ds[0] print(example["natural_image"], example["label"]) ``` Images are stored under `images/`; join the relative path returned in `natural_image`/`tactile_image` with the local dataset root to load the files. Available configurations: - `full`: All families (default). - `family_f1` through `family_f6`: Per-family subsets matching the paper splits. Each split lives in `processed/`, enabling the Hugging Face dataset viewer via the bundled `dataset_infos.json`. Available configurations: - `full`: All families (default). - `family_f1` … `family_f6`: Per-family subsets matching the paper splits. Each split lives in `processed/`, so the Hugging Face dataset viewer can load the files directly via the bundled `dataset_infos.json`. ## Citation ``` @misc{khan2026tactileevalstepautomatedfinegrained, title={TactileEval: A Step Towards Automated Fine-Grained Evaluation and Editing of Tactile Graphics}, author={Adnan Khan and Abbas Akkasi and Majid Komeili}, year={2026}, eprint={2604.19829}, archivePrefix={arXiv}, primaryClass={cs.CV}, url={https://arxiv.org/abs/2604.19829} } ``` ## Contact Questions? Open an issue or email adnankhan5@cmail.carleton.ca. ## Acknowledgements This work was supported in part by MITACS and the Digital Alliance of Canada. We thank the student volunteers at the Intelligent Machines Lab (iML), Carleton University, for their contributions, and Joshua Olojede and Hoda Vafaeesefat for their help with the AMT annotation environment.