Datasets:
json dict | __key__ stringlengths 21 81 | __url__ stringclasses 2
values |
|---|---|---|
{
"clip_duration": 5.5,
"clip_file": "153_1_001_qa0000.mp4",
"clip_offset": 5.5,
"qa": {
"answer": "C",
"answer_text": "Person 0",
"category": "T1",
"difficulty": "easy",
"format": "mcq",
"options": [
"A) Person 2",
"B) Person 4",
"C) Person 0",
"D) No one"
],... | json/153_1_001_qa0000 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 3.5,
"clip_file": "153_1_001_qa0001.mp4",
"clip_offset": 97.5,
"qa": {
"answer": "B",
"answer_text": "Looking at the same thing",
"category": "T2",
"difficulty": "easy",
"format": "mcq",
"options": [
"A) Everyone suddenly shifts gaze",
"B) Looking at the sa... | json/153_1_001_qa0001 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 78,
"clip_file": "153_1_001_qa0002.mp4",
"clip_offset": 22.5,
"qa": {
"answer": "A",
"answer_text": "Following someone's gaze happens first",
"category": "T3",
"difficulty": "medium",
"format": "mcq",
"options": [
"A) Following someone's gaze happens first",
... | json/153_1_001_qa0002 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 6.5,
"clip_file": "153_1_001_qa0003.mp4",
"clip_offset": 22,
"qa": {
"answer": "A",
"answer_text": "Person 2",
"category": "T5",
"difficulty": "hard",
"format": "mcq",
"options": [
"A) Person 2",
"B) Person 0",
"C) Person 4",
"D) Person 3"
]... | json/153_1_001_qa0003 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 5.5,
"clip_file": "153_1_001_qa0005.mp4",
"clip_offset": 22,
"qa": {
"answer": "C",
"answer_text": "Person 3",
"category": "T1",
"difficulty": "easy",
"format": "mcq",
"options": [
"A) Person 0",
"B) Person 4",
"C) Person 3",
"D) No one"
],
... | json/153_1_001_qa0005 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 16,
"clip_file": "153_1_001_qa0008.mp4",
"clip_offset": 76,
"qa": {
"answer": "B",
"answer_text": "Person 4",
"category": "G1",
"difficulty": "easy",
"format": "mcq",
"options": [
"A) Person 0",
"B) Person 4",
"C) Person 2",
"D) No one"
],
... | json/153_1_001_qa0008 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 16,
"clip_file": "153_1_001_qa0009.mp4",
"clip_offset": 74.5,
"qa": {
"answer": "B",
"answer_text": "Pointing",
"category": "G2",
"difficulty": "easy",
"format": "mcq",
"options": [
"A) Giving",
"B) Pointing",
"C) Reaching",
"D) Showing"
],
... | json/153_1_001_qa0009 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 16,
"clip_file": "153_1_001_qa0010.mp4",
"clip_offset": 75,
"qa": {
"answer": "C",
"answer_text": "Person 1",
"category": "G1",
"difficulty": "easy",
"format": "mcq",
"options": [
"A) Person 0",
"B) Person 2",
"C) Person 1",
"D) Person 4"
],... | json/153_1_001_qa0010 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 7,
"clip_file": "153_1_002_qa0000.mp4",
"clip_offset": 85,
"qa": {
"answer": "D",
"answer_text": "Person 1",
"category": "T1",
"difficulty": "easy",
"format": "mcq",
"options": [
"A) Person 2",
"B) Person 3",
"C) Person 0",
"D) Person 1"
],
... | json/153_1_002_qa0000 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 5.5,
"clip_file": "153_1_002_qa0001.mp4",
"clip_offset": 59.5,
"qa": {
"answer": "A",
"answer_text": "eye contact",
"category": "T2",
"difficulty": "easy",
"format": "mcq",
"options": [
"A) eye contact",
"B) quickly changes gaze direction",
"C) lookin... | json/153_1_002_qa0001 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 8,
"clip_file": "153_1_002_qa0002.mp4",
"clip_offset": 112,
"qa": {
"answer": "A",
"answer_text": "4.5 seconds",
"category": "T3",
"difficulty": "medium",
"format": "mcq",
"options": [
"A) 4.5 seconds",
"B) 2.5 seconds",
"C) 3.5 seconds",
"D) 1.... | json/153_1_002_qa0002 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 18.5,
"clip_file": "153_1_002_qa0003.mp4",
"clip_offset": 59,
"qa": {
"answer": "D",
"answer_text": "eye contact",
"category": "T3",
"difficulty": "medium",
"format": "mcq",
"options": [
"A) following someone's gaze",
"B) looking at the same thing",
"... | json/153_1_002_qa0003 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 9,
"clip_file": "153_1_002_qa0004.mp4",
"clip_offset": 94,
"qa": {
"answer": "B",
"answer_text": "Person 0 and Person 1",
"category": "T4",
"difficulty": "medium",
"format": "mcq",
"options": [
"A) Person 0 and Person 4",
"B) Person 0 and Person 1",
"... | json/153_1_002_qa0004 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 6,
"clip_file": "153_1_002_qa0005.mp4",
"clip_offset": 72,
"qa": {
"answer": "B",
"answer_text": "Person 4",
"category": "T5",
"difficulty": "hard",
"format": "mcq",
"options": [
"A) Person 1",
"B) Person 4",
"C) Person 3",
"D) Person 0"
],
... | json/153_1_002_qa0005 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 7,
"clip_file": "153_1_002_qa0006.mp4",
"clip_offset": 102,
"qa": {
"answer": "D",
"answer_text": "Person 1",
"category": "T5",
"difficulty": "hard",
"format": "mcq",
"options": [
"A) Person 3",
"B) Person 4",
"C) Person 0",
"D) Person 1"
],... | json/153_1_002_qa0006 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 6,
"clip_file": "153_1_002_qa0007.mp4",
"clip_offset": 56.5,
"qa": {
"answer": "B",
"answer_text": "Person 0",
"category": "G1",
"difficulty": "easy",
"format": "mcq",
"options": [
"A) Person 1",
"B) Person 0",
"C) Person 3",
"D) Person 4"
]... | json/153_1_002_qa0007 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 6,
"clip_file": "153_1_002_qa0008.mp4",
"clip_offset": 77.5,
"qa": {
"answer": "D",
"answer_text": "Pointing",
"category": "G2",
"difficulty": "easy",
"format": "mcq",
"options": [
"A) Reaching",
"B) Giving",
"C) Showing",
"D) Pointing"
],
... | json/153_1_002_qa0008 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 7.5,
"clip_file": "153_1_002_qa0009.mp4",
"clip_offset": 104.5,
"qa": {
"answer": "D",
"answer_text": "Person 1",
"category": "G1",
"difficulty": "easy",
"format": "mcq",
"options": [
"A) Person 4",
"B) Person 3",
"C) Person 0",
"D) Person 1"
... | json/153_1_002_qa0009 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 7.5,
"clip_file": "153_1_002_qa0010.mp4",
"clip_offset": 56,
"qa": {
"answer": "D",
"answer_text": "Person 3",
"category": "C1",
"difficulty": "medium",
"format": "mcq",
"options": [
"A) Person 1",
"B) Person 0",
"C) Person 4",
"D) Person 3"
... | json/153_1_002_qa0010 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 8.5,
"clip_file": "153_1_002_qa0011.mp4",
"clip_offset": 78.5,
"qa": {
"answer": "C",
"answer_text": "Person 1 and Person 4",
"category": "C3",
"difficulty": "hard",
"format": "mcq",
"options": [
"A) Person 0 and Person 2",
"B) Person 0 and Person 4",
... | json/153_1_002_qa0011 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 6.5,
"clip_file": "153_1_002_qa0012.mp4",
"clip_offset": 106,
"qa": {
"answer": "C",
"answer_text": "Person 0 follows Person 2's gaze",
"category": "C1",
"difficulty": "medium",
"format": "mcq",
"options": [
"A) Person 2 follows Person 0's gaze",
"B) Person... | json/153_1_002_qa0012 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 9.5,
"clip_file": "153_1_002_qa0013.mp4",
"clip_offset": 56,
"qa": {
"answer": "A",
"answer_text": "Person 4",
"category": "C3",
"difficulty": "hard",
"format": "mcq",
"options": [
"A) Person 4",
"B) Person 1",
"C) Person 3",
"D) Person 2"
]... | json/153_1_002_qa0013 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 18.5,
"clip_file": "153_1_003_qa0000.mp4",
"clip_offset": 0,
"qa": {
"answer": "A",
"answer_text": "Person 0",
"category": "T1",
"difficulty": "easy",
"format": "mcq",
"options": [
"A) Person 0",
"B) Person 2",
"C) Person 3",
"D) No one"
],
... | json/153_1_003_qa0000 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 4,
"clip_file": "153_1_003_qa0001.mp4",
"clip_offset": 22.5,
"qa": {
"answer": "B",
"answer_text": "Looking at the same thing",
"category": "T2",
"difficulty": "easy",
"format": "mcq",
"options": [
"A) Following someone's gaze",
"B) Looking at the same thin... | json/153_1_003_qa0001 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 7.5,
"clip_file": "153_1_003_qa0002.mp4",
"clip_offset": 37.5,
"qa": {
"answer": "C",
"answer_text": "2.0 seconds",
"category": "T3",
"difficulty": "medium",
"format": "mcq",
"options": [
"A) 1.0 seconds",
"B) 0.0 seconds",
"C) 2.0 seconds",
"D)... | json/153_1_003_qa0002 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 4,
"clip_file": "153_1_003_qa0003.mp4",
"clip_offset": 37,
"qa": {
"answer": "A",
"answer_text": "Person 0 and Person 4",
"category": "T4",
"difficulty": "medium",
"format": "mcq",
"options": [
"A) Person 0 and Person 4",
"B) Person 2 and Person 3",
"... | json/153_1_003_qa0003 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 5,
"clip_file": "153_1_003_qa0004.mp4",
"clip_offset": 7,
"qa": {
"answer": "A",
"answer_text": "Person 0",
"category": "T5",
"difficulty": "hard",
"format": "mcq",
"options": [
"A) Person 0",
"B) Person 4",
"C) Person 3",
"D) Person 2"
],
... | json/153_1_003_qa0004 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 4.5,
"clip_file": "153_1_003_qa0005.mp4",
"clip_offset": 23.5,
"qa": {
"answer": "D",
"answer_text": "Person 1",
"category": "T6",
"difficulty": "hard",
"format": "mcq",
"options": [
"A) Both Person 2 and Person 3",
"B) Person 3",
"C) Person 2",
... | json/153_1_003_qa0005 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 4,
"clip_file": "153_2_001_qa0001.mp4",
"clip_offset": 71.5,
"qa": {
"answer": "B",
"answer_text": "Person 2 follows Person 3's gaze",
"category": "T2",
"difficulty": "easy",
"format": "mcq",
"options": [
"A) Everyone suddenly shifts gaze",
"B) Person 2 fol... | json/153_2_001_qa0001 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 14,
"clip_file": "153_2_001_qa0002.mp4",
"clip_offset": 29,
"qa": {
"answer": "A",
"answer_text": "11.0 seconds",
"category": "T3",
"difficulty": "medium",
"format": "mcq",
"options": [
"A) 11.0 seconds",
"B) 7.0 seconds",
"C) 3.5 seconds",
"D) ... | json/153_2_001_qa0002 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 5.5,
"clip_file": "153_2_001_qa0003.mp4",
"clip_offset": 78,
"qa": {
"answer": "D",
"answer_text": "Person 0 and Person 3",
"category": "T4",
"difficulty": "medium",
"format": "mcq",
"options": [
"A) Person 3 and Person 2",
"B) Person 0 and Person 4",
... | json/153_2_001_qa0003 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 9,
"clip_file": "153_2_001_qa0005.mp4",
"clip_offset": 16.5,
"qa": {
"answer": "B",
"answer_text": "5.5 seconds",
"category": "T3",
"difficulty": "medium",
"format": "mcq",
"options": [
"A) 2.0 seconds",
"B) 5.5 seconds",
"C) 7.0 seconds",
"D) 3... | json/153_2_001_qa0005 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 6.5,
"clip_file": "153_2_001_qa0006.mp4",
"clip_offset": 53,
"qa": {
"answer": "C",
"answer_text": "Person 4",
"category": "G1",
"difficulty": "easy",
"format": "mcq",
"options": [
"A) Person 1",
"B) Person 2",
"C) Person 4",
"D) Person 3"
]... | json/153_2_001_qa0006 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 5.5,
"clip_file": "153_2_001_qa0007.mp4",
"clip_offset": 53.5,
"qa": {
"answer": "A",
"answer_text": "Pointing",
"category": "G2",
"difficulty": "easy",
"format": "mcq",
"options": [
"A) Pointing",
"B) Giving",
"C) Showing",
"D) Reaching"
],... | json/153_2_001_qa0007 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 7,
"clip_file": "153_2_002_qa0000.mp4",
"clip_offset": 34,
"qa": {
"answer": "D",
"answer_text": "Person 4",
"category": "T1",
"difficulty": "easy",
"format": "mcq",
"options": [
"A) Person 0",
"B) Person 3",
"C) Person 2",
"D) Person 4"
],
... | json/153_2_002_qa0000 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 4.5,
"clip_file": "153_2_002_qa0001.mp4",
"clip_offset": 20.5,
"qa": {
"answer": "D",
"answer_text": "Looking at the same thing",
"category": "T2",
"difficulty": "easy",
"format": "mcq",
"options": [
"A) Following someone's gaze",
"B) Quickly changing gaze ... | json/153_2_002_qa0001 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 44,
"clip_file": "153_2_002_qa0002.mp4",
"clip_offset": 18.5,
"qa": {
"answer": "B",
"answer_text": "Looking at the same thing",
"category": "T3",
"difficulty": "medium",
"format": "mcq",
"options": [
"A) They never interact",
"B) Looking at the same thing"... | json/153_2_002_qa0002 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 5,
"clip_file": "153_2_002_qa0004.mp4",
"clip_offset": 34.5,
"qa": {
"answer": "C",
"answer_text": "hey look at exactly the same time",
"category": "T5",
"difficulty": "hard",
"format": "mcq",
"options": [
"A) Person 0",
"B) Person 3",
"C) They look a... | json/153_2_002_qa0004 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 5,
"clip_file": "153_2_002_qa0005.mp4",
"clip_offset": 62,
"qa": {
"answer": "A",
"answer_text": "Person 0",
"category": "T6",
"difficulty": "hard",
"format": "mcq",
"options": [
"A) Person 0",
"B) Person 1",
"C) Person 4",
"D) There is no share... | json/153_2_002_qa0005 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 4.5,
"clip_file": "153_2_002_qa0008.mp4",
"clip_offset": 45.5,
"qa": {
"answer": "B",
"answer_text": "Person 0",
"category": "G1",
"difficulty": "easy",
"format": "mcq",
"options": [
"A) Person 1",
"B) Person 0",
"C) Person 3",
"D) Person 2"
... | json/153_2_002_qa0008 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 4,
"clip_file": "153_2_002_qa0009.mp4",
"clip_offset": 45.5,
"qa": {
"answer": "B",
"answer_text": "Pointing",
"category": "G2",
"difficulty": "easy",
"format": "mcq",
"options": [
"A) Reaching",
"B) Pointing",
"C) Showing",
"D) Giving"
],
... | json/153_2_002_qa0009 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 5.5,
"clip_file": "153_2_002_qa0010.mp4",
"clip_offset": 45.5,
"qa": {
"answer": "D",
"answer_text": "Person 3",
"category": "G1",
"difficulty": "easy",
"format": "mcq",
"options": [
"A) Person 4",
"B) Person 2",
"C) Person 1",
"D) Person 3"
... | json/153_2_002_qa0010 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 4,
"clip_file": "153_2_003_qa0000.mp4",
"clip_offset": 2.5,
"qa": {
"answer": "C",
"answer_text": "Quickly changes gaze direction",
"category": "T2",
"difficulty": "easy",
"format": "mcq",
"options": [
"A) Looks at the same thing as Person 0",
"B) Makes eye... | json/153_2_003_qa0000 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 4.5,
"clip_file": "153_2_003_qa0001.mp4",
"clip_offset": 10,
"qa": {
"answer": "C",
"answer_text": "Person 4 looks at the same thing as Person 3",
"category": "T2",
"difficulty": "easy",
"format": "mcq",
"options": [
"A) Person 4 follows Person 1's gaze",
"... | json/153_2_003_qa0001 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 7,
"clip_file": "153_2_003_qa0002.mp4",
"clip_offset": 0,
"qa": {
"answer": "D",
"answer_text": "Person 3 leads Person 4 in following their gaze",
"category": "T3",
"difficulty": "medium",
"format": "mcq",
"options": [
"A) Person 3 and Person 4 make eye contact",... | json/153_2_003_qa0002 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 6.5,
"clip_file": "153_2_003_qa0003.mp4",
"clip_offset": 12,
"qa": {
"answer": "D",
"answer_text": "Person 3",
"category": "T5",
"difficulty": "hard",
"format": "mcq",
"options": [
"A) Both look at the exact same time",
"B) Person 4",
"C) Person 1",
... | json/153_2_003_qa0003 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 5.5,
"clip_file": "153_2_003_qa0004.mp4",
"clip_offset": 1,
"qa": {
"answer": "D",
"answer_text": "Person 4 follows Person 3's gaze",
"category": "T5",
"difficulty": "hard",
"format": "mcq",
"options": [
"A) Person 3 and Person 4 look at each other",
"B) Pe... | json/153_2_003_qa0004 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 4.5,
"clip_file": "153_2_003_qa0005.mp4",
"clip_offset": 2.5,
"qa": {
"answer": "A",
"answer_text": "Person 3",
"category": "T6",
"difficulty": "hard",
"format": "mcq",
"options": [
"A) Person 3",
"B) Person 0",
"C) Person 2",
"D) Person 1"
... | json/153_2_003_qa0005 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 3.5,
"clip_file": "153_2_003_qa0006.mp4",
"clip_offset": 6,
"qa": {
"answer": "C",
"answer_text": "Person 1, Person 3, and Person 4",
"category": "T6",
"difficulty": "hard",
"format": "mcq",
"options": [
"A) Person 2, Person 3, and Person 4",
"B) Person 0, ... | json/153_2_003_qa0006 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 4,
"clip_file": "153_2_003_qa0009.mp4",
"clip_offset": 14.5,
"qa": {
"answer": "D",
"answer_text": "Shows an object",
"category": "G1",
"difficulty": "easy",
"format": "mcq",
"options": [
"A) Reaches for an object",
"B) Hands an object to Person 2",
"... | json/153_2_003_qa0009 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 4.5,
"clip_file": "170_1_001_qa0001.mp4",
"clip_offset": 63.5,
"qa": {
"answer": "A",
"answer_text": "Person 2",
"category": "T5",
"difficulty": "hard",
"format": "mcq",
"options": [
"A) Person 2",
"B) Person 1",
"C) Person 3",
"D) Person 0"
... | json/170_1_001_qa0001 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 8,
"clip_file": "170_1_001_qa0003.mp4",
"clip_offset": 19,
"qa": {
"answer": "A",
"answer_text": "Person 1",
"category": "T1",
"difficulty": "easy",
"format": "mcq",
"options": [
"A) Person 1",
"B) Person 3",
"C) No one",
"D) Person 0"
],
... | json/170_1_001_qa0003 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 6.5,
"clip_file": "170_1_001_qa0006.mp4",
"clip_offset": 98,
"qa": {
"answer": "D",
"answer_text": "No one",
"category": "G1",
"difficulty": "easy",
"format": "mcq",
"options": [
"A) Person 1",
"B) Person 2",
"C) Person 3",
"D) No one"
],
... | json/170_1_001_qa0006 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 5,
"clip_file": "170_1_001_qa0007.mp4",
"clip_offset": 95,
"qa": {
"answer": "C",
"answer_text": "Reaching",
"category": "G2",
"difficulty": "easy",
"format": "mcq",
"options": [
"A) Pointing",
"B) Giving",
"C) Reaching",
"D) Showing"
],
... | json/170_1_001_qa0007 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 8,
"clip_file": "170_1_001_qa0008.mp4",
"clip_offset": 96,
"qa": {
"answer": "B",
"answer_text": "Person 0",
"category": "G1",
"difficulty": "easy",
"format": "mcq",
"options": [
"A) Person 2",
"B) Person 0",
"C) Person 1",
"D) Person 3"
],
... | json/170_1_001_qa0008 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 9,
"clip_file": "170_1_001_qa0009.mp4",
"clip_offset": 92.5,
"qa": {
"answer": "D",
"answer_text": "Person 0 reaches for a playing card on the table.",
"category": "C3",
"difficulty": "hard",
"format": "mcq",
"options": [
"A) Person 0 points at Person 1.",
... | json/170_1_001_qa0009 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 12,
"clip_file": "170_1_001_qa0010.mp4",
"clip_offset": 93,
"qa": {
"answer": "C",
"answer_text": "No gaze interaction occurred.",
"category": "C4",
"difficulty": "hard",
"format": "mcq",
"options": [
"A) Person 0 and Person 3 made eye contact.",
"B) Person... | json/170_1_001_qa0010 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 9,
"clip_file": "170_1_002_qa0000.mp4",
"clip_offset": 74.5,
"qa": {
"answer": "C",
"answer_text": "Person 2",
"category": "T1",
"difficulty": "easy",
"format": "mcq",
"options": [
"A) Person 3",
"B) No one",
"C) Person 2",
"D) Person 1"
],
... | json/170_1_002_qa0000 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 5,
"clip_file": "170_1_002_qa0001.mp4",
"clip_offset": 17,
"qa": {
"answer": "D",
"answer_text": "Looking at the same thing",
"category": "T2",
"difficulty": "easy",
"format": "mcq",
"options": [
"A) Person 1 follows Person 2's gaze",
"B) Person 0 and Perso... | json/170_1_002_qa0001 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 9.5,
"clip_file": "170_1_002_qa0002.mp4",
"clip_offset": 60,
"qa": {
"answer": "B",
"answer_text": "5.5 seconds",
"category": "T3",
"difficulty": "medium",
"format": "mcq",
"options": [
"A) 7.5 seconds",
"B) 5.5 seconds",
"C) 1.5 seconds",
"D) 3... | json/170_1_002_qa0002 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 6,
"clip_file": "170_1_002_qa0003.mp4",
"clip_offset": 21,
"qa": {
"answer": "B",
"answer_text": "Person 0 and Person 2",
"category": "T4",
"difficulty": "medium",
"format": "mcq",
"options": [
"A) Person 1 and Person 2",
"B) Person 0 and Person 2",
"... | json/170_1_002_qa0003 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 7,
"clip_file": "170_1_002_qa0004.mp4",
"clip_offset": 28.5,
"qa": {
"answer": "B",
"answer_text": "Person 1",
"category": "T5",
"difficulty": "hard",
"format": "mcq",
"options": [
"A) Person 0",
"B) Person 1",
"C) Both at the same time",
"D) Ne... | json/170_1_002_qa0004 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 3.5,
"clip_file": "170_1_002_qa0005.mp4",
"clip_offset": 53.5,
"qa": {
"answer": "C",
"answer_text": "Person 3",
"category": "T6",
"difficulty": "hard",
"format": "mcq",
"options": [
"A) Person 1",
"B) Person 0",
"C) Person 3",
"D) Person 2"
... | json/170_1_002_qa0005 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 4.5,
"clip_file": "170_1_002_qa0008.mp4",
"clip_offset": 38.5,
"qa": {
"answer": "A",
"answer_text": "Both Person 2 and Person 3",
"category": "G1",
"difficulty": "easy",
"format": "mcq",
"options": [
"A) Both Person 2 and Person 3",
"B) Person 3",
"C... | json/170_1_002_qa0008 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 7.5,
"clip_file": "170_1_002_qa0009.mp4",
"clip_offset": 97,
"qa": {
"answer": "C",
"answer_text": "Pointing",
"category": "G2",
"difficulty": "easy",
"format": "mcq",
"options": [
"A) Reaching",
"B) Giving",
"C) Pointing",
"D) Showing"
],
... | json/170_1_002_qa0009 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 8.5,
"clip_file": "170_1_002_qa0010.mp4",
"clip_offset": 96.5,
"qa": {
"answer": "B",
"answer_text": "An object",
"category": "G1",
"difficulty": "easy",
"format": "mcq",
"options": [
"A) Person 3",
"B) An object",
"C) Person 2",
"D) Person 1"
... | json/170_1_002_qa0010 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 7,
"clip_file": "170_1_003_qa0000.mp4",
"clip_offset": 24.5,
"qa": {
"answer": "C",
"answer_text": "Person 3",
"category": "T1",
"difficulty": "easy",
"format": "mcq",
"options": [
"A) No one",
"B) Person 1",
"C) Person 3",
"D) Person 2"
],
... | json/170_1_003_qa0000 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 4,
"clip_file": "170_1_003_qa0001.mp4",
"clip_offset": 65.5,
"qa": {
"answer": "D",
"answer_text": "Looking at the same thing",
"category": "T2",
"difficulty": "easy",
"format": "mcq",
"options": [
"A) Everyone suddenly shifts gaze",
"B) Making eye contact"... | json/170_1_003_qa0001 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 5,
"clip_file": "170_1_003_qa0002.mp4",
"clip_offset": 29.5,
"qa": {
"answer": "D",
"answer_text": "2.5 seconds",
"category": "T3",
"difficulty": "medium",
"format": "mcq",
"options": [
"A) 0.5 seconds",
"B) 2.0 seconds",
"C) 1.5 seconds",
"D) 2... | json/170_1_003_qa0002 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 3.5,
"clip_file": "170_1_003_qa0003.mp4",
"clip_offset": 73,
"qa": {
"answer": "C",
"answer_text": "Person 0 and Person 1",
"category": "T4",
"difficulty": "medium",
"format": "mcq",
"options": [
"A) Person 1 and Person 3",
"B) Person 0 and Person 2",
... | json/170_1_003_qa0003 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 6,
"clip_file": "170_1_003_qa0004.mp4",
"clip_offset": 85.5,
"qa": {
"answer": "D",
"answer_text": "Person 0",
"category": "T5",
"difficulty": "hard",
"format": "mcq",
"options": [
"A) Person 1",
"B) Person 3",
"C) Person 2",
"D) Person 0"
]... | json/170_1_003_qa0004 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 2.5,
"clip_file": "170_1_003_qa0005.mp4",
"clip_offset": 88.5,
"qa": {
"answer": "B",
"answer_text": "Person 2",
"category": "T6",
"difficulty": "hard",
"format": "mcq",
"options": [
"A) Person 0",
"B) Person 2",
"C) Person 3",
"D) Person 1"
... | json/170_1_003_qa0005 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 18,
"clip_file": "170_1_003_qa0006.mp4",
"clip_offset": 14,
"qa": {
"answer": "D",
"answer_text": "Following someone's gaze",
"category": "T3",
"difficulty": "medium",
"format": "mcq",
"options": [
"A) Everyone suddenly shifts gaze",
"B) Looking at the same... | json/170_1_003_qa0006 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 9,
"clip_file": "170_1_003_qa0009.mp4",
"clip_offset": 59,
"qa": {
"answer": "A",
"answer_text": "An object on the table",
"category": "G1",
"difficulty": "easy",
"format": "mcq",
"options": [
"A) An object on the table",
"B) Person 3",
"C) Person 0",... | json/170_1_003_qa0009 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 6.5,
"clip_file": "170_1_003_qa0010.mp4",
"clip_offset": 64,
"qa": {
"answer": "C",
"answer_text": "Pointing",
"category": "G2",
"difficulty": "easy",
"format": "mcq",
"options": [
"A) Giving",
"B) Reaching",
"C) Pointing",
"D) Showing"
],
... | json/170_1_003_qa0010 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 7,
"clip_file": "170_1_003_qa0011.mp4",
"clip_offset": 113,
"qa": {
"answer": "D",
"answer_text": "Person 1",
"category": "G1",
"difficulty": "easy",
"format": "mcq",
"options": [
"A) Person 2",
"B) Person 0",
"C) Person 3",
"D) Person 1"
],... | json/170_1_003_qa0011 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 6.5,
"clip_file": "170_1_003_qa0012.mp4",
"clip_offset": 62.5,
"qa": {
"answer": "A",
"answer_text": "Person 1",
"category": "C1",
"difficulty": "medium",
"format": "mcq",
"options": [
"A) Person 1",
"B) Person 3",
"C) Person 2",
"D) No one"
... | json/170_1_003_qa0012 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 5.5,
"clip_file": "170_1_003_qa0013.mp4",
"clip_offset": 113,
"qa": {
"answer": "C",
"answer_text": "Person 0",
"category": "C1",
"difficulty": "medium",
"format": "mcq",
"options": [
"A) Person 2 does not follow anyone's gaze",
"B) Person 1",
"C) Per... | json/170_1_003_qa0013 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 7.5,
"clip_file": "170_1_003_qa0014.mp4",
"clip_offset": 112,
"qa": {
"answer": "B",
"answer_text": "Person 2",
"category": "C4",
"difficulty": "hard",
"format": "mcq",
"options": [
"A) Person 1",
"B) Person 2",
"C) Person 3",
"D) Person 0"
... | json/170_1_003_qa0014 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 4.5,
"clip_file": "170_1_003_qa0015.mp4",
"clip_offset": 113.5,
"qa": {
"answer": "C",
"answer_text": "Person 3",
"category": "C4",
"difficulty": "hard",
"format": "mcq",
"options": [
"A) Person 0",
"B) Person 1",
"C) Person 3",
"D) Person 2"
... | json/170_1_003_qa0015 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 8.5,
"clip_file": "170_1_004_qa0000.mp4",
"clip_offset": 0.5,
"qa": {
"answer": "D",
"answer_text": "Person 1",
"category": "T1",
"difficulty": "easy",
"format": "mcq",
"options": [
"A) No one",
"B) Person 2",
"C) Person 3",
"D) Person 1"
],... | json/170_1_004_qa0000 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 6.5,
"clip_file": "170_1_004_qa0001.mp4",
"clip_offset": 1,
"qa": {
"answer": "D",
"answer_text": "They are looking at each other",
"category": "T2",
"difficulty": "easy",
"format": "mcq",
"options": [
"A) They are looking at the same thing",
"B) One is fol... | json/170_1_004_qa0001 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 2.5,
"clip_file": "170_1_004_qa0002.mp4",
"clip_offset": 15,
"qa": {
"answer": "D",
"answer_text": "They all suddenly shift their gaze",
"category": "T2",
"difficulty": "easy",
"format": "mcq",
"options": [
"A) They close their eyes",
"B) They make eye cont... | json/170_1_004_qa0002 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 5.5,
"clip_file": "170_1_004_qa0003.mp4",
"clip_offset": 7.5,
"qa": {
"answer": "C",
"answer_text": "No one",
"category": "G1",
"difficulty": "easy",
"format": "mcq",
"options": [
"A) Person 0",
"B) Person 1",
"C) No one",
"D) Person 3"
],
... | json/170_1_004_qa0003 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 6,
"clip_file": "170_1_004_qa0004.mp4",
"clip_offset": 7,
"qa": {
"answer": "D",
"answer_text": "Pointing",
"category": "G2",
"difficulty": "easy",
"format": "mcq",
"options": [
"A) Giving",
"B) Showing",
"C) Reaching",
"D) Pointing"
],
... | json/170_1_004_qa0004 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 4.5,
"clip_file": "170_2_001_qa0001.mp4",
"clip_offset": 25,
"qa": {
"answer": "C",
"answer_text": "Everyone suddenly shifts gaze",
"category": "T2",
"difficulty": "easy",
"format": "mcq",
"options": [
"A) Following someone's gaze",
"B) Eye contact between ... | json/170_2_001_qa0001 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 6.5,
"clip_file": "170_2_001_qa0002.mp4",
"clip_offset": 39.5,
"qa": {
"answer": "B",
"answer_text": "2.5 seconds",
"category": "T3",
"difficulty": "medium",
"format": "mcq",
"options": [
"A) 0.5 seconds",
"B) 2.5 seconds",
"C) 1.5 seconds",
"D)... | json/170_2_001_qa0002 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 7.5,
"clip_file": "170_2_001_qa0003.mp4",
"clip_offset": 38.5,
"qa": {
"answer": "C",
"answer_text": "Person 2 and Person 3",
"category": "T4",
"difficulty": "medium",
"format": "mcq",
"options": [
"A) Person 0 and Person 1",
"B) Person 0 and Person 2",
... | json/170_2_001_qa0003 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 5.5,
"clip_file": "170_2_001_qa0004.mp4",
"clip_offset": 38,
"qa": {
"answer": "B",
"answer_text": "Person 0",
"category": "T5",
"difficulty": "hard",
"format": "mcq",
"options": [
"A) Person 1",
"B) Person 0",
"C) Person 2",
"D) Person 3"
]... | json/170_2_001_qa0004 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 5,
"clip_file": "170_2_001_qa0005.mp4",
"clip_offset": 24.5,
"qa": {
"answer": "D",
"answer_text": "Person 3",
"category": "T6",
"difficulty": "hard",
"format": "mcq",
"options": [
"A) Person 2",
"B) Person 1",
"C) Person 0",
"D) Person 3"
]... | json/170_2_001_qa0005 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 5.5,
"clip_file": "170_2_001_qa0006.mp4",
"clip_offset": 4,
"qa": {
"answer": "D",
"answer_text": "Person 0",
"category": "T5",
"difficulty": "hard",
"format": "mcq",
"options": [
"A) Person 3",
"B) No one",
"C) Person 2",
"D) Person 0"
],
... | json/170_2_001_qa0006 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 17,
"clip_file": "170_2_001_qa0009.mp4",
"clip_offset": 32,
"qa": {
"answer": "D",
"answer_text": "Person 2",
"category": "G1",
"difficulty": "easy",
"format": "mcq",
"options": [
"A) Person 1",
"B) No one",
"C) Person 3",
"D) Person 2"
],
... | json/170_2_001_qa0009 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 17,
"clip_file": "170_2_001_qa0010.mp4",
"clip_offset": 32.5,
"qa": {
"answer": "B",
"answer_text": "Pointing",
"category": "G2",
"difficulty": "easy",
"format": "mcq",
"options": [
"A) Showing",
"B) Pointing",
"C) Reaching",
"D) Giving"
],
... | json/170_2_001_qa0010 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 16.5,
"clip_file": "170_2_001_qa0011.mp4",
"clip_offset": 31.5,
"qa": {
"answer": "D",
"answer_text": "The eye contact begins before the pointing gesture starts.",
"category": "C1",
"difficulty": "medium",
"format": "mcq",
"options": [
"A) The eye contact begins ... | json/170_2_001_qa0011 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 16.5,
"clip_file": "170_2_001_qa0012.mp4",
"clip_offset": 32.5,
"qa": {
"answer": "B",
"answer_text": "Person 3",
"category": "C2",
"difficulty": "medium",
"format": "mcq",
"options": [
"A) Person 2 makes no eye contact during this time.",
"B) Person 3",
... | json/170_2_001_qa0012 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 16.5,
"clip_file": "170_2_001_qa0013.mp4",
"clip_offset": 32,
"qa": {
"answer": "D",
"answer_text": "The target makes eye contact with Person 2.",
"category": "C4",
"difficulty": "hard",
"format": "mcq",
"options": [
"A) The target captures the attention of the w... | json/170_2_001_qa0013 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 23,
"clip_file": "170_2_001_qa0014.mp4",
"clip_offset": 25.5,
"qa": {
"answer": "D",
"answer_text": "The attention capture event occurs first at 27.0s.",
"category": "C1",
"difficulty": "medium",
"format": "mcq",
"options": [
"A) Both events start simultaneously ... | json/170_2_001_qa0014 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 4.5,
"clip_file": "171_1_001_qa0003.mp4",
"clip_offset": 105.5,
"qa": {
"answer": "A",
"answer_text": "Person 3",
"category": "T6",
"difficulty": "hard",
"format": "mcq",
"options": [
"A) Person 3",
"B) Person 2",
"C) Person 1",
"D) Person 0"
... | json/171_1_001_qa0003 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 5,
"clip_file": "171_1_001_qa0004.mp4",
"clip_offset": 106,
"qa": {
"answer": "B",
"answer_text": "Person 0",
"category": "T6",
"difficulty": "hard",
"format": "mcq",
"options": [
"A) Person 3",
"B) Person 0",
"C) Person 2",
"D) Person 1"
],... | json/171_1_001_qa0004 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
{
"clip_duration": 10,
"clip_file": "171_1_001_qa0007.mp4",
"clip_offset": 110,
"qa": {
"answer": "B",
"answer_text": "Person 0",
"category": "G1",
"difficulty": "easy",
"format": "mcq",
"options": [
"A) No one",
"B) Person 0",
"C) Person 3",
"D) Person 1"
],
... | json/171_1_001_qa0007 | hf://datasets/arkimjh/werewolf2.0@f5ecdde53230dba6a11710225775eff08a1b97c2/tars/werewolf2.0_jsons_part001.tar |
End of preview. Expand in Data Studio
Werewolf 2.0 Benchmark
Description
Werewolf 2.0 is a video-based social reasoning benchmark for evaluating gaze and gesture understanding in multi-person scenarios. Each video clip is paired with a multiple-choice QA testing fine-grained social cue comprehension.
Dataset Statistics
| Count | |
|---|---|
| Video clips | 1,196 |
| QA pairs (MCQ) | 1,196 |
| Categories | 16 (T1-T6, G1-G6, C1-C4) |
| Answer choices | 4 (A/B/C/D) |
| Video FPS | 2 fps |
16-Category QA Taxonomy
| Cat | Name | Difficulty |
|---|---|---|
| T1 | Gaze Target Identification | Easy |
| T2 | Gaze Event Classification | Easy |
| T3 | Mutual Gaze Recognition | Medium |
| T4 | Gaze Following | Medium |
| T5 | Temporal Gaze Reasoning | Hard |
| T6 | Group Attention Dynamics | Hard |
| G1 | Gesture Recognition | Easy |
| G2 | Gesture Type Classification | Easy |
| G3 | Gesture Temporal Reasoning | Medium |
| G4 | Reciprocal Gesture Patterns | Medium |
| G5 | Gesture Frequency | Hard |
| G6 | Gesture Sequence Chains | Hard |
| C1 | Gaze-Gesture Temporal Alignment | Medium |
| C2 | Gaze Response to Gesture | Medium |
| C3 | Eye Contact During Interaction | Hard |
| C4 | Cross-Modal Person Dynamics | Hard |
Dataset Structure
werewolf2.0/
├── tars/
│ ├── werewolf2.0_videos_part001.tar → video/*.mp4
│ └── werewolf2.0_jsons_part001.tar → json/*.json
└── README.md
Each JSON file contains:
{
"video_name": "153_1_001",
"clip_file": "153_1_001_qa0000.mp4",
"clip_offset": 5.5,
"clip_duration": 5.5,
"qa": {
"category": "T1",
"difficulty": "easy",
"format": "mcq",
"question": "Between 1.5s and 3.5s, whose gaze does Person 3 follow?",
"options": ["A) Person 0", "B) Person 2", "C) Person 4", "D) No one"],
"answer": "A",
"answer_text": "Person 0"
}
}
Usage
from huggingface_hub import snapshot_download
import tarfile
from pathlib import Path
# Download
snapshot_download(
repo_id="arkimjh/werewolf2.0",
repo_type="dataset",
local_dir="./werewolf2.0"
)
# Extract
for tar_file in Path("./werewolf2.0/tars").glob("*.tar"):
with tarfile.open(tar_file) as tf:
tf.extractall("./werewolf2.0/")
Citation
TBD
- Downloads last month
- 48