id stringclasses 5
values | url stringclasses 5
values | source stringclasses 2
values | year stringclasses 5
values | license_type stringclasses 2
values | hash stringclasses 5
values | title stringclasses 5
values | abs stringclasses 5
values |
|---|---|---|---|---|---|---|---|
1809.09600 | https://arxiv.org/pdf/1809.09600.pdf | arxiv | 2018 | cc by 4.0 | 17e5df4028138faf704cd905582e9c96 | HotpotQA: A Dataset for Diverse, Explainable Multi-hop Question
Answering | Existing question answering (QA) datasets fail to train QA systems to perform
complex reasoning and provide explanations for answers. We introduce HotpotQA,
a new dataset with 113k Wikipedia-based question-answer pairs with four key
features: (1) the questions require finding and reasoning over multiple
supporting docu... |
2009.07758 | https://arxiv.org/pdf/2009.07758.pdf | arxiv | 2020 | cc by 4.0 | d00711f9fe400b0f9063bc76a682f14f | GLUCOSE: GeneraLized and COntextualized Story Explanations | When humans read or listen, they make implicit commonsense inferences that
frame their understanding of what happened and why. As a step toward AI systems
that can build similar mental models, we introduce GLUCOSE, a large-scale
dataset of implicit commonsense causal knowledge, encoded as causal
mini-theories about the... |
N19-1423 | https://aclanthology.org/N19-1423.pdf | acl anthology | 2019 | acl license | 9ffe961d898ddbaeafef32cedda9a64b | BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding | AbstractWe introduce a new language representation model called BERT, which stands for Bidirectional Encoder Representations from Transformers. Unlike recent language representation models (Peters et al., 2018a; Radford et al., 2018), BERT is designed to pre-train deep bidirectional representations from unlabeled text ... |
2301.12345 | https://arxiv.org/pdf/2301.12345.pdf | arxiv | 2023 | cc by 4.0 | 2b08a8f62a0c53ec3f13020d0ca1e2d3 | Chemotactic motility-induced phase separation | Collectives of actively-moving particles can spontaneously separate into
dilute and dense phases -- a fascinating phenomenon known as motility-induced
phase separation (MIPS). MIPS is well-studied for randomly-moving particles
with no directional bias. However, many forms of active matter exhibit
collective chemotaxis,... |
P16-1174 | https://aclanthology.org/P16-1174 | acl anthology | 2016 | acl license | 5e6524f4b776a121f6325eb15a00bd7f | A Trainable Spaced Repetition Model for Language Learning |
README.md exists but content is empty.
- Downloads last month
- 4