Statement_Equivalence / added_tokens.json
MattStammers's picture
Upload tokenizer
e12fea0
raw
history blame contribute delete
82 Bytes
{
"[CLS]": 101,
"[MASK]": 103,
"[PAD]": 0,
"[SEP]": 102,
"[UNK]": 100
}