Context length for compression
#2
by Money123001 - opened
what is the context limit to which it can compress at a time?
what is the context limit to which it can compress at a time?
Thanks for the question. It's the same as the base model xlm-roberta-large, i.e., 512 tokens.
qianhuiwu changed discussion status to closed