I assume the model isn't compatible with the Transformers library, hence this PR removes the library name tag
Hi @nielsr . We implemented ReT as a subclass of PreTrainedModel, and it supports several PreTrainedModels available on the Hub. That's why we added the Transformer library tag.
PreTrainedModel
· Sign up or log in to comment