What this fixes

transformers 5+ expects PreTrainedModel.post_init() to run so internal fields like all_tied_weights_keys exist. This repo’s remote modeling_norbert.py never called post_init(), which caused: AttributeError: 'NorbertForTokenClassification' object has no attribute 'all_tied_weights_keys' when loading with trust_remote_code=True (e.g. Docker builds).

What changed

  • self.post_init() after the base NorbertModel modules are built.
  • self.post_init() after task heads (Classifier) are added, and for NorbertForMaskedLM, matching the pattern used in ltg/norbert3-large.

Notes

  • No weight or config.json changes — Python-only.
  • Relates to CVE-2026-1839 in the sense that projects need transformers 5+; this unblocks loading with current transformers.
Marentius changed pull request status to open
Kushtrim changed pull request status to merged

Sign up or log in to comment