# Learnings - Integrations - Keep the `training` optional dependency pinned to `trl>=0.14.0,<0.15.0` while core runtime uses `torch==2.2.2`, because newer TRL releases can break import-time compatibility (for example `FSDPModule` import failures). *(F006)* - Rollout generation must handle both callable HF tokenizers and non-callable notebook/test tokenizers by branching tokenization flow and falling back to shape-agnostic decode extraction from `model.generate` outputs. *(F015)*