# Custom dependencies for RNNLM (creative-help) Inference Endpoint # Base stack (torch, transformers) is provided by the Inference Endpoints container # RNNLM tokenizer uses spaCy for tokenization and entity extraction spacy>=3.0 # English spaCy model - required for RNNLMTokenizer (entity recognition, tokenization) # Install from GitHub release (pip cannot install spacy models via python -m spacy download in container) https://github.com/explosion/spacy-models/releases/download/en_core_web_sm-3.7.0/en_core_web_sm-3.7.0-py3-none-any.whl # NumPy (used by tokenization_utils) numpy