GPT-Neo-125M for TinyBrain

This repository packages EleutherAI/gpt-neo-125M for the TinyBrain runtime on Apple Watch.

TinyBrain bundle contract

The conversion script produces a single bundle archive:

  • gptneo125_only_logits.mlmodelc.zip

The zip layout is:

  • gptneo125_only_logits.mlmodelc/
  • tokenizer/
  • metadata.json

Runtime interface

  • Task: causal language modeling
  • Inputs:
    • input_ids (Int32, shape 1 x 16)
    • attention_mask (Int32, shape 1 x 16)
  • Output:
    • logits

Files included in tokenizer/

  • tokenizer.json
  • vocab.json
  • merges.txt
  • special_tokens_map.json
  • tokenizer_config.json
  • token_decoder.json

Build

Run:

python3 convert_gptneo125_to_coreml.py

The script:

  1. Downloads the model and tokenizer from Hugging Face.
  2. Converts the model to Core ML.
  3. Compiles the package to .mlmodelc.
  4. Writes TinyBrain runtime metadata.
  5. Packages the compiled model, tokenizer, and metadata into one zip.
Downloads last month
-
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support