GPT 5 Codex
Collection
Distilled models and datasets for GPT 5 Codex β’ 7 items β’ Updated β’ 5
This model is still in developement.
For the most reliable performance use the following sampling parameters:
temperature: 1top_k: 40min_p: 0.00top_p: 1.00repeat_penalty: 1.0 (off)This gpt_oss model was trained 2x faster with Unsloth and Huggingface's TRL library.
8-bit
16-bit
Base model
openai/gpt-oss-20b