• stage 3
  • "n": 256,
    "fmt_strict_rate": 0.00390625,
    "fmt_loose_rate": 0.25390625,
    "fmt_bad_rate": 0.7421875,
    "comp_tok_mean": 168.36328125,
    "comp_tok_min": 7,
    "comp_tok_max": 1142,
    "len_score_mean": 0.6488464355468752,
    "answer_exact_rate": 0.0,
    "answer_contain_rate": 0.1875
    ```}
    
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Atomheart-Father/llama_ppo_stage_0

Finetuned
(1)
this model