ewqr2130/alignment-handbook-zephyr-7b_ppo_5e7step_51 runing the SFT with PPO for 51 steps. runing the SFT with PPO for 51 steps. runing the SFT with PPO for 51 steps. runing the SFT with PPO for 51 steps. runing the SFT with PPO for 51 steps.

Downloads last month
306
Safetensors
Model size
7B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for ewqr2130/alignment-handbook-zephyr-7b_ppo_5e7step_51

Quantizations
1 model