ewqr2130/alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont2---- 7k steps. ewqr2130/alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont2---- 7k steps. ewqr2130/alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont2---- 7k steps. ewqr2130/alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont2---- 7k steps. ewqr2130/alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont2---- 7k steps. ewqr2130/alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont2---- 7k steps. ewqr2130/alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont2---- 7k steps. ewqr2130/alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont2---- 7k steps.

Downloads last month
288
Safetensors
Model size
7B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for ewqr2130/alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont2

Quantizations
1 model