File size: 395 Bytes
16fb831
7200dcb
16fb831
7200dcb
16fb831
7200dcb
16fb831
7200dcb
16fb831
7200dcb
16fb831
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
# payelb/UltraFeedback_openbmb_TinyLlama-1.1B_aligned_with_baseline_RM

Base model: TinyLlama/TinyLlama-1.1B-Chat-v1.0

Alignment dataset: openbmb/UltraFeedback

Reward model: payelb/UltraFeedback_openbmb_roberta-base_1k_fixed_baseline

Method: PPO alignment with LoRA adapters.

Notes:
- Reward normalization and clipping enabled
- KL control enabled
- pad_token_id/eos_token_id explicitly set