Fara-TARS-7B / mergekit_config.yml
yasserrmd's picture
Merged Model
e892afd verified
raw
history blame contribute delete
469 Bytes
models:
- model: microsoft/Fara-7B
- model: ByteDance-Seed/UI-TARS-1.5-7B
merge_method: slerp
base_model: microsoft/Fara-7B
dtype: bfloat16
parameters:
t:
# 5-point gradient:
# 0.1 (Start): Mostly Fara -> Ensures input understanding and English grammar.
# 0.3 -> 0.5 (Middle): Blends TARS capability for reasoning and logic.
# 0.1 (End): Mostly Fara -> Ensures the output stops correctly and doesn't loop.
- value: [0.1, 0.3, 0.5, 0.3, 0.1]