Resolving Interference When Merging Models
Paper • 2306.01708 • Published • 18
This is a merge of pre-trained language models created using mergekit.
This model was merged using the TIES merge method using WhiteRabbitNeo/WhiteRabbitNeo-2.5-Qwen-2.5-Coder-7B as a base.
The following models were included in the merge:
The following YAML configuration was used to produce this model:
models:
- model: Locutusque/StockQwen-2.5-7B
parameters:
density: 1
weight: 1
- model: WhiteRabbitNeo/WhiteRabbitNeo-2.5-Qwen-2.5-Coder-7B
parameters:
density: 1
weight: 1
- model: Locutusque/StockQwen-2.5-7B
parameters:
density: 1
weight: 1
- model: WhiteRabbitNeo/WhiteRabbitNeo-2.5-Qwen-2.5-Coder-7B
parameters:
density: 1
weight: 1
merge_method: ties
base_model: WhiteRabbitNeo/WhiteRabbitNeo-2.5-Qwen-2.5-Coder-7B
parameters:
normalize: true
int8_mask: true
dtype: bfloat16