Resolving Interference When Merging Models
Paper β’ 2306.01708 β’ Published β’ 18
This is a merge of pre-trained language models created using mergekit.
This model was merged using the TIES merge method using meta-llama/Meta-Llama-3.1-8B as a base.
The following models were included in the merge:
The following YAML configuration was used to produce this model:
base_model: meta-llama/Meta-Llama-3.1-8B
chat_template: auto
dtype: float16
merge_method: ties
models:
- model: mlabonne/Meta-Llama-3.1-8B-Instruct-abliterated
parameters:
density: 0.5
weight: 0.5
- model: meta-llama/Meta-Llama-3.1-8B-Instruct
parameters:
density: 0.5
weight: 0.5
parameters:
int8_mask: true
normalize: false
Detailed results can be found here
| Metric | Value |
|---|---|
| Avg. | 20.77 |
| IFEval (0-Shot) | 45.51 |
| BBH (3-Shot) | 28.91 |
| MATH Lvl 5 (4-Shot) | 11.63 |
| GPQA (0-shot) | 2.24 |
| MuSR (0-shot) | 6.59 |
| MMLU-PRO (5-shot) | 29.76 |