| --- |
| license: other |
| language: |
| - en |
| --- |
| The Llama 2 sequel to my original experiment with gradient merges using [the following script](https://github.com/Gryphe/BlockMerge_Gradient). Its three models ([Hermes](https://huggingface.co/NousResearch/Nous-Hermes-Llama2-13b), [Chronos](https://huggingface.co/elinas/chronos-13b-v2) and [Airoboros](https://huggingface.co/jondurbin/airoboros-l2-13b-gpt4-2.0)) are almost evenly divided over the layer structure this time. Airoboros was the "wildcard model" due to its superior ability to understand complex instructions. |
|
|
| Quantized models are available from TheBloke: [GGML](https://huggingface.co/TheBloke/MythoLogic-13B-GGML) - [GPTQ](https://huggingface.co/TheBloke/MythoLogic-13B-GPTQ) (You're the best!) |
|
|
| ## Model details |
|
|
| As always, the main objective was to create an all-round model with improved roleplaying capabilities. MythoLogic-L2 differs from its predecessor in that it focuses primarily on the understanding of instructions and personalities of complex character cards. |
|
|
| Illustrated below are the gradients used for this L2 recipe; |
|
|
| [](approximation.png) |
|
|
| ## Prompt Format |
|
|
| This model primarily uses (and wast tested with) Alpaca formatting, so for optimal model performance, use: |
| ``` |
| ### Instruction: |
| Your instruction or question here. |
| ### Response: |
| ``` |
|
|
| For chat/roleplay conversations, I suggest the following structure instead: |
| ``` |
| You are <AI NAME>, a friendly and supportive AI language model. We discuss all subjects freely and without bias or restrictions. |
| I am <YOUR NAME>, the user interacting with you through a chat conversation. Start with greeting me. |
| |
| ### Instruction: |
| Write <AI NAME>'s next reply in a chat between <YOUR NAME> and <AI NAME>. Write a single reply only. |
| |
| ### Chat History: |
| <AI NAME>: Good day, <YOUR NAME>! How can I assist you today? |
| (Etc, etc) |
| |
| ### Response: |
| <AI NAME>: |
| ``` |
|
|
| --- |
| license: other |
| --- |