--- base_model: - Sorawiz/MistralCreative-24B-Chat - Gryphe/Pantheon-RP-1.8-24b-Small-3.1 - ReadyArt/Forgotten-Abomination-24B-v4.0 - ReadyArt/Forgotten-Transgression-24B-v4.1 - ReadyArt/Gaslight-24B-v1.0 - ReadyArt/The-Omega-Directive-M-24B-v1.0 - anthracite-core/Mistral-Small-3.1-24B-Instruct-2503-HF library_name: transformers tags: - mergekit - merge --- # Chat Template Mistral Instruct ``` {{ if .System }}<|im_start|>system {{ .System }}<|im_end|> {{ end }}{{ if .Prompt }}<|im_start|>user {{ .Prompt }}<|im_end|> {{ end }}<|im_start|>assistant {{ .Response }}<|im_end|> ``` ChatML ``` {{ if .System }}<|im_start|>system {{ .System }}<|im_end|> {{ end }}{{ if .Prompt }}<|im_start|>user {{ .Prompt }}<|im_end|> {{ end }}<|im_start|>assistant {{ .Response }}{{ if .Response }}<|im_end|>{{ end }} ``` # GGUF Thank you [mradermacher](https://huggingface.co/mradermacher) for creating the GGUF versions of this model. * Static quants - [mradermacher/MistralCreative-24B-Instruct-GGUF](https://huggingface.co/mradermacher/MistralCreative-24B-Instruct-GGUF) * Imatrix quants - [mradermacher/MistralCreative-24B-Instruct-i1-GGUF](https://huggingface.co/mradermacher/MistralCreative-24B-Instruct-i1-GGUF) # Merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [TIES](https://arxiv.org/abs/2306.01708) merge method using [anthracite-core/Mistral-Small-3.1-24B-Instruct-2503-HF](https://huggingface.co/anthracite-core/Mistral-Small-3.1-24B-Instruct-2503-HF) as a base. ### Models Merged The following models were included in the merge: * [ReadyArt/The-Omega-Directive-M-24B-v1.0](https://huggingface.co/ReadyArt/The-Omega-Directive-M-24B-v1.0) * [Sorawiz/MistralCreative-24B-Test-U](https://huggingface.co/Sorawiz/MistralCreative-24B-Test-U) ### Configuration The following YAML configuration was used to produce this model: ```yaml name: Sorawiz/MistralCreative-24B-Test-E merge_method: dare_ties base_model: Sorawiz/MistralCreative-24B-Chat models: - model: Sorawiz/MistralCreative-24B-Chat parameters: weight: 0.20 - model: Gryphe/Pantheon-RP-1.8-24b-Small-3.1 parameters: weight: 0.20 - model: ReadyArt/Forgotten-Transgression-24B-v4.1 parameters: weight: 0.30 - model: ReadyArt/Forgotten-Abomination-24B-v4.0 parameters: weight: 0.30 parameters: density: 1 tokenizer: source: union chat_template: auto --- name: Sorawiz/MistralCreative-24B-Test-U merge_method: dare_ties base_model: Sorawiz/MistralCreative-24B-Test-E models: - model: Sorawiz/MistralCreative-24B-Test-E parameters: weight: 0.3 - model: ReadyArt/Gaslight-24B-v1.0 parameters: weight: 0.5 - model: Gryphe/Pantheon-RP-1.8-24b-Small-3.1 parameters: weight: 0.2 parameters: density: 0.70 tokenizer: source: union chat_template: auto --- models: - model: anthracite-core/Mistral-Small-3.1-24B-Instruct-2503-HF - model: Sorawiz/MistralCreative-24B-Test-U parameters: density: 1.00 weight: 1.00 - model: ReadyArt/The-Omega-Directive-M-24B-v1.0 parameters: density: 1.00 weight: 1.00 merge_method: ties base_model: anthracite-core/Mistral-Small-3.1-24B-Instruct-2503-HF parameters: normalize: true dtype: float32 ```