--- license: apache-2.0 base_model: - OpenPipe/mistral-ft-optimized-1218 - mlabonne/NeuralHermes-2.5-Mistral-7B tags: - merge - mergekit - lazymergekit - OpenPipe/mistral-ft-optimized-1218 - mlabonne/NeuralHermes-2.5-Mistral-7B --- # Marcoro14-7B-slerp Marcoro14-7B-slerp is a merge of the following models using [mergekit](https://github.com/cg123/mergekit): * [OpenPipe/mistral-ft-optimized-1218](https://huggingface.co./OpenPipe/mistral-ft-optimized-1218) * [mlabonne/NeuralHermes-2.5-Mistral-7B](https://huggingface.co./mlabonne/NeuralHermes-2.5-Mistral-7B) ## 🧩 Configuration ```yaml # slices: # - sources: # - model: AIDC-ai-business/Marcoroni-7B-v3 # layer_range: [0, 32] # - model: EmbeddedLLM/Mistral-7B-Merge-14-v0.1 # layer_range: [0, 32] # merge_method: slerp # base_model: AIDC-ai-business/Marcoroni-7B-v3 slices: - sources: - model: OpenPipe/mistral-ft-optimized-1218 layer_range: [0, 32] - model: mlabonne/NeuralHermes-2.5-Mistral-7B layer_range: [0, 32] merge_method: slerp base_model: OpenPipe/mistral-ft-optimized-1218 parameters: t: - filter: self_attn value: [0, 0.5, 0.3, 0.7, 1] - filter: mlp value: [1, 0.5, 0.7, 0.3, 0] - value: 0.5 dtype: bfloat16 ```