metadata
base_model:
- 152334H/miqu-1-70b-sf
library_name: transformers
tags:
- mergekit
- merge
miqu-1-120b
This is a 120b frankenmerge of miqu-1-70b created by interleaving layers of miqu-1-70b-sf with itself using mergekit.
Inspired by Venus-120b-v1.2, MegaDolphin-120b, and goliath-120b.
Prompt template: Mistral
<s>[INST] {prompt} [/INST]
Merge Details
Merge Method
This model was merged using the passthrough merge method.
Models Merged
The following models were included in the merge:
Configuration
The following YAML configuration was used to produce this model:
dtype: float16
merge_method: passthrough
slices:
- sources:
- layer_range: [0, 20]
model: 152334H/miqu-1-70b-sf
- sources:
- layer_range: [10, 30]
model: 152334H/miqu-1-70b-sf
- sources:
- layer_range: [20, 40]
model: 152334H/miqu-1-70b-sf
- sources:
- layer_range: [30, 50]
model: 152334H/miqu-1-70b-sf
- sources:
- layer_range: [40, 60]
model: 152334H/miqu-1-70b-sf
- sources:
- layer_range: [50, 70]
model: 152334H/miqu-1-70b-sf
- sources:
- layer_range: [60, 80]
model: 152334H/miqu-1-70b-sf
Credits & Special Thanks
- original model: mistralai (Mistral AI_)
- leaked model: miqudev/miqu-1-70b
- f16 model: 152334H/miqu-1-70b-sf
- mergekit: arcee-ai/mergekit: Tools for merging pretrained large language models.
- mergekit_config.yml: nsfwthrowitaway69/Venus-120b-v1.2