L3-8B-SMaid-v0.1 / README.md
Alsebay's picture
Upload
1d04e5b verified
|
raw
history blame
No virus
1.69 kB
metadata
base_model:
  - Sao10K/L3-8B-Stheno-v3.2
  - NeverSleep/Llama-3-Lumimaid-8B-v0.1-OAS
library_name: transformers
tags:
  - mergekit
  - merge

merge

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the DARE TIES merge method using NeverSleep/Llama-3-Lumimaid-8B-v0.1-OAS as a base.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:


slices:
- sources:
  - layer_range: [0, 16]
    model: NeverSleep/Llama-3-Lumimaid-8B-v0.1-OAS
    parameters:
      density: 0.5
      weight: 1.0
  - layer_range: [0, 16]
    model: Sao10K/L3-8B-Stheno-v3.2
    parameters:
      density: 0.5
      weight: 0.9
- sources:
  - layer_range: [16, 24]
    model: Sao10K/L3-8B-Stheno-v3.2
    parameters:
      density: 0.75
      weight: 0.5
  - layer_range: [16, 24]
    model: NeverSleep/Llama-3-Lumimaid-8B-v0.1-OAS
    parameters:
      density: 0.25
      weight: 0.5
- sources:
  - layer_range: [24, 32]
    model: NeverSleep/Llama-3-Lumimaid-8B-v0.1-OAS
    parameters:
      density: 0.5
      weight: 0.5
  - layer_range: [24, 32]
    model: Sao10K/L3-8B-Stheno-v3.2
    parameters:
      density: 0.5
      weight: 1.0
merge_method: dare_ties
base_model: NeverSleep/Llama-3-Lumimaid-8B-v0.1-OAS
parameters:
int8_mask: true
dtype: bfloat16