merge

Interesting results, still testing.

Gidget Image

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the SLERP merge method.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

merge_method: slerp
dtype: bfloat16

slices:
  - sources:
      - model: Undi95/PsyMedRP-v1-20B
        layer_range: [0, 62]
      - model: Undi95/MXLewd-L2-20B
        layer_range: [0, 62]
    base_model: Undi95/MXLewd-L2-20B
    parameters:
      t:
        - 0.5
        - 0.72
        - 0.69
        - 0.47
        - 0.28
        - 0.31
        - 0.53
        - 0.72
        - 0.68
        - 0.45
        - 0.28
        - 0.33
        - 0.56
        - 0.72
        - 0.66
        - 0.43
        - 0.27
        - 0.34
        - 0.59
        - 0.71
        - 0.64
        - 0.41
        - 0.27
        - 0.36
        - 0.62
        - 0.70
        - 0.61
        - 0.39
        - 0.27
        - 0.38
        - 0.65
        - 0.68
        - 0.59
        - 0.37
        - 0.27
        - 0.41
        - 0.67
        - 0.66
        - 0.56
        - 0.35
        - 0.27
        - 0.44
        - 0.69
        - 0.63
        - 0.54
        - 0.33
        - 0.28
        - 0.47
        - 0.70
        - 0.60
        - 0.51
        - 0.31
        - 0.29
        - 0.50
        - 0.71
        - 0.57
        - 0.49
        - 0.30
        - 0.30
        - 0.53
        - 0.71
        - 0.54
        - 0.46
        - 0.29
        - 0.31
Downloads last month
4
Safetensors
Model size
20B params
Tensor type
BF16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for Elfrino/GoldenGidget-20B

Merge model
this model
Quantizations
3 models