File size: 1,891 Bytes
c1f4705
 
 
 
 
 
 
eea0be8
36cd48c
70b5574
 
36cd48c
 
e00ac26
36cd48c
3863e6d
 
97b0c3e
 
 
3863e6d
36cd48c
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
69be812
f7f396c
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
---
base_model: []
tags:
- mergekit
- merge

---
# 0x01-8x7B-hf

![grinning female android, cyberpunk, robotic, biomechanical, serial number "0x01"](https://files.catbox.moe/je2zar.png)

here we go again. multi-step merge, various models involved at various ratios with various methods. 

this thing came to me in a fever dream when I was hung over, but after slightly tweaking the recipe it turned out surprisingly decent. using with the settings included.

## Update: 
The following settings have proved to work good too:
- Context: https://files.catbox.moe/q91rca.json
- Instruct: https://files.catbox.moe/2w8ja2.json
- Textgen: https://files.catbox.moe/s25rad.json

## Constituent parts
```yaml
# primordial_slop_a:
  - model: mistralai/Mixtral-8x7B-v0.1+retrieval-bar/Mixtral-8x7B-v0.1_case-briefs
  - model: mistralai/Mixtral-8x7B-v0.1+SeanWu25/Mixtral_8x7b_Medicine
  - model: mistralai/Mixtral-8x7B-v0.1+SeanWu25/Mixtral_8x7b_WuKurtz
  - model: mistralai/Mixtral-8x7B-v0.1+Epiculous/crunchy-onion-lora
  - model: mistralai/Mixtral-8x7B-v0.1+maxkretchmer/gc-mixtral
# primordial_slop_b:
  - model: Envoid/Mixtral-Instruct-ITR-8x7B
  - model: crestf411/daybreak-mixtral-8x7b-v1.0-hf
  - model: NousResearch/Nous-Hermes-2-Mixtral-8x7B-DPO
  - model: orangetin/OpenHermes-Mixtral-8x7B
  - model: mistralai/Mixtral-8x7B-Instruct-v0.1+idegroup/PhyAssistant
  - model: ycros/crunchy-onion-nx
  - model: jondurbin/bagel-dpo-8x7b-v0.2
  - model: amoldwalunj/Mixtral-8x7B-Instruct-v0.1-legal_finetune_mixtral_32k
# primordial_slop_c: a+b
# primordial_slop_d:
  - model: Sao10K/Sensualize-Mixtral-bf16
  - model: Envoid/Mixtral-Instruct-ITR-DADA-8x7B
```
---
## Quantized versions:
- GGUF iMat: [Quant-Cartel/0x01-8x7b-iMat-GGUF](https://huggingface.co/Quant-Cartel/0x01-8x7b-iMat-GGUF)
- exl2 rpcal: [Quant-Cartel/0x01-8x7b-exl2-rpcal](https://huggingface.co/Quant-Cartel/0x01-8x7b-exl2-rpcal)