Update README.md
Browse files
README.md
CHANGED
@@ -39,21 +39,47 @@ The following models were included in the merge:
|
|
39 |
The following YAML configuration was used to produce this model:
|
40 |
|
41 |
```yaml
|
42 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
43 |
dtype: bfloat16
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
44 |
merge_method: dare_ties
|
45 |
-
|
46 |
-
|
47 |
-
|
48 |
-
|
|
|
|
|
49 |
parameters:
|
50 |
weight: 0.39
|
51 |
-
-
|
52 |
-
model: llama-3-8B-ultra-instruct/InstructPart
|
53 |
parameters:
|
54 |
weight: 0.26
|
55 |
-
|
56 |
-
|
|
|
|
|
57 |
```
|
58 |
|
59 |
### Chat Template (Llama 3 Official)
|
|
|
39 |
The following YAML configuration was used to produce this model:
|
40 |
|
41 |
```yaml
|
42 |
+
models:
|
43 |
+
- model: ChaoticNeutrals/Poppy_Porpoise-v0.7-L3-8B
|
44 |
+
parameters:
|
45 |
+
weight: 0.4
|
46 |
+
- model: Undi95/Llama-3-LewdPlay-8B-evo
|
47 |
+
parameters:
|
48 |
+
weight: 0.5
|
49 |
+
- model: jondurbin/bagel-8b-v1.0
|
50 |
+
parameters:
|
51 |
+
weight: 0.1
|
52 |
+
merge_method: dare_ties
|
53 |
dtype: bfloat16
|
54 |
+
base_model: Undi95/Meta-Llama-3-8B-hf
|
55 |
+
name: RPPart
|
56 |
+
---
|
57 |
+
models:
|
58 |
+
- model: Weyaxi/Einstein-v6.1-Llama3-8B
|
59 |
+
parameters:
|
60 |
+
weight: 0.6
|
61 |
+
- model: VAGOsolutions/Llama-3-SauerkrautLM-8b-Instruct
|
62 |
+
parameters:
|
63 |
+
weight: 0.3
|
64 |
+
- model: aaditya/OpenBioLLM-Llama3-8B
|
65 |
+
parameters:
|
66 |
+
weight: 0.1
|
67 |
merge_method: dare_ties
|
68 |
+
base_model: Undi95/Meta-Llama-3-8B-hf
|
69 |
+
dtype: bfloat16
|
70 |
+
name: InstructPart
|
71 |
+
---
|
72 |
+
models:
|
73 |
+
- model: RPPart
|
74 |
parameters:
|
75 |
weight: 0.39
|
76 |
+
- model: InstructPart
|
|
|
77 |
parameters:
|
78 |
weight: 0.26
|
79 |
+
merge_method: dare_ties
|
80 |
+
base_model: Undi95/Meta-Llama-3-8B-Instruct-hf
|
81 |
+
dtype: bfloat16
|
82 |
+
name: Llama-3-8B-Ultra-Instruct
|
83 |
```
|
84 |
|
85 |
### Chat Template (Llama 3 Official)
|