RJuro commited on
Commit
b1e6ef7
1 Parent(s): 9c64819

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +41 -0
README.md ADDED
@@ -0,0 +1,41 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model:
3
+ - mlabonne/NeuralBeagle14-7B
4
+ - danish-foundation-models/munin-7b-alpha
5
+ tags:
6
+ - mergekit
7
+ - merge
8
+
9
+ ---
10
+ # beagle_munin
11
+
12
+ This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
13
+
14
+ ## Merge Details
15
+ ### Merge Method
16
+
17
+ This model was merged using the [DARE](https://arxiv.org/abs/2311.03099) [TIES](https://arxiv.org/abs/2306.01708) merge method using [danish-foundation-models/munin-7b-alpha](https://huggingface.co/danish-foundation-models/munin-7b-alpha) as a base.
18
+
19
+ ### Models Merged
20
+
21
+ The following models were included in the merge:
22
+ * [mlabonne/NeuralBeagle14-7B](https://huggingface.co/mlabonne/NeuralBeagle14-7B)
23
+
24
+ ### Configuration
25
+
26
+ The following YAML configuration was used to produce this model:
27
+
28
+ ```yaml
29
+ models:
30
+ - model: danish-foundation-models/munin-7b-alpha
31
+ # No parameters necessary for base model
32
+ - model: mlabonne/NeuralBeagle14-7B
33
+ parameters:
34
+ density: 0.53
35
+ weight: 0.6
36
+ merge_method: dare_ties
37
+ base_model: danish-foundation-models/munin-7b-alpha
38
+ parameters:
39
+ int8_mask: true
40
+ dtype: bfloat16
41
+ ```