ajibawa-2023 commited on
Commit
21de3a7
1 Parent(s): 9e4bcc8

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -0
README.md CHANGED
@@ -20,6 +20,7 @@ All the credit goes to the Open-Orca team for releasing SlimOrca dataset.
20
  Check examples given below.
21
 
22
  **Training:**
 
23
  Entire dataset was trained on 4 x A100 80GB. For 2 epoch, training took almost 114 hours. Axolotl & DeepSpeed codebase was used for training purpose.
24
  Entire data is trained on Llama-3 by Meta.
25
 
 
20
  Check examples given below.
21
 
22
  **Training:**
23
+
24
  Entire dataset was trained on 4 x A100 80GB. For 2 epoch, training took almost 114 hours. Axolotl & DeepSpeed codebase was used for training purpose.
25
  Entire data is trained on Llama-3 by Meta.
26