johnrachwanpruna commited on
Commit
f9c4e66
1 Parent(s): 3717df8

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +6 -4
README.md CHANGED
@@ -24,6 +24,8 @@ tags:
24
  [![LinkedIn](https://img.shields.io/badge/LinkedIn-Connect-blue)](https://www.linkedin.com/company/93832878/admin/feed/posts/?feedType=following)
25
  [![Discord](https://img.shields.io/badge/Discord-Join%20Us-blue?style=social&logo=discord)](https://discord.gg/CP4VSgck)
26
 
 
 
27
  # Simply make AI models cheaper, smaller, faster, and greener!
28
 
29
  - Give a thumbs up if you like this model!
@@ -71,7 +73,7 @@ The following clients/libraries will automatically download models for you, prov
71
  * Faraday.dev
72
 
73
  - **Option A** - Downloading in `text-generation-webui`:
74
- - **Step 1**: Under Download Model, you can enter the model repo: PrunaAI/gemma-1.1-2b-it-GGUF-smashed-smashed and below it, a specific filename to download, such as: phi-2.IQ3_M.gguf.
75
  - **Step 2**: Then click Download.
76
 
77
  - **Option B** - Downloading on the command line (including multiple files at once):
@@ -81,14 +83,14 @@ pip3 install huggingface-hub
81
  ```
82
  - **Step 2**: Then you can download any individual model file to the current directory, at high speed, with a command like this:
83
  ```shell
84
- huggingface-cli download PrunaAI/gemma-1.1-2b-it-GGUF-smashed-smashed gemma-1.1-2b-it.IQ3_M.gguf --local-dir . --local-dir-use-symlinks False
85
  ```
86
  <details>
87
  <summary>More advanced huggingface-cli download usage (click to read)</summary>
88
  Alternatively, you can also download multiple files at once with a pattern:
89
 
90
  ```shell
91
- huggingface-cli download PrunaAI/gemma-1.1-2b-it-GGUF-smashed-smashed --local-dir . --local-dir-use-symlinks False --include='*Q4_K*gguf'
92
  ```
93
 
94
  For more documentation on downloading with `huggingface-cli`, please see: [HF -> Hub Python Library -> Download files -> Download from the CLI](https://huggingface.co/docs/huggingface_hub/guides/download#download-from-the-cli).
@@ -102,7 +104,7 @@ pip3 install hf_transfer
102
  And set environment variable `HF_HUB_ENABLE_HF_TRANSFER` to `1`:
103
 
104
  ```shell
105
- HF_HUB_ENABLE_HF_TRANSFER=1 huggingface-cli download PrunaAI/gemma-1.1-2b-it-GGUF-smashed-smashed gemma-1.1-2b-it.IQ3_M.gguf --local-dir . --local-dir-use-symlinks False
106
  ```
107
 
108
  Windows Command Line users: You can set the environment variable by running `set HF_HUB_ENABLE_HF_TRANSFER=1` before the download command.
 
24
  [![LinkedIn](https://img.shields.io/badge/LinkedIn-Connect-blue)](https://www.linkedin.com/company/93832878/admin/feed/posts/?feedType=following)
25
  [![Discord](https://img.shields.io/badge/Discord-Join%20Us-blue?style=social&logo=discord)](https://discord.gg/CP4VSgck)
26
 
27
+ ## This repo contains GGUF versions of the google/gemma-1.1-2b-it model.
28
+
29
  # Simply make AI models cheaper, smaller, faster, and greener!
30
 
31
  - Give a thumbs up if you like this model!
 
73
  * Faraday.dev
74
 
75
  - **Option A** - Downloading in `text-generation-webui`:
76
+ - **Step 1**: Under Download Model, you can enter the model repo: PrunaAI/gemma-1.1-2b-it-GGUF-smashed and below it, a specific filename to download, such as: phi-2.IQ3_M.gguf.
77
  - **Step 2**: Then click Download.
78
 
79
  - **Option B** - Downloading on the command line (including multiple files at once):
 
83
  ```
84
  - **Step 2**: Then you can download any individual model file to the current directory, at high speed, with a command like this:
85
  ```shell
86
+ huggingface-cli download PrunaAI/gemma-1.1-2b-it-GGUF-smashed gemma-1.1-2b-it.IQ3_M.gguf --local-dir . --local-dir-use-symlinks False
87
  ```
88
  <details>
89
  <summary>More advanced huggingface-cli download usage (click to read)</summary>
90
  Alternatively, you can also download multiple files at once with a pattern:
91
 
92
  ```shell
93
+ huggingface-cli download PrunaAI/gemma-1.1-2b-it-GGUF-smashed --local-dir . --local-dir-use-symlinks False --include='*Q4_K*gguf'
94
  ```
95
 
96
  For more documentation on downloading with `huggingface-cli`, please see: [HF -> Hub Python Library -> Download files -> Download from the CLI](https://huggingface.co/docs/huggingface_hub/guides/download#download-from-the-cli).
 
104
  And set environment variable `HF_HUB_ENABLE_HF_TRANSFER` to `1`:
105
 
106
  ```shell
107
+ HF_HUB_ENABLE_HF_TRANSFER=1 huggingface-cli download PrunaAI/gemma-1.1-2b-it-GGUF-smashed gemma-1.1-2b-it.IQ3_M.gguf --local-dir . --local-dir-use-symlinks False
108
  ```
109
 
110
  Windows Command Line users: You can set the environment variable by running `set HF_HUB_ENABLE_HF_TRANSFER=1` before the download command.