starmpcc commited on
Commit
9207fa5
1 Parent(s): 2af7a77

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +14 -10
README.md CHANGED
@@ -12,7 +12,7 @@ tags:
12
 
13
  <!-- Provide a quick summary of what the model is/does. -->
14
 
15
- This is official model checkpoint for Asclepius-7B [arxiv](todo)
16
  This model is the first publicly shareable clinical LLM, trained with synthetic data.
17
 
18
  ## Model Details
@@ -33,7 +33,7 @@ This model is the first publicly shareable clinical LLM, trained with synthetic
33
  <!-- Provide the basic links for the model. -->
34
 
35
  - **Repository:** https://github.com/starmpcc/Asclepius
36
- - **Paper [optional]:** TODO Arxiv
37
  - **Data:** https://huggingface.co/datasets/starmpcc/Asclepius-Synthetic-Clinical-Notes
38
 
39
  ## Uses
@@ -116,7 +116,7 @@ https://huggingface.co/datasets/starmpcc/Asclepius-Synthetic-Clinical-Notes
116
 
117
  - We followed config used in [Stanford Alpaca](https://github.com/tatsu-lab/stanford_alpaca)
118
  -
119
- #### Speeds, Sizes, Times [optional]
120
 
121
  <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
122
  - Pre-Training (1 epoch): 1h 33m with 8x A100 80G
@@ -124,17 +124,21 @@ https://huggingface.co/datasets/starmpcc/Asclepius-Synthetic-Clinical-Notes
124
 
125
 
126
 
127
- ## Citation [optional]
128
 
129
  <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
130
 
131
  **BibTeX:**
132
 
133
- [More Information Needed]
134
-
135
- **APA:**
136
-
137
- [More Information Needed]
138
-
 
 
 
 
139
 
140
 
 
12
 
13
  <!-- Provide a quick summary of what the model is/does. -->
14
 
15
+ This is official model checkpoint for Asclepius-7B [(arxiv)](https://arxiv.org/abs/2309.00237)
16
  This model is the first publicly shareable clinical LLM, trained with synthetic data.
17
 
18
  ## Model Details
 
33
  <!-- Provide the basic links for the model. -->
34
 
35
  - **Repository:** https://github.com/starmpcc/Asclepius
36
+ - **Paper:** https://arxiv.org/abs/2309.00237
37
  - **Data:** https://huggingface.co/datasets/starmpcc/Asclepius-Synthetic-Clinical-Notes
38
 
39
  ## Uses
 
116
 
117
  - We followed config used in [Stanford Alpaca](https://github.com/tatsu-lab/stanford_alpaca)
118
  -
119
+ #### Speeds, Sizes, Times
120
 
121
  <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
122
  - Pre-Training (1 epoch): 1h 33m with 8x A100 80G
 
124
 
125
 
126
 
127
+ ## Citation
128
 
129
  <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
130
 
131
  **BibTeX:**
132
 
133
+ ```
134
+ @misc{kweon2023publicly,
135
+ title={Publicly Shareable Clinical Large Language Model Built on Synthetic Clinical Notes},
136
+ author={Sunjun Kweon and Junu Kim and Jiyoun Kim and Sujeong Im and Eunbyeol Cho and Seongsu Bae and Jungwoo Oh and Gyubok Lee and Jong Hak Moon and Seng Chan You and Seungjin Baek and Chang Hoon Han and Yoon Bin Jung and Yohan Jo and Edward Choi},
137
+ year={2023},
138
+ eprint={2309.00237},
139
+ archivePrefix={arXiv},
140
+ primaryClass={cs.CL}
141
+ }
142
+ ```
143
 
144