akoksal commited on
Commit
3ee8c4f
1 Parent(s): 032665d

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +9 -1
README.md CHANGED
@@ -47,7 +47,7 @@ The LongForm dataset is created by leveraging English corpus examples with augme
47
 
48
  Github Repo: https://github.com/akoksal/LongForm
49
 
50
- ### For LongForm-OPT models: Use [EOI] to indicate the end of instruction.
51
 
52
  LongForm-**T5-XL**: https://huggingface.co/akoksal/LongForm-T5-XL
53
 
@@ -86,11 +86,19 @@ We provide in-depth evaluation of LongForm models and baselines in the paper. We
86
  | [**LongForm-OPT-6.7B**](https://huggingface.co/akoksal/LongForm-OPT-6.7B) | 17.7 | 16.9 | 17.2 | 19.0 |
87
  | [**LongForm-LLaMA-7B**](https://huggingface.co/akoksal/LongForm-LLaMA-7B-diff)‡ | **19.7** | **21.7** | **18.6** | 18.9 |
88
 
 
 
 
 
 
89
  ‡: We can just release the difference between LongForm-LLaMA-7B and pretrained LLaMA-7B publicly due to restrictions of LLaMA models.
90
 
91
  ## Limitations
92
  The LongForm dataset and models mainly focus on long text generation and have limitations regarding structured prediction tasks in NLP. Additionally, we observe that LongForm models may present hallucination problems similar to those found in LLMs.
93
 
 
 
 
94
  ## Citation
95
  ```
96
  @misc{koksal2023longform,
 
47
 
48
  Github Repo: https://github.com/akoksal/LongForm
49
 
50
+ ### For LongForm OPT and LLaMA models: Use [EOI] to indicate the end of instruction.
51
 
52
  LongForm-**T5-XL**: https://huggingface.co/akoksal/LongForm-T5-XL
53
 
 
86
  | [**LongForm-OPT-6.7B**](https://huggingface.co/akoksal/LongForm-OPT-6.7B) | 17.7 | 16.9 | 17.2 | 19.0 |
87
  | [**LongForm-LLaMA-7B**](https://huggingface.co/akoksal/LongForm-LLaMA-7B-diff)‡ | **19.7** | **21.7** | **18.6** | 18.9 |
88
 
89
+ Smaller versions of LongForm-OPT models are also available:
90
+ - [**LongForm-OPT-1.3B**](https://huggingface.co/akoksal/LongForm-OPT-1.3B)
91
+ - [**LongForm-OPT-350M**](https://huggingface.co/akoksal/LongForm-OPT-350M)
92
+ - [**LongForm-OPT-125M**](https://huggingface.co/akoksal/LongForm-OPT-125M)
93
+
94
  ‡: We can just release the difference between LongForm-LLaMA-7B and pretrained LLaMA-7B publicly due to restrictions of LLaMA models.
95
 
96
  ## Limitations
97
  The LongForm dataset and models mainly focus on long text generation and have limitations regarding structured prediction tasks in NLP. Additionally, we observe that LongForm models may present hallucination problems similar to those found in LLMs.
98
 
99
+ ## License
100
+ The LongForm project is subject to a MIT License with custom limitations for restrictions imposed by OpenAI (for the instruction generation part), as well as the license of language models (OPT, LLaMA, and T5).
101
+
102
  ## Citation
103
  ```
104
  @misc{koksal2023longform,