shenbinqian commited on
Commit
dceb610
1 Parent(s): 0067ff7

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -1,5 +1,5 @@
1
  ---
2
- license: mit
3
  tags:
4
  - generated_from_trainer
5
  metrics:
@@ -20,7 +20,7 @@ should probably proofread and complete it, then remove this comment. -->
20
  # roberta-large-finetuned-abbr-filtered-plod
21
 
22
  This model is a fine-tuned version of the [roberta-large](https://huggingface.co/roberta-large) on the [PLODv2 filtered dataset](https://github.com/shenbinqian/PLODv2-CLM4AbbrDetection).
23
- It achieves the following results on the test set:
24
 
25
  Results on abbreviations:
26
  - Precision: 0.9073
 
1
  ---
2
+ license: cc-by-sa-4.0
3
  tags:
4
  - generated_from_trainer
5
  metrics:
 
20
  # roberta-large-finetuned-abbr-filtered-plod
21
 
22
  This model is a fine-tuned version of the [roberta-large](https://huggingface.co/roberta-large) on the [PLODv2 filtered dataset](https://github.com/shenbinqian/PLODv2-CLM4AbbrDetection).
23
+ It is released with our LREC-COLING 2024 publication (coming soon). It achieves the following results on the test set:
24
 
25
  Results on abbreviations:
26
  - Precision: 0.9073