ryo0634 commited on
Commit
82423db
1 Parent(s): 8dac253

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +61 -0
README.md ADDED
@@ -0,0 +1,61 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - multilingual
4
+ - ar
5
+ - bn
6
+ - de
7
+ - el
8
+ - en
9
+ - es
10
+ - fi
11
+ - fr
12
+ - hi
13
+ - id
14
+ - it
15
+ - ja
16
+ - ko
17
+ - nl
18
+ - pl
19
+ - pt
20
+ - ru
21
+ - sv
22
+ - sw
23
+ - te
24
+ - th
25
+ - tr
26
+ - vi
27
+ - zh
28
+ thumbnail: https://github.com/studio-ousia/luke/raw/master/resources/luke_logo.png
29
+ tags:
30
+ - luke
31
+ - named entity recognition
32
+ - relation classification
33
+ - question answering
34
+ license: apache-2.0
35
+ ---
36
+
37
+ ## mLUKE
38
+
39
+ **mLUKE** (multilingual LUKE) is a multilingual extension of LUKE.
40
+
41
+ Please check the [official repository](https://github.com/studio-ousia/luke) for
42
+ more details and updates.
43
+
44
+ This is the mLUKE large model with 24 hidden layers, 768 hidden size. The total number
45
+ of parameters in this model is 868M (561M for the word embeddings and encoder, 307M for the entity embeddings).
46
+ The model was initialized with the weights of XLM-RoBERTa(large) and trained using December 2020 version of Wikipedia in 24 languages.
47
+
48
+ ### Citation
49
+
50
+ If you find mLUKE useful for your work, please cite the following paper:
51
+
52
+ ```latex
53
+ @inproceedings{ri-etal-2022-mluke,
54
+ title = "m{LUKE}: {T}he Power of Entity Representations in Multilingual Pretrained Language Models",
55
+ author = "Ri, Ryokan and
56
+ Yamada, Ikuya and
57
+ Tsuruoka, Yoshimasa",
58
+ booktitle = "Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)",
59
+ year = "2022",
60
+ url = "https://aclanthology.org/2022.acl-long.505",
61
+ ```