--- language: - en - es - fr - nl - de - fi - ru - tr - ko - zh - ja - th - pt - it - sv - hu - pl - et - hr - uk - el - da - he tags: - biomedical - bionlp - entity linking - embedding - bert --- A multilingual **BERGAMOT**: **B**iomedical **E**ntity **R**epresentation with **G**raph-**A**ugmented **M**ulti-**O**bjective **T**ransformer model with pre-trained on UMLS (version 2020AB) using a Graph Attention Network (GAT) encoder. For technical details see our [NAACL 2024 paper](https://aclanthology.org/2024.findings-naacl.288). [Here is the poster](https://github.com/Andoree/BERGAMOT/blob/main/BERGAMOT_poster_naacl.jpg) of our paper. For pretraining code see our github: [https://github.com/Andoree/BERGAMOT](https://github.com/Andoree/BERGAMOT). ## Citation ```bibtex @inproceedings{sakhovskiy-et-al-2024-bergamot, title = "Biomedical Entity Representation with Graph-Augmented Multi-Objective Transformer", author = "Sakhovskiy, Andrey and Semenova, Natalia and Kadurin, Artur and Tutubalina, Elena", booktitle = "Findings of the Association for Computational Linguistics: NAACL 2024", month = jun, year = "2024", address = "Mexico City, Mexico", publisher = "Association for Computational Linguistics", } ```