Edit model card

A multilingual BERGAMOT: Biomedical Entity Representation with Graph-Augmented Multi-Objective Transformer model with pre-trained on UMLS (version 2020AB) using a Graph Attention Network (GAT) encoder.

For technical details see our NAACL 2024 paper.

Here is the poster of our paper.

For pretraining code see our github: https://github.com/Andoree/BERGAMOT.

Citation

@inproceedings{sakhovskiy-et-al-2024-bergamot,
    title = "Biomedical Entity Representation with Graph-Augmented Multi-Objective Transformer",
    author = "Sakhovskiy, Andrey and Semenova, Natalia and Kadurin, Artur and Tutubalina, Elena",
    booktitle = "Findings of the Association for Computational Linguistics: NAACL 2024",
    month = jun,
    year = "2024",
    address = "Mexico City, Mexico",
    publisher = "Association for Computational Linguistics",
  }
Downloads last month
208
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.