Triplet-Trained Vector Space and Sieve-Based Search Improve Biomedical Concept Normalization
dc.contributor.author | Xu, D. | |
dc.contributor.author | Bethard, S. | |
dc.date.accessioned | 2022-03-17T01:56:58Z | |
dc.date.available | 2022-03-17T01:56:58Z | |
dc.date.issued | 2021 | |
dc.identifier.citation | Xu, D., & Bethard, S. (2021, June). Triplet-Trained Vector Space and Sieve-Based Search Improve Biomedical Concept Normalization. In Proceedings of the 20th Workshop on Biomedical Language Processing (pp. 11-22). | |
dc.identifier.isbn | 9781954085404 | |
dc.identifier.doi | 10.18653/v1/2021.bionlp-1.2 | |
dc.identifier.uri | http://hdl.handle.net/10150/663578 | |
dc.description.abstract | Concept normalization, the task of linking textual mentions of concepts to concepts in an ontology, is critical for mining and analyzing biomedical texts. We propose a vector-space model for concept normalization, where mentions and concepts are encoded via transformer networks that are trained via a triplet objective with online hard triplet mining. The transformer networks refine existing pre-trained models, and the online triplet mining makes training efficient even with hundreds of thousands of concepts by sampling training triples within each mini-batch. We introduce a variety of strategies for searching with the trained vector-space model, including approaches that incorporate domain-specific synonyms at search time with no model retraining. Across five datasets, our models that are trained only once on their corresponding ontologies are within 3 points of state-of-the-art models that are retrained for each new domain. Our models can also be trained for each domain, achieving new state-of-the-art on multiple datasets. © 2021 Association for Computational Linguistics | |
dc.language.iso | en | |
dc.publisher | Association for Computational Linguistics (ACL) | |
dc.rights | Copyright © 2021 Association for Computational Linguistics. Licensed on a Creative Commons Attribution 4.0 International License. | |
dc.rights.uri | https://creativecommons.org/licenses/by/4.0/ | |
dc.title | Triplet-Trained Vector Space and Sieve-Based Search Improve Biomedical Concept Normalization | |
dc.type | Proceedings | |
dc.type | text | |
dc.contributor.department | School of Information, University of Arizona | |
dc.identifier.journal | Proceedings of the 20th Workshop on Biomedical Language Processing, BioNLP 2021 | |
dc.description.note | Open access journal | |
dc.description.collectioninformation | This item from the UA Faculty Publications collection is made available by the University of Arizona with support from the University of Arizona Libraries. If you have questions, please contact us at repository@u.library.arizona.edu. | |
dc.eprint.version | Final published version | |
dc.source.journaltitle | Proceedings of the 20th Workshop on Biomedical Language Processing, BioNLP 2021 | |
refterms.dateFOA | 2022-03-17T01:56:58Z |