Triplet-Trained Vector Space and Sieve-Based Search Improve Biomedical Concept Normalization
Affiliation
School of Information, University of ArizonaIssue Date
2021
Metadata
Show full item recordCitation
Xu, D., & Bethard, S. (2021, June). Triplet-Trained Vector Space and Sieve-Based Search Improve Biomedical Concept Normalization. In Proceedings of the 20th Workshop on Biomedical Language Processing (pp. 11-22).Rights
Copyright © 2021 Association for Computational Linguistics. Licensed on a Creative Commons Attribution 4.0 International License.Collection Information
This item from the UA Faculty Publications collection is made available by the University of Arizona with support from the University of Arizona Libraries. If you have questions, please contact us at repository@u.library.arizona.edu.Abstract
Concept normalization, the task of linking textual mentions of concepts to concepts in an ontology, is critical for mining and analyzing biomedical texts. We propose a vector-space model for concept normalization, where mentions and concepts are encoded via transformer networks that are trained via a triplet objective with online hard triplet mining. The transformer networks refine existing pre-trained models, and the online triplet mining makes training efficient even with hundreds of thousands of concepts by sampling training triples within each mini-batch. We introduce a variety of strategies for searching with the trained vector-space model, including approaches that incorporate domain-specific synonyms at search time with no model retraining. Across five datasets, our models that are trained only once on their corresponding ontologies are within 3 points of state-of-the-art models that are retrained for each new domain. Our models can also be trained for each domain, achieving new state-of-the-art on multiple datasets. © 2021 Association for Computational LinguisticsNote
Open access journalISBN
9781954085404Version
Final published versionae974a485f413a2113503eed53cd6c53
10.18653/v1/2021.bionlp-1.2
Scopus Count
Collections
Except where otherwise noted, this item's license is described as Copyright © 2021 Association for Computational Linguistics. Licensed on a Creative Commons Attribution 4.0 International License.