Show simple item record

dc.contributor.authorMithun, M.P.
dc.contributor.authorSuntwal, S.
dc.contributor.authorSurdeanu, M.
dc.date.accessioned2022-05-19T23:19:49Z
dc.date.available2022-05-19T23:19:49Z
dc.date.issued2021
dc.identifier.citationMithun, M. P., Suntwal, S., & Surdeanu, M. (2021, November). Students Who Study Together Learn Better: On the Importance of Collective Knowledge Distillation for Domain Transfer in Fact Verification. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (pp. 6968-6973).
dc.identifier.isbn9781955917094
dc.identifier.doi10.18653/v1/2021.emnlp-main.558
dc.identifier.urihttp://hdl.handle.net/10150/664432
dc.description.abstractWhile neural networks produce state-of-the-art performance in several NLP tasks, they depend heavily on lexicalized information, which transfers poorly between domains. Previous work (Suntwal et al., 2019) proposed delexicalization as a form of knowledge distillation to reduce dependency on such lexical artifacts. However, a critical unsolved issue that remains is how much delexicalization should be applied? A little helps reduce over-fitting, but too much discards useful information. We propose Group Learning (GL), a knowledge and model distillation approach for fact verification. In our method, while multiple student models have access to different delexicalized data views, they are encouraged to independently learn from each other through pair-wise consistency losses. In several cross-domain experiments between the FEVER and FNC fact verification datasets, we show that our approach learns the best delexicalization strategy for the given training dataset and outperforms state-of-the-art classifiers that rely on the original data. © 2021 Association for Computational Linguistics
dc.language.isoen
dc.publisherAssociation for Computational Linguistics (ACL)
dc.rightsCopyright © 2021 Association for Computational Linguistics, licensed on a Creative Commons Attribution 4.0 International License.
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/
dc.titleStudents Who Study Together Learn Better: On the Importance of Collective Knowledge Distillation for Domain Transfer in Fact Verification
dc.typeProceedings
dc.typetext
dc.contributor.departmentUniversity of Arizona
dc.identifier.journalEMNLP 2021 - 2021 Conference on Empirical Methods in Natural Language Processing, Proceedings
dc.description.noteOpen access journal
dc.description.collectioninformationThis item from the UA Faculty Publications collection is made available by the University of Arizona with support from the University of Arizona Libraries. If you have questions, please contact us at repository@u.library.arizona.edu.
dc.eprint.versionFinal published version
dc.source.journaltitleEMNLP 2021 - 2021 Conference on Empirical Methods in Natural Language Processing, Proceedings
refterms.dateFOA2022-05-19T23:19:49Z


Files in this item

Thumbnail
Name:
2021.emnlp-main.558.pdf
Size:
198.5Kb
Format:
PDF
Description:
Final Published Version

This item appears in the following Collection(s)

Show simple item record

Copyright © 2021 Association for Computational Linguistics, licensed on a Creative Commons Attribution 4.0 International License.
Except where otherwise noted, this item's license is described as Copyright © 2021 Association for Computational Linguistics, licensed on a Creative Commons Attribution 4.0 International License.