A Human-machine Interface for Few-shot Rule Synthesis for Information Extraction
AffiliationUniversity of Arizona
MetadataShow full item record
CitationRobert Vacareanu, George C.G. Barbosa, Enrique Noriega-Atala, Gus Hahn-Powell, Rebecca Sharp, Marco A. Valenzuela-Escárcega, and Mihai Surdeanu. 2022. A Human-machine Interface for Few-shot Rule Synthesis for Information Extraction. In Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies: System Demonstrations, pages 64–70, Hybrid: Seattle, Washington + Online. Association for Computational Linguistics.
JournalNAACL 2022 - 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Proceedings of the Demonstrations Session
RightsCopyright © 2022 Association for Computational Linguistics. This is an open access article licensed on a Creative Commons Attribution 4.0 International License.
Collection InformationThis item from the UA Faculty Publications collection is made available by the University of Arizona with support from the University of Arizona Libraries. If you have questions, please contact us at firstname.lastname@example.org.
AbstractWe propose a system that assists a user in constructing transparent information extraction models, consisting of patterns (or rules) written in a declarative language, through program synthesis. Users of our system can specify their requirements through the use of examples, which are collected with a search interface. The rule-synthesis system proposes rule candidates and the results of applying them on a textual corpus; the user has the option to accept the candidate, request another option, or adjust the examples provided to the system. Through an interactive evaluation, we show that our approach generates high-precision rules even in a 1-shot setting. On a second evaluation on a widely-used relation extraction dataset (TACRED), our method generates rules that outperform considerably manually written patterns. Our code, demo, and documentation is available at https://clulab.github.io/odinsynth/. © 2022 Association for Computational Linguistics.
NoteOpen access journal
VersionFinal published version
Except where otherwise noted, this item's license is described as Copyright © 2022 Association for Computational Linguistics. This is an open access article licensed on a Creative Commons Attribution 4.0 International License.