FINE-TUNING TRANSFORMER-BASED NATURAL LANGUAGE GENERATION ALGORITHMS FOR USDA GRAINS REPORTS FOR FARMERS, PRODUCERS, AND SMALL BUSINESSES
Publisher
The University of Arizona.Rights
Copyright © is held by the author. Digital access to this material is made possible by the University Libraries, University of Arizona. Further transmission, reproduction or presentation (such as public display or performance) of protected items is prohibited except with permission of the author.Abstract
The Transformer architecture is a deep learning model that adopts the mechanism of self-attention, the ability of a Transformer model to attend to different parts of the input sequence when making predictions. The Transformer differentially weighs the significance of each part of the input data, learning context and meaning by tracking relationships in sequential data, such as the words in a sentence. For Natural Language Generation (NLG), the application of the Transformer was nothing short of a breakthrough. Self and multi-head attention models have proven their efficacy in a variety of textual tasks, including classification, translation, summarization, and generation. However, incorporating numerical information into textual tasks is a relatively novel frontier, especially when said information is vital for assessing the efficacy of automating the summary generation process. In this research, I focus on evaluating pre-trained Transformer-based NLG models, such as the widely popularized and open-source BERT and T5, on USDA grain reports to produce high-quality, information-rich summaries. I conduct a comparative analysis of these models’ performances on a small collection of numerically-oriented commodities reports, and compute two simple yet informative metrics to judge which models perform the best at relaying numerical information in their respective summaries.Type
Electronic thesistext
Degree Name
B.S.Degree Level
bachelorsDegree Program
Computer ScienceHonors College