Author
Kreso, MarkoIssue Date
2022Keywords
Mixed Abstractive ExtractiveNatural Language Processing
NLP
PubMed
Text Summarization
Transformers
Advisor
Surdeanu, Mihai
Metadata
Show full item recordPublisher
The University of Arizona.Rights
Copyright © is held by the author. Digital access to this material is made possible by the University Libraries, University of Arizona. Further transmission, reproduction, presentation (such as public display or performance) of protected items is prohibited except with permission of the author.Abstract
This paper focuses on using a novel unsupervised summarization layer, called BART-textrank, that uses both abstractive and extractive techniques to produce a final extractive summary. This unsupervised layer is versatile since it can be added on top of any abstractive summarizer without additional training. It is used in conjunction with a base size transformer that achieves SOTA performance in a few metrics when comparing to transformer methods with similar parameter size, and remains competitive with large transformers. The competitiveness of BART-textrank is apparent in figure 4.2.Type
textElectronic Thesis
Degree Name
M.S.Degree Level
mastersDegree Program
Graduate CollegeComputer Science