Show simple item record

dc.contributor.advisorBethard, Steven
dc.contributor.authorZhang, Jiacheng
dc.creatorZhang, Jiacheng
dc.date.accessioned2021-06-22T03:12:07Z
dc.date.available2021-06-22T03:12:07Z
dc.date.issued2021
dc.identifier.citationZhang, Jiacheng. (2021). General Benefits of Mono-Lingual Pre-Training in Transformers (Master's thesis, University of Arizona, Tucson, USA).
dc.identifier.urihttp://hdl.handle.net/10150/660173
dc.description.abstractPre-trained transformer is a class of neural networks behind many recent natural language processing systems. Its success is often attributed to linguistic knowledge injected during the pre-training process. In this work, we make multiple attempts to surgically remove language specific knowledge from BERT. Surprisingly, these interventions often do little damage to BERT's performance on GLUE tasks. By contrasting against non-pre-trained transformers with oracle initialization, we argue that when it comes to explain BERT's working, there is a sizable void below linguistic probing and above model initialization.
dc.language.isoen
dc.publisherThe University of Arizona.
dc.rightsCopyright © is held by the author. Digital access to this material is made possible by the University Libraries, University of Arizona. Further transmission, reproduction, presentation (such as public display or performance) of protected items is prohibited except with permission of the author.
dc.rights.urihttp://rightsstatements.org/vocab/InC/1.0/
dc.subjectBERT
dc.subjectpre-training
dc.subjecttransformers
dc.titleGeneral Benefits of Mono-Lingual Pre-Training in Transformers
dc.typetext
dc.typeElectronic Thesis
thesis.degree.grantorUniversity of Arizona
thesis.degree.levelmasters
dc.contributor.committeememberSurdeanu, Mihai
dc.contributor.committeememberBarnard, Kobus
thesis.degree.disciplineGraduate College
thesis.degree.disciplineComputer Science
thesis.degree.nameM.S.
refterms.dateFOA2021-06-22T03:12:08Z


Files in this item

Thumbnail
Name:
azu_etd_18735_sip1_m.pdf
Size:
621.1Kb
Format:
PDF

This item appears in the following Collection(s)

Show simple item record