Does BERT really agree ? Fine-grained Analysis of Lexical Dependence on a Syntactic Task

Although transformer-based Neural Language Models demonstrate impressive performance on a variety of tasks, their generalization abilities are not well understood. They have been shown to perform strongly on subject-verb number agreement in a wide array of settings, suggesting that they learned to t...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:arXiv.org
Hlavní autori: Lasri, Karim, Lenci, Alessandro, Poibeau, Thierry
Médium: Paper
Jazyk:English
Vydavateľské údaje: Ithaca Cornell University Library, arXiv.org 14.04.2022
Predmet:
ISSN:2331-8422
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Abstract Although transformer-based Neural Language Models demonstrate impressive performance on a variety of tasks, their generalization abilities are not well understood. They have been shown to perform strongly on subject-verb number agreement in a wide array of settings, suggesting that they learned to track syntactic dependencies during their training even without explicit supervision. In this paper, we examine the extent to which BERT is able to perform lexically-independent subject-verb number agreement (NA) on targeted syntactic templates. To do so, we disrupt the lexical patterns found in naturally occurring stimuli for each targeted structure in a novel fine-grained analysis of BERT's behavior. Our results on nonce sentences suggest that the model generalizes well for simple templates, but fails to perform lexically-independent syntactic generalization when as little as one attractor is present.
AbstractList Although transformer-based Neural Language Models demonstrate impressive performance on a variety of tasks, their generalization abilities are not well understood. They have been shown to perform strongly on subject-verb number agreement in a wide array of settings, suggesting that they learned to track syntactic dependencies during their training even without explicit supervision. In this paper, we examine the extent to which BERT is able to perform lexically-independent subject-verb number agreement (NA) on targeted syntactic templates. To do so, we disrupt the lexical patterns found in naturally occurring stimuli for each targeted structure in a novel fine-grained analysis of BERT's behavior. Our results on nonce sentences suggest that the model generalizes well for simple templates, but fails to perform lexically-independent syntactic generalization when as little as one attractor is present.
Author Lasri, Karim
Lenci, Alessandro
Poibeau, Thierry
Author_xml – sequence: 1
  givenname: Karim
  surname: Lasri
  fullname: Lasri, Karim
– sequence: 2
  givenname: Alessandro
  surname: Lenci
  fullname: Lenci, Alessandro
– sequence: 3
  givenname: Thierry
  surname: Poibeau
  fullname: Poibeau, Thierry
BookMark eNotj01Lw0AUABdRsNb-AG8PPKdu3u4mm5PUfqhQEDQHb-Vl87akhk3NttL-ewN6mtswcyMuQxdYiLtUTrU1Rj5Qf2p-pohST2VmbXEhRqhUmliNeC0mMe6klJjlaIwaic9FxxGelu8l9Extewba9szwCKsmcLLtaUANs0DtOTYROg9rPjWOWljwnkPNwTF0AQg-zuFA7tA4KCl-3YorT23kyT_Holwty_lLsn57fp3P1gkZNIllXxDmakh3bHWhsrpgkzp0ylUmKyrvszpnV1itfJp6LW3lpaeKUOc1GTUW93_afd99HzkeNrvu2A-1cYOZkQqNHD5_Acj6VA4
ContentType Paper
Copyright 2022. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Copyright_xml – notice: 2022. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
DBID 8FE
8FG
ABJCF
ABUWG
AFKRA
AZQEC
BENPR
BGLVJ
CCPQU
DWQXO
HCIFZ
L6V
M7S
PHGZM
PHGZT
PIMPY
PKEHL
PQEST
PQGLB
PQQKQ
PQUKI
PRINS
PTHSS
DOI 10.48550/arxiv.2204.06889
DatabaseName ProQuest SciTech Collection
ProQuest Technology Collection
Materials Science & Engineering Collection
ProQuest Central (Alumni)
ProQuest Central UK/Ireland
ProQuest Central Essentials
ProQuest Central
Technology Collection
ProQuest One Community College
ProQuest Central
SciTech Premium Collection
ProQuest Engineering Collection
Engineering Database
ProQuest Central Premium
ProQuest One Academic (New)
Publicly Available Content Database
ProQuest One Academic Middle East (New)
ProQuest One Academic Eastern Edition (DO NOT USE)
ProQuest One Applied & Life Sciences
ProQuest One Academic (retired)
ProQuest One Academic UKI Edition
ProQuest Central China
Engineering collection
DatabaseTitle Publicly Available Content Database
Engineering Database
Technology Collection
ProQuest One Academic Middle East (New)
ProQuest Central Essentials
ProQuest One Academic Eastern Edition
ProQuest Central (Alumni Edition)
SciTech Premium Collection
ProQuest One Community College
ProQuest Technology Collection
ProQuest SciTech Collection
ProQuest Central China
ProQuest Central
ProQuest One Applied & Life Sciences
ProQuest Engineering Collection
ProQuest One Academic UKI Edition
ProQuest Central Korea
Materials Science & Engineering Collection
ProQuest Central (New)
ProQuest One Academic
ProQuest One Academic (New)
Engineering Collection
DatabaseTitleList Publicly Available Content Database
Database_xml – sequence: 1
  dbid: PIMPY
  name: Publicly Available Content Database (subscription)
  url: http://search.proquest.com/publiccontent
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Physics
EISSN 2331-8422
Genre Working Paper/Pre-Print
GroupedDBID 8FE
8FG
ABJCF
ABUWG
AFKRA
ALMA_UNASSIGNED_HOLDINGS
AZQEC
BENPR
BGLVJ
CCPQU
DWQXO
FRJ
HCIFZ
L6V
M7S
M~E
PHGZM
PHGZT
PIMPY
PKEHL
PQEST
PQGLB
PQQKQ
PQUKI
PRINS
PTHSS
ID FETCH-LOGICAL-a525-8ef9a273485ce84936d9e51c2c3cb569bff6d7ec9843f11f408bf0faba247da53
IEDL.DBID BENPR
IngestDate Mon Jun 30 09:26:08 EDT 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed false
IsScholarly false
Language English
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-a525-8ef9a273485ce84936d9e51c2c3cb569bff6d7ec9843f11f408bf0faba247da53
Notes SourceType-Working Papers-1
ObjectType-Working Paper/Pre-Print-1
content type line 50
OpenAccessLink https://www.proquest.com/docview/2650325055?pq-origsite=%requestingapplication%
PQID 2650325055
PQPubID 2050157
ParticipantIDs proquest_journals_2650325055
PublicationCentury 2000
PublicationDate 20220414
PublicationDateYYYYMMDD 2022-04-14
PublicationDate_xml – month: 04
  year: 2022
  text: 20220414
  day: 14
PublicationDecade 2020
PublicationPlace Ithaca
PublicationPlace_xml – name: Ithaca
PublicationTitle arXiv.org
PublicationYear 2022
Publisher Cornell University Library, arXiv.org
Publisher_xml – name: Cornell University Library, arXiv.org
SSID ssj0002672553
Score 1.7914433
SecondaryResourceType preprint
Snippet Although transformer-based Neural Language Models demonstrate impressive performance on a variety of tasks, their generalization abilities are not well...
SourceID proquest
SourceType Aggregation Database
SubjectTerms Sentences
Title Does BERT really agree ? Fine-grained Analysis of Lexical Dependence on a Syntactic Task
URI https://www.proquest.com/docview/2650325055
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpV3NT8IwFG8UNPHkd_xA0oPXCWu7tT2RIBBNkCywA55I27XGSDbckMB_bzuHHky8eGx6ad5L3_f7_QC4FUxYu298DwtfeARR7onAJStMtIXRofRlqekhHY3YdMqjquBWVGOVW5tYGuokU65G3kI2lMDOXwedxbvnWKNcd7Wi0NgFdYdURmqg3u2PovF3lQWF1MbM-KudWYJ3tUS-fl3dIeQATkPG-C8jXHqWweF_33QE6pFY6PwY7Oj0BOyXE52qOAXTXqYL2O2PY2jjwvl8A4VNrTXswIENLL0XRw2hE7gFJYGZgUO9diqDvYoYV2mYpVDAySZdlrtUMBbF2xmIB_34_sGraBSs1FHgMW24KEFsAqUZ4ThMuA58hRRWMgi5NCZMqFacEWx835A2k6ZthBSI0EQE-BzU0izVFwBavVJiaFsi6RZQDTOhwq6Nz2x-LkN6CRpbOc2qr1DMfoR09ff1NThAbrfAASmSBqgt8w99A_bUavla5M1Ks003nDmxp-jxKXr-BEubsDM
linkProvider ProQuest
linkToHtml http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw1V1NT8IwGG6IaPTkd_xA7UGPE9Z9dQdDokAgICGyAzfSdW8JkWy4IcKP8j_aFqYHE28cvK6Xrc-7p2_b930ehG4ZZZL3hWlYzGSGTTzfYI7arFBWYQLc0Aw10h2v26WDgd8roM-8F0aVVeacqIk6Srg6Iy8TmUpYar12qtM3Q7lGqdvV3EJjFRZtWH7ILVv20KpJfO8IadSDp6axdhWQL0Ecg4LwmdZ0cThQ27fcyAfH5IRbPHRcPxTCjTzgPrUtYZrCrtBQVAQLGbG9iCmTCMn4RVuRv64U7H8f6RDXkwm6tbo71UphZZYuxvN7QpSaqkup_4vx9TLW2P9nE3CAij02hfQQFSA-Qju6WpVnx2hQSyDDj_WXAMucdzJZYjZKAXAVN2TSbIyU7QVEOBdcwYnAHViocMS1tekvB5zEmOH-Mp7pPjEcsOz1BAWb-JZTtBUnMZwhLGPWs4VXCUmommsFFS63VIkCBfnI9c5RKYdluP7Ns-EPJhd_D9-g3Wbw3Bl2Wt32JdojqodCCUbaJbQ1S9_hCm3z-Wycpdc6pDAabhjBLx-TCX4
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Does+BERT+really+agree+%3F+Fine-grained+Analysis+of+Lexical+Dependence+on+a+Syntactic+Task&rft.jtitle=arXiv.org&rft.au=Lasri%2C+Karim&rft.au=Lenci%2C+Alessandro&rft.au=Poibeau%2C+Thierry&rft.date=2022-04-14&rft.pub=Cornell+University+Library%2C+arXiv.org&rft.eissn=2331-8422&rft_id=info:doi/10.48550%2Farxiv.2204.06889