Neural Encoding and Decoding With Distributed Sentence Representations

Building computational models to account for the cortical representation of language plays an important role in understanding the human linguistic system. Recent progress in distributed semantic models (DSMs), especially transformer-based methods, has driven advances in many language understanding t...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transaction on neural networks and learning systems Vol. 32; no. 2; pp. 589 - 603
Main Authors: Sun, Jingyuan, Wang, Shaonan, Zhang, Jiajun, Zong, Chengqing
Format: Journal Article
Language:English
Published: United States IEEE 01.02.2021
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects:
ISSN:2162-237X, 2162-2388, 2162-2388
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Abstract Building computational models to account for the cortical representation of language plays an important role in understanding the human linguistic system. Recent progress in distributed semantic models (DSMs), especially transformer-based methods, has driven advances in many language understanding tasks, making DSM a promising methodology to probe brain language processing. DSMs have been shown to reliably explain cortical responses to word stimuli. However, characterizing the brain activities for sentence processing is much less exhaustively explored with DSMs, especially the deep neural network-based methods. What is the relationship between cortical sentence representations against DSMs? What linguistic features that a DSM catches better explain its correlation with the brain activities aroused by sentence stimuli? Could distributed sentence representations help to reveal the semantic selectivity of different brain areas? We address these questions through the lens of neural encoding and decoding, fueled by the latest developments in natural language representation learning. We begin by evaluating the ability of a wide range of 12 DSMs to predict and decipher the functional magnetic resonance imaging (fMRI) images from humans reading sentences. Most models deliver high accuracy in the left middle temporal gyrus (LMTG) and left occipital complex (LOC). Notably, encoders trained with transformer-based DSMs consistently outperform other unsupervised structured models and all the unstructured baselines. With probing and ablation tasks, we further find that differences in the performance of the DSMs in modeling brain activities can be at least partially explained by the granularity of their semantic representations. We also illustrate the DSM's selectivity for concept categories and show that the topics are represented by spatially overlapping and distributed cortical patterns. Our results corroborate and extend previous findings in understanding the relation between DSMs and neural activation patterns and contribute to building solid brain-machine interfaces with deep neural network representations.
AbstractList Building computational models to account for the cortical representation of language plays an important role in understanding the human linguistic system. Recent progress in distributed semantic models (DSMs), especially transformer-based methods, has driven advances in many language understanding tasks, making DSM a promising methodology to probe brain language processing. DSMs have been shown to reliably explain cortical responses to word stimuli. However, characterizing the brain activities for sentence processing is much less exhaustively explored with DSMs, especially the deep neural network-based methods. What is the relationship between cortical sentence representations against DSMs? What linguistic features that a DSM catches better explain its correlation with the brain activities aroused by sentence stimuli? Could distributed sentence representations help to reveal the semantic selectivity of different brain areas? We address these questions through the lens of neural encoding and decoding, fueled by the latest developments in natural language representation learning. We begin by evaluating the ability of a wide range of 12 DSMs to predict and decipher the functional magnetic resonance imaging (fMRI) images from humans reading sentences. Most models deliver high accuracy in the left middle temporal gyrus (LMTG) and left occipital complex (LOC). Notably, encoders trained with transformer-based DSMs consistently outperform other unsupervised structured models and all the unstructured baselines. With probing and ablation tasks, we further find that differences in the performance of the DSMs in modeling brain activities can be at least partially explained by the granularity of their semantic representations. We also illustrate the DSM's selectivity for concept categories and show that the topics are represented by spatially overlapping and distributed cortical patterns. Our results corroborate and extend previous findings in understanding the relation between DSMs and neural activation patterns and contribute to building solid brain-machine interfaces with deep neural network representations.
Building computational models to account for the cortical representation of language plays an important role in understanding the human linguistic system. Recent progress in distributed semantic models (DSMs), especially transformer-based methods, has driven advances in many language understanding tasks, making DSM a promising methodology to probe brain language processing. DSMs have been shown to reliably explain cortical responses to word stimuli. However, characterizing the brain activities for sentence processing is much less exhaustively explored with DSMs, especially the deep neural network-based methods. What is the relationship between cortical sentence representations against DSMs? What linguistic features that a DSM catches better explain its correlation with the brain activities aroused by sentence stimuli? Could distributed sentence representations help to reveal the semantic selectivity of different brain areas? We address these questions through the lens of neural encoding and decoding, fueled by the latest developments in natural language representation learning. We begin by evaluating the ability of a wide range of 12 DSMs to predict and decipher the functional magnetic resonance imaging (fMRI) images from humans reading sentences. Most models deliver high accuracy in the left middle temporal gyrus (LMTG) and left occipital complex (LOC). Notably, encoders trained with transformer-based DSMs consistently outperform other unsupervised structured models and all the unstructured baselines. With probing and ablation tasks, we further find that differences in the performance of the DSMs in modeling brain activities can be at least partially explained by the granularity of their semantic representations. We also illustrate the DSM's selectivity for concept categories and show that the topics are represented by spatially overlapping and distributed cortical patterns. Our results corroborate and extend previous findings in understanding the relation between DSMs and neural activation patterns and contribute to building solid brain-machine interfaces with deep neural network representations.Building computational models to account for the cortical representation of language plays an important role in understanding the human linguistic system. Recent progress in distributed semantic models (DSMs), especially transformer-based methods, has driven advances in many language understanding tasks, making DSM a promising methodology to probe brain language processing. DSMs have been shown to reliably explain cortical responses to word stimuli. However, characterizing the brain activities for sentence processing is much less exhaustively explored with DSMs, especially the deep neural network-based methods. What is the relationship between cortical sentence representations against DSMs? What linguistic features that a DSM catches better explain its correlation with the brain activities aroused by sentence stimuli? Could distributed sentence representations help to reveal the semantic selectivity of different brain areas? We address these questions through the lens of neural encoding and decoding, fueled by the latest developments in natural language representation learning. We begin by evaluating the ability of a wide range of 12 DSMs to predict and decipher the functional magnetic resonance imaging (fMRI) images from humans reading sentences. Most models deliver high accuracy in the left middle temporal gyrus (LMTG) and left occipital complex (LOC). Notably, encoders trained with transformer-based DSMs consistently outperform other unsupervised structured models and all the unstructured baselines. With probing and ablation tasks, we further find that differences in the performance of the DSMs in modeling brain activities can be at least partially explained by the granularity of their semantic representations. We also illustrate the DSM's selectivity for concept categories and show that the topics are represented by spatially overlapping and distributed cortical patterns. Our results corroborate and extend previous findings in understanding the relation between DSMs and neural activation patterns and contribute to building solid brain-machine interfaces with deep neural network representations.
Author Zhang, Jiajun
Wang, Shaonan
Sun, Jingyuan
Zong, Chengqing
Author_xml – sequence: 1
  givenname: Jingyuan
  orcidid: 0000-0001-8745-6104
  surname: Sun
  fullname: Sun, Jingyuan
  organization: National Laboratory of Pattern Recognition, Institute of Automation, Chinese Academy of Sciences (CAS), Beijing, China
– sequence: 2
  givenname: Shaonan
  orcidid: 0000-0001-5455-1359
  surname: Wang
  fullname: Wang, Shaonan
  email: shaonan.wang@nlpr.ia.ac.cn
  organization: National Laboratory of Pattern Recognition, Institute of Automation, Chinese Academy of Sciences (CAS), Beijing, China
– sequence: 3
  givenname: Jiajun
  orcidid: 0000-0001-5293-7434
  surname: Zhang
  fullname: Zhang, Jiajun
  organization: National Laboratory of Pattern Recognition, Institute of Automation, Chinese Academy of Sciences (CAS), Beijing, China
– sequence: 4
  givenname: Chengqing
  orcidid: 0000-0002-9864-3818
  surname: Zong
  fullname: Zong, Chengqing
  organization: National Laboratory of Pattern Recognition, Institute of Automation, Chinese Academy of Sciences (CAS), Beijing, China
BackLink https://www.ncbi.nlm.nih.gov/pubmed/33052868$$D View this record in MEDLINE/PubMed
BookMark eNp9kU1PGzEQhq0qqKGQP0ClaiUuXJKO7fjrWPHRVopSiQ_BzXLsCTXaeIO9e-DfsyEJBw74Mh7peUf2PN_IIDUJCTmhMKEUzM_b-Xx2M2HAYMKBKWHEF3LIqGRjxrUevN_Vw5CMSnmC_kgQcmq-kiHnIJiW-pBczbHLrq4uk29CTI-VS6G6wF1zH9v_1UUsbY6LrsVQ3WBqMXmsrnGdsfSda2OTyjE5WLq64GhXj8jd1eXt-Z_x7N_vv-e_ZmPPBW3HgUsDKmjDBVM-eCqEkMCYA6mMoMZTpjlKEQINSnO9pFQ6w6iCBThYTPkROdvOXefmucPS2lUsHuvaJWy6YtlUUK5Bc96jpx_Qp6bLqX9dT2k15UrTDfVjR3WLFQa7znHl8ovdb6gH9BbwuSkl49L6uP10m12sLQW78WHffNiND7vz0UfZh-h--qeh79tQRMT3gGG9SQH8FQhYkto
CODEN ITNNAL
CitedBy_id crossref_primary_10_1038_s41597_023_01995_6
crossref_primary_10_1109_TKDE_2025_3527551
crossref_primary_10_1109_TNNLS_2023_3267333
crossref_primary_10_1109_TCE_2025_3573773
crossref_primary_10_1109_TNNLS_2023_3248275
crossref_primary_10_1016_j_neuroimage_2025_121096
crossref_primary_10_3389_frai_2022_796793
crossref_primary_10_1038_s41597_022_01708_5
crossref_primary_10_1111_tops_12771
crossref_primary_10_1038_s41597_022_01840_2
crossref_primary_10_1002_brx2_57
crossref_primary_10_1016_j_brainresbull_2023_110713
crossref_primary_10_1016_j_neucom_2025_130928
crossref_primary_10_1038_s41598_022_20460_9
crossref_primary_10_1038_s42256_024_00925_4
crossref_primary_10_1038_s42003_025_07731_7
crossref_primary_10_1111_cogs_13388
crossref_primary_10_1016_j_bandl_2024_105389
crossref_primary_10_1371_journal_pcbi_1012537
Cites_doi 10.18653/v1/P18-1041
10.18653/v1/P19-1507
10.1371/journal.pone.0008622
10.1016/j.neuron.2011.09.006
10.1093/cercor/bhw240
10.1196/annals.1440.011
10.1093/cercor/bhp055
10.18653/v1/P16-3004
10.1371/journal.pone.0112575
10.1126/science.1152876
10.1038/nature17637
10.18653/v1/P18-1198
10.1016/j.neuroimage.2011.01.066
10.1016/j.neuroimage.2016.04.063
10.1016/j.neuroimage.2016.02.009
10.1609/aaai.v33i01.33017047
10.1002/hbm.23692
10.1073/pnas.1112937108
10.1016/j.neuroimage.2017.08.017
10.18653/v1/S17-2001
10.1038/s41467-018-03068-4
10.18653/v1/D19-1050
10.1073/pnas.1421236112
10.1145/3010088
10.1109/TIP.2019.2922062
10.18653/v1/D17-1070
10.1145/3123266.3123427
10.1177/0956797616641941
10.18653/v1/W19-4820
10.24963/ijcai.2017/494
10.3389/fnhum.2011.00072
ContentType Journal Article
Copyright Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2021
Copyright_xml – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2021
DBID 97E
RIA
RIE
AAYXX
CITATION
CGR
CUY
CVF
ECM
EIF
NPM
7QF
7QO
7QP
7QQ
7QR
7SC
7SE
7SP
7SR
7TA
7TB
7TK
7U5
8BQ
8FD
F28
FR3
H8D
JG9
JQ2
KR7
L7M
L~C
L~D
P64
7X8
DOI 10.1109/TNNLS.2020.3027595
DatabaseName IEEE All-Society Periodicals Package (ASPP) 2005–Present
IEEE All-Society Periodicals Package (ASPP) 1998–Present
IEEE Electronic Library (IEL)
CrossRef
Medline
MEDLINE
MEDLINE (Ovid)
MEDLINE
MEDLINE
PubMed
Aluminium Industry Abstracts
Biotechnology Research Abstracts
Calcium & Calcified Tissue Abstracts
Ceramic Abstracts
Chemoreception Abstracts
Computer and Information Systems Abstracts
Corrosion Abstracts
Electronics & Communications Abstracts
Engineered Materials Abstracts
Materials Business File
Mechanical & Transportation Engineering Abstracts
Neurosciences Abstracts
Solid State and Superconductivity Abstracts
METADEX
Technology Research Database
ANTE: Abstracts in New Technology & Engineering
Engineering Research Database
Aerospace Database
Materials Research Database
ProQuest Computer Science Collection
Civil Engineering Abstracts
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts – Academic
Computer and Information Systems Abstracts Professional
Biotechnology and BioEngineering Abstracts
MEDLINE - Academic
DatabaseTitle CrossRef
MEDLINE
Medline Complete
MEDLINE with Full Text
PubMed
MEDLINE (Ovid)
Materials Research Database
Technology Research Database
Computer and Information Systems Abstracts – Academic
Mechanical & Transportation Engineering Abstracts
ProQuest Computer Science Collection
Computer and Information Systems Abstracts
Materials Business File
Aerospace Database
Engineered Materials Abstracts
Biotechnology Research Abstracts
Chemoreception Abstracts
Advanced Technologies Database with Aerospace
ANTE: Abstracts in New Technology & Engineering
Civil Engineering Abstracts
Aluminium Industry Abstracts
Electronics & Communications Abstracts
Ceramic Abstracts
Neurosciences Abstracts
METADEX
Biotechnology and BioEngineering Abstracts
Computer and Information Systems Abstracts Professional
Solid State and Superconductivity Abstracts
Engineering Research Database
Calcium & Calcified Tissue Abstracts
Corrosion Abstracts
MEDLINE - Academic
DatabaseTitleList
Materials Research Database
MEDLINE
MEDLINE - Academic
Database_xml – sequence: 1
  dbid: NPM
  name: PubMed
  url: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 2
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://ieeexplore.ieee.org/
  sourceTypes: Publisher
– sequence: 3
  dbid: 7X8
  name: MEDLINE - Academic
  url: https://search.proquest.com/medline
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Computer Science
EISSN 2162-2388
EndPage 603
ExternalDocumentID 33052868
10_1109_TNNLS_2020_3027595
9223750
Genre orig-research
Research Support, Non-U.S. Gov't
Journal Article
GrantInformation_xml – fundername: Natural Science Foundation of China
  grantid: 61906189
  funderid: 10.13039/501100001809
– fundername: Beijing Municipal Science and Technology Project
  grantid: Z181100008918017
  funderid: 10.13039/501100009592
– fundername: Beijing Advanced Innovation Center for Language Resources and the Beijing Academy of Artificial Intelligence
  grantid: BAAI2019QN0504
  funderid: 10.13039/501100012424
GroupedDBID 0R~
4.4
5VS
6IK
97E
AAJGR
AARMG
AASAJ
AAWTH
ABAZT
ABQJQ
ABVLG
ACIWK
ACPRK
AENEX
AFRAH
AGQYO
AGSQL
AHBIQ
AKJIK
AKQYR
ALMA_UNASSIGNED_HOLDINGS
ATWAV
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
EBS
EJD
IFIPE
IPLJI
JAVBF
M43
MS~
O9-
OCL
PQQKQ
RIA
RIE
RNS
AAYXX
CITATION
CGR
CUY
CVF
ECM
EIF
NPM
RIG
7QF
7QO
7QP
7QQ
7QR
7SC
7SE
7SP
7SR
7TA
7TB
7TK
7U5
8BQ
8FD
F28
FR3
H8D
JG9
JQ2
KR7
L7M
L~C
L~D
P64
7X8
ID FETCH-LOGICAL-c351t-d36907d893527cdc15556022a0679519c1283e65dd1d7838f116a92170b0a0b43
IEDL.DBID RIE
ISICitedReferencesCount 31
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000616310400011&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
ISSN 2162-237X
2162-2388
IngestDate Thu Oct 02 12:02:56 EDT 2025
Sun Nov 09 08:09:45 EST 2025
Thu Jan 02 22:56:09 EST 2025
Sat Nov 29 01:40:09 EST 2025
Tue Nov 18 21:45:11 EST 2025
Wed Aug 27 05:47:13 EDT 2025
IsPeerReviewed false
IsScholarly true
Issue 2
Language English
License https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html
https://doi.org/10.15223/policy-029
https://doi.org/10.15223/policy-037
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c351t-d36907d893527cdc15556022a0679519c1283e65dd1d7838f116a92170b0a0b43
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ORCID 0000-0001-5455-1359
0000-0001-5293-7434
0000-0002-9864-3818
0000-0001-8745-6104
PMID 33052868
PQID 2487437813
PQPubID 85436
PageCount 15
ParticipantIDs crossref_citationtrail_10_1109_TNNLS_2020_3027595
proquest_miscellaneous_2451380833
crossref_primary_10_1109_TNNLS_2020_3027595
proquest_journals_2487437813
ieee_primary_9223750
pubmed_primary_33052868
PublicationCentury 2000
PublicationDate 2021-02-01
PublicationDateYYYYMMDD 2021-02-01
PublicationDate_xml – month: 02
  year: 2021
  text: 2021-02-01
  day: 01
PublicationDecade 2020
PublicationPlace United States
PublicationPlace_xml – name: United States
– name: Piscataway
PublicationTitle IEEE transaction on neural networks and learning systems
PublicationTitleAbbrev TNNLS
PublicationTitleAlternate IEEE Trans Neural Netw Learn Syst
PublicationYear 2021
Publisher IEEE
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Publisher_xml – name: IEEE
– name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
References ref35
ref13
ref12
ref37
ref15
ref36
ref14
ref11
hewitt (ref33) 2019
ref32
ref2
ref1
ref39
ref17
ref38
huth (ref10) 2016; 532
ref16
ref19
marelli (ref34) 2014
ref18
subramanian (ref31) 2018
zhao (ref40) 2017
liu (ref29) 2019
mitchell (ref4) 2008; 320
wang (ref42) 2017
ref24
ref23
ref25
ref20
ref41
ref22
vaswani (ref30) 2017
kiros (ref28) 2015
ref8
ref7
radford (ref26) 2019; 1
ref9
ref3
ref6
ref5
devlin (ref21) 2018
arora (ref27) 2016
References_xml – ident: ref19
  doi: 10.18653/v1/P18-1041
– ident: ref14
  doi: 10.18653/v1/P19-1507
– year: 2019
  ident: ref29
  article-title: RoBERTa: A robustly optimized BERT pretraining approach
  publication-title: arXiv 1907 11692
– ident: ref2
  doi: 10.1371/journal.pone.0008622
– ident: ref39
  doi: 10.1016/j.neuron.2011.09.006
– ident: ref7
  doi: 10.1093/cercor/bhw240
– ident: ref38
  doi: 10.1196/annals.1440.011
– start-page: 3294
  year: 2015
  ident: ref28
  article-title: Skip-thought vectors
  publication-title: Proc Adv Neural Inf Process Syst (NIPS)
– ident: ref37
  doi: 10.1093/cercor/bhp055
– start-page: 3532
  year: 2017
  ident: ref40
  article-title: Community-based question answering via asymmetric multi-faceted ranking network learning
  publication-title: Proc 31st AAAI Conf Artif Intell (AAAI)
– ident: ref9
  doi: 10.18653/v1/P16-3004
– ident: ref17
  doi: 10.1371/journal.pone.0112575
– volume: 320
  start-page: 1191
  year: 2008
  ident: ref4
  article-title: Predicting human brain activity associated with the meanings of nouns
  publication-title: Science
  doi: 10.1126/science.1152876
– volume: 532
  start-page: 453
  year: 2016
  ident: ref10
  article-title: Natural speech reveals the semantic maps that tile human cerebral cortex
  publication-title: Nature
  doi: 10.1038/nature17637
– ident: ref32
  doi: 10.18653/v1/P18-1198
– start-page: 216
  year: 2014
  ident: ref34
  article-title: A SICK cure for the evaluation of compositional distributional semantic models
  publication-title: Proc LREC
– ident: ref6
  doi: 10.1016/j.neuroimage.2011.01.066
– start-page: 4129
  year: 2019
  ident: ref33
  article-title: A structural probe for finding syntax in word representations
  publication-title: Proc Conf North Amer Chapter Assoc Comput Linguistics Hum Lang Technol (NAACL-HLT)
– ident: ref12
  doi: 10.1016/j.neuroimage.2016.04.063
– start-page: 4171
  year: 2018
  ident: ref21
  article-title: Bert: Pre-training of deep bidirectional transformers for language understanding
  publication-title: Proc Annu Conf North Amer Chapter Assoc Comput Linguistics Hum Lang Technol (NAACL-HLT)
– ident: ref1
  doi: 10.1016/j.neuroimage.2016.02.009
– ident: ref15
  doi: 10.1609/aaai.v33i01.33017047
– ident: ref8
  doi: 10.1002/hbm.23692
– ident: ref36
  doi: 10.1073/pnas.1112937108
– ident: ref18
  doi: 10.1016/j.neuroimage.2017.08.017
– ident: ref35
  doi: 10.18653/v1/S17-2001
– start-page: 5964
  year: 2017
  ident: ref42
  article-title: Investigating inner properties of multimodal representation and semantic compositionality with brain-based componential semantics
  publication-title: Proc AAAI Conf Artif Intell (AAAI)
– year: 2018
  ident: ref31
  article-title: Learning general purpose distributed sentence representations via large scale multi-task learning
  publication-title: Proc Int Conf Learn Represent (ICLR)
– ident: ref20
  doi: 10.1038/s41467-018-03068-4
– ident: ref13
  doi: 10.18653/v1/D19-1050
– ident: ref5
  doi: 10.1073/pnas.1421236112
– ident: ref24
  doi: 10.1145/3010088
– start-page: 5998
  year: 2017
  ident: ref30
  article-title: Attention is all you need
  publication-title: Proc Adv Neural Inf Process Syst (NIPS)
– ident: ref25
  doi: 10.1109/TIP.2019.2922062
– year: 2016
  ident: ref27
  article-title: A simple but tough-to-beat baseline for sentence embeddings
  publication-title: Proc Int Conf Learn Represent (ICLR)
– ident: ref22
  doi: 10.18653/v1/D17-1070
– ident: ref41
  doi: 10.1145/3123266.3123427
– ident: ref3
  doi: 10.1177/0956797616641941
– volume: 1
  start-page: 9
  year: 2019
  ident: ref26
  article-title: Language models are unsupervised multitask learners
  publication-title: OpenAIRE blog
– ident: ref16
  doi: 10.18653/v1/W19-4820
– ident: ref23
  doi: 10.24963/ijcai.2017/494
– ident: ref11
  doi: 10.3389/fnhum.2011.00072
SSID ssj0000605649
Score 2.5298634
Snippet Building computational models to account for the cortical representation of language plays an important role in understanding the human linguistic system....
SourceID proquest
pubmed
crossref
ieee
SourceType Aggregation Database
Index Database
Enrichment Source
Publisher
StartPage 589
SubjectTerms Ablation
Algorithms
Artificial neural networks
Brain
Brain - diagnostic imaging
Brain mapping
Brain modeling
Brain-Computer Interfaces
Brain–machine interfaces
Cerebral Cortex - anatomy & histology
Cerebral Cortex - physiology
Coders
Computational neuroscience
Computer Simulation
Decoding
Deep Learning
distributed semantic representations
Encoding
Functional magnetic resonance imaging
Humans
Image Processing, Computer-Assisted
Interfaces
Language
Linguistics
Machine learning
Magnetic Resonance Imaging
Model accuracy
Natural Language Processing
Neural coding
neural decoding
neural encoding
Neural networks
Neural Networks, Computer
Neuroimaging
Occipital Lobe - diagnostic imaging
Reading
Representations
Reproducibility of Results
Selectivity
Semantics
Sentences
Stimuli
Task analysis
Temporal gyrus
Temporal Lobe - diagnostic imaging
Transformers
Title Neural Encoding and Decoding With Distributed Sentence Representations
URI https://ieeexplore.ieee.org/document/9223750
https://www.ncbi.nlm.nih.gov/pubmed/33052868
https://www.proquest.com/docview/2487437813
https://www.proquest.com/docview/2451380833
Volume 32
WOSCitedRecordID wos000616310400011&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVIEE
  databaseName: IEEE Electronic Library (IEL)
  customDbUrl:
  eissn: 2162-2388
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0000605649
  issn: 2162-237X
  databaseCode: RIE
  dateStart: 20120101
  isFulltext: true
  titleUrlDefault: https://ieeexplore.ieee.org/
  providerName: IEEE
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV07b9swED6kQYYudR5N6zwMFejWqBFJSaTGoImRwTCC2m28CRRJIQEKObDl_P7c0ZSQoSnQTYKOksA74r6PvAfAV-GqPNMZMtXaJXFaaBFXyJZjI2tnjagN9z2Wfk_kdKoWi-JuBy76XBjnnA8-c9_p0p_l26XZ0FbZZYG-TBJBfydlvs3V6vdTEsTluUe7nOU8RsFFlyOTFJfz6XQyQzbIkaTSQV1BHWuQymdcUZHVVy7J91h5G256tzMe_N8P78OHAC-jq609HMCOaw5h0LVuiMJKPoIxFeVAwZvGLMl7Rbqx0bULN_eP7UN0TSV1qRuWs9GMKnfiwOinD5wN-UrN-iP8Gt_Mf9zGoaVCbETG2tgKYsMWQUrGpbEG0QRCHs417SchmDPoroTLM2uZlUqomrFcF0hbkirRSZWKY9htlo37DJHmkrFKqyytBKXPKuuQTltb68rUKuVDYN2slibUG6e2F39KzzuSovRKKUkpZVDKEL71Y5621Tb-KX1EU95LhtkewlmnvDIsyHXJkZilQiomhvClf4xLic5HdOOWG5LJmFCISVHm01bp_bs7Wzn5-zdP4T2nYBcfzn0Gu-1q485hzzy3j-vVCO11oUbeXl8AzrPh5w
linkProvider IEEE
linkToHtml http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1LT9wwEB4hQGovvAvLM0i90ZTYzsM5ImAFYomqsrR7ixzbEUgoi9hdfj8zXifqoSBxS5RxEnlszffZnvkAvgtbpYlKkKnWNgrjXImwQrYc6qy2Rotac6ex9GeQFYUcjfJfC_Cjy4Wx1rrDZ_YnXbq9fDPWM1oqO80xlmVE0JdIOctna3UrKhEi89ThXc5SHqLpqM2SifLTYVEM7pAPcqSptFWXk2YNkvmESyqz-k9Qcior7wNOF3j6q5_75TVY8QAzOJuPiHVYsM0GrLbiDYGfy5vQp7IcaHjZ6DHFr0A1Jriw_ubv4_QhuKCiuqSHZU1wR7U7sWHw2x2d9RlLzWQL7vuXw_Or0IsqhFokbBoaQXzYIExJeKaNRjyBoIdzRStKCOc0Bixh08QYZjIpZM1YqnIkLlEVqaiKxTdYbMaN3YFA8YyxSskkrgQl0EpjkVAbU6tK1zLmPWBtr5baVxwn4Yun0jGPKC-dU0pySumd0oOTrs3zvN7Gh9ab1OWdpe_tHuy3ziv9lJyUHKlZLDLJRA-Ou8c4mWiHRDV2PCObhAmJqBRttudO797djpXd_3_zCL5cDW8H5eC6uNmDr5yOvrjD3fuwOH2Z2QNY1q_Tx8nLoRu1b2vO5Eg
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Neural+Encoding+and+Decoding+With+Distributed+Sentence+Representations&rft.jtitle=IEEE+transaction+on+neural+networks+and+learning+systems&rft.au=Sun%2C+Jingyuan&rft.au=Wang%2C+Shaonan&rft.au=Zhang%2C+Jiajun&rft.au=Zong%2C+Chengqing&rft.date=2021-02-01&rft.eissn=2162-2388&rft.volume=32&rft.issue=2&rft.spage=589&rft_id=info:doi/10.1109%2FTNNLS.2020.3027595&rft_id=info%3Apmid%2F33052868&rft.externalDocID=33052868
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2162-237X&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2162-237X&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2162-237X&client=summon