OpCodeBERT: A Method for Python Code Representation Learning by BERT With Opcode

Programming language pre-training models have made significant progress in code representation learning in recent years. Although various methods, such as data flow and Abstract Syntax Tree (AST), have been widely applied to enhance code representation, there has been no research literature, up to d...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on software engineering Vol. 51; no. 11; pp. 3103 - 3116
Main Authors: Qiu, Canyu, Liu, Jianxun, Xiao, Xiaocong, Xiao, Yong
Format: Journal Article
Language:English
Published: New York IEEE 01.11.2025
IEEE Computer Society
Subjects:
ISSN:0098-5589, 1939-3520
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Abstract Programming language pre-training models have made significant progress in code representation learning in recent years. Although various methods, such as data flow and Abstract Syntax Tree (AST), have been widely applied to enhance code representation, there has been no research literature, up to date, specifically exploring the use of intermediate code of the source codes for code representation. For example, the intermediate code of Python, namely opcode, not only includes the data input and output stack processes during program execution, but also describes the specific execution order and control flow information. These features are not possessed in source code, data flow, AST and other structures or are difficult to directly reflect. In this paper, we propose OpCodeBERT ( https://github.com/qcy321/OpCodeBERT ) approach, which is the first to utilize Python opcode for code representation learning and improves code representation by encoding the underlying execution logic, comments, and source code. To support the training of opcode, we filter the public datasets to exclude unparsable data and innovatively propose an opcode-to-sequence mapping method to convert them into a form suitable for model input. In addition, we pre-train OpCodeBERT using a two-stage masked language modeling (MLM) and a multi-modal contrastive learning. To evaluate the effectiveness of OpCodeBERT, we have done experiment with multiple downstream tasks. The experimental results show that OpCodeBERT performs excellently on these tasks, validating the effectiveness of incorporating opcode and further demonstrating the feasibility of this method in code representation learning.
AbstractList Programming language pre-training models have made significant progress in code representation learning in recent years. Although various methods, such as data flow and Abstract Syntax Tree (AST), have been widely applied to enhance code representation, there has been no research literature, up to date, specifically exploring the use of intermediate code of the source codes for code representation. For example, the intermediate code of Python, namely opcode, not only includes the data input and output stack processes during program execution, but also describes the specific execution order and control flow information. These features are not possessed in source code, data flow, AST and other structures or are difficult to directly reflect. In this paper, we propose OpCodeBERT (https://github.com/qcy321/OpCodeBERT) approach, which is the first to utilize Python opcode for code representation learning and improves code representation by encoding the underlying execution logic, comments, and source code. To support the training of opcode, we filter the public datasets to exclude unparsable data and innovatively propose an opcode-to-sequence mapping method to convert them into a form suitable for model input. In addition, we pre-train OpCodeBERT using a two-stage masked language modeling (MLM) and a multi-modal contrastive learning. To evaluate the effectiveness of OpCodeBERT, we have done experiment with multiple downstream tasks. The experimental results show that OpCodeBERT performs excellently on these tasks, validating the effectiveness of incorporating opcode and further demonstrating the feasibility of this method in code representation learning.
Author Qiu, Canyu
Liu, Jianxun
Xiao, Yong
Xiao, Xiaocong
Author_xml – sequence: 1
  givenname: Canyu
  orcidid: 0009-0004-9261-1446
  surname: Qiu
  fullname: Qiu, Canyu
  email: qcanyu66@gmail.com
  organization: School of Computer Science and Engineering, Hunan University of Science and Technology, and Hunan Provincial Key Laboratory for Services Computing and Novel Software Technology, Xiangtan, China
– sequence: 2
  givenname: Jianxun
  orcidid: 0000-0003-0722-152X
  surname: Liu
  fullname: Liu, Jianxun
  email: ljx529@gmail.com
  organization: School of Computer Science and Engineering, Hunan University of Science and Technology, and Hunan Provincial Key Laboratory for Services Computing and Novel Software Technology, Xiangtan, China
– sequence: 3
  givenname: Xiaocong
  orcidid: 0009-0003-9626-2960
  surname: Xiao
  fullname: Xiao, Xiaocong
  email: xiaocongxiao1102@gmail.com
  organization: School of Computer Science and Engineering, Hunan University of Science and Technology, and Hunan Provincial Key Laboratory for Services Computing and Novel Software Technology, Xiangtan, China
– sequence: 4
  givenname: Yong
  orcidid: 0000-0001-8239-6149
  surname: Xiao
  fullname: Xiao, Yong
  email: yongx853@gmail.com
  organization: School of Computer Science and Engineering, Hunan University of Science and Technology, and Hunan Provincial Key Laboratory for Services Computing and Novel Software Technology, Xiangtan, China
BookMark eNpFkE1PwzAMhiMEEtvgzoFDJM4ddtI0DbcxjQ9paNMY4hj1w2GdoC1pd9i_J9MmcbJlP68tPUN2Xjc1MXaDMEYEc79-n40FCDWWCYKI4zM2QCNNJJWAczYAMGmkVGou2bDrtgCgtFYDtly006akx9lq_cAn_I36TVNy13i-3Ie25octX1HrqaO6z_oqzOaU-bqqv3i-54ck_6z6DV-0RWCv2IXLvju6PtUR-3iaracv0Xzx_DqdzKNCxLqPCoolGZWbPEsgd7lEHZeoNWZkXCFQqzKXpE3qyKU5xQLKEHQJmZigICFH7O54t_XN74663m6bna_DSyuFlggaNQYKjlThm67z5Gzrq5_M7y2CPXizwZs9eLMnbyFye4xURPSPIyZGKyH_ANBnacs
CODEN IESEDJ
Cites_doi 10.5555/3524938.3525087
10.18653/v1/2021.emnlp-main.685
10.1109/AICAI.2019.8701341
10.1145/3368089.3417058
10.1049/sfw2.12064
10.1007/s10462-018-09679-z
10.1108/ws.2000.07949fab.004
10.1109/CVPR.2016.434
10.18653/v1/2021.emnlp-main.552
10.1109/CVPR.2006.100
10.18653/v1/2022.naacl-main.55
10.18653/v1/2022.findings-naacl.80
10.18653/v1/2021.acl-long.442
10.48550/ARXIV.1706.03762
10.1145/3475960.3475985
10.1109/ICSE43902.2021.00041
10.1109/ICSE48619.2023.00185
10.48550/arXiv.1810.04805
10.18653/v1/2023.emnlp-main.68
10.18653/v1/2020.findings-emnlp.139
10.18653/v1/W18-5446
10.1525/9780520940420-020
10.18653/v1/2020.emnlp-main.728
10.18653/v1/N18-1202
10.1145/3597503.3608140
10.18653/v1/2022.acl-long.499
10.18653/v1/2021.naacl-main.211
ContentType Journal Article
Copyright Copyright IEEE Computer Society 2025
Copyright_xml – notice: Copyright IEEE Computer Society 2025
DBID 97E
RIA
RIE
AAYXX
CITATION
JQ2
K9.
DOI 10.1109/TSE.2025.3610244
DatabaseName IEEE All-Society Periodicals Package (ASPP) 2005-present
IEEE All-Society Periodicals Package (ASPP) 1998-Present
IEEE Electronic Library (IEL)
CrossRef
ProQuest Computer Science Collection
ProQuest Health & Medical Complete (Alumni)
DatabaseTitle CrossRef
ProQuest Health & Medical Complete (Alumni)
ProQuest Computer Science Collection
DatabaseTitleList ProQuest Health & Medical Complete (Alumni)

Database_xml – sequence: 1
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Computer Science
EISSN 1939-3520
EndPage 3116
ExternalDocumentID 10_1109_TSE_2025_3610244
11169752
Genre orig-research
GrantInformation_xml – fundername: National Key Research and Development Program of China
  grantid: 2020YFB1707602
  funderid: 10.13039/501100012166
– fundername: Natural Science Foundation of China
  grantid: 61866013
  funderid: 10.13039/501100001809
GroupedDBID --Z
-DZ
-~X
.4S
.DC
0R~
29I
3EH
4.4
5GY
5VS
6IK
7WY
7X7
85S
88E
88I
8FE
8FG
8FI
8FJ
8FL
8G5
8R4
8R5
97E
9M8
AAJGR
AASAJ
AAWTH
ABFSI
ABJCF
ABPPZ
ABQJQ
ABUWG
ABVLG
ACGFO
ACGOD
ACIWK
ACNCT
ADBBV
AENEX
AETIX
AFFHD
AFKRA
AGQYO
AGSQL
AHBIQ
AI.
AIBXA
AKJIK
AKQYR
ALLEH
ALMA_UNASSIGNED_HOLDINGS
ARAPS
ARCSS
ASUFR
ATWAV
AZQEC
BEFXN
BENPR
BEZIV
BFFAM
BGLVJ
BGNUA
BKEBE
BKOMP
BPEOZ
BPHCQ
BVXVI
CCPQU
CS3
DU5
DWQXO
E.L
EBS
EDO
EJD
FRNLG
FYUFA
GNUQQ
GROUPED_ABI_INFORM_RESEARCH
GUQSH
HCIFZ
HMCUK
HZ~
H~9
I-F
IBMZZ
ICLAB
IEDLZ
IFIPE
IFJZH
IPLJI
ITG
ITH
JAVBF
K60
K6V
K6~
K7-
L6V
LAI
M0C
M1P
M1Q
M2O
M2P
M43
M7S
MS~
O9-
OCL
OHT
P2P
P62
PHGZM
PHGZT
PJZUB
PPXIY
PQBIZ
PQBZA
PQGLB
PQQKQ
PROAC
PSQYO
PTHSS
Q2X
RIA
RIE
RNI
RNS
RXW
RZB
S10
TAE
TN5
TWZ
UHB
UKHRP
UPT
UQL
VH1
WH7
XOL
YYP
YZZ
ZCG
AAYXX
CITATION
ABAZT
JQ2
K9.
ID FETCH-LOGICAL-c247t-ce43e95b9ba60bfb3174d1771ae9fc2175db3e798fef8be420dc24f6e94e0ce23
IEDL.DBID RIE
ISICitedReferencesCount 0
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=001618768900001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
ISSN 0098-5589
IngestDate Thu Nov 20 16:01:32 EST 2025
Thu Nov 27 00:45:45 EST 2025
Wed Nov 26 07:27:09 EST 2025
IsPeerReviewed true
IsScholarly true
Issue 11
Language English
License https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html
https://doi.org/10.15223/policy-029
https://doi.org/10.15223/policy-037
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c247t-ce43e95b9ba60bfb3174d1771ae9fc2175db3e798fef8be420dc24f6e94e0ce23
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ORCID 0000-0001-8239-6149
0000-0003-0722-152X
0009-0004-9261-1446
0009-0003-9626-2960
PQID 3273107171
PQPubID 21418
PageCount 14
ParticipantIDs proquest_journals_3273107171
crossref_primary_10_1109_TSE_2025_3610244
ieee_primary_11169752
PublicationCentury 2000
PublicationDate 2025-11-01
PublicationDateYYYYMMDD 2025-11-01
PublicationDate_xml – month: 11
  year: 2025
  text: 2025-11-01
  day: 01
PublicationDecade 2020
PublicationPlace New York
PublicationPlace_xml – name: New York
PublicationTitle IEEE transactions on software engineering
PublicationTitleAbbrev TSE
PublicationYear 2025
Publisher IEEE
IEEE Computer Society
Publisher_xml – name: IEEE
– name: IEEE Computer Society
References ref13
ref12
ref15
ref14
Tay (ref32) 2023
ref10
ref18
Zhang (ref8) 2024
Husain (ref17) 2019
ref45
Guo (ref11) 2021
ref42
ref41
ref44
Wang (ref40) 2022
ref7
ref9
Srivastava (ref37) 2014; 15
ref3
Liu (ref47) 2019
ref6
Raffel (ref22) 2020; 21
ref35
ref34
ref36
Maaten (ref51) 2008; 9
ref31
ref33
Puri (ref46) 2021
Zhu (ref21) 2024
Neelakantan (ref39) 2022
Lachaux (ref49) 2020
ref1
ref38
Clark (ref19) 2020
ref24
ref23
ref26
Wang (ref50) 2020
Lachaux (ref30) 2021; 34
Karampatsis (ref5) 2020
ref28
Jiang (ref20) 2021
ref27
ref29
Lu (ref43) 2021
Kanade (ref4) 2020
Wang (ref16) 2021
Radford (ref2) 2019
Wang (ref25) 2019
Kocetkov (ref48) 2022
References_xml – start-page: 5110
  volume-title: Proc. Int. Conf. Mach. Learn. (ICML)
  year: 2020
  ident: ref4
  article-title: Learning and evaluating contextual embedding of source code
– ident: ref35
  doi: 10.5555/3524938.3525087
– ident: ref29
  doi: 10.18653/v1/2021.emnlp-main.685
– ident: ref12
  doi: 10.1109/AICAI.2019.8701341
– year: 2019
  ident: ref2
  article-title: Language models are unsupervised multitask learners
  publication-title: OpenAI Blog
– ident: ref7
  doi: 10.1145/3368089.3417058
– year: 2022
  ident: ref39
  article-title: Text and code embeddings by contrastive pre-training
– year: 2019
  ident: ref47
  article-title: RoBERTa: A robustly optimized BERT pretraining approach
– start-page: 47427
  volume-title: Proc. Int. Conf. Learn. Representations (ICLR)
  year: 2024
  ident: ref8
  article-title: Code representation learning at scale
– year: 2021
  ident: ref16
  article-title: SynCoBERT: Syntax-guided multi-modal contrastive pre-training for code representation
– ident: ref14
  doi: 10.1049/sfw2.12064
– ident: ref13
  doi: 10.1007/s10462-018-09679-z
– start-page: 3261
  volume-title: Proc. Conf. Neural Inf. Process. Syst. (NeurIPS)
  year: 2019
  ident: ref25
  article-title: SuperGLUE: A stickier benchmark for general-purpose language understanding systems
– ident: ref18
  doi: 10.1108/ws.2000.07949fab.004
– volume: 34
  start-page: 14967
  volume-title: Proc. Conf. Neural Inf. Process. Syst. (NeurIPS)
  year: 2021
  ident: ref30
  article-title: DOBF: A deobfuscation pre-training objective for programming languages
– ident: ref34
  doi: 10.1109/CVPR.2016.434
– ident: ref36
  doi: 10.18653/v1/2021.emnlp-main.552
– ident: ref33
  doi: 10.1109/CVPR.2006.100
– ident: ref38
  doi: 10.18653/v1/2022.naacl-main.55
– ident: ref41
  doi: 10.18653/v1/2022.findings-naacl.80
– volume: 15
  start-page: 1929
  issue: 1
  year: 2014
  ident: ref37
  article-title: Dropout: A simple way to prevent neural networks from overfitting
  publication-title: J. Mach. Learn. Res.
– ident: ref44
  doi: 10.18653/v1/2021.acl-long.442
– volume: 9
  start-page: 2579
  issue: 86
  year: 2008
  ident: ref51
  article-title: Visualizing data using t-SNE
  publication-title: J. Mach. Learn. Res. (JMLR)
– start-page: 54
  volume-title: Proc. Uncertainty Artif. Intell. (UAI)
  year: 2021
  ident: ref20
  article-title: TreeBERT: A tree-based pre-trained model for programming language
– ident: ref15
  doi: 10.48550/ARXIV.1706.03762
– start-page: 1
  volume-title: Proc. IEEE/ACM 46th Int. Conf. Softw. Eng. (ICSE)
  year: 2024
  ident: ref21
  article-title: GrammarT5: Grammar-integrated pretrained encoder-decoder neural model for code
– ident: ref45
  doi: 10.1145/3475960.3475985
– ident: ref27
  doi: 10.1109/ICSE43902.2021.00041
– year: 2022
  ident: ref48
  article-title: The stack: 3 TB of permissively licensed source code
– start-page: 3295
  volume-title: Proc. Int. Conf. Learn. Representations (ICLR)
  year: 2020
  ident: ref19
  article-title: ELECTRA: Pre-training text encoders as discriminators rather than generators
– volume-title: Proc. Annu. Conf. Neural Inf. Process. Syst. (NeurIPS)
  year: 2020
  ident: ref49
  article-title: Unsupervised translation of programming languages
– ident: ref42
  doi: 10.1109/ICSE48619.2023.00185
– ident: ref3
  doi: 10.48550/arXiv.1810.04805
– ident: ref31
  doi: 10.18653/v1/2023.emnlp-main.68
– ident: ref6
  doi: 10.18653/v1/2020.findings-emnlp.139
– volume-title: Proc. Neural Inf. Process. Syst. Track Datasets Benchmarks 1 (NeurIPS Datasets Benchmarks)
  year: 2021
  ident: ref43
  article-title: CodeXGLUE: A machine learning benchmark dataset for code understanding and generation
– volume: 21
  start-page: 1
  issue: 140
  year: 2020
  ident: ref22
  article-title: Exploring the limits of transfer learning with a unified text-to-text transformer
  publication-title: J. Mach. Learn. Res.
– ident: ref24
  doi: 10.18653/v1/W18-5446
– ident: ref23
  doi: 10.1525/9780520940420-020
– start-page: 17456
  volume-title: Proc. Int. Conf. Learn. Representations (ICLR)
  year: 2021
  ident: ref11
  article-title: GraphCodeBERT: Pre-training code representations with data flow
– year: 2020
  ident: ref5
  article-title: SCELMo: Source code embeddings from language models
– start-page: 36221
  volume-title: Proc. Int. Conf. Learn. Representations (ICLR)
  year: 2023
  ident: ref32
  article-title: UL2: Unifying language learning paradigms
– year: 2022
  ident: ref40
  article-title: Text embeddings by weakly-supervised contrastive pre-training
– ident: ref26
  doi: 10.18653/v1/2020.emnlp-main.728
– ident: ref1
  doi: 10.18653/v1/N18-1202
– year: 2019
  ident: ref17
  article-title: CodeSearchNet challenge: Evaluating the state of semantic code search
– ident: ref9
  doi: 10.1145/3597503.3608140
– volume-title: Proc. Annu. Conf. Neural Inf. Process. Syst. (NeurIPS)
  year: 2021
  ident: ref46
  article-title: CodeNet: A large-scale ai for code dataset for learning a diversity of coding tasks
– start-page: 9929
  volume-title: Proc. Int. Conf. Mach. Learn. (ICML)
  year: 2020
  ident: ref50
  article-title: Understanding contrastive representation learning through alignment and uniformity on the hypersphere
– ident: ref10
  doi: 10.18653/v1/2022.acl-long.499
– ident: ref28
  doi: 10.18653/v1/2021.naacl-main.211
SSID ssj0005775
ssib053395008
Score 2.478489
Snippet Programming language pre-training models have made significant progress in code representation learning in recent years. Although various methods, such as data...
SourceID proquest
crossref
ieee
SourceType Aggregation Database
Index Database
Publisher
StartPage 3103
SubjectTerms Biological system modeling
Codes
contrastive learning
Data models
Effectiveness
Learning
Logic
MLM
Programming languages
Python
Python opcode
Representation learning
Representations
Semantics
Source code
Source coding
Syntactics
Training
underlying execution logic
Title OpCodeBERT: A Method for Python Code Representation Learning by BERT With Opcode
URI https://ieeexplore.ieee.org/document/11169752
https://www.proquest.com/docview/3273107171
Volume 51
WOSCitedRecordID wos001618768900001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVIEE
  databaseName: IEEE Electronic Library (IEL)
  customDbUrl:
  eissn: 1939-3520
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0005775
  issn: 0098-5589
  databaseCode: RIE
  dateStart: 19750101
  isFulltext: true
  titleUrlDefault: https://ieeexplore.ieee.org/
  providerName: IEEE
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1LT8JAEN4o8eBFfGBE0ezBi4dCX9vtekMC8SIQxMitYbtT5VIIFBP-vbPbbYgxHrw1aWfTzOzMfLM7D0LuueSxOV5C28-ckM2ZIyBFIBf5cz9SCiF_OWyCD4fxbCbGtljd1MIAgEk-g7Z-NHf5aplu9VFZB_UyEpyhxT3kPCqLtfb5HJyzqkEmY7Go7iRd0Zm-9jES9Fk7QLCA_uyHDzJDVX5ZYuNeBvV__tgpObE4knZLwZ-RA8jPSb2a0UCtyl6Q8WjVWyp46k-mj7RLX8zAaIpIlY53um0A1W_pxOTD2jKknNqmqx9U7qimpO-L4pOOVrr-vUHeBv1p79mxUxSc1A954aQQBiCYFHIeuTKTCBhC5XHuzUFkKUYkTMkAuIgzyGIJoe8qJMwiECG4KfjBJanlyxyuCI1BSQ8xm4BQoIHNRMwyV0ngKsbFw6xJHiq-JquyWUZiggxXJCiDRMsgsTJokobm4_47y8ImaVWSSKw6bZIAQZanI0_v-g-yG3KsVy-rBFukVqy3cEuO0q9isVnfmZ3yDVZFuiM
linkProvider IEEE
linkToHtml http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1LT8MwDI7QQIILzyHGMwcuHAp9JE3DbaBNQ-ylMcRu1dK4sMs2bQOJf4-TpkIIceBWqXVb2bH9OfGDkEuhRGK3l9D2c4_xMfckZAjk4nAcxloj5C-GTYhuNxmNZN8Vq9taGACwyWdwbS7tWb6eZe9mq-wG9TKWgqPFXeeMhX5RrvWd0SEEL1tkcp7I8lTSlzfDpwbGgiG_jhAuoEf74YXsWJVfttg6mObOP39tl2w7JEnrhej3yBpM98lOOaWBOqU9IP3e_H6m4a4xGN7SOu3YkdEUsSrtf5rGAdTcpQObEesKkabUtV19peqTGkr6Mlm90d7cVMBXyXOzMbxveW6OgpeFTKy8DFgEkiupxrGvcoWQgelAiGAMMs8wJuFaRSBkkkOeKEDGaiTMY5AM_AzC6JBUprMpHBGagFYBojYJTKKJzWXCc18rEDrBl7O8Rq5Kvqbzol1GasMMX6Yog9TIIHUyqJGq4eP3c46FNXJaSiJ1CrVMI4RZgYk9g-M_yC7IZmvYaafth-7jCdkyXypqBk9JZbV4hzOykX2sJsvFuV01XzKzvWo
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=OpCodeBERT%3A+A+Method+for+Python+Code+Representation+Learning+by+BERT+With+Opcode&rft.jtitle=IEEE+transactions+on+software+engineering&rft.au=Qiu%2C+Canyu&rft.au=Liu%2C+Jianxun&rft.au=Xiao%2C+Xiaocong&rft.au=Xiao%2C+Yong&rft.date=2025-11-01&rft.pub=IEEE+Computer+Society&rft.issn=0098-5589&rft.eissn=1939-3520&rft.volume=51&rft.issue=11&rft.spage=3103&rft.epage=3116&rft_id=info:doi/10.1109%2FTSE.2025.3610244&rft.externalDBID=NO_FULL_TEXT
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0098-5589&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0098-5589&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0098-5589&client=summon