The complexity dynamics of grokking

We demonstrate the existence of a complexity phase transition in neural networks by studying the grokking phenomenon, where networks suddenly transition from memorization to generalization long after overfitting their training data. To characterize this phase transition, we introduce a theoretical f...

Full description

Saved in:
Bibliographic Details
Published in:Physica. D Vol. 482; p. 134859
Main Authors: DeMoss, Branton, Sapora, Silvia, Foerster, Jakob, Hawes, Nick, Posner, Ingmar
Format: Journal Article
Language:English
Published: Elsevier B.V 01.11.2025
Subjects:
ISSN:0167-2789
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Abstract We demonstrate the existence of a complexity phase transition in neural networks by studying the grokking phenomenon, where networks suddenly transition from memorization to generalization long after overfitting their training data. To characterize this phase transition, we introduce a theoretical framework for measuring complexity based on rate–distortion theory and Kolmogorov complexity, which can be understood as principled lossy compression for networks. We find that properly regularized networks exhibit a sharp phase transition: complexity rises during memorization, then falls as the network discovers a simpler underlying pattern that generalizes. In contrast, unregularized networks remain trapped in a high-complexity memorization phase. We establish an explicit connection between our complexity measure and generalization bounds, providing a theoretical foundation for the link between lossy compression and generalization. Our framework achieves compression ratios 30-40× better than naïve approaches, enabling precise tracking of complexity dynamics. Finally, we introduce a regularization method based on spectral entropy that encourages networks toward low-complexity representations by penalizing their intrinsic dimension.
AbstractList We demonstrate the existence of a complexity phase transition in neural networks by studying the grokking phenomenon, where networks suddenly transition from memorization to generalization long after overfitting their training data. To characterize this phase transition, we introduce a theoretical framework for measuring complexity based on rate–distortion theory and Kolmogorov complexity, which can be understood as principled lossy compression for networks. We find that properly regularized networks exhibit a sharp phase transition: complexity rises during memorization, then falls as the network discovers a simpler underlying pattern that generalizes. In contrast, unregularized networks remain trapped in a high-complexity memorization phase. We establish an explicit connection between our complexity measure and generalization bounds, providing a theoretical foundation for the link between lossy compression and generalization. Our framework achieves compression ratios 30-40× better than naïve approaches, enabling precise tracking of complexity dynamics. Finally, we introduce a regularization method based on spectral entropy that encourages networks toward low-complexity representations by penalizing their intrinsic dimension.
ArticleNumber 134859
Author Posner, Ingmar
Sapora, Silvia
Foerster, Jakob
Hawes, Nick
DeMoss, Branton
Author_xml – sequence: 1
  givenname: Branton
  orcidid: 0000-0001-6828-6787
  surname: DeMoss
  fullname: DeMoss, Branton
  email: bdemoss@robots.ox.ac.uk
– sequence: 2
  givenname: Silvia
  surname: Sapora
  fullname: Sapora, Silvia
– sequence: 3
  givenname: Jakob
  surname: Foerster
  fullname: Foerster, Jakob
– sequence: 4
  givenname: Nick
  surname: Hawes
  fullname: Hawes, Nick
– sequence: 5
  givenname: Ingmar
  surname: Posner
  fullname: Posner, Ingmar
BookMark eNp9j71OwzAYRT0UibbwBCyRmBP8l9gZGFAFFKkSS5ktY39unTZxZEeIvD0pYWa6w9W5umeFFl3oAKE7gguCSfXQFP1xTLagmJYFYVyW9QItp0bkVMj6Gq1SajDGRDCxRPf7I2QmtP0Zvv0wZnbsdOtNyoLLDjGcTr473KArp88Jbv9yjT5envebbb57f33bPO1ywzAeci6Z1p-CEScrLcBpKaUzpaQ1rSrgkggrRIlNbYBDaTQAr4UW1LEKCJWWrRGbd00MKUVwqo--1XFUBKuLm2rUr5u6uKnZbaIeZwqma18eokrGQ2fA-ghmUDb4f_kf9xFcJw
Cites_doi 10.1103/PhysRevE.111.014118
10.1007/978-1-4757-3860-5
10.1017/S0960129511000521
10.1145/168304.168306
10.1016/0005-1098(78)90005-5
10.1002/j.1538-7305.1948.tb00917.x
10.1103/PhysRevResearch.2.033312
10.1109/TIT.2010.2048491
ContentType Journal Article
Copyright 2025 The Authors
Copyright_xml – notice: 2025 The Authors
DBID 6I.
AAFTH
AAYXX
CITATION
DOI 10.1016/j.physd.2025.134859
DatabaseName ScienceDirect Open Access Titles
Elsevier:ScienceDirect:Open Access
CrossRef
DatabaseTitle CrossRef
DatabaseTitleList
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
Physics
ExternalDocumentID 10_1016_j_physd_2025_134859
S0167278925003367
GroupedDBID --K
--M
-~X
.~1
0R~
1B1
1RT
1~.
1~5
29O
4.4
457
4G.
5VS
6I.
7-5
71M
8P~
9JN
AAEDT
AAEDW
AAFTH
AAIKJ
AAKOC
AALRI
AAOAW
AAQFI
AAQXK
AATTM
AAXKI
AAXUO
AAYWO
ABAOU
ABFNM
ABMAC
ABNEU
ABWVN
ABXDB
ACDAQ
ACFVG
ACGFS
ACLOT
ACNCT
ACNNM
ACRLP
ACRPL
ADBBV
ADEZE
ADGUI
ADIYS
ADMUD
ADNMO
ADVLN
AEBSH
AEIPS
AEKER
AFFNX
AFJKZ
AFTJW
AGHFR
AGQPQ
AGUBO
AGYEJ
AHHHB
AIEXJ
AIGVJ
AIIUN
AIKHN
AITUG
AIVDX
ALMA_UNASSIGNED_HOLDINGS
AMRAJ
ANKPU
APXCP
ARUGR
ASPBG
AVWKF
AXJTR
AZFZN
BBWZM
BKOJK
BLXMC
EBS
EFJIC
EFKBS
EFLBG
EJD
EO8
EO9
EP2
EP3
F5P
FDB
FEDTE
FGOYB
FIRID
FNPLU
FYGXN
G-Q
GBLVA
HMV
HVGLF
HZ~
H~9
IHE
J1W
K-O
KOM
M38
M41
MHUIS
MO0
MVM
N9A
NDZJH
O-L
O9-
OAUVE
OGIMB
OZT
P-8
P-9
P2P
PC.
Q38
R2-
RNS
ROL
RPZ
SDF
SDG
SDP
SES
SEW
SPC
SPCBC
SPD
SPG
SSQ
SSW
SSZ
T5K
TN5
TWZ
WUQ
XJT
XPP
YNT
YYP
~02
~G-
~HD
9DU
AAYXX
CITATION
ID FETCH-LOGICAL-c300t-483aab731f86a7efa888fc5829266e4817d7750c9ce4e5caee497a72f36e128d3
ISICitedReferencesCount 0
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=001560597300003&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
ISSN 0167-2789
IngestDate Sat Nov 29 06:52:14 EST 2025
Sat Oct 11 16:52:49 EDT 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Keywords Algorithmic complexity
Grokking
Phase transition
Language English
License This is an open access article under the CC BY license.
LinkModel OpenURL
MergedId FETCHMERGED-LOGICAL-c300t-483aab731f86a7efa888fc5829266e4817d7750c9ce4e5caee497a72f36e128d3
ORCID 0000-0001-6828-6787
OpenAccessLink https://dx.doi.org/10.1016/j.physd.2025.134859
ParticipantIDs crossref_primary_10_1016_j_physd_2025_134859
elsevier_sciencedirect_doi_10_1016_j_physd_2025_134859
PublicationCentury 2000
PublicationDate November 2025
2025-11-00
PublicationDateYYYYMMDD 2025-11-01
PublicationDate_xml – month: 11
  year: 2025
  text: November 2025
PublicationDecade 2020
PublicationTitle Physica. D
PublicationYear 2025
Publisher Elsevier B.V
Publisher_xml – name: Elsevier B.V
References Power, Burda, Edwards, Babuschkin, Misra (b1) 2022
Vapnik, Chervonenkis (b7) 1971
Ma, Wang, Ma, Wang, Wang, Huang, Dong, Wang, Xue, Wei (b22) 2024
Micikevicius, Narang, Alben, Diamos, Elsen, García, Ginsburg, Houston, Kuchaiev, Venkatesh, Wu (b21) 2017
Nanda, Chan, Lieberum, Smith, Steinhardt (b11) 2023
Shannon (b4) 1948; 27
Vereshchagin, Vitányi (b20) 2004; 56
Li, Farkhoor, Liu, Yosinski (b26) 2018
Rissanen (b3) 1978; 14
Bartlett, Mendelson (b6) 2003; 3
M. Li, P.M.B. Vitányi, An introduction to kolmogorov complexity and its applications, in: Graduate Texts in Computer Science, 1993.
Langford, Seeger (b17) 2001
Kolchinsky, Wolpert (b31) 2020; 2
Liu, Kitouni, Nolte, Michaud, Tegmark, Williams (b8) 2022
Liu, Michaud, Tegmark (b9) 2022
Kaplan, McCandlish, Henighan, Brown, Chess, Child, Gray, Radford, Wu, Amodei (b13) 2020
Y. Blau, T. Michaeli, Rethinking lossy compression: The rate–distortion-perception tradeoff, in: International Conference on Machine Learning, 2019.
V.N. Vapnik, Principles of risk minimization for learning theory, in: Neural Information Processing Systems, 1991.
Baez, Stay (b30) 2012; 22
Goldblum, Finzi, Rowan, Wilson (b16) 2023
Nakkiran, Kaplun, Bansal, Yang, Barak, Sutskever (b12) 2019; 2021
Ebtekar, Hutter (b29) 2025; 111
Aghajanyan, Zettlemoyer, Gupta (b27) 2020
Dettmers, Pagnoni, Holtzman, Zettlemoyer (b25) 2023
Aaronson, Carroll, Ouellette (b2) 2014
Henighan, Kaplan, Katz, Chen, Hesse, Jackson, Jun, Brown, Dhariwal, Gray, Hallacy, Mann, Radford, Ramesh, Ryder, Ziegler, Schulman, Amodei, McCandlish (b14) 2020
Varma, Shah, Kenton, Kram’ar, Kumar (b10) 2023
O. Roy, M. Vetterli, The effective rank: A measure of effective dimensionality, in: 2007 15th European Signal Processing Conference, 2007, pp. 606–610.
Lotfi, Finzi, Kuang, Rudner, Goldblum, Wilson (b15) 2024
G.E. Hinton, D. van Camp, Keeping the neural networks simple by minimizing the description length of the weights, in: Annual Conference Computational Learning Theory, 1993.
Hu, Shen, Wallis, Allen-Zhu, Li, Wang, Chen (b24) 2021
Ma (10.1016/j.physd.2025.134859_b22) 2024
10.1016/j.physd.2025.134859_b18
10.1016/j.physd.2025.134859_b19
Varma (10.1016/j.physd.2025.134859_b10) 2023
Henighan (10.1016/j.physd.2025.134859_b14) 2020
Kaplan (10.1016/j.physd.2025.134859_b13) 2020
Liu (10.1016/j.physd.2025.134859_b8) 2022
Rissanen (10.1016/j.physd.2025.134859_b3) 1978; 14
Shannon (10.1016/j.physd.2025.134859_b4) 1948; 27
Lotfi (10.1016/j.physd.2025.134859_b15) 2024
Liu (10.1016/j.physd.2025.134859_b9) 2022
Baez (10.1016/j.physd.2025.134859_b30) 2012; 22
Vereshchagin (10.1016/j.physd.2025.134859_b20) 2004; 56
10.1016/j.physd.2025.134859_b23
Nakkiran (10.1016/j.physd.2025.134859_b12) 2019; 2021
Power (10.1016/j.physd.2025.134859_b1) 2022
Aaronson (10.1016/j.physd.2025.134859_b2) 2014
Hu (10.1016/j.physd.2025.134859_b24) 2021
Langford (10.1016/j.physd.2025.134859_b17) 2001
10.1016/j.physd.2025.134859_b28
Goldblum (10.1016/j.physd.2025.134859_b16) 2023
Bartlett (10.1016/j.physd.2025.134859_b6) 2003; 3
Nanda (10.1016/j.physd.2025.134859_b11) 2023
Micikevicius (10.1016/j.physd.2025.134859_b21) 2017
10.1016/j.physd.2025.134859_b5
Vapnik (10.1016/j.physd.2025.134859_b7) 1971
Dettmers (10.1016/j.physd.2025.134859_b25) 2023
Li (10.1016/j.physd.2025.134859_b26) 2018
Ebtekar (10.1016/j.physd.2025.134859_b29) 2025; 111
Aghajanyan (10.1016/j.physd.2025.134859_b27) 2020
Kolchinsky (10.1016/j.physd.2025.134859_b31) 2020; 2
References_xml – year: 2014
  ident: b2
  article-title: Quantifying the rise and fall of complexity in closed systems: The coffee automaton
– year: 2020
  ident: b14
  article-title: Scaling laws for autoregressive generative modeling
– reference: Y. Blau, T. Michaeli, Rethinking lossy compression: The rate–distortion-perception tradeoff, in: International Conference on Machine Learning, 2019.
– reference: G.E. Hinton, D. van Camp, Keeping the neural networks simple by minimizing the description length of the weights, in: Annual Conference Computational Learning Theory, 1993.
– year: 2021
  ident: b24
  article-title: Lora: Low-rank adaptation of large language models
– year: 2017
  ident: b21
  article-title: Mixed precision training
– year: 2023
  ident: b25
  article-title: Qlora: Efficient finetuning of quantized llms
– volume: 14
  start-page: 465
  year: 1978
  end-page: 471
  ident: b3
  article-title: Modeling by shortest data description*
  publication-title: Autom.
– year: 2023
  ident: b10
  article-title: Explaining grokking through circuit efficiency
– volume: 111
  year: 2025
  ident: b29
  article-title: Foundations of algorithmic thermodynamics
  publication-title: Phys. Rev. E
– year: 2020
  ident: b13
  article-title: Scaling laws for neural language models
– year: 2022
  ident: b8
  article-title: Towards understanding grokking: An effective theory of representation learning
– reference: M. Li, P.M.B. Vitányi, An introduction to kolmogorov complexity and its applications, in: Graduate Texts in Computer Science, 1993.
– volume: 3
  start-page: 463
  year: 2003
  end-page: 482
  ident: b6
  article-title: Rademacher and gaussian complexities: Risk bounds and structural results
  publication-title: J. Mach. Learn. Res.
– reference: O. Roy, M. Vetterli, The effective rank: A measure of effective dimensionality, in: 2007 15th European Signal Processing Conference, 2007, pp. 606–610.
– year: 2023
  ident: b11
  article-title: Progress measures for grokking via mechanistic interpretability
– volume: 56
  start-page: 3438
  year: 2004
  end-page: 3454
  ident: b20
  article-title: Rate distortion and denoising of individual data using kolmogorov complexity
  publication-title: IEEE Trans. Inform. Theory
– year: 2018
  ident: b26
  article-title: Measuring the intrinsic dimension of objective landscapes
– volume: 2
  year: 2020
  ident: b31
  article-title: Thermodynamic costs of turing machines
  publication-title: Phys. Rev. Res.
– year: 2023
  ident: b16
  article-title: The no free lunch theorem, kolmogorov complexity, and the role of inductive biases in machine learning
– year: 2022
  ident: b1
  article-title: Grokking: Generalization beyond overfitting on small algorithmic datasets
– volume: 2021
  year: 2019
  ident: b12
  article-title: Deep double descent: where bigger models and more data hurt
  publication-title: J. Stat. Mech. Theory Exp.
– volume: 27
  start-page: 623
  year: 1948
  end-page: 656
  ident: b4
  article-title: A mathematical theory of communication
  publication-title: Bell Syst. Tech. J.
– year: 1971
  ident: b7
  article-title: On the uniform convergence of relative frequencies of events to their probabilities
– year: 2022
  ident: b9
  article-title: Omnigrok: Grokking beyond algorithmic data
– year: 2024
  ident: b15
  article-title: Non-vacuous generalization bounds for large language models
  publication-title: ICML 2024
– year: 2020
  ident: b27
  article-title: Intrinsic dimensionality explains the effectiveness of language model fine-tuning
– year: 2024
  ident: b22
  article-title: The era of 1-bit llms: All large language models are in 1.58 bits
– volume: 22
  start-page: 771
  year: 2012
  end-page: 787
  ident: b30
  article-title: Algorithmic thermodynamics
  publication-title: Math. Structures Comput. Sci.
– reference: V.N. Vapnik, Principles of risk minimization for learning theory, in: Neural Information Processing Systems, 1991.
– year: 2001
  ident: b17
  article-title: Bounds for averaging classifiers. Carnegie mellon
– year: 2018
  ident: 10.1016/j.physd.2025.134859_b26
– ident: 10.1016/j.physd.2025.134859_b28
– year: 2022
  ident: 10.1016/j.physd.2025.134859_b9
– volume: 111
  year: 2025
  ident: 10.1016/j.physd.2025.134859_b29
  article-title: Foundations of algorithmic thermodynamics
  publication-title: Phys. Rev. E
  doi: 10.1103/PhysRevE.111.014118
– year: 2017
  ident: 10.1016/j.physd.2025.134859_b21
– year: 2023
  ident: 10.1016/j.physd.2025.134859_b25
– year: 2022
  ident: 10.1016/j.physd.2025.134859_b8
– year: 2014
  ident: 10.1016/j.physd.2025.134859_b2
– year: 2020
  ident: 10.1016/j.physd.2025.134859_b13
– ident: 10.1016/j.physd.2025.134859_b5
– volume: 3
  start-page: 463
  year: 2003
  ident: 10.1016/j.physd.2025.134859_b6
  article-title: Rademacher and gaussian complexities: Risk bounds and structural results
  publication-title: J. Mach. Learn. Res.
– year: 2023
  ident: 10.1016/j.physd.2025.134859_b10
– volume: 2021
  year: 2019
  ident: 10.1016/j.physd.2025.134859_b12
  article-title: Deep double descent: where bigger models and more data hurt
  publication-title: J. Stat. Mech. Theory Exp.
– ident: 10.1016/j.physd.2025.134859_b18
  doi: 10.1007/978-1-4757-3860-5
– year: 2020
  ident: 10.1016/j.physd.2025.134859_b14
– year: 2024
  ident: 10.1016/j.physd.2025.134859_b15
  article-title: Non-vacuous generalization bounds for large language models
– year: 2022
  ident: 10.1016/j.physd.2025.134859_b1
– volume: 22
  start-page: 771
  year: 2012
  ident: 10.1016/j.physd.2025.134859_b30
  article-title: Algorithmic thermodynamics
  publication-title: Math. Structures Comput. Sci.
  doi: 10.1017/S0960129511000521
– ident: 10.1016/j.physd.2025.134859_b23
  doi: 10.1145/168304.168306
– volume: 14
  start-page: 465
  year: 1978
  ident: 10.1016/j.physd.2025.134859_b3
  article-title: Modeling by shortest data description*
  publication-title: Autom.
  doi: 10.1016/0005-1098(78)90005-5
– year: 2001
  ident: 10.1016/j.physd.2025.134859_b17
– year: 2021
  ident: 10.1016/j.physd.2025.134859_b24
– ident: 10.1016/j.physd.2025.134859_b19
– volume: 27
  start-page: 623
  year: 1948
  ident: 10.1016/j.physd.2025.134859_b4
  article-title: A mathematical theory of communication
  publication-title: Bell Syst. Tech. J.
  doi: 10.1002/j.1538-7305.1948.tb00917.x
– year: 2023
  ident: 10.1016/j.physd.2025.134859_b11
– volume: 2
  year: 2020
  ident: 10.1016/j.physd.2025.134859_b31
  article-title: Thermodynamic costs of turing machines
  publication-title: Phys. Rev. Res.
  doi: 10.1103/PhysRevResearch.2.033312
– volume: 56
  start-page: 3438
  year: 2004
  ident: 10.1016/j.physd.2025.134859_b20
  article-title: Rate distortion and denoising of individual data using kolmogorov complexity
  publication-title: IEEE Trans. Inform. Theory
  doi: 10.1109/TIT.2010.2048491
– year: 2024
  ident: 10.1016/j.physd.2025.134859_b22
– year: 1971
  ident: 10.1016/j.physd.2025.134859_b7
– year: 2020
  ident: 10.1016/j.physd.2025.134859_b27
– year: 2023
  ident: 10.1016/j.physd.2025.134859_b16
SSID ssj0001737
Score 2.4723191
Snippet We demonstrate the existence of a complexity phase transition in neural networks by studying the grokking phenomenon, where networks suddenly transition from...
SourceID crossref
elsevier
SourceType Index Database
Publisher
StartPage 134859
SubjectTerms Algorithmic complexity
Grokking
Phase transition
Title The complexity dynamics of grokking
URI https://dx.doi.org/10.1016/j.physd.2025.134859
Volume 482
WOSCitedRecordID wos001560597300003&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVESC
  databaseName: Elsevier SD Freedom Collection Journals 2021
  issn: 0167-2789
  databaseCode: AIEXJ
  dateStart: 19950101
  customDbUrl:
  isFulltext: true
  dateEnd: 99991231
  titleUrlDefault: https://www.sciencedirect.com
  omitProxy: false
  ssIdentifier: ssj0001737
  providerName: Elsevier
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV3fa9swED7WtIXuYay_WPcLQ_dYh1i2fNJjt7VspQ2FtZA3o9hySTPskqRd_vydLMnJyCjtYC_CCCTb99mnT8fpO4BPmCqV084qTFmOtEHhPJSo41ANOTIlVRQVTdWSc-z3xWAgL10dvWlTTgCrSszn8u6_Qk19BLY5OvsMuNtJqYOuCXRqCXZqnwx8kyiu54ZhF7bkfJOxcTOpx2O_VjlGemmB6i5Sf7_qC1dK_fOkKTHcRmGUUbFq4qWjnw-j1qGf1oZDWuzP1LgeLtzaL-uF-l5738UXGHcH7dqg18rBFxuHJP9qDtEuO9LElhFacco2PnDbNbEaI87KeDeKE-GUwP9Uu_5hZjYTEzXrxXGKa7DOkEvRgfXj7yeDs3aZjdAKovon8ZJSTfLeyq3-TjuWqMTVa3jl9gDBscVuG17oagdeLilD7sCmhWW6C4eEZ7DAM_B4BnUZeDz34Pr05OrLt9BVtgjzuNebmQiuUkOMo1KkCnWp6G8pcy6YJL6kExFhgUTlcpnrRPNcaZ1IVMjKONVEKIp4HzpVXek3ELCkLFCWPS5ooJZDUZTkchkyXehIFXgAR_7FszsrYJL5zL7brLFTZuyUWTsdQOqNkzkOZrlVRmg-NvDtvw58B1uLz-49dGaTe_0BNvKH2Wg6-ehQ_w2EblQU
linkProvider Elsevier
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=The+complexity+dynamics+of+grokking&rft.jtitle=Physica.+D&rft.au=DeMoss%2C+Branton&rft.au=Sapora%2C+Silvia&rft.au=Foerster%2C+Jakob&rft.au=Hawes%2C+Nick&rft.date=2025-11-01&rft.pub=Elsevier+B.V&rft.issn=0167-2789&rft.volume=482&rft_id=info:doi/10.1016%2Fj.physd.2025.134859&rft.externalDocID=S0167278925003367
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0167-2789&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0167-2789&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0167-2789&client=summon