Bayesian Variational Inference for Exponential Random Graph Models

Deriving Bayesian inference for exponential random graph models (ERGMs) is a challenging "doubly intractable" problem as the normalizing constants of the likelihood and posterior density are both intractable. Markov chain Monte Carlo (MCMC) methods which yield Bayesian inference for ERGMs,...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:Journal of computational and graphical statistics Ročník 29; číslo 4; s. 910 - 928
Hlavní autori: Tan, Linda S. L., Friel, Nial
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: Alexandria Taylor & Francis 01.10.2020
Taylor & Francis Ltd
Predmet:
ISSN:1061-8600, 1537-2715
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Abstract Deriving Bayesian inference for exponential random graph models (ERGMs) is a challenging "doubly intractable" problem as the normalizing constants of the likelihood and posterior density are both intractable. Markov chain Monte Carlo (MCMC) methods which yield Bayesian inference for ERGMs, such as the exchange algorithm, are asymptotically exact but computationally intensive, as a network has to be drawn from the likelihood at every step using, for instance, a "tie no tie" sampler. In this article, we develop a variety of variational methods for Gaussian approximation of the posterior density and model selection. These include nonconjugate variational message passing based on an adjusted pseudolikelihood and stochastic variational inference. To overcome the computational hurdle of drawing a network from the likelihood at each iteration, we propose stochastic gradient ascent with biased but consistent gradient estimates computed using adaptive self-normalized importance sampling. These methods provide attractive fast alternatives to MCMC for posterior approximation. We illustrate the variational methods using real networks and compare their accuracy with results obtained via MCMC and Laplace approximation. Supplementary materials for this article are available online.
AbstractList Deriving Bayesian inference for exponential random graph models (ERGMs) is a challenging "doubly intractable" problem as the normalizing constants of the likelihood and posterior density are both intractable. Markov chain Monte Carlo (MCMC) methods which yield Bayesian inference for ERGMs, such as the exchange algorithm, are asymptotically exact but computationally intensive, as a network has to be drawn from the likelihood at every step using, for instance, a "tie no tie" sampler. In this article, we develop a variety of variational methods for Gaussian approximation of the posterior density and model selection. These include nonconjugate variational message passing based on an adjusted pseudolikelihood and stochastic variational inference. To overcome the computational hurdle of drawing a network from the likelihood at each iteration, we propose stochastic gradient ascent with biased but consistent gradient estimates computed using adaptive self-normalized importance sampling. These methods provide attractive fast alternatives to MCMC for posterior approximation. We illustrate the variational methods using real networks and compare their accuracy with results obtained via MCMC and Laplace approximation. Supplementary materials for this article are available online.
Deriving Bayesian inference for exponential random graph models (ERGMs) is a challenging “doubly intractable” problem as the normalizing constants of the likelihood and posterior density are both intractable. Markov chain Monte Carlo (MCMC) methods which yield Bayesian inference for ERGMs, such as the exchange algorithm, are asymptotically exact but computationally intensive, as a network has to be drawn from the likelihood at every step using, for instance, a “tie no tie” sampler. In this article, we develop a variety of variational methods for Gaussian approximation of the posterior density and model selection. These include nonconjugate variational message passing based on an adjusted pseudolikelihood and stochastic variational inference. To overcome the computational hurdle of drawing a network from the likelihood at each iteration, we propose stochastic gradient ascent with biased but consistent gradient estimates computed using adaptive self-normalized importance sampling. These methods provide attractive fast alternatives to MCMC for posterior approximation. We illustrate the variational methods using real networks and compare their accuracy with results obtained via MCMC and Laplace approximation. Supplementary materials for this article are available online.
Author Friel, Nial
Tan, Linda S. L.
Author_xml – sequence: 1
  givenname: Linda S. L.
  surname: Tan
  fullname: Tan, Linda S. L.
  email: statsll@nus.edu.sg
  organization: Department of Statistics and Applied Probability, National University of Singapore
– sequence: 2
  givenname: Nial
  surname: Friel
  fullname: Friel, Nial
  organization: School of Mathematics and Statistics, University College Dublin
BookMark eNqFkMFKAzEQhoNUsK0-grDgeWuym0128aIttRYqgqjXkKYTTNkma7Kl9u3N2nrxoKcMme-fGb4B6llnAaFLgkcEl_iaYEZKhvEow1n84hRzQk9QnxQ5TzNOil6sI5N20BkahLDGGBNW8T4aj-UegpE2eZPeyNY4K-tkbjV4sAoS7Xwy_WziQtua2HmWduU2yczL5j15dCuowzk61bIOcHF8h-j1fvoyeUgXT7P55G6Rqrws2pRqAixXS1LmRbFUOgPGMMsyVkoJdEkqxRWFnFVQVgQiplhFK6Z1F4rn5kN0dZjbePexhdCKtdv6eG4QGeWEcVZwGqniQCnvQvCgRePNRvq9IFh0usSPLtHpEkddMXfzK6dM--2j9dLU_6ZvD2ljo7GN3Dlfr0Qr97Xz2kurTBD53yO-AHp5g88
CitedBy_id crossref_primary_10_1080_01621459_2024_2395504
crossref_primary_10_1093_jrsssb_qkaf001
crossref_primary_10_1002_sim_70109
crossref_primary_10_1016_j_socnet_2023_07_001
crossref_primary_10_1214_21_BA1298
crossref_primary_10_1016_j_csda_2020_107029
Cites_doi 10.1016/j.socnet.2010.09.004
10.1214/13-STS418
10.1111/j.2517-6161.1974.tb00999.x
10.1080/10618600.2018.1448832
10.1080/10618600.2012.679224
10.1080/00949650902882162
10.1016/j.socnet.2017.03.013
10.1016/j.socnet.2012.10.003
10.1016/j.sigpro.2016.08.025
10.1093/bioinformatics/btm370
10.1080/01621459.1994.10476469
10.1002/0471722138
10.1016/j.socnet.2008.10.003
10.1007/978-0-387-76371-2
10.1086/jar.33.4.3629752
10.18637/jss.v061.i02
10.1007/BF00992696
10.1080/01621459.1995.10476572
10.1080/713660095
10.1038/ng881
10.1080/01621459.2018.1448824
10.1080/01621459.2015.1009072
10.1214/11-BJPS174
10.1111/j.2517-6161.1992.tb01443.x
10.1198/016214507000000446
10.2307/2337136
10.1093/nar/29.1.72
10.1214/15-STS523
10.1214/16-AAP1272
10.1007/s11222-017-9729-7
10.1080/10618600.2017.1330205
10.1198/jcgs.2011.09118
10.1093/biomet/93.2.451
10.1080/01621459.1990.10475327
10.18637/jss.v024.i03
10.1198/016214501750332848
10.1023/A:1008923215028
10.1214/aoms/1177729586
ContentType Journal Article
Copyright 2020 American Statistical Association, Institute of Mathematical Statistics, and Interface Foundation of North America 2020
2020 American Statistical Association, Institute of Mathematical Statistics, and Interface Foundation of North America
Copyright_xml – notice: 2020 American Statistical Association, Institute of Mathematical Statistics, and Interface Foundation of North America 2020
– notice: 2020 American Statistical Association, Institute of Mathematical Statistics, and Interface Foundation of North America
DBID AAYXX
CITATION
JQ2
DOI 10.1080/10618600.2020.1740714
DatabaseName CrossRef
ProQuest Computer Science Collection
DatabaseTitle CrossRef
ProQuest Computer Science Collection
DatabaseTitleList
ProQuest Computer Science Collection
DeliveryMethod fulltext_linktorsrc
Discipline Statistics
Mathematics
EISSN 1537-2715
EndPage 928
ExternalDocumentID 10_1080_10618600_2020_1740714
1740714
Genre Article
GrantInformation_xml – fundername: Insight Centre for Data Analytics is supported by Science Foundation Ireland
  grantid: SFI/12/RC/
GroupedDBID -~X
.4S
.7F
.DC
.QJ
0BK
0R~
30N
4.4
5GY
AAENE
AAGDL
AAHIA
AAJMT
AALDU
AAMIU
AAPUL
AAQRR
ABCCY
ABFAN
ABFIM
ABJNI
ABLIJ
ABLJU
ABPAQ
ABPEM
ABTAI
ABXUL
ABXYU
ABYWD
ACGFO
ACGFS
ACIWK
ACMTB
ACTIO
ACTMH
ADCVX
ADGTB
ADXHL
AEGXH
AELLO
AENEX
AEOZL
AEPSL
AEYOC
AFRVT
AFVYC
AGDLA
AGMYJ
AHDZW
AIAGR
AIJEM
AKBRZ
AKBVH
AKOOK
ALMA_UNASSIGNED_HOLDINGS
ALQZU
AMVHM
AQRUH
AQTUD
ARCSS
AVBZW
AWYRJ
BLEHA
CCCUG
CS3
D0L
DGEBU
DKSSO
DU5
EBS
E~A
E~B
F5P
GTTXZ
H13
HF~
HZ~
H~P
IPNFZ
J.P
JAA
JMS
KYCEM
LJTGL
M4Z
MS~
NA5
NY~
O9-
P2P
PQQKQ
RIG
RNANH
ROSJB
RTWRZ
RWL
RXW
S-T
SNACF
TAE
TASJS
TBQAZ
TDBHL
TEJ
TFL
TFT
TFW
TN5
TTHFI
TUROJ
TUS
UT5
UU3
WZA
XWC
ZGOLN
~S~
AAYXX
CITATION
JQ2
ID FETCH-LOGICAL-c385t-4f1e63cb18355bcf2e66062268aae4b19c7c4e369e891eb18c69496ff63cb0013
IEDL.DBID TFW
ISICitedReferencesCount 10
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000587815600001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
ISSN 1061-8600
IngestDate Mon Nov 10 02:49:14 EST 2025
Tue Nov 18 21:39:32 EST 2025
Sat Nov 29 03:24:17 EST 2025
Mon Oct 20 23:48:01 EDT 2025
IsDoiOpenAccess false
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Issue 4
Language English
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c385t-4f1e63cb18355bcf2e66062268aae4b19c7c4e369e891eb18c69496ff63cb0013
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
OpenAccessLink http://hdl.handle.net/10197/12008
PQID 2471676574
PQPubID 29738
PageCount 19
ParticipantIDs proquest_journals_2471676574
informaworld_taylorfrancis_310_1080_10618600_2020_1740714
crossref_citationtrail_10_1080_10618600_2020_1740714
crossref_primary_10_1080_10618600_2020_1740714
PublicationCentury 2000
PublicationDate 2020-10-01
PublicationDateYYYYMMDD 2020-10-01
PublicationDate_xml – month: 10
  year: 2020
  text: 2020-10-01
  day: 01
PublicationDecade 2020
PublicationPlace Alexandria
PublicationPlace_xml – name: Alexandria
PublicationTitle Journal of computational and graphical statistics
PublicationYear 2020
Publisher Taylor & Francis
Taylor & Francis Ltd
Publisher_xml – name: Taylor & Francis
– name: Taylor & Francis Ltd
References CIT0032
CIT0031
CIT0034
CIT0033
Murray I. (CIT0030) 2006
CIT0036
CIT0038
Wand M. P. (CIT0051) 2014; 15
Knowles D. A. (CIT0020) 2011; 24
CIT0039
CIT0040
CIT0043
CIT0042
CIT0001
CIT0045
CIT0044
Winn J. (CIT0053) 2005; 6
CIT0003
CIT0047
CIT0002
CIT0046
CIT0049
CIT0004
CIT0048
CIT0007
CIT0006
CIT0009
CIT0008
CIT0050
Le T. A. (CIT0022) 2019
Snijders T. A. (CIT0041) 2002; 3
CIT0052
CIT0010
CIT0012
CIT0011
CIT0055
Kingma D. P. (CIT0019) 2014
Rezende D. J. (CIT0035) 2014
CIT0014
CIT0013
CIT0016
CIT0015
CIT0018
CIT0017
Burda Y. (CIT0005) 2016
Roeder G. (CIT0037) 2017; 30
CIT0021
CIT0023
CIT0025
CIT0024
CIT0027
CIT0026
CIT0029
CIT0028
Xu M. (CIT0054) 2019; 89
References_xml – ident: CIT0006
  doi: 10.1016/j.socnet.2010.09.004
– ident: CIT0046
  doi: 10.1214/13-STS418
– ident: CIT0002
  doi: 10.1111/j.2517-6161.1974.tb00999.x
– ident: CIT0009
– ident: CIT0004
  doi: 10.1080/10618600.2018.1448832
– ident: CIT0014
  doi: 10.1080/10618600.2012.679224
– ident: CIT0023
  doi: 10.1080/00949650902882162
– ident: CIT0048
– ident: CIT0003
  doi: 10.1016/j.socnet.2017.03.013
– ident: CIT0007
  doi: 10.1016/j.socnet.2012.10.003
– volume: 24
  start-page: 1701
  volume-title: Advances in Neural Information Processing Systems
  year: 2011
  ident: CIT0020
– ident: CIT0028
  doi: 10.1016/j.sigpro.2016.08.025
– ident: CIT0039
  doi: 10.1093/bioinformatics/btm370
– ident: CIT0021
  doi: 10.1080/01621459.1994.10476469
– ident: CIT0042
  doi: 10.1002/0471722138
– ident: CIT0012
– volume: 15
  start-page: 1351
  year: 2014
  ident: CIT0051
  publication-title: Journal of Machine Learning Research
– volume: 6
  start-page: 661
  year: 2005
  ident: CIT0053
  publication-title: Journal of Machine Learning Research
– ident: CIT0050
  doi: 10.1016/j.socnet.2008.10.003
– volume: 89
  start-page: 2711
  year: 2019
  ident: CIT0054
  publication-title: Proceedings of Machine Learning Research, Proceedings of Machine Learning Research (PMLR
– ident: CIT0025
  doi: 10.1007/978-0-387-76371-2
– ident: CIT0055
  doi: 10.1086/jar.33.4.3629752
– start-page: 1278
  volume-title: Proceedings of the 31st International Conference on Machine Learning, JMLR Workshop and Conference Proceedings
  year: 2014
  ident: CIT0035
– ident: CIT0008
  doi: 10.18637/jss.v061.i02
– volume-title: Uncertainty in Artificial Intelligence
  year: 2019
  ident: CIT0022
– ident: CIT0052
  doi: 10.1007/BF00992696
– ident: CIT0017
  doi: 10.1080/01621459.1995.10476572
– start-page: 359
  volume-title: Proceedings of the Twenty-Second Conference on Uncertainty in Artificial Intelligence, UAI’06
  year: 2006
  ident: CIT0030
– ident: CIT0034
  doi: 10.1080/713660095
– ident: CIT0040
  doi: 10.1038/ng881
– ident: CIT0033
  doi: 10.1080/01621459.2018.1448824
– ident: CIT0024
  doi: 10.1080/01621459.2015.1009072
– ident: CIT0001
  doi: 10.1214/11-BJPS174
– ident: CIT0013
  doi: 10.1111/j.2517-6161.1992.tb01443.x
– ident: CIT0015
  doi: 10.1198/016214507000000446
– ident: CIT0026
  doi: 10.2307/2337136
– ident: CIT0018
– ident: CIT0038
  doi: 10.1093/nar/29.1.72
– ident: CIT0027
  doi: 10.1214/15-STS523
– ident: CIT0010
– ident: CIT0045
– ident: CIT0044
  doi: 10.1214/16-AAP1272
– ident: CIT0047
  doi: 10.1007/s11222-017-9729-7
– volume: 3
  start-page: 1
  year: 2002
  ident: CIT0041
  publication-title: Journal of Social Structure
– ident: CIT0049
  doi: 10.1080/10618600.2017.1330205
– ident: CIT0032
  doi: 10.1198/jcgs.2011.09118
– ident: CIT0029
  doi: 10.1093/biomet/93.2.451
– ident: CIT0043
  doi: 10.1080/01621459.1990.10475327
– volume-title: “Importance Weighted Autoencoders,” in Proceedings of the 4th International Conference on Learning Representations (ICLR)
  year: 2016
  ident: CIT0005
– ident: CIT0016
  doi: 10.18637/jss.v024.i03
– volume-title: “Auto-Encoding Variational Bayes,” in Proceedings of the 2nd International Conference on Learning Representations (ICLR)
  year: 2014
  ident: CIT0019
– volume: 30
  volume-title: Advances in Neural Information Processing Systems
  year: 2017
  ident: CIT0037
– ident: CIT0011
  doi: 10.1198/016214501750332848
– ident: CIT0031
  doi: 10.1023/A:1008923215028
– ident: CIT0036
  doi: 10.1214/aoms/1177729586
SSID ssj0001697
Score 2.3808231
Snippet Deriving Bayesian inference for exponential random graph models (ERGMs) is a challenging "doubly intractable" problem as the normalizing constants of the...
Deriving Bayesian inference for exponential random graph models (ERGMs) is a challenging “doubly intractable” problem as the normalizing constants of the...
SourceID proquest
crossref
informaworld
SourceType Aggregation Database
Enrichment Source
Index Database
Publisher
StartPage 910
SubjectTerms Adaptive sampling
Adaptive self-normalized importance sampling
Adjusted pseudolikelihood
Algorithms
Approximation
Ascent
Bayesian analysis
Density
Exponential random graph model
Importance sampling
Importance weighted lower bound
Iterative methods
Markov chains
Mathematical analysis
Message passing
Monte Carlo simulation
Nonconjugate variational message passing
Normalizing (statistics)
Statistical inference
Stochastic variational inference
Variational methods
Title Bayesian Variational Inference for Exponential Random Graph Models
URI https://www.tandfonline.com/doi/abs/10.1080/10618600.2020.1740714
https://www.proquest.com/docview/2471676574
Volume 29
WOSCitedRecordID wos000587815600001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVAWR
  databaseName: Taylor & Francis Journals
  customDbUrl:
  eissn: 1537-2715
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0001697
  issn: 1061-8600
  databaseCode: TFW
  dateStart: 19920301
  isFulltext: true
  titleUrlDefault: https://www.tandfonline.com
  providerName: Taylor & Francis
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1NT8MwDI0Q4jAOfAwQg4Fy4BqgbZqkR4Y24MCE0IDdoiZNT2Ob1oHg32O36cSE0A5wrCqnkZ3Yz5X9TMiZkQBSVZaz2GSccTA6M1mqWA6hOTBRkgbOlMMmZL-vhsPkwVcTFr6sEnPovCKKKH01Xu7UFHVF3AVmMQoCNWR34SVW5GATDnhhQPZY1DfovSx8ceDHq4AEQ5G6h-e3VZai0xJ36Q9fXQag3vY_bH2HbHn0Sa-q47JL1ty4STbvF9StRZM0EH5W7M17pNNJPx22WdJnyKn9f0N6VzcJUtg47X5MJ2OsOYI3j7CxySu9QRZsimPWRsU-eep1B9e3zE9dYDZS8ZzxPHAisgbuehwbm4dOQJIDKE2lqeMmSKy03EUicSoBQwbKioQnIs9RCBHlAVkfw3cPCbWhUdLGmZAy5U6Z1MnIcmFDKwGXCdUivNa2tp6SHCdjjHTgmUtrfWnUl_b6apHzhdi04uRYJZB8N6Welz9D8mpyiY5WyLZru2t_vQsdQkgXUsSSH_1h6WPSwMeqNLBN1uezN3dCNuw7WHl2Wh7kL9YO66g
linkProvider Taylor & Francis
linkToHtml http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1LT8MwDLZgIDEOPAaI8eyBa4G2aZIeGWKAYDugAbtFTZqcxobYQPDvsfuYhhDiAOfISWQnsR3Z3wdwpAUGqTJzfqwz5jM0uq-zVPoOXXOgoyQNrM7JJkS3K_v9ZLYXhsoqKYd2BVBE_lbT5abP6Kok7oTSGImeGtO78JRKcqgLZx4WiJ2OErBe-3H6GgclwQqK-CRTdfH8NM0X__QFvfTba527oPbqf2x-DVbKANQ7K07MOszZYQOWO1P01nED6hSBFgDOG9BqpR-WOi29B0yry69D77rqE_Rw597F-_NoSGVHOHKHOxs9eZcEhO0R09pgvAn37Yve-ZVfEi_4JpLxxGcusDwyGq97HGvjQssxz8FATaapZTpIjDDMRjyxMkFbBtLwhCXcORKioHILakNcdxs8E2opTJxxIVJmpU6tiAzjJjQCQzMum8AqdStTopITOcZABSV4aaUvRfpSpb6acDwVey5gOX4TSGZtqSb5f4gryEtU9IvsXmV4Vd7wsQrRq3PBY8F2_jD1ISxd9Tq36va6e7MLdRoqKgX3oDZ5ebX7sGje0OIvB_mp_gTHAu_L
linkToPdf http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1LT8MwDLZ4CcGBN-IxoAeuBdqmSXpkwGACJoTG4xY1aXKCMbGB4N9jt-kEQogDnCO3ke3EnyP7M8CuFghSZeHCVBcsZGj0UBe5DB2G5kgnWR5ZXQ6bEJ2OvL_Prnw14cCXVVIO7SqiiPKupsPdL1xdEbdPWYzEQI3ZXXxAFTnUhDMOkwidU3LsbutudBlHfr4KioQkUzfx_PSZL-HpC3npt8u6jECt-X_Y-wLMefgZHFb-sghjtrcEs5cj7tbBEswQ_qzom5eh2czfLfVZBreYVPuHw6BddwkGuPHg5K3_1KOiI1y5xo09PQanRIMd0Jy1h8EK3LROukdnoR-7EJpEpsOQucjyxGg87GmqjYstxywHYZrMc8t0lBlhmE14ZmWGloyk4RnLuHMkRJByFSZ6-N81CEyspTBpwYXImZU6tyIxjJvYCARmXK4Dq7WtjOckp9EYDyry1KW1vhTpS3l9rcPeSKxfkXL8JpB9NqUalq8hrhpdopJfZBu13ZU_3wMVY0zn6G6Cbfzh0zswfXXcUhftzvkmzNBKVSbYgInh84vdginzigZ_3i59-gPyI-59
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Bayesian+Variational+Inference+for+Exponential+Random+Graph+Models&rft.jtitle=Journal+of+computational+and+graphical+statistics&rft.au=Tan%2C+Linda+S+L&rft.au=Friel%2C+Nial&rft.date=2020-10-01&rft.pub=Taylor+%26+Francis+Ltd&rft.issn=1061-8600&rft.eissn=1537-2715&rft.volume=29&rft.issue=4&rft.spage=910&rft.epage=928&rft_id=info:doi/10.1080%2F10618600.2020.1740714&rft.externalDBID=NO_FULL_TEXT
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1061-8600&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1061-8600&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1061-8600&client=summon