Edge-Enhanced Dual-Stream Perception Network for Monocular Depth Estimation
Estimating depth from a single RGB image has a wide range of applications, such as in robot navigation and autonomous driving. Currently, Convolutional Neural Networks based on encoder–decoder architecture are the most popular methods to estimate depth maps. However, convolutional operators have lim...
Gespeichert in:
| Veröffentlicht in: | Electronics (Basel) Jg. 13; H. 9; S. 1652 |
|---|---|
| Hauptverfasser: | , |
| Format: | Journal Article |
| Sprache: | Englisch |
| Veröffentlicht: |
Basel
MDPI AG
01.05.2024
|
| Schlagworte: | |
| ISSN: | 2079-9292, 2079-9292 |
| Online-Zugang: | Volltext |
| Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
| Abstract | Estimating depth from a single RGB image has a wide range of applications, such as in robot navigation and autonomous driving. Currently, Convolutional Neural Networks based on encoder–decoder architecture are the most popular methods to estimate depth maps. However, convolutional operators have limitations in modeling large-scale dependence, often leading to inaccurate depth predictions at object edges. To address these issues, a new edge-enhanced dual-stream monocular depth estimation method is introduced in this paper. ResNet and Swin Transformer are combined to better extract global and local features, which benefits the estimation of the depth map. To better integrate the information from the two branches of the encoder and the shallow branch of the decoder, we designed a lightweight decoder based on the multi-head Cross-Attention Module. Furthermore, in order to improve the boundary clarity of objects in the depth map, a loss function with an additional penalty for depth estimation error on the edges of objects is presented. The results on three datasets, NYU Depth V2, KITTI, and SUN RGB-D, show that the method presented in this paper achieves better performance for monocular depth estimation. Additionally, it has good generalization capabilities for various scenarios and real-world images. |
|---|---|
| AbstractList | Estimating depth from a single RGB image has a wide range of applications, such as in robot navigation and autonomous driving. Currently, Convolutional Neural Networks based on encoder–decoder architecture are the most popular methods to estimate depth maps. However, convolutional operators have limitations in modeling large-scale dependence, often leading to inaccurate depth predictions at object edges. To address these issues, a new edge-enhanced dual-stream monocular depth estimation method is introduced in this paper. ResNet and Swin Transformer are combined to better extract global and local features, which benefits the estimation of the depth map. To better integrate the information from the two branches of the encoder and the shallow branch of the decoder, we designed a lightweight decoder based on the multi-head Cross-Attention Module. Furthermore, in order to improve the boundary clarity of objects in the depth map, a loss function with an additional penalty for depth estimation error on the edges of objects is presented. The results on three datasets, NYU Depth V2, KITTI, and SUN RGB-D, show that the method presented in this paper achieves better performance for monocular depth estimation. Additionally, it has good generalization capabilities for various scenarios and real-world images. |
| Audience | Academic |
| Author | Wang, Quande Liu, Zihang |
| Author_xml | – sequence: 1 givenname: Zihang surname: Liu fullname: Liu, Zihang – sequence: 2 givenname: Quande surname: Wang fullname: Wang, Quande |
| BookMark | eNp9kMtOwzAQRS1UJErpF7CJxDrgR53Ey6oND1EeErCOJrbTuqR2cRwh_h6XskAI4Vl4NHPPjOYeo4F1ViN0SvA5YwJf6FbL4J01siMMC5JxeoCGFOciFVTQwY_8CI27bo3jE4QVDA_RbamWOi3tCqzUKpn30KZPwWvYJI_aS70NxtnkXod351-Txvnkzlkn-xZ8Mo_dVVJ2wWxgJztBhw20nR5__yP0clk-z67TxcPVzWy6SCXLSEhlDtmEY6WUlBMhMK051CBErYFgTiEWRFPkBRY5zWQxoaqWtaoJA4Zr0MBG6Gw_d-vdW6-7UK1d721cWTHMGeEFLnBUne9VS2h1ZWzjggcZQ-mNkdHCxsT6NBeMZznjeQTEHpDedZ3XTSVN-DosgqatCK52fld_-B1Z9ovd-uiK__iX-gS_2on8 |
| CitedBy_id | crossref_primary_10_1016_j_optlastec_2025_113892 crossref_primary_10_1109_ACCESS_2025_3579429 crossref_primary_10_3390_electronics13204020 crossref_primary_10_3390_s24237752 |
| Cites_doi | 10.3390/electronics12020350 10.1109/TCYB.2013.2265378 10.1109/ICCV48922.2021.01196 10.1364/JOSAA.8.000377 10.1109/TPAMI.2015.2505283 10.3390/electronics12061450 10.1109/CVPR.2017.243 10.1109/CVPR.2017.634 10.1007/978-3-642-33715-4_54 10.1049/ell2.13019 10.1109/CVPR.2015.7298655 10.1109/TCSVT.2021.3049869 10.1007/s11263-007-0071-y 10.1007/978-3-030-01267-0_4 10.1109/ICVR55215.2022.9847988 10.1109/ICCV48922.2021.01596 10.1109/CVPR.2018.00037 10.1007/s00521-021-06462-0 10.1007/978-3-030-58574-7_35 10.3390/electronics12224669 10.1109/CVPR.2012.6248074 10.1109/3DV.2016.32 10.1007/978-3-030-01219-9_14 10.1109/JSEN.2021.3120753 10.3390/app13179940 10.1109/TCCN.2024.3360527 10.1109/TNNLS.2011.2180025 10.1109/CVPR.2018.00214 10.24963/ijcai.2019/98 10.1007/978-3-319-46484-8_45 |
| ContentType | Journal Article |
| Copyright | COPYRIGHT 2024 MDPI AG 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. |
| Copyright_xml | – notice: COPYRIGHT 2024 MDPI AG – notice: 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. |
| DBID | AAYXX CITATION 7SP 8FD 8FE 8FG ABUWG AFKRA ARAPS AZQEC BENPR BGLVJ CCPQU DWQXO HCIFZ L7M P5Z P62 PHGZM PHGZT PIMPY PKEHL PQEST PQGLB PQQKQ PQUKI PRINS |
| DOI | 10.3390/electronics13091652 |
| DatabaseName | CrossRef Electronics & Communications Abstracts Technology Research Database ProQuest SciTech Collection ProQuest Technology Collection ProQuest Central (Alumni) ProQuest Central UK/Ireland Advanced Technologies & Computer Science Collection ProQuest Central Essentials ProQuest Central Technology Collection ProQuest One Community College ProQuest Central SciTech Premium Collection Advanced Technologies Database with Aerospace Advanced Technologies & Aerospace Database ProQuest Advanced Technologies & Aerospace Collection ProQuest Central Premium ProQuest One Academic Publicly Available Content Database ProQuest One Academic Middle East (New) ProQuest One Academic Eastern Edition (DO NOT USE) ProQuest One Applied & Life Sciences ProQuest One Academic (retired) ProQuest One Academic UKI Edition ProQuest Central China |
| DatabaseTitle | CrossRef Publicly Available Content Database Advanced Technologies & Aerospace Collection Technology Collection Technology Research Database ProQuest One Academic Middle East (New) ProQuest Advanced Technologies & Aerospace Collection ProQuest Central Essentials ProQuest One Academic Eastern Edition Electronics & Communications Abstracts ProQuest Central (Alumni Edition) SciTech Premium Collection ProQuest One Community College ProQuest Technology Collection ProQuest SciTech Collection ProQuest Central China ProQuest Central Advanced Technologies & Aerospace Database ProQuest One Applied & Life Sciences ProQuest One Academic UKI Edition ProQuest Central Korea ProQuest Central (New) ProQuest One Academic Advanced Technologies Database with Aerospace ProQuest One Academic (New) |
| DatabaseTitleList | Publicly Available Content Database CrossRef |
| Database_xml | – sequence: 1 dbid: PIMPY name: Publicly Available Content Database url: http://search.proquest.com/publiccontent sourceTypes: Aggregation Database |
| DeliveryMethod | fulltext_linktorsrc |
| Discipline | Engineering |
| EISSN | 2079-9292 |
| ExternalDocumentID | A793567357 10_3390_electronics13091652 |
| GeographicLocations | Germany |
| GeographicLocations_xml | – name: Germany |
| GroupedDBID | 5VS 8FE 8FG AAYXX ADMLS AFFHD AFKRA ALMA_UNASSIGNED_HOLDINGS ARAPS BENPR BGLVJ CCPQU CITATION HCIFZ IAO ITC KQ8 MODMG M~E OK1 P62 PHGZM PHGZT PIMPY PQGLB PROAC 7SP 8FD ABUWG AZQEC DWQXO L7M PKEHL PQEST PQQKQ PQUKI PRINS |
| ID | FETCH-LOGICAL-c361t-c7a6450dddcc49902b5aba99bea1052a02b9f87809726c842dbcbdb13a30baea3 |
| IEDL.DBID | PIMPY |
| ISICitedReferencesCount | 6 |
| ISICitedReferencesURI | http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=001219977500001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| ISSN | 2079-9292 |
| IngestDate | Sat Jul 26 00:07:43 EDT 2025 Tue Nov 04 18:24:16 EST 2025 Sat Nov 29 07:19:57 EST 2025 Tue Nov 18 21:56:00 EST 2025 |
| IsDoiOpenAccess | true |
| IsOpenAccess | true |
| IsPeerReviewed | true |
| IsScholarly | true |
| Issue | 9 |
| Language | English |
| LinkModel | DirectLink |
| MergedId | FETCHMERGED-LOGICAL-c361t-c7a6450dddcc49902b5aba99bea1052a02b9f87809726c842dbcbdb13a30baea3 |
| Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
| OpenAccessLink | https://www.proquest.com/publiccontent/docview/3053158080?pq-origsite=%requestingapplication% |
| PQID | 3053158080 |
| PQPubID | 2032404 |
| ParticipantIDs | proquest_journals_3053158080 gale_infotracacademiconefile_A793567357 crossref_citationtrail_10_3390_electronics13091652 crossref_primary_10_3390_electronics13091652 |
| PublicationCentury | 2000 |
| PublicationDate | 2024-05-01 |
| PublicationDateYYYYMMDD | 2024-05-01 |
| PublicationDate_xml | – month: 05 year: 2024 text: 2024-05-01 day: 01 |
| PublicationDecade | 2020 |
| PublicationPlace | Basel |
| PublicationPlace_xml | – name: Basel |
| PublicationTitle | Electronics (Basel) |
| PublicationYear | 2024 |
| Publisher | MDPI AG |
| Publisher_xml | – name: MDPI AG |
| References | Han (ref_1) 2013; 43 Liu (ref_12) 2015; 38 ref_14 ref_36 Rogister (ref_5) 2011; 23 ref_13 ref_35 ref_34 ref_11 ref_10 ref_32 ref_30 Chan (ref_3) 2024; 60 Chan (ref_29) 2022; 34 ref_19 Saxena (ref_31) 2008; 76 ref_17 ref_39 ref_16 ref_38 ref_15 ref_37 Cheng (ref_26) 2021; 21 Chakrabarti (ref_33) 2016; 29 Eigen (ref_7) 2014; 27 ref_25 Song (ref_18) 2021; 31 ref_23 ref_22 Vaswani (ref_24) 2017; 30 ref_21 ref_20 ref_41 ref_40 ref_2 ref_28 ref_27 Koenderink (ref_6) 1991; 8 ref_9 ref_8 ref_4 |
| References_xml | – ident: ref_15 doi: 10.3390/electronics12020350 – ident: ref_30 – volume: 43 start-page: 1318 year: 2013 ident: ref_1 article-title: Enhanced computer vision with microsoft kinect sensor: A review publication-title: IEEE Trans. Cybern. doi: 10.1109/TCYB.2013.2265378 – ident: ref_41 doi: 10.1109/ICCV48922.2021.01196 – ident: ref_34 – volume: 8 start-page: 377 year: 1991 ident: ref_6 article-title: Affine structure from motion publication-title: JOSA A doi: 10.1364/JOSAA.8.000377 – volume: 38 start-page: 2024 year: 2015 ident: ref_12 article-title: Learning depth from single monocular images using deep convolutional neural fields publication-title: IEEE Trans. Pattern Anal. Mach. Intell. doi: 10.1109/TPAMI.2015.2505283 – ident: ref_14 doi: 10.3390/electronics12061450 – ident: ref_17 doi: 10.1109/CVPR.2017.243 – ident: ref_16 – ident: ref_19 doi: 10.1109/CVPR.2017.634 – ident: ref_39 – ident: ref_9 doi: 10.1007/978-3-642-33715-4_54 – ident: ref_35 – ident: ref_23 – volume: 60 start-page: e13019 year: 2024 ident: ref_3 article-title: Light-field image super-resolution with depth feature by multiple-decouple and fusion module publication-title: Electron. Lett. doi: 10.1049/ell2.13019 – ident: ref_11 doi: 10.1109/CVPR.2015.7298655 – volume: 31 start-page: 4381 year: 2021 ident: ref_18 article-title: Monocular depth estimation using laplacian pyramid-based depth residuals publication-title: IEEE Trans. Circuits Syst. Video Technol. doi: 10.1109/TCSVT.2021.3049869 – volume: 27 start-page: 1 year: 2014 ident: ref_7 article-title: Depth map prediction from a single image using a multi-scale deep network publication-title: Adv. Neural Inf. Process. Syst. – volume: 76 start-page: 53 year: 2008 ident: ref_31 article-title: 3-d depth reconstruction from a single still image publication-title: Int. J. Comput. Vis. doi: 10.1007/s11263-007-0071-y – ident: ref_21 doi: 10.1007/978-3-030-01267-0_4 – ident: ref_27 doi: 10.1109/ICVR55215.2022.9847988 – volume: 29 start-page: 1 year: 2016 ident: ref_33 article-title: Depth from a single image by harmonizing overcomplete local network predictions publication-title: Adv. Neural Inf. Process. Syst. – ident: ref_37 doi: 10.1109/ICCV48922.2021.01596 – ident: ref_20 doi: 10.1109/CVPR.2018.00037 – volume: 34 start-page: 1359 year: 2022 ident: ref_29 article-title: Multiple classifier for concatenate-designed neural network publication-title: Neural Comput. Appl. doi: 10.1007/s00521-021-06462-0 – ident: ref_36 doi: 10.1007/978-3-030-58574-7_35 – ident: ref_2 – volume: 30 start-page: 1 year: 2017 ident: ref_24 article-title: Attention is all you need publication-title: Adv. Neural Inf. Process. Syst. – ident: ref_25 doi: 10.3390/electronics12224669 – ident: ref_10 doi: 10.1109/CVPR.2012.6248074 – ident: ref_32 doi: 10.1109/3DV.2016.32 – ident: ref_40 doi: 10.1007/978-3-030-01219-9_14 – ident: ref_13 – volume: 21 start-page: 26912 year: 2021 ident: ref_26 article-title: Swin-Depth: Using Transformers and Multi-Scale Fusion for Monocular-Based Depth Estimation publication-title: IEEE Sens. J. doi: 10.1109/JSEN.2021.3120753 – ident: ref_8 doi: 10.3390/app13179940 – ident: ref_4 doi: 10.1109/TCCN.2024.3360527 – volume: 23 start-page: 347 year: 2011 ident: ref_5 article-title: Asynchronous event-based binocular stereo matching publication-title: IEEE Trans. Neural Netw. Learn. Syst. doi: 10.1109/TNNLS.2011.2180025 – ident: ref_22 doi: 10.1109/CVPR.2018.00214 – ident: ref_28 doi: 10.24963/ijcai.2019/98 – ident: ref_38 doi: 10.1007/978-3-319-46484-8_45 |
| SSID | ssj0000913830 |
| Score | 2.3221502 |
| Snippet | Estimating depth from a single RGB image has a wide range of applications, such as in robot navigation and autonomous driving. Currently, Convolutional Neural... |
| SourceID | proquest gale crossref |
| SourceType | Aggregation Database Enrichment Source Index Database |
| StartPage | 1652 |
| SubjectTerms | Accuracy Algorithms Analysis Artificial neural networks Autonomous navigation Coders Datasets Deep learning Estimation Neural networks Robots |
| Title | Edge-Enhanced Dual-Stream Perception Network for Monocular Depth Estimation |
| URI | https://www.proquest.com/docview/3053158080 |
| Volume | 13 |
| WOSCitedRecordID | wos001219977500001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| hasFullText | 1 |
| inHoldings | 1 |
| isFullTextHit | |
| isPrint | |
| journalDatabaseRights | – providerCode: PRVHPJ databaseName: ROAD: Directory of Open Access Scholarly Resources customDbUrl: eissn: 2079-9292 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0000913830 issn: 2079-9292 databaseCode: M~E dateStart: 20120101 isFulltext: true titleUrlDefault: https://road.issn.org providerName: ISSN International Centre – providerCode: PRVPQU databaseName: Advanced Technologies & Aerospace Database customDbUrl: eissn: 2079-9292 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0000913830 issn: 2079-9292 databaseCode: P5Z dateStart: 20120301 isFulltext: true titleUrlDefault: https://search.proquest.com/hightechjournals providerName: ProQuest – providerCode: PRVPQU databaseName: ProQuest Central customDbUrl: eissn: 2079-9292 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0000913830 issn: 2079-9292 databaseCode: BENPR dateStart: 20120301 isFulltext: true titleUrlDefault: https://www.proquest.com/central providerName: ProQuest – providerCode: PRVPQU databaseName: Publicly Available Content Database customDbUrl: eissn: 2079-9292 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0000913830 issn: 2079-9292 databaseCode: PIMPY dateStart: 20120301 isFulltext: true titleUrlDefault: http://search.proquest.com/publiccontent providerName: ProQuest |
| link | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV07T8MwED5BywADb0R5VB6QWLCa2HlOCGgqEGoVIZCAJbIdhyJBC7Qw8ts5J24LEmJiyRAnkZU7fz7b330HcODqSKFpGVWsKKinGacicguqMXwIERuC2FNlsYmw14tub-PUpkePLK1ygoklUFdqz4a3jSDcyofK7Ji3uPEd32giHr-8UlNDypy12oIa81A3wltODerpRTe9m-65GA3MiDuV-BDH1X5rVmtmhGCOoZLPfkxQv8N0Ofd0Vv6316uwbGNQclI5zRrM6cE6LH1TJtyAyyR_0DQZ9EuCAGm_iydqDrDFM0mnVBjSqzjkBANfguAwLDmtpI2tfZIgdlRpkZtw00muz86prbtAFQ_cMVWhCDzfyfNcKVwQOUz6Qoo4llpgNMYE3oiLKIyM8k-gIo_lUslculxwRwot-BbUBsOB3gYiFWcGb4UbRJ7w_NgVAUPnKELHyYWSDWCTn50pK0puamM8Zbg4MRbKfrFQA46mL71Umhx_P35orJiZEYvfVsImHmAPjfZVdoIQ5Qch98MG7E2smNmhPMpmRtv5u3kXFhlGPBUbcg9q47d3vQ8L6mP8OHprQv006aVXTZjvfiZ4Tf37pvXPLw239js |
| linkProvider | ProQuest |
| linkToHtml | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw1V3JTsMwEB1BiwQc2BFl9QHEBauJnfWAEKJFVIWqB5DgFGzHoUilBVpA_BTfyDhLCxLixoFrnFhJ_PTm2R6_Adi1daBwaBlVLEmooxmnIrATqlE--MgNXuiotNiE32oF19dhewI-irMwJq2y4MSUqOO-MmvkVW7Q4hoXxKPHJ2qqRpnd1aKERgaLpn5_wynb4LBRw_HdY-y0fnlyRvOqAlRxzx5S5QvPca04jpVCuW8x6QopwlBqgVqDCbwQJoEfGF8bTwUOi6WSsbS54JYUWnDsdxLKDoLdKkG53bho34xWdYzLZsCtzN6I89CqjqvZDDBcoBhz2bcQ-HMgSKPb6fx_-y8LMJfraHKcAX8RJnRvCWa_uCsuQ7Me32la73XSJAdSexFdajbhxQNpj9J5SCvLgyco3gkSXD_NyyU1bO2QOvJfdrRzBa7-5GtWodTr9_QaEKk4MzFD2F7gCMcNbeExBHjiW1YslKwAK4YzUrmxuqnv0Y1wgmUwEP2AgQocjB56zHxFfr993-AkMqyDfSuRH57ANzT-XdEx0qzr-dz1K7BZ4CTK6WgQjUGy_nvzDkyfXV6cR-eNVnMDZhgquCy7cxNKw-cXvQVT6nV4P3jezpFP4PavQfUJWL5GGg |
| linkToPdf | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw1V1LS8QwEB50FdGDb3F95qB4MWyb9HkQUbeLslIWUfBWkzRVQXfVXRX_mr_OSR-rgnjz4LVpQ9v58s0kmXwDsGXrQKFpGVUsy6ijGacisDOqMXzwkRu80FF5sQk_joPLy7AzAu_VWRiTVllxYk7UaU-ZNfIGN2hxcxXErEyL6DRb-w-P1FSQMjutVTmNAiJt_faK07f-3kkTbb3NWCs6PzqmZYUBqrhnD6jyhee4VpqmSmHobzHpCinCUGqBcQcTeCHMAj8wGjeeChyWSiVTaXPBLSm04NjvKIz5HCc9NRg7jOLO2XCFxyhuBtwqpI44D63GZ2WbProODMxc9s0d_uwUck_XmvnP_2gWpsv4mhwUA2IORnR3Hqa-qC4uQDtKrzWNujd58gNpPos7ajbnxT3pDNN8SFzkxxMM6gkSXy_P1yVNbL0hEfJiceRzES7-5GuWoNbtdfUyEKk4M75E2F7gCMcNbeExBH7mW1YqlKwDq0ybqFJw3dT9uEtw4mXwkPyAhzrsDh96KPRGfr99x2AmMWyEfStRHqrANzS6XskB0q_r-dz167BWYSYpaaqffAJm5ffmTZhAJCWnJ3F7FSYZBnZF0uca1AZPz3odxtXL4Lb_tFEOAgJXf42pD2p6TrQ |
| openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Edge-Enhanced+Dual-Stream+Perception+Network+for+Monocular+Depth+Estimation&rft.jtitle=Electronics+%28Basel%29&rft.au=Liu%2C+Zihang&rft.au=Wang%2C+Quande&rft.date=2024-05-01&rft.pub=MDPI+AG&rft.issn=2079-9292&rft.eissn=2079-9292&rft.volume=13&rft.issue=9&rft_id=info:doi/10.3390%2Felectronics13091652&rft.externalDocID=A793567357 |
| thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2079-9292&client=summon |
| thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2079-9292&client=summon |
| thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2079-9292&client=summon |