Learning Low-Precision Structured Subnetworks Using Joint Layerwise Channel Pruning and Uniform Quantization

Pruning and quantization are core techniques used to reduce the inference costs of deep neural networks. Among the state-of-the-art pruning techniques, magnitude-based pruning algorithms have demonstrated consistent success in the reduction of both weight and feature map complexity. However, we find...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Applied sciences Ročník 12; číslo 15; s. 7829
Hlavní autoři: Zhang, Xinyu, Colbert, Ian, Das, Srinjoy
Médium: Journal Article
Jazyk:angličtina
Vydáno: Basel MDPI AG 01.08.2022
Témata:
ISSN:2076-3417, 2076-3417
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Abstract Pruning and quantization are core techniques used to reduce the inference costs of deep neural networks. Among the state-of-the-art pruning techniques, magnitude-based pruning algorithms have demonstrated consistent success in the reduction of both weight and feature map complexity. However, we find that existing measures of neuron (or channel) importance estimation used for such pruning procedures have at least one of two limitations: (1) failure to consider the interdependence between successive layers; and/or (2) performing the estimation in a parametric setting or by using distributional assumptions on the feature maps. In this work, we demonstrate that the importance rankings of the output neurons of a given layer strongly depend on the sparsity level of the preceding layer, and therefore, naïvely estimating neuron importance to drive magnitude-based pruning will lead to suboptimal performance. Informed by this observation, we propose a purely data-driven nonparametric, magnitude-based channel pruning strategy that works in a greedy manner based on the activations of the previous sparsified layer. We demonstrate that our proposed method works effectively in combination with statistics-based quantization techniques to generate low precision structured subnetworks that can be efficiently accelerated by hardware platforms such as GPUs and FPGAs. Using our proposed algorithms, we demonstrate increased performance per memory footprint over existing solutions across a range of discriminative and generative networks.
AbstractList Pruning and quantization are core techniques used to reduce the inference costs of deep neural networks. Among the state-of-the-art pruning techniques, magnitude-based pruning algorithms have demonstrated consistent success in the reduction of both weight and feature map complexity. However, we find that existing measures of neuron (or channel) importance estimation used for such pruning procedures have at least one of two limitations: (1) failure to consider the interdependence between successive layers; and/or (2) performing the estimation in a parametric setting or by using distributional assumptions on the feature maps. In this work, we demonstrate that the importance rankings of the output neurons of a given layer strongly depend on the sparsity level of the preceding layer, and therefore, naïvely estimating neuron importance to drive magnitude-based pruning will lead to suboptimal performance. Informed by this observation, we propose a purely data-driven nonparametric, magnitude-based channel pruning strategy that works in a greedy manner based on the activations of the previous sparsified layer. We demonstrate that our proposed method works effectively in combination with statistics-based quantization techniques to generate low precision structured subnetworks that can be efficiently accelerated by hardware platforms such as GPUs and FPGAs. Using our proposed algorithms, we demonstrate increased performance per memory footprint over existing solutions across a range of discriminative and generative networks.
Author Zhang, Xinyu
Das, Srinjoy
Colbert, Ian
Author_xml – sequence: 1
  givenname: Xinyu
  surname: Zhang
  fullname: Zhang, Xinyu
– sequence: 2
  givenname: Ian
  orcidid: 0000-0002-1669-5519
  surname: Colbert
  fullname: Colbert, Ian
– sequence: 3
  givenname: Srinjoy
  orcidid: 0000-0003-3821-8112
  surname: Das
  fullname: Das, Srinjoy
BookMark eNptkVuLFDEQhYOs4Lruk3-gwUdpTaWTTvpRBi8rDa6s8xwql14z9iZjkmZYf729MwqLWC9VFKc-TnGek7OYoifkJdA3XTfQt7jfAwMhFRuekHNGZd92HOTZo_kZuSxlR9caoFNAz8k8eswxxNtmTIf2OnsbSkixual5sXXJ3jU3i4m-HlL-UZpteZB-TiHWZsR7nw-h-GbzHWP0c3OdlyMKo2u2MUwp3zVfF4w1_MK6Ul-QpxPOxV_-6Rdk--H9t82ndvzy8Wrzbmxt1_PaOsaUA8El9lNvBXo1cDDKgjEAUy8cOIkwGEEnlFb1jiM6MJOcjDSK2u6CXJ24LuFO73O4w3yvEwZ9XKR8qzHXYGevpeQDsskKYRznIJB2TngvHXPKMwUr69WJtc_p5-JL1bu05Lja10xSKrpBAV9VcFLZnErJftI21OPPNWOYNVD9kJF-lNF68_qfm79O_6f-DbJbliQ
CitedBy_id crossref_primary_10_1007_s13042_024_02229_w
crossref_primary_10_3390_electronics12071683
Cites_doi 10.1007/978-3-319-24574-4_28
10.1007/978-3-030-58526-6_16
10.1109/CVPR.2016.90
10.1109/CVPR.2019.01152
10.1109/CVPR.2018.00474
10.1109/CVPRW.2017.241
10.1109/CVPR42600.2020.00225
10.1109/ICCV.2017.541
10.1109/CVPR.2018.00286
10.1109/CVPR.2016.350
10.1201/9781003162810-13
10.1109/CVPR42600.2020.00215
10.1109/SAUPEC/RobMech/PRASA48453.2020.9041096
10.1109/CVPR.2017.243
10.1109/CVPR.2017.353
10.1109/CVPR.2019.00289
10.1109/IJCNN.2019.8852476
10.1109/JSTSP.2020.2975903
10.1109/ITW.2015.7133169
10.1080/01621459.1966.10480879
10.1109/ACCESS.2021.3123938
10.1109/5.726791
10.1109/ICCV.2017.155
10.1016/j.neucom.2021.07.045
10.1145/3414685.3417786
10.3390/info13040176
ContentType Journal Article
Copyright 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Copyright_xml – notice: 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
DBID AAYXX
CITATION
ABUWG
AFKRA
AZQEC
BENPR
CCPQU
DWQXO
PHGZM
PHGZT
PIMPY
PKEHL
PQEST
PQQKQ
PQUKI
PRINS
DOA
DOI 10.3390/app12157829
DatabaseName CrossRef
ProQuest Central (Alumni)
ProQuest Central UK/Ireland
ProQuest Central Essentials
ProQuest Central
ProQuest One
ProQuest Central Korea
ProQuest Central Premium
ProQuest One Academic (New)
Publicly Available Content Database
ProQuest One Academic Middle East (New)
ProQuest One Academic Eastern Edition (DO NOT USE)
ProQuest One Academic (retired)
ProQuest One Academic UKI Edition
ProQuest Central China
DOAJ Directory of Open Access Journals
DatabaseTitle CrossRef
Publicly Available Content Database
ProQuest One Academic Middle East (New)
ProQuest Central Essentials
ProQuest One Academic Eastern Edition
ProQuest Central (Alumni Edition)
ProQuest One Community College
ProQuest Central China
ProQuest Central
ProQuest One Academic UKI Edition
ProQuest Central Korea
ProQuest Central (New)
ProQuest One Academic
ProQuest One Academic (New)
DatabaseTitleList
Publicly Available Content Database
CrossRef
Database_xml – sequence: 1
  dbid: DOA
  name: DOAJ Directory of Open Access Journals
  url: https://www.doaj.org/
  sourceTypes: Open Website
– sequence: 2
  dbid: PIMPY
  name: Publicly Available Content Database
  url: http://search.proquest.com/publiccontent
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
Sciences (General)
EISSN 2076-3417
ExternalDocumentID oai_doaj_org_article_7749a2fc55bd4415a03d5ee7d2d8e281
10_3390_app12157829
GroupedDBID .4S
2XV
5VS
7XC
8CJ
8FE
8FG
8FH
AADQD
AAFWJ
AAYXX
ADBBV
ADMLS
AFFHD
AFKRA
AFPKN
AFZYC
ALMA_UNASSIGNED_HOLDINGS
APEBS
ARCSS
BCNDV
BENPR
CCPQU
CITATION
CZ9
D1I
D1J
D1K
GROUPED_DOAJ
IAO
IGS
ITC
K6-
K6V
KC.
KQ8
L6V
LK5
LK8
M7R
MODMG
M~E
OK1
P62
PHGZM
PHGZT
PIMPY
PROAC
TUS
ABUWG
AZQEC
DWQXO
PKEHL
PQEST
PQQKQ
PQUKI
PRINS
ID FETCH-LOGICAL-c364t-d228d1547a6f6c5ae8941b8c1bb11f65d1d7a19b50fa7c86d4aad1bf7fb7b80c3
IEDL.DBID PIMPY
ISICitedReferencesCount 2
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000839023400001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
ISSN 2076-3417
IngestDate Tue Oct 14 19:02:33 EDT 2025
Mon Jun 30 07:32:55 EDT 2025
Tue Nov 18 21:31:39 EST 2025
Sat Nov 29 07:13:19 EST 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Issue 15
Language English
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c364t-d228d1547a6f6c5ae8941b8c1bb11f65d1d7a19b50fa7c86d4aad1bf7fb7b80c3
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ORCID 0000-0002-1669-5519
0000-0003-3821-8112
OpenAccessLink https://www.proquest.com/publiccontent/docview/2700539814?pq-origsite=%requestingapplication%
PQID 2700539814
PQPubID 2032433
ParticipantIDs doaj_primary_oai_doaj_org_article_7749a2fc55bd4415a03d5ee7d2d8e281
proquest_journals_2700539814
crossref_citationtrail_10_3390_app12157829
crossref_primary_10_3390_app12157829
PublicationCentury 2000
PublicationDate 2022-08-01
PublicationDateYYYYMMDD 2022-08-01
PublicationDate_xml – month: 08
  year: 2022
  text: 2022-08-01
  day: 01
PublicationDecade 2020
PublicationPlace Basel
PublicationPlace_xml – name: Basel
PublicationTitle Applied sciences
PublicationYear 2022
Publisher MDPI AG
Publisher_xml – name: MDPI AG
References ref_50
ref_14
ref_58
ref_13
LeCun (ref_33) 1998; 86
ref_57
ref_56
ref_11
ref_55
ref_10
ref_54
ref_53
Choi (ref_61) 2020; 14
ref_52
ref_51
ref_19
Colbert (ref_17) 2021; 9
ref_15
ref_59
Paszke (ref_47) 2019; 32
ref_60
ref_25
ref_24
ref_23
ref_22
ref_21
ref_20
ref_64
ref_63
ref_62
ref_29
ref_28
ref_27
ref_26
Liang (ref_8) 2021; 461
Knight (ref_31) 1966; 61
ref_36
ref_35
ref_34
ref_39
ref_38
ref_37
Levenshtein (ref_32) 1966; Volume 10
ref_46
Blalock (ref_18) 2020; 2
ref_45
ref_44
Jha (ref_16) 2021; 120
ref_43
ref_42
Thomas (ref_49) 2020; 39
ref_41
ref_40
ref_1
ref_3
Jain (ref_30) 2020; 2
ref_2
Louizos (ref_12) 2020; 33
ref_48
ref_9
ref_5
ref_4
ref_7
ref_6
References_xml – ident: ref_43
  doi: 10.1007/978-3-319-24574-4_28
– ident: ref_9
– ident: ref_25
  doi: 10.1007/978-3-030-58526-6_16
– ident: ref_37
  doi: 10.1109/CVPR.2016.90
– ident: ref_5
– ident: ref_55
– ident: ref_26
– ident: ref_22
  doi: 10.1109/CVPR.2019.01152
– volume: Volume 10
  start-page: 707
  year: 1966
  ident: ref_32
  article-title: Binary codes capable of correcting deletions, insertions, and reversals
  publication-title: Soviet Physics—Doklady
– ident: ref_41
  doi: 10.1109/CVPR.2018.00474
– ident: ref_14
  doi: 10.1109/CVPRW.2017.241
– ident: ref_39
– ident: ref_51
  doi: 10.1109/CVPR42600.2020.00225
– ident: ref_42
– ident: ref_1
– ident: ref_35
– ident: ref_23
– ident: ref_24
  doi: 10.1109/ICCV.2017.541
– ident: ref_58
– ident: ref_27
  doi: 10.1109/CVPR.2018.00286
– ident: ref_45
  doi: 10.1109/CVPR.2016.350
– ident: ref_29
  doi: 10.1201/9781003162810-13
– ident: ref_4
– ident: ref_56
– ident: ref_6
  doi: 10.1109/CVPR42600.2020.00215
– ident: ref_52
– ident: ref_48
– ident: ref_10
– volume: 33
  start-page: 5741
  year: 2020
  ident: ref_12
  article-title: Bayesian bits: Unifying quantization and pruning
  publication-title: Adv. Neural Inf. Process. Syst.
– ident: ref_13
– ident: ref_62
– ident: ref_38
– ident: ref_20
– ident: ref_59
– ident: ref_28
– ident: ref_53
– ident: ref_3
– ident: ref_34
– ident: ref_60
  doi: 10.1109/SAUPEC/RobMech/PRASA48453.2020.9041096
– ident: ref_40
  doi: 10.1109/CVPR.2017.243
– ident: ref_44
  doi: 10.1109/CVPR.2017.353
– ident: ref_57
  doi: 10.1109/CVPR.2019.00289
– ident: ref_11
  doi: 10.1109/IJCNN.2019.8852476
– ident: ref_63
– volume: 14
  start-page: 715
  year: 2020
  ident: ref_61
  article-title: Universal deep neural network compression
  publication-title: IEEE J. Sel. Top. Signal Process.
  doi: 10.1109/JSTSP.2020.2975903
– ident: ref_21
– ident: ref_36
  doi: 10.1109/ITW.2015.7133169
– volume: 61
  start-page: 436
  year: 1966
  ident: ref_31
  article-title: A computer method for calculating Kendall’s tau with ungrouped data
  publication-title: J. Am. Stat. Assoc.
  doi: 10.1080/01621459.1966.10480879
– volume: 9
  start-page: 147967
  year: 2021
  ident: ref_17
  article-title: An Energy-Efficient Edge Computing Paradigm for Convolution-Based Image Upsampling
  publication-title: IEEE Access
  doi: 10.1109/ACCESS.2021.3123938
– ident: ref_54
– ident: ref_2
– volume: 2
  start-page: 129
  year: 2020
  ident: ref_18
  article-title: What is the state of neural network pruning?
  publication-title: Proc. Mach. Learn. Syst.
– volume: 86
  start-page: 2278
  year: 1998
  ident: ref_33
  article-title: Gradient-based learning applied to document recognition
  publication-title: Proc. IEEE
  doi: 10.1109/5.726791
– ident: ref_46
– ident: ref_15
  doi: 10.1109/ICCV.2017.155
– ident: ref_7
  doi: 10.1109/SAUPEC/RobMech/PRASA48453.2020.9041096
– volume: 461
  start-page: 370
  year: 2021
  ident: ref_8
  article-title: Pruning and quantization for deep neural network acceleration: A survey
  publication-title: Neurocomputing
  doi: 10.1016/j.neucom.2021.07.045
– ident: ref_64
– volume: 120
  start-page: x109
  year: 2021
  ident: ref_16
  article-title: Data-type aware arithmetic intensity for deep neural networks
  publication-title: Energy
– volume: 2
  start-page: 112
  year: 2020
  ident: ref_30
  article-title: Trained quantization thresholds for accurate and efficient fixed-point inference of deep neural networks
  publication-title: Proc. Mach. Learn. Syst.
– volume: 39
  start-page: 1
  year: 2020
  ident: ref_49
  article-title: A reduced-precision network for image reconstruction
  publication-title: ACM Trans. Graph. Tog
  doi: 10.1145/3414685.3417786
– ident: ref_19
– volume: 32
  start-page: 8026
  year: 2019
  ident: ref_47
  article-title: Pytorch: An imperative style, high-performance deep learning library
  publication-title: Adv. Neural Inf. Process. Syst.
– ident: ref_50
  doi: 10.3390/info13040176
SSID ssj0000913810
Score 2.2539988
Snippet Pruning and quantization are core techniques used to reduce the inference costs of deep neural networks. Among the state-of-the-art pruning techniques,...
SourceID doaj
proquest
crossref
SourceType Open Website
Aggregation Database
Enrichment Source
Index Database
StartPage 7829
SubjectTerms Algorithms
channel pruning
Energy consumption
joint pruning
layerwise pruning
Neural networks
Performance evaluation
quantization
Sparsity
SummonAdditionalLinks – databaseName: DOAJ Directory of Open Access Journals
  dbid: DOA
  link: http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV3dS8MwEA8yfNAHcVNxOiUPe1Ch2LRJkz6qOETGmPjB3ko-ZTA6WTf375uk2Zgo-OJrOdpyd7n7Hbn7HQDdlFClGFaRStxIjhEocnk84phqg1KpkN_W8NangwEbjfLhxqov1xNW0wPXiru28CTniZGECOWwP49TRbSmKlFMJ37oOolpvlFM-RicI0ddVQ_kpbaud_fBjkjBJsT8WwryTP0_ArHPLr19sBdgIbypf6cJtnTZArsbZIEt0AzHsIIXgSv68gBMAj_qO-xPl9FwFjbmwGdPC7uYaQVtaCjrXu8K-gYB-Dgdl3PY5xZtL8eVhm7CoNQTOJwt_Kt4qaDFog7OwqeFVX2Y1TwEr737l7uHKCxQiGSa4bnVf8KssjHlmckk4ZrlGAkmkRAImYwopChHuSCx4VSyTGHOFRKGGkEFi2V6BBrltNTHAApscYuME46IxEjHgmlsMM2MNQ7LpGiDq5VOCxnYxd2Si0lhqwxngGLDAG3QXQt_1KQav4vdOuOsRRwTtn9g_aMI_lH85R9t0FmZtgjHsyrcbTtJc4bwyX984xTsJG4qwvcFdkDDmlifgW35OR9Xs3PvmV94S-qB
  priority: 102
  providerName: Directory of Open Access Journals
Title Learning Low-Precision Structured Subnetworks Using Joint Layerwise Channel Pruning and Uniform Quantization
URI https://www.proquest.com/docview/2700539814
https://doaj.org/article/7749a2fc55bd4415a03d5ee7d2d8e281
Volume 12
WOSCitedRecordID wos000839023400001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVAON
  databaseName: DOAJ Directory of Open Access Journals
  customDbUrl:
  eissn: 2076-3417
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0000913810
  issn: 2076-3417
  databaseCode: DOA
  dateStart: 20110101
  isFulltext: true
  titleUrlDefault: https://www.doaj.org/
  providerName: Directory of Open Access Journals
– providerCode: PRVHPJ
  databaseName: ROAD: Directory of Open Access Scholarly Resources
  customDbUrl:
  eissn: 2076-3417
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0000913810
  issn: 2076-3417
  databaseCode: M~E
  dateStart: 20110101
  isFulltext: true
  titleUrlDefault: https://road.issn.org
  providerName: ISSN International Centre
– providerCode: PRVPQU
  databaseName: ProQuest Central
  customDbUrl:
  eissn: 2076-3417
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0000913810
  issn: 2076-3417
  databaseCode: BENPR
  dateStart: 20110101
  isFulltext: true
  titleUrlDefault: https://www.proquest.com/central
  providerName: ProQuest
– providerCode: PRVPQU
  databaseName: Publicly Available Content Database
  customDbUrl:
  eissn: 2076-3417
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0000913810
  issn: 2076-3417
  databaseCode: PIMPY
  dateStart: 20110101
  isFulltext: true
  titleUrlDefault: http://search.proquest.com/publiccontent
  providerName: ProQuest
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1Lb9QwEB7BLgc4AC1ULJSVDz0AUtQ4cWLnhChqBWhZhafKKfKzWmmVlGSX_n08Xu9SBOLENRlZlr7xPOyZbwCO8oIbI5hJTIYtOU7RBP14Ihm3juba0DCt4euMz-fi_LyqY3v0EMsqtzYxGOoN2zPWbXsjfGw6jTfmx_hcWuSVoOzl5fcEZ0jhW2scqHETxki8lY5gXL99X3_b3bkgB6ag6aZNL_fZPr4SI72Cd5PVb44p8Pf_YZ6Dzzm79393ex_uxtiTvNooyx7csO0-3LnGSLgPe_GsD-RZJKR-_gCWkYT1gsy6q6Tu41ge8ilwz657a4i3P-2moHwgoQqBvOsW7YrMpA_prxaDJdjG0Nolqft1WEq2hviAF2Nm8mHt8Y0NoQ_hy9np59dvkjilIdF5yVYe5Ex4RBmXpSt1Ia2oGFVCU6UodWVhqOGSVqpIneRalIZJaahy3CmuRKrzAxi1XWsfAVHMB0c6zSQtNKM2VcIyx3jpKpmJUqsJvNhC1OhIYY6TNJaNT2UQz-YanhM42glfbpg7_i52gljvRJBuO3zo-osmnt7Gx8h-C04XhTKYgMo0N4W13GRG2EzQCRxu1aCJNmBofqH--N-_n8DtDJsqQlnhIYw8ePYp3NI_Vouhn8L45HRef5yG24JpVOmfpjoH9A
linkProvider ProQuest
linkToHtml http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw1V3LbtQwFL0qUyRgAbSAGCjgRZEAKSJOnNhZIMSr6tB0FESpyir4WY00Skoyw4if4huxE2coArHrgm1iWZZzch_2vecA7MYJVYoRFajIteQYgQPnxwNOqDY4lgp3ag3HOZ1O2clJVmzAj6EXxpVVDjaxM9Sqlu6M_Lm7IE3ijGHy8uxr4FSj3O3qIKHRw-JAf1_ZlK19MXlrv-_jKNp7d_RmP_CqAoGMU7Kwi4qYXQGhPDWpTLhmGcGCSSwExiZNFFaU40wkoeFUslQRzhUWhhpBBQtlbOe9BJvEgj0cwWYxOSw-r091HMsmw2HfCBjHWejuoR2Bg3XE2W-ur1MI-MMBdF5t78b_th834bqPn9GrHvBbsKGrbbh2jlVxG7a8vWrRE0-q_fQWzD2R7CnK61VQNF5aCH3s-HOXjVbI2tCqL4pvUVdJgd7Xs2qBcm7TktWs1ci1YlR6jopm2U3FK4Vs0O7ifvRhaTHqm1pvw6cL2YM7MKrqSt8FJIgN8GQYcZxIgnUomCaG0NRkPGKpFGN4NoCglJ6G3amBzEubjjnElOcQM4bd9eCznn3k78NeOzSthzjK8O5B3ZyW3gKVNs63SzAySYRySTQPY5VoTVWkmI4YHsPOALTS27G2_IWye_9-_Qiu7B8d5mU-mR7ch6uRaxLpyiR3YGQ_pH4Al-W3xaxtHvpfBsGXi0blT9ttWUg
linkToPdf http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw1V1Lb9QwEB6VLUJwAFpALBTwoUiAFDVOnNg5VBWlXVG6WoVX1Vvws1pplZRklxV_rb-uduIsRSBuPXBNLGtkf56HPfMNwHacUKUYUYGKXEmOEThwdjzghGqDY6lw263hZEwnE3Z6muVrcNHXwri0yl4ntopaVdLdke-4B9IkzhgmO8anReQHo73z74HrIOVeWvt2Gh1EjvXPpQ3fmt2jA7vXL6NodPjl3fvAdxgIZJySuRUwYlYaQnlqUplwzTKCBZNYCIxNmiisKMeZSELDqWSpIpwrLAw1ggoWytjOewPWaWyDngGs7x9O8k-rGx7HuMlw2BUFxnEWujdpR-ZgjXL2mxlsuwX8YQxaCze69z-vzX246_1q9LY7CBuwpstNuHOFbXETNrwea9ArT7b9-gHMPMHsGRpXyyCvfcsh9Lnl1V3UWiGrW8suWb5BbYYF-lBNyzkacxuuLKeNRq5Eo9QzlNeLdipeKmSdeRcPoI8Li11f7PoQvl7LGjyCQVmV-jEgQazjJ8OI40QSrEPBNDGEpibjEUulGMKbHhCF9PTsrkvIrLBhmkNPcQU9Q9heDT7vWEn-PmzfIWs1xFGJtx-q-qzwmqmw_r8VwcgkEcoF1zyMVaI1VZFiOmJ4CFs96Aqv35riF-Ke_Pv3C7hloViMjybHT-F25GpH2uzJLRjYfdTP4Kb8MZ829XN_ehB8u25QXgLYNGHi
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Learning+Low-Precision+Structured+Subnetworks+Using+Joint+Layerwise+Channel+Pruning+and+Uniform+Quantization&rft.jtitle=Applied+sciences&rft.au=Zhang%2C+Xinyu&rft.au=Colbert%2C+Ian&rft.au=Das%2C+Srinjoy&rft.date=2022-08-01&rft.issn=2076-3417&rft.eissn=2076-3417&rft.volume=12&rft.issue=15&rft.spage=7829&rft_id=info:doi/10.3390%2Fapp12157829&rft.externalDBID=n%2Fa&rft.externalDocID=10_3390_app12157829
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2076-3417&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2076-3417&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2076-3417&client=summon