Learning Low-Precision Structured Subnetworks Using Joint Layerwise Channel Pruning and Uniform Quantization
Pruning and quantization are core techniques used to reduce the inference costs of deep neural networks. Among the state-of-the-art pruning techniques, magnitude-based pruning algorithms have demonstrated consistent success in the reduction of both weight and feature map complexity. However, we find...
Saved in:
| Published in: | Applied sciences Vol. 12; no. 15; p. 7829 |
|---|---|
| Main Authors: | , , |
| Format: | Journal Article |
| Language: | English |
| Published: |
Basel
MDPI AG
01.08.2022
|
| Subjects: | |
| ISSN: | 2076-3417, 2076-3417 |
| Online Access: | Get full text |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Abstract | Pruning and quantization are core techniques used to reduce the inference costs of deep neural networks. Among the state-of-the-art pruning techniques, magnitude-based pruning algorithms have demonstrated consistent success in the reduction of both weight and feature map complexity. However, we find that existing measures of neuron (or channel) importance estimation used for such pruning procedures have at least one of two limitations: (1) failure to consider the interdependence between successive layers; and/or (2) performing the estimation in a parametric setting or by using distributional assumptions on the feature maps. In this work, we demonstrate that the importance rankings of the output neurons of a given layer strongly depend on the sparsity level of the preceding layer, and therefore, naïvely estimating neuron importance to drive magnitude-based pruning will lead to suboptimal performance. Informed by this observation, we propose a purely data-driven nonparametric, magnitude-based channel pruning strategy that works in a greedy manner based on the activations of the previous sparsified layer. We demonstrate that our proposed method works effectively in combination with statistics-based quantization techniques to generate low precision structured subnetworks that can be efficiently accelerated by hardware platforms such as GPUs and FPGAs. Using our proposed algorithms, we demonstrate increased performance per memory footprint over existing solutions across a range of discriminative and generative networks. |
|---|---|
| AbstractList | Pruning and quantization are core techniques used to reduce the inference costs of deep neural networks. Among the state-of-the-art pruning techniques, magnitude-based pruning algorithms have demonstrated consistent success in the reduction of both weight and feature map complexity. However, we find that existing measures of neuron (or channel) importance estimation used for such pruning procedures have at least one of two limitations: (1) failure to consider the interdependence between successive layers; and/or (2) performing the estimation in a parametric setting or by using distributional assumptions on the feature maps. In this work, we demonstrate that the importance rankings of the output neurons of a given layer strongly depend on the sparsity level of the preceding layer, and therefore, naïvely estimating neuron importance to drive magnitude-based pruning will lead to suboptimal performance. Informed by this observation, we propose a purely data-driven nonparametric, magnitude-based channel pruning strategy that works in a greedy manner based on the activations of the previous sparsified layer. We demonstrate that our proposed method works effectively in combination with statistics-based quantization techniques to generate low precision structured subnetworks that can be efficiently accelerated by hardware platforms such as GPUs and FPGAs. Using our proposed algorithms, we demonstrate increased performance per memory footprint over existing solutions across a range of discriminative and generative networks. |
| Author | Zhang, Xinyu Das, Srinjoy Colbert, Ian |
| Author_xml | – sequence: 1 givenname: Xinyu surname: Zhang fullname: Zhang, Xinyu – sequence: 2 givenname: Ian orcidid: 0000-0002-1669-5519 surname: Colbert fullname: Colbert, Ian – sequence: 3 givenname: Srinjoy orcidid: 0000-0003-3821-8112 surname: Das fullname: Das, Srinjoy |
| BookMark | eNptkVuLFDEQhYOs4Lruk3-gwUdpTaWTTvpRBi8rDa6s8xwql14z9iZjkmZYf729MwqLWC9VFKc-TnGek7OYoifkJdA3XTfQt7jfAwMhFRuekHNGZd92HOTZo_kZuSxlR9caoFNAz8k8eswxxNtmTIf2OnsbSkixual5sXXJ3jU3i4m-HlL-UZpteZB-TiHWZsR7nw-h-GbzHWP0c3OdlyMKo2u2MUwp3zVfF4w1_MK6Ul-QpxPOxV_-6Rdk--H9t82ndvzy8Wrzbmxt1_PaOsaUA8El9lNvBXo1cDDKgjEAUy8cOIkwGEEnlFb1jiM6MJOcjDSK2u6CXJ24LuFO73O4w3yvEwZ9XKR8qzHXYGevpeQDsskKYRznIJB2TngvHXPKMwUr69WJtc_p5-JL1bu05Lja10xSKrpBAV9VcFLZnErJftI21OPPNWOYNVD9kJF-lNF68_qfm79O_6f-DbJbliQ |
| CitedBy_id | crossref_primary_10_1007_s13042_024_02229_w crossref_primary_10_3390_electronics12071683 |
| Cites_doi | 10.1007/978-3-319-24574-4_28 10.1007/978-3-030-58526-6_16 10.1109/CVPR.2016.90 10.1109/CVPR.2019.01152 10.1109/CVPR.2018.00474 10.1109/CVPRW.2017.241 10.1109/CVPR42600.2020.00225 10.1109/ICCV.2017.541 10.1109/CVPR.2018.00286 10.1109/CVPR.2016.350 10.1201/9781003162810-13 10.1109/CVPR42600.2020.00215 10.1109/SAUPEC/RobMech/PRASA48453.2020.9041096 10.1109/CVPR.2017.243 10.1109/CVPR.2017.353 10.1109/CVPR.2019.00289 10.1109/IJCNN.2019.8852476 10.1109/JSTSP.2020.2975903 10.1109/ITW.2015.7133169 10.1080/01621459.1966.10480879 10.1109/ACCESS.2021.3123938 10.1109/5.726791 10.1109/ICCV.2017.155 10.1016/j.neucom.2021.07.045 10.1145/3414685.3417786 10.3390/info13040176 |
| ContentType | Journal Article |
| Copyright | 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. |
| Copyright_xml | – notice: 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. |
| DBID | AAYXX CITATION ABUWG AFKRA AZQEC BENPR CCPQU DWQXO PHGZM PHGZT PIMPY PKEHL PQEST PQQKQ PQUKI PRINS DOA |
| DOI | 10.3390/app12157829 |
| DatabaseName | CrossRef ProQuest Central (Alumni) ProQuest Central UK/Ireland ProQuest Central Essentials ProQuest Central ProQuest One ProQuest Central Korea ProQuest Central Premium ProQuest One Academic (New) Publicly Available Content Database ProQuest One Academic Middle East (New) ProQuest One Academic Eastern Edition (DO NOT USE) ProQuest One Academic (retired) ProQuest One Academic UKI Edition ProQuest Central China DOAJ Directory of Open Access Journals (ODIN) |
| DatabaseTitle | CrossRef Publicly Available Content Database ProQuest One Academic Middle East (New) ProQuest Central Essentials ProQuest One Academic Eastern Edition ProQuest Central (Alumni Edition) ProQuest One Community College ProQuest Central China ProQuest Central ProQuest One Academic UKI Edition ProQuest Central Korea ProQuest Central (New) ProQuest One Academic ProQuest One Academic (New) |
| DatabaseTitleList | Publicly Available Content Database CrossRef |
| Database_xml | – sequence: 1 dbid: DOA name: DOAJ Directory of Open Access Journals url: https://www.doaj.org/ sourceTypes: Open Website – sequence: 2 dbid: PIMPY name: Publicly Available Content Database (ProQuest) url: http://search.proquest.com/publiccontent sourceTypes: Aggregation Database |
| DeliveryMethod | fulltext_linktorsrc |
| Discipline | Engineering Sciences (General) |
| EISSN | 2076-3417 |
| ExternalDocumentID | oai_doaj_org_article_7749a2fc55bd4415a03d5ee7d2d8e281 10_3390_app12157829 |
| GroupedDBID | .4S 2XV 5VS 7XC 8CJ 8FE 8FG 8FH AADQD AAFWJ AAYXX ADBBV ADMLS AFFHD AFKRA AFPKN AFZYC ALMA_UNASSIGNED_HOLDINGS APEBS ARCSS BCNDV BENPR CCPQU CITATION CZ9 D1I D1J D1K GROUPED_DOAJ IAO IGS ITC K6- K6V KC. KQ8 L6V LK5 LK8 M7R MODMG M~E OK1 P62 PHGZM PHGZT PIMPY PROAC TUS ABUWG AZQEC DWQXO PKEHL PQEST PQQKQ PQUKI PRINS |
| ID | FETCH-LOGICAL-c364t-d228d1547a6f6c5ae8941b8c1bb11f65d1d7a19b50fa7c86d4aad1bf7fb7b80c3 |
| IEDL.DBID | PIMPY |
| ISICitedReferencesCount | 2 |
| ISICitedReferencesURI | http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000839023400001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| ISSN | 2076-3417 |
| IngestDate | Tue Oct 14 19:02:33 EDT 2025 Mon Jun 30 07:32:55 EDT 2025 Tue Nov 18 21:31:39 EST 2025 Sat Nov 29 07:13:19 EST 2025 |
| IsDoiOpenAccess | true |
| IsOpenAccess | true |
| IsPeerReviewed | true |
| IsScholarly | true |
| Issue | 15 |
| Language | English |
| LinkModel | DirectLink |
| MergedId | FETCHMERGED-LOGICAL-c364t-d228d1547a6f6c5ae8941b8c1bb11f65d1d7a19b50fa7c86d4aad1bf7fb7b80c3 |
| Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
| ORCID | 0000-0002-1669-5519 0000-0003-3821-8112 |
| OpenAccessLink | https://www.proquest.com/publiccontent/docview/2700539814?pq-origsite=%requestingapplication% |
| PQID | 2700539814 |
| PQPubID | 2032433 |
| ParticipantIDs | doaj_primary_oai_doaj_org_article_7749a2fc55bd4415a03d5ee7d2d8e281 proquest_journals_2700539814 crossref_citationtrail_10_3390_app12157829 crossref_primary_10_3390_app12157829 |
| PublicationCentury | 2000 |
| PublicationDate | 2022-08-01 |
| PublicationDateYYYYMMDD | 2022-08-01 |
| PublicationDate_xml | – month: 08 year: 2022 text: 2022-08-01 day: 01 |
| PublicationDecade | 2020 |
| PublicationPlace | Basel |
| PublicationPlace_xml | – name: Basel |
| PublicationTitle | Applied sciences |
| PublicationYear | 2022 |
| Publisher | MDPI AG |
| Publisher_xml | – name: MDPI AG |
| References | ref_50 ref_14 ref_58 ref_13 LeCun (ref_33) 1998; 86 ref_57 ref_56 ref_11 ref_55 ref_10 ref_54 ref_53 Choi (ref_61) 2020; 14 ref_52 ref_51 ref_19 Colbert (ref_17) 2021; 9 ref_15 ref_59 Paszke (ref_47) 2019; 32 ref_60 ref_25 ref_24 ref_23 ref_22 ref_21 ref_20 ref_64 ref_63 ref_62 ref_29 ref_28 ref_27 ref_26 Liang (ref_8) 2021; 461 Knight (ref_31) 1966; 61 ref_36 ref_35 ref_34 ref_39 ref_38 ref_37 Levenshtein (ref_32) 1966; Volume 10 ref_46 Blalock (ref_18) 2020; 2 ref_45 ref_44 Jha (ref_16) 2021; 120 ref_43 ref_42 Thomas (ref_49) 2020; 39 ref_41 ref_40 ref_1 ref_3 Jain (ref_30) 2020; 2 ref_2 Louizos (ref_12) 2020; 33 ref_48 ref_9 ref_5 ref_4 ref_7 ref_6 |
| References_xml | – ident: ref_43 doi: 10.1007/978-3-319-24574-4_28 – ident: ref_9 – ident: ref_25 doi: 10.1007/978-3-030-58526-6_16 – ident: ref_37 doi: 10.1109/CVPR.2016.90 – ident: ref_5 – ident: ref_55 – ident: ref_26 – ident: ref_22 doi: 10.1109/CVPR.2019.01152 – volume: Volume 10 start-page: 707 year: 1966 ident: ref_32 article-title: Binary codes capable of correcting deletions, insertions, and reversals publication-title: Soviet Physics—Doklady – ident: ref_41 doi: 10.1109/CVPR.2018.00474 – ident: ref_14 doi: 10.1109/CVPRW.2017.241 – ident: ref_39 – ident: ref_51 doi: 10.1109/CVPR42600.2020.00225 – ident: ref_42 – ident: ref_1 – ident: ref_35 – ident: ref_23 – ident: ref_24 doi: 10.1109/ICCV.2017.541 – ident: ref_58 – ident: ref_27 doi: 10.1109/CVPR.2018.00286 – ident: ref_45 doi: 10.1109/CVPR.2016.350 – ident: ref_29 doi: 10.1201/9781003162810-13 – ident: ref_4 – ident: ref_56 – ident: ref_6 doi: 10.1109/CVPR42600.2020.00215 – ident: ref_52 – ident: ref_48 – ident: ref_10 – volume: 33 start-page: 5741 year: 2020 ident: ref_12 article-title: Bayesian bits: Unifying quantization and pruning publication-title: Adv. Neural Inf. Process. Syst. – ident: ref_13 – ident: ref_62 – ident: ref_38 – ident: ref_20 – ident: ref_59 – ident: ref_28 – ident: ref_53 – ident: ref_3 – ident: ref_34 – ident: ref_60 doi: 10.1109/SAUPEC/RobMech/PRASA48453.2020.9041096 – ident: ref_40 doi: 10.1109/CVPR.2017.243 – ident: ref_44 doi: 10.1109/CVPR.2017.353 – ident: ref_57 doi: 10.1109/CVPR.2019.00289 – ident: ref_11 doi: 10.1109/IJCNN.2019.8852476 – ident: ref_63 – volume: 14 start-page: 715 year: 2020 ident: ref_61 article-title: Universal deep neural network compression publication-title: IEEE J. Sel. Top. Signal Process. doi: 10.1109/JSTSP.2020.2975903 – ident: ref_21 – ident: ref_36 doi: 10.1109/ITW.2015.7133169 – volume: 61 start-page: 436 year: 1966 ident: ref_31 article-title: A computer method for calculating Kendall’s tau with ungrouped data publication-title: J. Am. Stat. Assoc. doi: 10.1080/01621459.1966.10480879 – volume: 9 start-page: 147967 year: 2021 ident: ref_17 article-title: An Energy-Efficient Edge Computing Paradigm for Convolution-Based Image Upsampling publication-title: IEEE Access doi: 10.1109/ACCESS.2021.3123938 – ident: ref_54 – ident: ref_2 – volume: 2 start-page: 129 year: 2020 ident: ref_18 article-title: What is the state of neural network pruning? publication-title: Proc. Mach. Learn. Syst. – volume: 86 start-page: 2278 year: 1998 ident: ref_33 article-title: Gradient-based learning applied to document recognition publication-title: Proc. IEEE doi: 10.1109/5.726791 – ident: ref_46 – ident: ref_15 doi: 10.1109/ICCV.2017.155 – ident: ref_7 doi: 10.1109/SAUPEC/RobMech/PRASA48453.2020.9041096 – volume: 461 start-page: 370 year: 2021 ident: ref_8 article-title: Pruning and quantization for deep neural network acceleration: A survey publication-title: Neurocomputing doi: 10.1016/j.neucom.2021.07.045 – ident: ref_64 – volume: 120 start-page: x109 year: 2021 ident: ref_16 article-title: Data-type aware arithmetic intensity for deep neural networks publication-title: Energy – volume: 2 start-page: 112 year: 2020 ident: ref_30 article-title: Trained quantization thresholds for accurate and efficient fixed-point inference of deep neural networks publication-title: Proc. Mach. Learn. Syst. – volume: 39 start-page: 1 year: 2020 ident: ref_49 article-title: A reduced-precision network for image reconstruction publication-title: ACM Trans. Graph. Tog doi: 10.1145/3414685.3417786 – ident: ref_19 – volume: 32 start-page: 8026 year: 2019 ident: ref_47 article-title: Pytorch: An imperative style, high-performance deep learning library publication-title: Adv. Neural Inf. Process. Syst. – ident: ref_50 doi: 10.3390/info13040176 |
| SSID | ssj0000913810 |
| Score | 2.2539988 |
| Snippet | Pruning and quantization are core techniques used to reduce the inference costs of deep neural networks. Among the state-of-the-art pruning techniques,... |
| SourceID | doaj proquest crossref |
| SourceType | Open Website Aggregation Database Enrichment Source Index Database |
| StartPage | 7829 |
| SubjectTerms | Algorithms channel pruning Energy consumption joint pruning layerwise pruning Neural networks Performance evaluation quantization Sparsity |
| SummonAdditionalLinks | – databaseName: DOAJ Directory of Open Access Journals (ODIN) dbid: DOA link: http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1Ra9swEBYj9GF9KG220nTp0EMe2oGpJcuW9NiOhVFCyehW8mYknVQCwSlxsv79SrJSUlrYy17NYZu709136O47hEacGApSs6xgxmYMrMqkh_FZJXKgxlHrIqXQ_YTf3orZTE53Vn2FnrCOHrhT3KWHJ1JRZ8pSQ8D-Ki-gtJYDBWFpHLqmOZc7xVSMwZIE6qpuIK_wdX24Dw5ECj4hylcpKDL1vwnEMbuMD9FBgoX4qvudI_TBNn20v0MW2EdH6Ri2-DxxRV98QovEj_qAJ8unbLpKG3PwXaSF3awsYB8amq7Xu8WxQQDfLOfNGk-UR9tP89biMGHQ2AWerjbxVaoB7LFogLP418arPs1qfkZ_xj9-f_-ZpQUKmSkqts6AUgEeI3FVucqUygrJiBaGaE2Iq0ogwBWRusyd4kZUwJQCoh13mmuRm-IY9ZplY08Q5trjJA_mgEHBSkslYURap6QlILgoBujbVqe1SeziYcnFovZVRjBAvWOAARq9CD92pBrvi10H47yIBCbs-MD7R538o_6XfwzQcGvaOh3Ptg637WUhBWGn_-MbX9BHGqYiYl_gEPW8ie0Z2jN_1_N29TV65jMOzupk priority: 102 providerName: Directory of Open Access Journals |
| Title | Learning Low-Precision Structured Subnetworks Using Joint Layerwise Channel Pruning and Uniform Quantization |
| URI | https://www.proquest.com/docview/2700539814 https://doaj.org/article/7749a2fc55bd4415a03d5ee7d2d8e281 |
| Volume | 12 |
| WOSCitedRecordID | wos000839023400001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| hasFullText | 1 |
| inHoldings | 1 |
| isFullTextHit | |
| isPrint | |
| journalDatabaseRights | – providerCode: PRVAON databaseName: DOAJ Directory of Open Access Journals customDbUrl: eissn: 2076-3417 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0000913810 issn: 2076-3417 databaseCode: DOA dateStart: 20110101 isFulltext: true titleUrlDefault: https://www.doaj.org/ providerName: Directory of Open Access Journals – providerCode: PRVHPJ databaseName: ROAD: Directory of Open Access Scholarly Resources customDbUrl: eissn: 2076-3417 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0000913810 issn: 2076-3417 databaseCode: M~E dateStart: 20110101 isFulltext: true titleUrlDefault: https://road.issn.org providerName: ISSN International Centre – providerCode: PRVPQU databaseName: ProQuest Central customDbUrl: eissn: 2076-3417 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0000913810 issn: 2076-3417 databaseCode: BENPR dateStart: 20110101 isFulltext: true titleUrlDefault: https://www.proquest.com/central providerName: ProQuest – providerCode: PRVPQU databaseName: Publicly Available Content Database (ProQuest) customDbUrl: eissn: 2076-3417 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0000913810 issn: 2076-3417 databaseCode: PIMPY dateStart: 20110101 isFulltext: true titleUrlDefault: http://search.proquest.com/publiccontent providerName: ProQuest |
| link | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1Nb9QwEB3BLgd6AFqoutCufOgBkKLGjhPbJ0RRK0DLKnyqnCJ_pVpplZRkl_59bMe7LQJx4ppYUaQ3nnn2zLwBOGZYEyMUTTKqbUKNlYlwND4peGqIromtg6TQtxmbz_nFhShje3Qfyyo3PjE46kHt2ddtOyd8Ylrtb8xPfLo0zwTH9NXVj8TPkPK51jhQ4y6MvfBWOoJx-e5D-X175-I1MDlOhza9zJ32fZbYyyu4MCl-C0xBv_8P9xxizvnD__u3j-BB5J7o9WAsu3DHNnuwc0uRcA92417v0fMoSP3iMSyjCOslmrXXSdnFsTzoc9CeXXfWIOd_mqGgvEehCgG9bxfNCs2ko_TXi94i38bQ2CUqu3X4lGwMcoTXc2b0ce3wjQ2hT-Dr-dmXN2-TOKUh0VlBV4khhBtHxJgs6kLn0nJBseIaK4VxXeQGGyaxUHlaS6Z5YaiUBqua1YopnupsH0ZN29gDQEw5MuYYo6Emo7klAlMsbC2FxYYznk3g5QaiSkcJcz9JY1m5o4zHs7qF5wSOt4uvBuWOvy879Vhvl3i57fCg7S6ruHsrx5GFJLXOc2X8AVSmmcmtZYYYbgnHEzjcmEEVfUBf3aD-9N-vn8F94psqQlnhIYwcePYI7umfq0XfTWF8ejYvP03DbcE0mvQvZ3kH1w |
| linkProvider | ProQuest |
| linkToHtml | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw1V1Nb9QwEB2VLRJwAFpALBTwoUiAFBE7TmIfEOKr6tJ0FUSpyin4K9VKq6Qku6z4U_xG7MRZikDceuCaWD7YzzNv7Jk3ALspVkRzSYOIKhNQbUTALY0PEhZqokpiyk5S6DhLp1N2csLzDfgx1MK4tMrBJnaGWtfK3ZE_dw-kccQZpi_Pvgaua5R7XR1aaPSwODDfVzZka19M3tr9fUzI3rujN_uB7yoQqCihi0ATwrQlDqlIykTFwjBOsWQKS4lxmcQa61RgLuOwFKliiaZCaCzLtJSpZKGK7LyXYJNasIcj2Mwnh_nn9a2OU9lkOOwLAaOIh-4d2gk4WEfMf3N9XYeAPxxA59X2bvxv63ETrnv-jF71gN-CDVNtw7VzqorbsOXtVYueeFHtp7dg7oVkT1FWr4K88a2F0MdOP3fZGI2sDa36pPgWdZkU6H09qxYoEzYsWc1ag1wpRmXmKG-W3VSi0siSdsf70Yelxagvar0Nny5kDe7AqKorcxdQKi2htKxXUx3R2BCOKeamFNxgzVIWjeHZAIJCeRl21w1kXthwzCGmOIeYMeyuB5_16iN_H_baoWk9xEmGdx_q5rTwFqiwPJ8LUqo4ltoF0SKMdGxMqolmhjA8hp0BaIW3Y23xC2X3_v37EVzZPzrMimwyPbgPV4krEunSJHdgZDfSPIDL6tti1jYP_ZFB8OWiUfkT38dZKw |
| linkToPdf | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw1V1Nb9QwEB2VLUJwAFpALBTwoUiAFDX-SGwfEKK0K0pXq-Wr6i04tlOttEpKssuKv8avw06cpQjErQeuieWD_Tzzxp55A7DLsSZG5iyiTNuIGasi6Wh8lIrYEF0QW7SSQidjPpmI01M53YAffS2MT6vsbWJrqE2l_R35nn8gTagUmO0VIS1iejB6df418h2k_Etr306jg8ix_b5y4Vvz8ujA7fVTQkaHn968jUKHgUjTlC0iQ4gwjkRwlRapTpQVkuFcaJznGBdpYrDhCss8iQvFtUgNU8rgvOBFznMRa-rmvQKbnLqgZwCb-4eT6Yf1DY9X3BQ47ooCKZWxf5P2Yg7OKcvf3GDbLeAPZ9B6uNGt_3ltbsPNwKvR6-4gbMGGLbfhxgW1xW3YCnasQc-C2PbzOzAPArNnaFytomkdWg6hj62u7rK2BjnbWnbJ8g1qMyzQu2pWLtBYuXBlNWss8iUapZ2jab1sp1KlQY7M-3gAvV867IZi17vw-VLW4B4Myqq09wHx3BFNx4YNM5QllkjMsLSFkhYbwQUdwoseEJkO8uy-S8g8c2GaR092AT1D2F0PPu9USf4-bN8jaz3ES4m3H6r6LAuWKXP8XypS6CTJjQ-uVUxNYi03xAhLBB7CTg-6LNi3JvuFuAf__v0ErjkoZuOjyfFDuE587UibPbkDA7eP9hFc1d8Ws6Z-HE4Pgi-XDcqf5jthxQ |
| openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Learning+Low-Precision+Structured+Subnetworks+Using+Joint+Layerwise+Channel+Pruning+and+Uniform+Quantization&rft.jtitle=Applied+sciences&rft.au=Zhang%2C+Xinyu&rft.au=Colbert%2C+Ian&rft.au=Das%2C+Srinjoy&rft.date=2022-08-01&rft.pub=MDPI+AG&rft.eissn=2076-3417&rft.volume=12&rft.issue=15&rft.spage=7829&rft_id=info:doi/10.3390%2Fapp12157829&rft.externalDBID=HAS_PDF_LINK |
| thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2076-3417&client=summon |
| thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2076-3417&client=summon |
| thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2076-3417&client=summon |