Representation and compression of Residual Neural Networks through a multilayer network based approach
In recent years different types of Residual Neural Networks (ResNets, for short) have been introduced to improve the performance of deep Convolutional Neural Networks. To cope with the possible redundancy of the layer structure of ResNets and to use them on devices with limited computational capabil...
Uložené v:
| Vydané v: | Expert systems with applications Ročník 215; s. 119391 |
|---|---|
| Hlavní autori: | , , , , , , |
| Médium: | Journal Article |
| Jazyk: | English |
| Vydavateľské údaje: |
Elsevier Ltd
01.04.2023
|
| Predmet: | |
| ISSN: | 0957-4174, 1873-6793 |
| On-line prístup: | Získať plný text |
| Tagy: |
Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
|
| Shrnutí: | In recent years different types of Residual Neural Networks (ResNets, for short) have been introduced to improve the performance of deep Convolutional Neural Networks. To cope with the possible redundancy of the layer structure of ResNets and to use them on devices with limited computational capabilities, several tools for exploring and compressing such networks have been proposed. In this paper, we provide a contribution in this setting. In particular, we propose an approach for the representation and compression of a ResNet based on the use of a multilayer network. This is a structure sufficiently powerful to represent and manipulate a ResNet, as well as other families of deep neural networks. Our compression approach uses a multilayer network to represent a ResNet and to identify the possible redundant convolutional layers belonging to it. Once such layers are identified, it prunes them and some related ones obtaining a new compressed ResNet. Experimental results demonstrate the suitability and effectiveness of the proposed approach.
•Representation of a ResNet model through a multilayer network.•An approach to compress a ResNet through its multilayer network representation.•A compression approach tested on ResNet20,ResNet56,ResNet110 and CIFAR10, CIFAR100. |
|---|---|
| ISSN: | 0957-4174 1873-6793 |
| DOI: | 10.1016/j.eswa.2022.119391 |