A privacy protection approach in edge-computing based on maximized dnn partition strategy with energy saving

Saved in:
Bibliographic Details
Title: A privacy protection approach in edge-computing based on maximized dnn partition strategy with energy saving
Authors: Guo Chaopeng, Lin Zhengqing, Song Jie
Source: Journal of Cloud Computing: Advances, Systems and Applications, Vol 12, Iss 1, Pp 1-16 (2023)
Publisher Information: SpringerOpen, 2023.
Publication Year: 2023
Collection: LCC:Computer engineering. Computer hardware
LCC:Electronic computers. Computer science
Subject Terms: Privacy protection, Edge-intelligent, DNN partition, Mixed-precision quantization, Energy optimization, Computer engineering. Computer hardware, TK7885-7895, Electronic computers. Computer science, QA75.5-76.95
Description: Abstract With the development of deep neural network (DNN) techniques, applications of DNNs show state-of-art performance. In the cloud edge collaborative mode, edge devices upload the raw data, such as texts, images, and videos, to the cloud for processing. Then, the cloud returns prediction or classification results. Although edge devices take advantage of the powerful performance of DNN, there are also colossal privacy protection risks. DNN partition strategy can effectively solve the privacy problems by offload part of the DNN model to the edge, in which the encoded features are transmitted rather than original data. We explore the relationship between privacy and the intermedia result of the DNN. The more parts offloaded to the edge, the more abstract features we can have, indicating more conducive to privacy protection. We propose a privacy protection approach based on a maximum DNN partition strategy. Besides, a mix-precision quantization approach is adopted to reduce the energy use of edge devices. The experiments show that our method manages to increase at most 20% model privacy in various DNN architecture. Through the energy-aware mixed-precision quantization approach, the model’s energy consumption is reduced by at most 5x comparing to the typical edge-cloud solution.
Document Type: article
File Description: electronic resource
Language: English
ISSN: 2192-113X
Relation: https://doaj.org/toc/2192-113X
DOI: 10.1186/s13677-023-00404-y
Access URL: https://doaj.org/article/bf12e847683443868dbb3b2e28ec0e3b
Accession Number: edsdoj.bf12e847683443868dbb3b2e28ec0e3b
Database: Directory of Open Access Journals
Description
Abstract:Abstract With the development of deep neural network (DNN) techniques, applications of DNNs show state-of-art performance. In the cloud edge collaborative mode, edge devices upload the raw data, such as texts, images, and videos, to the cloud for processing. Then, the cloud returns prediction or classification results. Although edge devices take advantage of the powerful performance of DNN, there are also colossal privacy protection risks. DNN partition strategy can effectively solve the privacy problems by offload part of the DNN model to the edge, in which the encoded features are transmitted rather than original data. We explore the relationship between privacy and the intermedia result of the DNN. The more parts offloaded to the edge, the more abstract features we can have, indicating more conducive to privacy protection. We propose a privacy protection approach based on a maximum DNN partition strategy. Besides, a mix-precision quantization approach is adopted to reduce the energy use of edge devices. The experiments show that our method manages to increase at most 20% model privacy in various DNN architecture. Through the energy-aware mixed-precision quantization approach, the model’s energy consumption is reduced by at most 5x comparing to the typical edge-cloud solution.
ISSN:2192113X
DOI:10.1186/s13677-023-00404-y