DistrEdge: Speeding up Convolutional Neural Network Inference on Distributed Edge Devices

As the number of edge devices with computing resources (e.g., embedded GPUs, mobile phones, and laptops) in-creases, recent studies demonstrate that it can be beneficial to col-laboratively run convolutional neural network (CNN) inference on more than one edge device. However, these studies make str...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Proceedings - IEEE International Parallel and Distributed Processing Symposium s. 1097 - 1107
Hlavní autoři: Hou, Xueyu, Guan, Yongjie, Han, Tao, Zhang, Ning
Médium: Konferenční příspěvek
Jazyk:angličtina
Vydáno: IEEE 01.05.2022
Témata:
ISSN:1530-2075
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:As the number of edge devices with computing resources (e.g., embedded GPUs, mobile phones, and laptops) in-creases, recent studies demonstrate that it can be beneficial to col-laboratively run convolutional neural network (CNN) inference on more than one edge device. However, these studies make strong assumptions on the devices' conditions, and their application is far from practical. In this work, we propose a general method, called DistrEdge, to provide CNN inference distribution strategies in environments with multiple IoT edge devices. By addressing heterogeneity in devices, network conditions, and nonlinear characters of CNN computation, DistrEdge is adaptive to a wide range of cases (e.g., with different network conditions, various device types) using deep reinforcement learning technology. We utilize the latest embedded AI computing devices (e.g., NVIDIA Jetson products) to construct cases of heterogeneous devices' types in the experiment. Based on our evaluations, DistrEdge can properly adjust the distribution strategy according to the devices' computing characters and the network conditions. It achieves 1.1 to 3 x speedup compared to state-of-the-art methods.
ISSN:1530-2075
DOI:10.1109/IPDPS53621.2022.00110