Distributed Computation Offloading in Mobile Fog Computing: A Deep Neural Network Approach

In this letter, the performance of offloading is studied, which aims at minimizing the energy consumption in offloading with constraints on the delay. To solve binary computational offloading decision problem, a learning offloading algorithm based on distributed DNN is proposed, which uses multiple...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:IEEE communications letters Ročník 26; číslo 3; s. 696 - 700
Hlavní autoři: Yang, Zhongjun, Bai, Wenle
Médium: Journal Article
Jazyk:angličtina
Vydáno: New York IEEE 01.03.2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Témata:
ISSN:1089-7798, 1558-2558
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:In this letter, the performance of offloading is studied, which aims at minimizing the energy consumption in offloading with constraints on the delay. To solve binary computational offloading decision problem, a learning offloading algorithm based on distributed DNN is proposed, which uses multiple parallel DNNs to generate offloading decisions. The DNN are further improved by using the back-propagation method with cross-entropy as the loss function and using the newly generated decisions as a public training set. Besides, an innovative hierarchical offloading model is presented, based on which derives closed-form expressions for delay and energy in offloading. Then, the Delay-Energy Weighted Sum (DEWS) metric is defined and introduced as system utility to construct a gradient optimization problem to study. Extensive simulation indicate that the algorithm modified with the DEWS metric is able to achieve significant reduction on energy consumption with comparably less the total delay and maintain a high offloading accuracy.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1089-7798
1558-2558
DOI:10.1109/LCOMM.2021.3138800