On Model Coding for Distributed Inference and Transmission in Mobile Edge Computing Systems

Consider a mobile edge computing system in which users wish to obtain the result of a linear inference operation on locally measured input data. Unlike the offloaded input data, the model weight matrix is distributed across the wireless Edge Nodes (ENs). ENs have non-deterministic computing times, a...

Full description

Saved in:
Bibliographic Details
Published in:IEEE communications letters Vol. 23; no. 6; pp. 1065 - 1068
Main Authors: Zhang, Jingjing, Simeone, Osvaldo
Format: Journal Article
Language:English
Published: New York IEEE 01.06.2019
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects:
ISSN:1089-7798, 1558-2558
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Consider a mobile edge computing system in which users wish to obtain the result of a linear inference operation on locally measured input data. Unlike the offloaded input data, the model weight matrix is distributed across the wireless Edge Nodes (ENs). ENs have non-deterministic computing times, and they can transmit any shared computed output back to the users cooperatively. This letter investigates the potential advantages obtained by coding model information prior to ENs' storage. Through an information-theoretic analysis, it is concluded that, while generally limiting cooperation opportunities, coding is instrumental in reducing the overall computation-plus-communication latency.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1089-7798
1558-2558
DOI:10.1109/LCOMM.2019.2911496