Cache Placement in Fog-RANs: From Centralized to Distributed Algorithms

To deal with the rapid growth of high-speed and/or ultra-low latency data traffic for massive mobile users, fog radio access networks (Fog-RANs) have emerged as a promising architecture for next-generation wireless networks. In Fog-RANs, the edge nodes and user terminals possess storage, computation...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on wireless communications Jg. 16; H. 11; S. 7039 - 7051
Hauptverfasser: Liu, Juan, Bo Bai, Jun Zhang, Letaief, Khaled B.
Format: Journal Article
Sprache:Englisch
Veröffentlicht: IEEE 01.11.2017
Schlagworte:
ISSN:1536-1276
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:To deal with the rapid growth of high-speed and/or ultra-low latency data traffic for massive mobile users, fog radio access networks (Fog-RANs) have emerged as a promising architecture for next-generation wireless networks. In Fog-RANs, the edge nodes and user terminals possess storage, computation and communication functionalities to various degrees, which provide high flexibility for network operation, i.e., from fully centralized to fully distributed operation. In this paper, we study the cache placement problem in Fog-RANs, by taking into account flexible physical-layer transmission schemes and diverse content preferences of different users. We develop both centralized and distributed transmission aware cache placement strategies to minimize users' average download delay subject to the storage capacity constraints. In the centralized mode, the cache placement problem is transformed into a matroid constrained submodular maximization problem, and an approximation algorithm is proposed to find a solution within a constant factor to the optimum. In the distributed mode, a belief propagation-based distributed algorithm is proposed to provide a suboptimal solution, with iterative updates at each BS based on locally collected information. Simulation results show that by exploiting caching and cooperation gains, the proposed transmission aware caching algorithms can greatly reduce the users' average download delay.
ISSN:1536-1276
DOI:10.1109/TWC.2017.2737015