A new diffusion sparse RLS algorithm with improved convergence characteristics

A new sparsity aware recursive least squares (RLS) algorithm is proposed for distributed learning in a diffusion network. The algorithm deploys a RLS based adaptive filter at each node which is made sparsity aware by regularizing the conventional RLS cost function with a sparsity promoting penalty....

Full description

Saved in:
Bibliographic Details
Published in:IEEE International Conference on Circuits and Systems (Online) pp. 2651 - 2654
Main Authors: Das, Bijit Kumar, Chakraborty, Mrityunjoy
Format: Conference Proceeding Journal Article
Language:English
Published: IEEE 01.05.2016
Subjects:
ISSN:2379-447X
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:A new sparsity aware recursive least squares (RLS) algorithm is proposed for distributed learning in a diffusion network. The algorithm deploys a RLS based adaptive filter at each node which is made sparsity aware by regularizing the conventional RLS cost function with a sparsity promoting penalty. The regularization introduces certain "zero-attracting" terms in the RLS update equation which help in shrinkage of the coefficients. Each node shares its tap weight information with every other node in its neighborhood and refines its own estimate by linearly combining the incoming tap weight information from neighboring nodes by a set of pre-defined weights. Results on both first and second order convergence of the algorithm are also provided. As simulations show, the proposed scheme outperforms other existing algorithms both in terms of convergence speed and steady state excess mean square error.
Bibliography:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Conference-1
ObjectType-Feature-3
content type line 23
SourceType-Conference Papers & Proceedings-2
ISSN:2379-447X
DOI:10.1109/ISCAS.2016.7539138