Online sequential echo state network with sparse RLS algorithm for time series prediction
Recently, the echo state networks (ESNs) have been widely used for time series prediction. To meet the demand of actual applications and avoid the overfitting issue, the online sequential ESN with sparse recursive least squares (OSESN-SRLS) algorithm is proposed. Firstly, the ℓ0 and ℓ1 norm sparsity...
Saved in:
| Published in: | Neural networks Vol. 118; pp. 32 - 42 |
|---|---|
| Main Authors: | , , , , |
| Format: | Journal Article |
| Language: | English |
| Published: |
United States
Elsevier Ltd
01.10.2019
|
| Subjects: | |
| ISSN: | 0893-6080, 1879-2782, 1879-2782 |
| Online Access: | Get full text |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Recently, the echo state networks (ESNs) have been widely used for time series prediction. To meet the demand of actual applications and avoid the overfitting issue, the online sequential ESN with sparse recursive least squares (OSESN-SRLS) algorithm is proposed. Firstly, the ℓ0 and ℓ1 norm sparsity penalty constraints of output weights are separately employed to control the network size. Secondly, the sparse recursive least squares (SRLS) algorithm and the subgradients technique are combined to estimate the output weight matrix. Thirdly, an adaptive selection mechanism for the ℓ0 or ℓ1 norm regularization parameter is designed. With the selected regularization parameter, it is proved that the developed SRLS shows comparable or better performance than the regular RLS. Furthermore, the convergence of OSESN-SRLS is theoretically analyzed to guarantee its effectiveness. Simulation results illustrate that the proposed OSESN-SRLS always outperforms other existing ESNs in terms of estimation accuracy and network compactness.
•The online sequential ESN with sparse RLS algorithm is studied to improve estimation accuracy and network compactness.•The network size is controlled by the ℓ0 and ℓ1 norm sparsity penalty constraints.•The estimation performance is improved by the regularization parameters selection rule.•The algorithm convergence is analyzed to guarantee its effectiveness. |
|---|---|
| Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 |
| ISSN: | 0893-6080 1879-2782 1879-2782 |
| DOI: | 10.1016/j.neunet.2019.05.006 |