Suchergebnisse - (( kernel recursive least squares algorithm ) OR ( kernel recursive past squares algorithm ))
-
1
Autoren:
Quelle: IEEE Transactions on Signal Processing; Aug2004, Vol. 52 Issue 8, p2275-2285, 11p
Schlagwörter: ALGORITHMS, ALGEBRA, ONLINE data processing, KERNEL functions, COMPLEX variables, LEAST squares
-
2
Autoren: et al.
Quelle: IEEE Transactions on Signal Processing; Oct2009, Vol. 57 Issue 10, p3801-3814, 14p, 4 Black and White Photographs, 2 Charts, 9 Graphs
Schlagwörter: KERNEL functions, HILBERT space, KALMAN filtering, ESTIMATION theory, RECURSIVE functions
-
3
Autoren:
Quelle: Nonlinear Dynamics; May2021, Vol. 104 Issue 3, p2515-2530, 16p
-
4
Autoren:
Quelle: Kernel Adaptive Filtering: A Comprehensive Introduction; 2010, p94-123, 30p
-
5
Autoren:
Quelle: Kernel Adaptive Filtering: A Comprehensive Introduction; 2010, p124-151, 28p
-
6
Autoren: et al.
Quelle: Journal of Animal Science. Nov2020, Vol. 98 Issue 11, p1-10. 10p. 3 Charts, 2 Graphs.
PDF-Volltext -
7
Autoren: et al.
Quelle: Artificial Neural Networks - ICANN 2006 (9783540388715); 2006, p381-390, 10p
-
8
Autoren: et al.
Weitere Verfasser: et al.
Schlagwörter: 620 - Ingeniería y operaciones afines, Machine learning, Forecasts, Kernel adaptative filtering, Dictionary, Learning rate, Kernel bandwidth, Clustering adaptive, Aprendizaje de máquina, Predicción, Filtros adaptativos Kernel, Diccionario, Tasa de aprendizaje, Ancho de banda del Kernel, Agrupamiento adaptativo
Dateibeschreibung: application/pdf
Relation: S. Shanmuganathan and S. Samarasinghe, Artificial neural network modelling, vol. 628. Springer, 2016.; Y. Cui, S. Ahmad, and J. Hawkins, “Continuous online sequence learning with an unsupervised neural network model,” Neural computation, vol. 28, no. 11, pp. 2474–2504, 2016.; C. Deb, F. Zhang, J. Yang, S. E. Lee, and K. W. Shah, “A review on time series forecasting techniques for building energy consumption,” Renewable and Sustainable Energy Reviews, vol. 74, pp. 902–924, 2017.; Y. Feng, P. Zhang, M. Yang, Q. Li, and A. Zhang, “Short term load forecasting of offshore oil field microgrids based on da-svm,” Energy Procedia, vol. 158, pp. 2448–2455, 2019.; B. Schölkopf, A. J. Smola, et al., Learning with kernels: support vector machines, regularization, optimization, and beyond. MIT press, 2002.; F. Girosi, M. Jones, and T. Poggio, “Regularization theory and neural networks architectures,” Neural computation, vol. 7, no. 2, pp. 219–269, 1995.; B. Schölkopf, A. Smola, and K.-R. Müller, “Nonlinear component analysis as a kernel eigenvalue problem,” Neural computation, vol. 10, no. 5, pp. 1299–1319, 1998.; W. Liu, P. P. Pokharel, and J. C. Principe, “The kernel least-mean-square algorithm,” IEEE Transactions on Signal Processing, vol. 56, no. 2, pp. 543–554, 2008.; Y. Engel, S. Mannor, and R. Meir, “The kernel recursive least-squares algorithm,” IEEE Transactions on signal processing, vol. 52, no. 8, pp. 2275–2285, 2004.; W. Liu, I. Park, Y. Wang, and J. C. Príncipe, “Extended kernel recursive least squares algorithm,” IEEE Transactions on Signal Processing, vol. 57, no. 10, pp. 3801–3814, 2009.; B. Chen, S. Zhao, P. Zhu, and J. C. Príncipe, “Quantized kernel least mean square algorithm,” IEEE Transactions on Neural Networks and Learning Systems, vol. 23, no. 1, pp. 22–32, 2012.; H. Fan and Q. Song, “A linear recurrent kernel online learning algorithm with sparse updates,” Neural Networks, vol. 50, pp. 142–153, 2014.; W. Liu, I. Park, and J. C. Principe, “An information theoretic approach of designing sparse kernel adaptive filters,” IEEE Transactions on Neural Networks, vol. 20, no. 12, pp. 1950–1961, 2009.; W. Ao, W.-Q. Xiang, Y.-P. Zhang, L. Wang, C.-Y. Lv, and Z.-H. Wang, “A new variable step size lms adaptive filtering algorithm,” in 2012 International Conference on Computer Science and Electronics Engineering, vol. 2, pp. 265–268, IEEE, 2012.; Q. Niu and T. Chen, “A new variable step size lms adaptive algorithm,” in 2018 Chinese Control And Decision Conference (CCDC), pp. 1–4, IEEE, 2018.; S. Garcia-Vega, X.-J. Zeng, and J. Keane, “Learning from data streams using kernel least-mean-square with multiple kernel-sizes and adaptive step-size,” Neurocomputing, vol. 339, pp. 105–115, 2019.; B. Chen, J. Liang, N. Zheng, and J. C. Príncipe, “Kernel least mean square with adaptive kernel size,” Neurocomputing, vol. 191, pp. 95–106, 2016.; W. W. Qitang Sun, Lujuan Dang and S. Wang, “Kernel least mean square algorithm with mixed kernel,” In Advanced Computational Intelligence (ICACI), 2018 Tenth International Conference, pp. 140–144, 2018.; J. Platt, A resource-allocating network for function interpolation. MIT Press, 1991.; L. Csató and M. Opper, “Sparse on-line gaussian processes,” Neural computation, vol. 14, no. 3, pp. 641–668, 2002.; K. Li and J. C. Principe, “Transfer learning in adaptive filters: The nearest instance centroid-estimation kernel least-mean-square algorithm,” IEEE Transactions on Signal Processing, vol. 65, no. 24, pp. 6520–6535, 2017.; G. Wahba, Spline models for observational data, vol. 59. Siam, 1990.; J. Racine, “An efficient cross-validation algorithm for window width selection for nonparametric kernel regression,” Communications in Statistics-Simulation and Computation, vol. 22, no. 4, pp. 1107–1114, 1993.; E. Herrmann, “Local bandwidth choice in kernel regression estimation,” Journal of Computational and Graphical Statistics, vol. 6, no. 1, pp. 35–54, 1997.; B. W. Silverman, Density estimation for statistics and data analysis. Routledge, 2018.; Y. Gao and S.-L. Xie, “A variable step size lms adaptive filtering algorithm and its analysis,” Acta Electronica Sinica, vol. 29, no. 8, pp. 1094–1097, 2001.; Y. Qian, “A new variable step size algorithm applied in lms adaptive signal processing,” in 2016 Chinese Control and Decision Conference (CCDC), pp. 4326–4329, IEEE, 2016.; UPME, Plan de Expansión de Referencia Generación-Transmisión 2017 - 2031. Unidad de Planeación Minero Energética, 2017.; S. Sagiroglu and D. Sinanc, “Big data: A review,” in 2013 International Conference on Collaboration Technologies and Systems (CTS), pp. 42–47, IEEE, 2013.; N. Aronszajn, “Theory of reproducing kernels,” Transactions of the American mathematical society, vol. 68, no. 3, pp. 337–404, 1950.; C. J. Burges, “A tutorial on support vector machines for pattern recognition,” Data mining and knowledge discovery, vol. 2, no. 2, pp. 121–167, 1998.; B. Chen, L. Li, W. Liu, and J. C. Príncipe, “Nonlinear adaptive filtering in kernel spaces,” in Springer Handbook of Bio-/Neuroinformatics, pp. 715–734, Springer, 2014.; C. Cheng, A. Sa-Ngasoongsong, O. Beyca, T. Le, H. Yang, Z. Kong, and S. T. Bukkapatnam, “Time series forecasting for nonlinear and non-stationary processes: a review and comparative study,” Iie Transactions, vol. 47, no. 10, pp. 1053–1071, 2015.; B. Scholkopf and A. J. Smola, Learning with kernels: support vector machines, regularization, optimization, and beyond. MIT press, 2001.; K. I. Kim, M. O. Franz, and B. Scholkopf, “Iterative kernel principal component analysis for image modeling,” IEEE transactions on pattern analysis and machine intelligence, vol. 27, no. 9, pp. 1351–1366, 2005.; T.-T. Frieß and R. F. Harrison, “A kernel based adaline.,” in ESANN, vol. 72, pp. 21–23, 1999.; S. An, W. Liu, and S. Venkatesh, “Fast cross-validation algorithms for least squares support vector machine and kernel ridge regression,” Pattern Recognition, vol. 40, no. 8, pp. 2154–2162, 2007.; G. C. Cawley and N. L. Talbot, “Efficient leave-one-out cross-validation of kernel fisher discriminant classifiers,” Pattern Recognition, vol. 36, no. 11, pp. 2585–2592, 2003.; W. Hardle, “ ‘applied nonparametric regression (cambridge: Cambridge university press),” 1990.; C. M. Bishop, Pattern recognition and machine learning. springer, 2006.; R. O. Duda, P. E. Hart, and D. G. Stork, Pattern classification. John Wiley & Sons, 2012.; A. K. Jain, “Data clustering: 50 years beyond k-means,” Pattern recognition letters, vol. 31, no. 8, pp. 651–666, 2010.; J. M. Keller, M. R. Gray, and J. A. Givens, “A fuzzy k-nearest neighbor algorithm,” IEEE transactions on systems, man, and cybernetics, no. 4, pp. 580–585, 1985.; K. Fu, Sequential methods in pattern recognition and machine learning, vol. 52. Academic press, 1968.; C. Richard, J. C. M. Bermudez, and P. Honeine, “Online prediction of time series data with kernels,” IEEE Transactions on Signal Processing, vol. 57, no. 3, pp. 1058–1067, 2008.; C. Henry and R. Williams, “Real-time recursive estimation of statistical parameters,” Analytica chimica acta, vol. 242, pp. 17–23, 1991.; C. G. Bezerra, B. S. J. Costa, L. A. Guedes, and P. P. Angelov, “A new evolving clustering algorithm for online data streams,” in 2016 IEEE Conference on Evolving and Adaptive Intelligent Systems (EAIS), pp. 162–168, IEEE, 2016.; A. G. Salman, Y. Heryadi, E. Abdurahman, and W. Suparta, “Single layer & multilayer long short-term memory (lstm) model with intermediate variables for weather forecasting,” Procedia Computer Science, vol. 135, pp. 89–98, 2018; M. Yukawa and R.-i. Ishii, “On adaptivity of online model selection method based on multikernel adaptive filtering,” in 2013 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference, pp. 1–6, IEEE, 2013.; M. Yukawa, “Multikernel adaptive filtering,” IEEE Transactions on Signal Processing, vol. 60, no. 9, pp. 4672–4682, 2012.; F. A. Tobar, S.-Y. Kung, and D. P. Mandic, “Multikernel least mean square algorithm,” IEEE Transactions on Neural Networks and Learning Systems, vol. 25, no. 2, pp. 265– 277, 2013.; T. Ishida and T. Tanaka, “Multikernel adaptive filters with multiple dictionaries and regularization,” in 2013 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference, pp. 1–6, IEEE, 2013.; Q. Sun, L. Dang, W. Wang, and S. Wang, “Kernel least mean square algorithm with mixed kernel,” in 2018 Tenth International Conference on Advanced Computational Intelligence (ICACI), pp. 140–144, IEEE, 2018.; T. Burton, D. Sharpe, N. Jenkins, and E. Bossanyi, Wind energy handbook. John Wiley & Sons, 2001.; M. Elshendy, A. F. Colladon, E. Battistoni, and P. A. Gloor, “Using four different online media sources to forecast the crude oil price,” Journal of Information Science, 2017.; P. J. Brockwell and R. A. Davis, Introduction to time series and forecasting. ingerir, 2016.; V. Kotu, “Chapter 12: Time series forecasting, editor (s): Vijay kotu, bala deshpande, data science,” 2019.; Q. Song, X. Zhao, Z. Feng, and B. Song, “Recursive least squares algorithm with adaptive forgetting factor based on echo state network,” in 2011 9th World Congress on Intelligent Control and Automation, pp. 295–298, IEEE, 2011.; S. Wen, R. Hu, Y. Yang, T. Huang, Z. Zeng, and Y.-D. Song, “Memristor-based echo state network with online least mean square,” IEEE Transactions on Systems, Man, and Cybernetics: Systems, no. 99, pp. 1–10, 2018.; W.-Y. Chang, “A literature review of wind forecasting methods,” Journal of Power and Energy Engineering, vol. 2, no. 4, 2014.; C. Voyant, G. Notton, S. Kalogirou, M.-L. Nivet, C. Paoli, F. Motte, and A. Fouilloy, “Machine learning methods for solar radiation forecasting: A review,” Renewable Energy, vol. 105, pp. 569–582, 2017.; G. Bontempi, S. B. Taieb, and Y.-A. Le Borgne, “Machine learning strategies for time series forecasting,” in European business intelligence summer school, pp. 62–77, Springer, 2012.; L. Yu, Y. Zhao, L. Tang, and Z. Yang, “Online big data-driven oil consumption forecasting with google trends,” International Journal of Forecasting, vol. 35, no. 1, pp. 213– 223, 2019.; O. Schaer, N. Kourentzes, and R. Fildes, “Demand forecasting with user-generated online; A. S.Weigend, Time series prediction: forecasting the future and understanding the past. Routledge, 2018.; F. Kaytez, M. C. Taplamacioglu, E. Cam, and F. Hardalac, “Forecasting electricity consumption: A comparison of regression analysis, neural networks and least squares support vector machines,” International Journal of Electrical Power & Energy Systems, vol. 67, pp. 431–438, 2015.; W. Liu, J. C. Principe, and S. Haykin, Kernel adaptive filtering: a comprehensive introduction, vol. 57. John Wiley & Sons, 2011.; https://repositorio.unal.edu.co/handle/unal/75985
Verfügbarkeit: https://repositorio.unal.edu.co/handle/unal/75985
-
9
Autoren: et al.
Quelle: Journal of Animal Breeding & Genetics. Sep2020, Vol. 137 Issue 5, p423-437. 15p.
Schlagworte: Forecasting, Path analysis (Statistics), Genealogy, Information resources, Kinship, Heritability
HTML Volltext PDF-Volltext -
10
Autoren: Vaerenbergh, Steven Van
Thesis Advisors: Santamaría Caballero, Ignacio, Universidad de Cantabria. Departamento de Ingeniería de Comunicaciones
Quelle: TDR (Tesis Doctorales en Red)
Schlagwörter: spectral clustering, multiple-input multiple-output systems (MIMO), blind equalization of nonlinear systems, identification of nonlinear systems, signal processing, kernel methods, machine learning, análisis de correlaciones canónicas con kernels, separación ciega de fuentes post no lineal, filtrado adaptativo mediante Kernels, agrupamiento espectral, sistemas de múltiples entradas y múltiples salidas, igualación ciega de sistemas no lineales, identificación de sistemas no lineales, procesado de señal, métodos kernel, aprendizaje máquina, kernel adaptive filtering, postnonlinear blind source separation (BSS), adaptive kernel canonical correlation analysis, Teoría de la Señal y Comunicaciones
Time: 621.3
Dateibeschreibung: application/pdf
-
11
Autoren: et al.
Quelle: Entropy; Jun2019, Vol. 21 Issue 6, p588, 1p
-
12
Autoren:
Resource Type: eBook.
Schlagworte: Kernel functions, Adaptive filters
Categories: SCIENCE / Waves & Wave Mechanics
-
13
Autoren: et al.
Relation: Norges forskningsråd: 274717
Verfügbarkeit: https://hdl.handle.net/11250/3047904
-
14
Autoren:
Quelle: 2021 IEEE 93rd Vehicular Technology Conference (VTC2021-Spring). :1-5
-
15
Autoren:
Quelle: Nonlinear Dynamics. 104:2515-2530
Schlagwörter: 0202 electrical engineering, electronic engineering, information engineering, 02 engineering and technology
Zugangs-URL: https://experts.illinois.edu/en/publications/deep-
kernel -recursive -least -squares -algorithm
https://link.springer.com/article/10.1007/s11071-021-06416-0 -
16
Autoren: et al.
Quelle: IEEE Signal Processing Letters. 27:361-365
Schlagwörter: 0202 electrical engineering, electronic engineering, information engineering, 02 engineering and technology
-
17
Autoren:
Quelle: IEEE Signal Processing Letters. 27:1365-1369
Schlagwörter: 0202 electrical engineering, electronic engineering, information engineering, 02 engineering and technology
-
18
Autoren:
Quelle: Turkish Journal of Veterinary & Animal Sciences. 2019, Vol. 43 Issue 4, p500-506. 7p.
Schlagworte: Body weight, Machine learning, Sheep, Standard deviations, Support vector machines, Sheep breeds, Body-weight-supported treadmill training
Geografische Kategorien: Pakistan
PDF-Volltext -
19
Autoren:
Quelle: IEEE Access, Vol 6, Pp 74687-74698 (2018)
Schlagwörter: Health diagnosis, kernel adaptive filtering, pruning method, Electrical engineering. Electronics. Nuclear engineering, TK1-9971
Dateibeschreibung: electronic resource
-
20
Autoren: et al.
Quelle: IEEE Transactions on Cybernetics. 49:1160-1172
Schlagwörter: 13. Climate action, 0202 electrical engineering, electronic engineering, information engineering, 02 engineering and technology
Zugangs-URL: https://pubmed.ncbi.nlm.nih.gov/29994647
https://www.ncbi.nlm.nih.gov/pubmed/29994647
https://europepmc.org/article/MED/29994647
https://jglobal.jst.go.jp/en/detail?JGLOBAL_ID=201902234411479087
https://doi.org/10.1109/TCYB.2018.2789686
https://ieeexplore.ieee.org/document/8291833
http://ieeexplore.ieee.org/document/8291833/
Full Text Finder
Nájsť tento článok vo Web of Science