Sketch Kernel Ridge Regression Using Circulant Matrix: Algorithm and Theory

Kernel ridge regression (KRR) is a powerful method for nonparametric regression. The time and space complexity of computing the KRR estimate directly are <inline-formula> <tex-math notation="LaTeX">\mathcal {O}(n^{3}) </tex-math></inline-formula> and <inline-form...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:IEEE transaction on neural networks and learning systems Ročník 31; číslo 9; s. 3512 - 3524
Hlavní autoři: Yin, Rong, Liu, Yong, Wang, Weiping, Meng, Dan
Médium: Journal Article
Jazyk:angličtina
Vydáno: Piscataway IEEE 01.09.2020
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Témata:
ISSN:2162-237X, 2162-2388, 2162-2388
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Abstract Kernel ridge regression (KRR) is a powerful method for nonparametric regression. The time and space complexity of computing the KRR estimate directly are <inline-formula> <tex-math notation="LaTeX">\mathcal {O}(n^{3}) </tex-math></inline-formula> and <inline-formula> <tex-math notation="LaTeX">\mathcal {O}(n^{2}) </tex-math></inline-formula>, respectively, which are prohibitive for large-scale data sets, where <inline-formula> <tex-math notation="LaTeX">n </tex-math></inline-formula> is the number of data. In this article, we propose a novel random sketch technique based on the circulant matrix that achieves savings in storage space and accelerates the solution of the KRR approximation. The circulant matrix has the following advantages: It can save time complexity by using the fast Fourier transform (FFT) to compute the product of matrix and vector, its space complexity is linear, and the circulant matrix, whose entries in the first column are independent of each other and obey the Gaussian distribution, is almost as effective as the i.i.d. Gaussian random matrix for approximating KRR. Combining the characteristics of the circulant matrix and our careful design, theoretical analysis and experimental results demonstrate that our proposed sketch method, making the estimate kernel methods scalable and practical for large-scale data problems, outperforms the state-of-the-art KRR estimates in time complexity while retaining similar accuracies. Meanwhile, our sketch method provides the theoretical bound that keeps the optimal convergence rate for approximating KRR.
AbstractList Kernel ridge regression (KRR) is a powerful method for nonparametric regression. The time and space complexity of computing the KRR estimate directly are [Formula Omitted] and [Formula Omitted], respectively, which are prohibitive for large-scale data sets, where [Formula Omitted] is the number of data. In this article, we propose a novel random sketch technique based on the circulant matrix that achieves savings in storage space and accelerates the solution of the KRR approximation. The circulant matrix has the following advantages: It can save time complexity by using the fast Fourier transform (FFT) to compute the product of matrix and vector, its space complexity is linear, and the circulant matrix, whose entries in the first column are independent of each other and obey the Gaussian distribution, is almost as effective as the i.i.d. Gaussian random matrix for approximating KRR. Combining the characteristics of the circulant matrix and our careful design, theoretical analysis and experimental results demonstrate that our proposed sketch method, making the estimate kernel methods scalable and practical for large-scale data problems, outperforms the state-of-the-art KRR estimates in time complexity while retaining similar accuracies. Meanwhile, our sketch method provides the theoretical bound that keeps the optimal convergence rate for approximating KRR.
Kernel ridge regression (KRR) is a powerful method for nonparametric regression. The time and space complexity of computing the KRR estimate directly are O(n3) and O(n2) , respectively, which are prohibitive for large-scale data sets, where n is the number of data. In this article, we propose a novel random sketch technique based on the circulant matrix that achieves savings in storage space and accelerates the solution of the KRR approximation. The circulant matrix has the following advantages: It can save time complexity by using the fast Fourier transform (FFT) to compute the product of matrix and vector, its space complexity is linear, and the circulant matrix, whose entries in the first column are independent of each other and obey the Gaussian distribution, is almost as effective as the i.i.d. Gaussian random matrix for approximating KRR. Combining the characteristics of the circulant matrix and our careful design, theoretical analysis and experimental results demonstrate that our proposed sketch method, making the estimate kernel methods scalable and practical for large-scale data problems, outperforms the state-of-the-art KRR estimates in time complexity while retaining similar accuracies. Meanwhile, our sketch method provides the theoretical bound that keeps the optimal convergence rate for approximating KRR.Kernel ridge regression (KRR) is a powerful method for nonparametric regression. The time and space complexity of computing the KRR estimate directly are O(n3) and O(n2) , respectively, which are prohibitive for large-scale data sets, where n is the number of data. In this article, we propose a novel random sketch technique based on the circulant matrix that achieves savings in storage space and accelerates the solution of the KRR approximation. The circulant matrix has the following advantages: It can save time complexity by using the fast Fourier transform (FFT) to compute the product of matrix and vector, its space complexity is linear, and the circulant matrix, whose entries in the first column are independent of each other and obey the Gaussian distribution, is almost as effective as the i.i.d. Gaussian random matrix for approximating KRR. Combining the characteristics of the circulant matrix and our careful design, theoretical analysis and experimental results demonstrate that our proposed sketch method, making the estimate kernel methods scalable and practical for large-scale data problems, outperforms the state-of-the-art KRR estimates in time complexity while retaining similar accuracies. Meanwhile, our sketch method provides the theoretical bound that keeps the optimal convergence rate for approximating KRR.
Kernel ridge regression (KRR) is a powerful method for nonparametric regression. The time and space complexity of computing the KRR estimate directly are <inline-formula> <tex-math notation="LaTeX">\mathcal {O}(n^{3}) </tex-math></inline-formula> and <inline-formula> <tex-math notation="LaTeX">\mathcal {O}(n^{2}) </tex-math></inline-formula>, respectively, which are prohibitive for large-scale data sets, where <inline-formula> <tex-math notation="LaTeX">n </tex-math></inline-formula> is the number of data. In this article, we propose a novel random sketch technique based on the circulant matrix that achieves savings in storage space and accelerates the solution of the KRR approximation. The circulant matrix has the following advantages: It can save time complexity by using the fast Fourier transform (FFT) to compute the product of matrix and vector, its space complexity is linear, and the circulant matrix, whose entries in the first column are independent of each other and obey the Gaussian distribution, is almost as effective as the i.i.d. Gaussian random matrix for approximating KRR. Combining the characteristics of the circulant matrix and our careful design, theoretical analysis and experimental results demonstrate that our proposed sketch method, making the estimate kernel methods scalable and practical for large-scale data problems, outperforms the state-of-the-art KRR estimates in time complexity while retaining similar accuracies. Meanwhile, our sketch method provides the theoretical bound that keeps the optimal convergence rate for approximating KRR.
Author Wang, Weiping
Yin, Rong
Meng, Dan
Liu, Yong
Author_xml – sequence: 1
  givenname: Rong
  orcidid: 0000-0003-1894-7561
  surname: Yin
  fullname: Yin, Rong
  email: yinrong@iie.ac.cn
  organization: Institute of Information Engineering, Chinese Academy of Sciences, Beijing, China
– sequence: 2
  givenname: Yong
  orcidid: 0000-0002-6739-621X
  surname: Liu
  fullname: Liu, Yong
  email: liuyong@iie.ac.cn
  organization: Institute of Information Engineering, Chinese Academy of Sciences, Beijing, China
– sequence: 3
  givenname: Weiping
  surname: Wang
  fullname: Wang, Weiping
  organization: Institute of Information Engineering, Chinese Academy of Sciences, Beijing, China
– sequence: 4
  givenname: Dan
  surname: Meng
  fullname: Meng, Dan
  organization: Institute of Information Engineering, Chinese Academy of Sciences, Beijing, China
BookMark eNp9kE1LAzEQhoMo-PkH9BLw4qU1X5tNvEnxC6uCVvAW0mTSRre7mqSg_97VigcPzmXm8Lwzw7ON1tuuBYT2KRlSSvTx5PZ2_DBkhOoh00LoSq-hLUYlGzCu1PrvXD9tor2cn0lfklRS6C10_fACxc3xNaQWGnwf_QzwPcwS5By7Fj_m2M7wKCa3bGxb8I0tKb6f4NNm1qVY5gtsW48nc-jSxy7aCLbJsPfTd9Dj-dlkdDkY311cjU7HA8eZKgMQ0moXgHDqWCABgp8KwUFUUyVDqHzQMLUQpp4z6yFIZmtng9W18N5Ty3fQ0Wrva-relpCLWcTsoOkfhG6ZDeOUyqrikvfo4R_0uVumtv_OMMF1TXilZE-xFeVSl3OCYF5TXNj0YSgxX4rNt2Lzpdj8KO5D6k_IxWJLL60kG5v_oweraASA31tKKVnTin8CDuuNDQ
CODEN ITNNAL
CitedBy_id crossref_primary_10_1109_TCYB_2020_2987810
crossref_primary_10_1007_s10489_023_04606_4
crossref_primary_10_1016_j_patcog_2023_109722
crossref_primary_10_3390_app131810317
crossref_primary_10_3390_eng5010021
crossref_primary_10_1016_j_neucom_2024_128640
crossref_primary_10_1109_TNNLS_2019_2958922
crossref_primary_10_1109_TKDE_2022_3222146
Cites_doi 10.1109/TIT.2015.2450722
10.1109/TNN.2010.2064786
10.1016/j.neucom.2004.01.005
10.1007/978-3-642-41136-6_11
10.1109/TNNLS.2013.2297686
10.1017/S0266466600008409
10.1214/16-AOS1472
10.1007/s00454-008-9110-x
10.1109/TNNLS.2012.2200500
10.1109/TNNLS.2013.2272594
10.1214/009053606000001019
10.1109/ICIP.2008.4711764
10.1016/0022-247X(71)90184-3
10.1007/3-540-45435-7_3
10.1109/TNNLS.2018.2805019
10.1214/009053605000000282
10.1109/CVPR.2007.383105
10.1117/12.863527
10.1017/S0266466608080304
10.24963/ijcai.2018/346
10.1109/TNNLS.2014.2371492
10.1109/TCYB.2016.2520582
10.1609/aaai.v33i01.33013462
10.1016/0024-3795(94)00025-5
10.1137/0910009
10.1137/16M1105396
10.1007/BFb0098489
10.1002/0471667196.ess2254
10.1016/j.jco.2010.02.003
10.1007/978-3-662-44848-9_23
ContentType Journal Article
Copyright Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2020
Copyright_xml – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2020
DBID 97E
RIA
RIE
AAYXX
CITATION
7QF
7QO
7QP
7QQ
7QR
7SC
7SE
7SP
7SR
7TA
7TB
7TK
7U5
8BQ
8FD
F28
FR3
H8D
JG9
JQ2
KR7
L7M
L~C
L~D
P64
7X8
DOI 10.1109/TNNLS.2019.2944959
DatabaseName IEEE All-Society Periodicals Package (ASPP) 2005–Present
IEEE All-Society Periodicals Package (ASPP) 1998–Present
IEEE Electronic Library (IEL)
CrossRef
Aluminium Industry Abstracts
Biotechnology Research Abstracts
Calcium & Calcified Tissue Abstracts
Ceramic Abstracts
Chemoreception Abstracts
Computer and Information Systems Abstracts
Corrosion Abstracts
Electronics & Communications Abstracts
Engineered Materials Abstracts
Materials Business File
Mechanical & Transportation Engineering Abstracts
Neurosciences Abstracts
Solid State and Superconductivity Abstracts
METADEX
Technology Research Database
ANTE: Abstracts in New Technology & Engineering
Engineering Research Database
Aerospace Database
Materials Research Database
ProQuest Computer Science Collection
Civil Engineering Abstracts
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts – Academic
Computer and Information Systems Abstracts Professional
Biotechnology and BioEngineering Abstracts
MEDLINE - Academic
DatabaseTitle CrossRef
Materials Research Database
Technology Research Database
Computer and Information Systems Abstracts – Academic
Mechanical & Transportation Engineering Abstracts
ProQuest Computer Science Collection
Computer and Information Systems Abstracts
Materials Business File
Aerospace Database
Engineered Materials Abstracts
Biotechnology Research Abstracts
Chemoreception Abstracts
Advanced Technologies Database with Aerospace
ANTE: Abstracts in New Technology & Engineering
Civil Engineering Abstracts
Aluminium Industry Abstracts
Electronics & Communications Abstracts
Ceramic Abstracts
Neurosciences Abstracts
METADEX
Biotechnology and BioEngineering Abstracts
Computer and Information Systems Abstracts Professional
Solid State and Superconductivity Abstracts
Engineering Research Database
Calcium & Calcified Tissue Abstracts
Corrosion Abstracts
MEDLINE - Academic
DatabaseTitleList Materials Research Database
MEDLINE - Academic

Database_xml – sequence: 1
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://ieeexplore.ieee.org/
  sourceTypes: Publisher
– sequence: 2
  dbid: 7X8
  name: MEDLINE - Academic
  url: https://search.proquest.com/medline
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Computer Science
EISSN 2162-2388
EndPage 3524
ExternalDocumentID 10_1109_TNNLS_2019_2944959
8886715
Genre orig-research
GrantInformation_xml – fundername: Beijing Municipal Science and Technology Project
  grantid: Z191100007119002
  funderid: 10.13039/501100009592
– fundername: Excellent Talent Introduction of the Institute of Information Engineering of CAS
  grantid: Y7Z0111107
  funderid: 10.13039/501100002367
– fundername: CCF-Tencent Open Fund
– fundername: National Natural Science Foundation of China
  grantid: 61703396; 61673293
  funderid: 10.13039/501100001809
– fundername: Youth Innovation Promotion Association CAS
  funderid: 10.13039/501100002367
GroupedDBID 0R~
4.4
5VS
6IK
97E
AAJGR
AARMG
AASAJ
AAWTH
ABAZT
ABQJQ
ABVLG
ACIWK
ACPRK
AENEX
AFRAH
AGQYO
AGSQL
AHBIQ
AKJIK
AKQYR
ALMA_UNASSIGNED_HOLDINGS
ATWAV
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
EBS
EJD
IFIPE
IPLJI
JAVBF
M43
MS~
O9-
OCL
PQQKQ
RIA
RIE
RNS
AAYXX
CITATION
7QF
7QO
7QP
7QQ
7QR
7SC
7SE
7SP
7SR
7TA
7TB
7TK
7U5
8BQ
8FD
F28
FR3
H8D
JG9
JQ2
KR7
L7M
L~C
L~D
P64
7X8
ID FETCH-LOGICAL-c328t-e46a9cfe031c2f0fefdb443e45b86ff5df9ebaefbd32adef62a7cafa974ddd1a3
IEDL.DBID RIE
ISICitedReferencesCount 14
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000566342500029&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
ISSN 2162-237X
2162-2388
IngestDate Sun Nov 09 13:06:46 EST 2025
Sun Nov 30 04:19:59 EST 2025
Tue Nov 18 22:33:42 EST 2025
Sat Nov 29 01:40:04 EST 2025
Wed Aug 27 02:32:11 EDT 2025
IsPeerReviewed false
IsScholarly true
Issue 9
Language English
License https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html
https://doi.org/10.15223/policy-029
https://doi.org/10.15223/policy-037
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c328t-e46a9cfe031c2f0fefdb443e45b86ff5df9ebaefbd32adef62a7cafa974ddd1a3
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ORCID 0000-0002-6739-621X
0000-0003-1894-7561
PQID 2439703586
PQPubID 85436
PageCount 13
ParticipantIDs proquest_miscellaneous_2311655363
crossref_citationtrail_10_1109_TNNLS_2019_2944959
proquest_journals_2439703586
crossref_primary_10_1109_TNNLS_2019_2944959
ieee_primary_8886715
PublicationCentury 2000
PublicationDate 2020-09-01
PublicationDateYYYYMMDD 2020-09-01
PublicationDate_xml – month: 09
  year: 2020
  text: 2020-09-01
  day: 01
PublicationDecade 2020
PublicationPlace Piscataway
PublicationPlace_xml – name: Piscataway
PublicationTitle IEEE transaction on neural networks and learning systems
PublicationTitleAbbrev TNNLS
PublicationYear 2020
Publisher IEEE
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Publisher_xml – name: IEEE
– name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
References ref57
ref13
liu (ref4) 2017
ref56
ref12
ding (ref26) 2011; 20
ref15
ref58
ref14
gittens (ref35) 2016; 17
ref53
ref52
ref11
ref54
ref10
morariu (ref37) 2009
rahimi (ref30) 2008
ref16
ref19
alaoui (ref55) 2015
davis (ref51) 2013
williams (ref33) 2001
ref50
raykar (ref36) 2007
feng (ref41) 2015
ref45
li (ref32) 2010
ref42
ref43
ref49
raskutti (ref59) 2012; 13
ref8
liu (ref17) 2018; 48
ref9
ref3
wang (ref39) 2017; 18
wilson (ref44) 2015
ding (ref46) 2018
ref31
blanchard (ref28) 2010
cheng (ref27) 2004
ref2
ref38
saunders (ref48) 1998
liu (ref5) 2015
liu (ref6) 2014; 32
wandl (ref20) 1980
yang (ref18) 2018
ref24
kumar (ref34) 2012; 13
ref23
ref25
hastie (ref1) 2009
ref22
yu (ref47) 2014; 32
xu (ref21) 2010
ref29
li (ref7) 2018
davis (ref40) 2012
References_xml – ident: ref54
  doi: 10.1109/TIT.2015.2450722
– ident: ref31
  doi: 10.1109/TNN.2010.2064786
– year: 2009
  ident: ref1
  publication-title: The Elements of Statistical Learning Data Mining Inference and Prediction
– volume: 13
  start-page: 389
  year: 2012
  ident: ref59
  article-title: Minimax-optimal rates for sparse additive models over kernel classes via convex programming
  publication-title: J Mach Learn Res
– ident: ref15
  doi: 10.1016/j.neucom.2004.01.005
– start-page: 2910
  year: 2018
  ident: ref46
  article-title: Randomized kernel selection with spectra of multilevel circulant matrices
  publication-title: Proc 32nd AAAI Conf Artif Intell
– start-page: 1113
  year: 2009
  ident: ref37
  article-title: Automatic online tuning for fast Gaussian summation
  publication-title: Proc Adv Neural Inf Process Syst
– ident: ref2
  doi: 10.1007/978-3-642-41136-6_11
– year: 2012
  ident: ref40
  publication-title: Circulant Matrices
– start-page: 1775
  year: 2015
  ident: ref44
  article-title: Kernel interpolation for scalable structured Gaussian processes (KISS-GP)
  publication-title: Proc Int Conf Mach Learn
– volume: 17
  start-page: 1
  year: 2016
  ident: ref35
  article-title: Revisiting the Nyström method for improved large-scale machine learning
  publication-title: J Mach Learn Res
– start-page: 1
  year: 2007
  ident: ref36
  article-title: Fast large scale Gaussian process regression using approximate matrix-vector products
  publication-title: Proc E-Learning Workshop
– ident: ref10
  doi: 10.1109/TNNLS.2013.2297686
– ident: ref23
  doi: 10.1017/S0266466600008409
– volume: 32
  start-page: ii-946
  year: 2014
  ident: ref47
  article-title: Circulant binary embedding
  publication-title: Proc 31st Int Conf Mach Learn
– ident: ref38
  doi: 10.1214/16-AOS1472
– year: 1980
  ident: ref20
  publication-title: On Kernel Estimation of Regression Functions
– start-page: 682
  year: 2001
  ident: ref33
  article-title: Using the Nyström method to speed up kernel machines
  publication-title: Proc Adv Neural Inf Process Syst
– volume: 18
  start-page: 8039
  year: 2017
  ident: ref39
  article-title: Sketched ridge regression: Optimization perspective, statistical perspective, and model averaging
  publication-title: J Mach Learn Res
– start-page: 2814
  year: 2015
  ident: ref5
  article-title: Eigenvalues ratio for kernel selection of kernel methods
  publication-title: Proc 29th AAAI Conf Artif Intell
– ident: ref53
  doi: 10.1007/s00454-008-9110-x
– start-page: 1586
  year: 2018
  ident: ref7
  article-title: Multi-class learning: From theory to algorithm
  publication-title: Proc Adv Neural Inf Process Syst
– ident: ref12
  doi: 10.1109/TNNLS.2012.2200500
– ident: ref11
  doi: 10.1109/TNNLS.2013.2272594
– volume: 20
  start-page: 165
  year: 2011
  ident: ref26
  article-title: Approximate model selection for large scale LSSVM
  publication-title: Proc Asian Conf Mach Learn
– ident: ref57
  doi: 10.1214/009053606000001019
– ident: ref14
  doi: 10.1109/ICIP.2008.4711764
– start-page: 1177
  year: 2008
  ident: ref30
  article-title: Random features for large-scale kernel machines
  publication-title: Proc Adv Neural Inf Process Syst
– start-page: 631
  year: 2010
  ident: ref32
  article-title: Making large-scale Nyström approximation possible
  publication-title: Proc 27th Int Conf Mach Learn
– ident: ref49
  doi: 10.1016/0022-247X(71)90184-3
– volume: 13
  start-page: 981
  year: 2012
  ident: ref34
  article-title: Sampling Methods for the Nyström Method
  publication-title: J Mach Learn Res
– year: 2013
  ident: ref51
  publication-title: Circulant Matrices
– ident: ref58
  doi: 10.1007/3-540-45435-7_3
– start-page: 515
  year: 1998
  ident: ref48
  article-title: Ridge regression learning algorithm in dual variables
  publication-title: Proc 15th Int Conf Mach Learn
– start-page: 775
  year: 2015
  ident: ref55
  article-title: Fast randomized kernel ridge regression with statistical guarantees
  publication-title: Proc Adv Neural Inf Process Syst
– ident: ref8
  doi: 10.1109/TNNLS.2018.2805019
– ident: ref56
  doi: 10.1214/009053605000000282
– ident: ref13
  doi: 10.1109/CVPR.2007.383105
– ident: ref42
  doi: 10.1117/12.863527
– ident: ref22
  doi: 10.1017/S0266466608080304
– start-page: 307
  year: 2018
  ident: ref18
  article-title: Accurate, fast and scalable kernel ridge regression on parallel and distributed systems
  publication-title: Proc Int Conf Supercomput
– start-page: 2280
  year: 2017
  ident: ref4
  article-title: Infinite kernel learning: Generalization bounds and algorithms
  publication-title: Proc 31st AAAI Conf Artif Intell
– ident: ref3
  doi: 10.24963/ijcai.2018/346
– ident: ref9
  doi: 10.1109/TNNLS.2014.2371492
– volume: 48
  start-page: 284
  year: 2018
  ident: ref17
  article-title: An accelerator for kernel ridge regression algorithms based on data partition
  publication-title: J Univ Sci Technol China
– start-page: 226
  year: 2010
  ident: ref28
  article-title: Optimal learning rates for kernel conjugate gradient regression
  publication-title: Proc Adv Neural Inf Process Syst
– start-page: 81
  year: 2004
  ident: ref27
  article-title: Learning with non-positive kernels
  publication-title: Proc 21st Int Conf Mach Learn
– start-page: 157
  year: 2010
  ident: ref21
  article-title: Two-phase kernel estimation for robust motion deblurring
  publication-title: Proc Eur Conf Comput Vis
– ident: ref45
  doi: 10.1109/TCYB.2016.2520582
– ident: ref25
  doi: 10.1609/aaai.v33i01.33013462
– ident: ref50
  doi: 10.1016/0024-3795(94)00025-5
– ident: ref52
  doi: 10.1137/0910009
– ident: ref29
  doi: 10.1137/16M1105396
– ident: ref19
  doi: 10.1007/BFb0098489
– ident: ref16
  doi: 10.1002/0471667196.ess2254
– start-page: 3490
  year: 2015
  ident: ref41
  article-title: Random feature mapping with signed circulant matrix projection
  publication-title: Proc 24th Int Joint Conf Artif Intell
– ident: ref43
  doi: 10.1016/j.jco.2010.02.003
– ident: ref24
  doi: 10.1007/978-3-662-44848-9_23
– volume: 32
  start-page: 324
  year: 2014
  ident: ref6
  article-title: Efficient approximation of cross-validation for kernel methods using Bouligand influence function
  publication-title: Proc 31st Int Conf Mach Learn
SSID ssj0000605649
Score 2.404996
Snippet Kernel ridge regression (KRR) is a powerful method for nonparametric regression. The time and space complexity of computing the KRR estimate directly are...
Kernel ridge regression (KRR) is a powerful method for nonparametric regression. The time and space complexity of computing the KRR estimate directly are O(n3)...
SourceID proquest
crossref
ieee
SourceType Aggregation Database
Enrichment Source
Index Database
Publisher
StartPage 3512
SubjectTerms Acceleration
Algorithms
Approximation
Approximation algorithms
Circulant matrix
Complexity
Convergence
Fast Fourier transformations
Fourier analysis
Fourier transforms
Kernel
kernel ridge regression (KRR)
Kernels
large scale
Mathematical analysis
Matrix algebra
Matrix methods
Normal distribution
random sketch
Regression analysis
Statistical analysis
Theoretical analysis
Time complexity
Training
Title Sketch Kernel Ridge Regression Using Circulant Matrix: Algorithm and Theory
URI https://ieeexplore.ieee.org/document/8886715
https://www.proquest.com/docview/2439703586
https://www.proquest.com/docview/2311655363
Volume 31
WOSCitedRecordID wos000566342500029&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVIEE
  databaseName: IEEE Electronic Library (IEL)
  customDbUrl:
  eissn: 2162-2388
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0000605649
  issn: 2162-237X
  databaseCode: RIE
  dateStart: 20120101
  isFulltext: true
  titleUrlDefault: https://ieeexplore.ieee.org/
  providerName: IEEE
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1Na9wwEB3S0EMvTZu0dNs0KNBb4kQrybLVWwgJhaRLyRd7M7PSKF269RbHW_rzK8leQ2gp9Gbw2Bg9jTUjvTcD8CHnVhmUNgvBvs9UcKAMuXGZQAzpc6m5S2TMu8tiMimnU_NlAw4HLQwRJfIZHcXLdJbvlnYVt8qOQ7ami6gof1IUutNqDfspPMTlOkW7YqxFJmQxXWtkuDm-mUwuryORyxwJo0JSYB6tQ6mxyh9_47TEnG_938e9gOd9KMlOOuxfwgbV27C1btPAeq_dgYvrbxEbdkFNTQt2FRVa7IruOwZszRJrgJ3Om0hJrVv2OVbt__WRnSzul828_fqdYe1YJ-J_BbfnZzenn7K-h0JmpSjbjJRGYz0F37XCc0_ezZSSpPJZqb3PnTc0Q_IzJwU68lpgYdFjSDOcc2OUr2GzXtb0Bpj0BUpVWO9CyCgwRxFVuRzLMSkjuBrBeD2ile0LjMc-F4sqJRrcVAmFKqJQ9SiM4GB45kdXXuOf1jtx3AfLfshHsLsGruo98KESMdLiMi_1CPaH28F34oEI1rRcBRsZiw_lUsu3f3_zO3gmYn6dOGW7sNk2K3oPT-3Pdv7Q7IVpOC330jT8Dama2k8
linkProvider IEEE
linkToHtml http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1RT9swED4hmLS9wDY2rRsbRuINAq7tOPHeEBoCtUQTdFPfItc-s4qSopCi_fzZbhppGprEW6RcosifL76zv-8OYD-lRijNTeKDfZcI70CJpsomTGufPueS2kjG_DnMiiIfj9X3NTjstDCIGMlneBQu41m-nZtF2Co79tmazIKifCN0zmrVWt2OCvWRuYzxLutLljCejVcqGaqOR0UxvA5ULnXElPBpgfprJYqtVf75H8dF5mzreZ_3GjbbYJKcLNF_A2tYvYWtVaMG0vrtNgyubwM6ZIB1hTNyFTRa5ApvlhzYikTeADmd1oGUWjXkMtTt__2VnMxu5vW0-XVHdGXJUsb_Dn6cfRudnidtF4XEcJY3CQqplXHovdcwRx06OxGCo0gnuXQutU7hRKObWM60RSeZzox22ica1tq-5u9hvZpX-AEId5nmIjPO-qCR6VSzoMulOu-jUIyKHvRXI1qatsR46HQxK2OqQVUZUSgDCmWLQg8OumfulwU2_mu9Hca9s2yHvAc7K-DK1gcfShZiLcrTXPZgr7vtvScciegK5wtvw0P5oZRL_vHpN-_Cy_PR5bAcXhSDT_CKhWw7Msx2YL2pF_gZXpjHZvpQf4mT8Q8I8Nyw
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Sketch+Kernel+Ridge+Regression+Using+Circulant+Matrix%3A+Algorithm+and+Theory&rft.jtitle=IEEE+transaction+on+neural+networks+and+learning+systems&rft.au=Yin%2C+Rong&rft.au=Liu%2C+Yong&rft.au=Wang%2C+Weiping&rft.au=Meng%2C+Dan&rft.date=2020-09-01&rft.issn=2162-237X&rft.eissn=2162-2388&rft.volume=31&rft.issue=9&rft.spage=3512&rft.epage=3524&rft_id=info:doi/10.1109%2FTNNLS.2019.2944959&rft.externalDBID=n%2Fa&rft.externalDocID=10_1109_TNNLS_2019_2944959
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2162-237X&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2162-237X&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2162-237X&client=summon