Automated Model Inference for Gaussian Processes: An Overview of State-of-the-Art Methods and Algorithms

Gaussian process models (GPMs) are widely regarded as a prominent tool for learning statistical data models that enable interpolation, regression, and classification. These models are typically instantiated by a Gaussian Process with a zero-mean function and a radial basis covariance function. While...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:SN computer science Ročník 3; číslo 4; s. 300
Hlavní autori: Berns, Fabian, Hüwel, Jan, Beecks, Christian
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: Singapore Springer Nature Singapore 01.07.2022
Springer Nature B.V
Predmet:
ISSN:2661-8907, 2662-995X, 2661-8907
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Abstract Gaussian process models (GPMs) are widely regarded as a prominent tool for learning statistical data models that enable interpolation, regression, and classification. These models are typically instantiated by a Gaussian Process with a zero-mean function and a radial basis covariance function. While these default instantiations yield acceptable analytical quality in terms of model accuracy, GPM inference algorithms automatically search for an application-specific model fitting a particular dataset. State-of-the-art methods for automated inference of GPMs are searching the space of possible models in a rather intricate way and thus result in super-quadratic computation time complexity for model selection and evaluation. Since these properties only enable processing small datasets with low statistical versatility, various methods and algorithms using global as well as local approximations have been proposed for efficient inference of large-scale GPMs. While the latter approximation relies on representing data via local sub-models, global approaches capture data’s inherent characteristics by means of an educated sample. In this paper, we investigate the current state-of-the-art in automated model inference for Gaussian processes and outline strengths and shortcomings of the respective approaches. A performance analysis backs our theoretical findings and provides further empirical evidence. It indicates that approximated inference algorithms, especially locally approximating ones, deliver superior runtime performance, while maintaining the quality level of those using non-approximative Gaussian processes.
AbstractList Gaussian process models (GPMs) are widely regarded as a prominent tool for learning statistical data models that enable interpolation, regression, and classification. These models are typically instantiated by a Gaussian Process with a zero-mean function and a radial basis covariance function. While these default instantiations yield acceptable analytical quality in terms of model accuracy, GPM inference algorithms automatically search for an application-specific model fitting a particular dataset. State-of-the-art methods for automated inference of GPMs are searching the space of possible models in a rather intricate way and thus result in super-quadratic computation time complexity for model selection and evaluation. Since these properties only enable processing small datasets with low statistical versatility, various methods and algorithms using global as well as local approximations have been proposed for efficient inference of large-scale GPMs. While the latter approximation relies on representing data via local sub-models, global approaches capture data’s inherent characteristics by means of an educated sample. In this paper, we investigate the current state-of-the-art in automated model inference for Gaussian processes and outline strengths and shortcomings of the respective approaches. A performance analysis backs our theoretical findings and provides further empirical evidence. It indicates that approximated inference algorithms, especially locally approximating ones, deliver superior runtime performance, while maintaining the quality level of those using non-approximative Gaussian processes.
Gaussian process models (GPMs) are widely regarded as a prominent tool for learning statistical data models that enable interpolation, regression, and classification. These models are typically instantiated by a Gaussian Process with a zero-mean function and a radial basis covariance function. While these default instantiations yield acceptable analytical quality in terms of model accuracy, GPM inference algorithms automatically search for an application-specific model fitting a particular dataset. State-of-the-art methods for automated inference of GPMs are searching the space of possible models in a rather intricate way and thus result in super-quadratic computation time complexity for model selection and evaluation. Since these properties only enable processing small datasets with low statistical versatility, various methods and algorithms using global as well as local approximations have been proposed for efficient inference of large-scale GPMs. While the latter approximation relies on representing data via local sub-models, global approaches capture data's inherent characteristics by means of an educated sample. In this paper, we investigate the current state-of-the-art in automated model inference for Gaussian processes and outline strengths and shortcomings of the respective approaches. A performance analysis backs our theoretical findings and provides further empirical evidence. It indicates that approximated inference algorithms, especially locally approximating ones, deliver superior runtime performance, while maintaining the quality level of those using non-approximative Gaussian processes.Gaussian process models (GPMs) are widely regarded as a prominent tool for learning statistical data models that enable interpolation, regression, and classification. These models are typically instantiated by a Gaussian Process with a zero-mean function and a radial basis covariance function. While these default instantiations yield acceptable analytical quality in terms of model accuracy, GPM inference algorithms automatically search for an application-specific model fitting a particular dataset. State-of-the-art methods for automated inference of GPMs are searching the space of possible models in a rather intricate way and thus result in super-quadratic computation time complexity for model selection and evaluation. Since these properties only enable processing small datasets with low statistical versatility, various methods and algorithms using global as well as local approximations have been proposed for efficient inference of large-scale GPMs. While the latter approximation relies on representing data via local sub-models, global approaches capture data's inherent characteristics by means of an educated sample. In this paper, we investigate the current state-of-the-art in automated model inference for Gaussian processes and outline strengths and shortcomings of the respective approaches. A performance analysis backs our theoretical findings and provides further empirical evidence. It indicates that approximated inference algorithms, especially locally approximating ones, deliver superior runtime performance, while maintaining the quality level of those using non-approximative Gaussian processes.
ArticleNumber 300
Author Beecks, Christian
Hüwel, Jan
Berns, Fabian
Author_xml – sequence: 1
  givenname: Fabian
  surname: Berns
  fullname: Berns, Fabian
  email: fabian.berns@fernuni-hagen.de
  organization: University of Hagen
– sequence: 2
  givenname: Jan
  surname: Hüwel
  fullname: Hüwel, Jan
  organization: University of Hagen
– sequence: 3
  givenname: Christian
  surname: Beecks
  fullname: Beecks, Christian
  organization: University of Hagen, Fraunhofer Institute for Applied Information Technology FIT
BackLink https://www.ncbi.nlm.nih.gov/pubmed/35647556$$D View this record in MEDLINE/PubMed
BookMark eNp9kc1u1DAYRS1UREvpC7BAltiwMfgnsWMWSFFFS6VWRQLWluN8mbjK2MVOhuHt6zItlC66ciSfe3Wd8xLthRgAodeMvmeUqg-54lppQjknlLFGku0zdMClZKTRVO09-N5HRzlfUUp5TatK1i_QvqhlpepaHqCxXea4tjP0-CL2MOGzMECC4AAPMeFTu-TsbcBfU3SQM-SPuA34cgNp4-EXjgP-Npc0iQOZRyBtmvEFzGPsM7ahx-20isnP4zq_Qs8HO2U4ujsP0Y-Tz9-Pv5Dzy9Oz4_acONHoLVEVZ66uQVIrleo7pZwWamiGivXcAncDB66EZF3H-sY5sML2teDaAe-EYuIQfdr1Xi_dGnoHYU52MtfJr236baL15v-b4EezihujGReay1Lw7q4gxZ8L5NmsfXYwTTZAXLLhUnHBlNK0oG8foVdxSaE8z3AtBNVVcVCoNw8X_Z1yL6EAzQ5wKeacYDDOl5_q4-1APxlGza1ys1NuinLzR7nZlih_FL1vfzIkdqFc4LCC9G_2E6kb6Na_cg
CitedBy_id crossref_primary_10_1371_journal_pone_0296511
crossref_primary_10_3390_w16070949
crossref_primary_10_1016_j_jpowsour_2024_235891
crossref_primary_10_1122_8_0000930
Cites_doi 10.1007/s10115-016-0987-z
10.1016/j.enbuild.2014.04.034
10.1162/089976602760128018
10.1080/01621459.2015.1044091
10.1111/j.1467-8640.1989.tb00315.x
10.5220/0010109700650074
10.1162/neco.2007.19.11.3088
10.1016/j.image.2019.02.001
10.1109/TNNLS.2012.2200299
10.1007/s10462-012-9338-y
10.1109/TNNLS.2019.2957109
10.1016/j.chaos.2020.109924
10.1016/j.matcom.2015.11.005
10.1016/j.enbuild.2012.03.003
10.1109/TGRS.2011.2168962
10.1038/nature14541
10.1145/2338676.2338682
10.1109/ICDMW.2017.89
10.1016/j.sigpro.2019.107299
10.1098/rsta.2011.0550
10.1007/s40745-015-0040-1
10.1609/aaai.v29i1.9575
10.1137/1.9781611976700.41
10.1016/B978-1-55860-307-3.50037-X
10.1145/3340531.3412182
10.1016/j.ijepes.2014.02.027
10.1080/23249935.2021.1898487
10.1016/j.ijforecast.2013.07.001
10.1609/aaai.v28i1.8904
10.13140/RG.2.2.23937.20325
ContentType Journal Article
Copyright The Author(s) 2022
The Author(s) 2022.
The Author(s) 2022. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Copyright_xml – notice: The Author(s) 2022
– notice: The Author(s) 2022.
– notice: The Author(s) 2022. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
DBID C6C
AAYXX
CITATION
NPM
8FE
8FG
AFKRA
ARAPS
AZQEC
BENPR
BGLVJ
CCPQU
DWQXO
GNUQQ
HCIFZ
JQ2
K7-
P5Z
P62
PHGZM
PHGZT
PKEHL
PQEST
PQGLB
PQQKQ
PQUKI
7X8
5PM
DOI 10.1007/s42979-022-01186-x
DatabaseName Springer Nature OA Free Journals
CrossRef
PubMed
ProQuest SciTech Collection
ProQuest Technology Collection
ProQuest Central UK/Ireland
Advanced Technologies & Computer Science Collection
ProQuest Central Essentials - QC
ProQuest Central
ProQuest Technology Collection
ProQuest One
ProQuest Central Korea
ProQuest Central Student
SciTech Premium Collection
ProQuest Computer Science Collection
Computer Science Database
Advanced Technologies & Aerospace Database
ProQuest Advanced Technologies & Aerospace Collection
ProQuest Central Premium
ProQuest One Academic (New)
ProQuest One Academic Middle East (New)
ProQuest One Academic Eastern Edition (DO NOT USE)
ProQuest One Applied & Life Sciences
ProQuest One Academic (retired)
ProQuest One Academic UKI Edition
MEDLINE - Academic
PubMed Central (Full Participant titles)
DatabaseTitle CrossRef
PubMed
Advanced Technologies & Aerospace Collection
Computer Science Database
ProQuest Central Student
Technology Collection
ProQuest One Academic Middle East (New)
ProQuest Advanced Technologies & Aerospace Collection
ProQuest Central Essentials
ProQuest Computer Science Collection
ProQuest One Academic Eastern Edition
SciTech Premium Collection
ProQuest One Community College
ProQuest Technology Collection
ProQuest SciTech Collection
ProQuest Central
Advanced Technologies & Aerospace Database
ProQuest One Applied & Life Sciences
ProQuest One Academic UKI Edition
ProQuest Central Korea
ProQuest Central (New)
ProQuest One Academic
ProQuest One Academic (New)
MEDLINE - Academic
DatabaseTitleList
PubMed
MEDLINE - Academic
Advanced Technologies & Aerospace Collection
CrossRef

Database_xml – sequence: 1
  dbid: NPM
  name: PubMed
  url: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 2
  dbid: P5Z
  name: Advanced Technologies & Aerospace Database
  url: https://search.proquest.com/hightechjournals
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Computer Science
EISSN 2661-8907
ExternalDocumentID PMC9123926
35647556
10_1007_s42979_022_01186_x
Genre Journal Article
Review
GrantInformation_xml – fundername: Horizon 2020
  grantid: 957331
  funderid: http://dx.doi.org/10.13039/501100007601
– fundername: FernUniversität in Hagen (3099)
– fundername: Ministerium für Innovation, Wissenschaft und Forschung des Landes Nordrhein-Westfalen
  funderid: http://dx.doi.org/10.13039/501100009591
– fundername: ;
– fundername: ;
  grantid: 957331
GroupedDBID 0R~
406
AACDK
AAHNG
AAJBT
AASML
AATNV
AAUYE
ABAKF
ABECU
ABHQN
ABJNI
ABMQK
ABTEG
ABTKH
ABWNU
ACAOD
ACDTI
ACHSB
ACOKC
ACPIV
ACZOJ
ADKNI
ADTPH
ADYFF
AEFQL
AEMSY
AESKC
AFBBN
AFKRA
AFQWF
AGMZJ
AGQEE
AGRTI
AIGIU
AILAN
AJZVZ
ALMA_UNASSIGNED_HOLDINGS
AMXSW
AMYLF
ARAPS
BAPOH
BENPR
BGLVJ
C6C
CCPQU
DPUIP
EBLON
EBS
FIGPU
FNLPD
GGCAI
GNWQR
HCIFZ
IKXTQ
IWAJR
JZLTJ
K7-
LLZTM
NPVJJ
NQJWS
OK1
PT4
ROL
RSV
SJYHP
SNE
SOJ
SRMVM
SSLCW
UOJIU
UTJUX
ZMTXR
2JN
AAYXX
ABBRH
ABDBE
ABFSG
ABRTQ
ACSTC
ADKFA
AEZWR
AFDZB
AFFHD
AFHIU
AFOHR
AHPBZ
AHWEU
AIXLP
ATHPR
AYFIA
BSONS
CITATION
EJD
KOV
PHGZM
PHGZT
PQGLB
NPM
8FE
8FG
AZQEC
DWQXO
GNUQQ
JQ2
P62
PKEHL
PQEST
PQQKQ
PQUKI
7X8
PUEGO
5PM
ID FETCH-LOGICAL-c389x-7421c55e60a677db77c937f8f41d2ae2cf2e27361bb1d8ccea3ad5329ce2b3713
IEDL.DBID P5Z
ISSN 2661-8907
2662-995X
IngestDate Tue Nov 04 01:42:33 EST 2025
Thu Oct 02 05:43:21 EDT 2025
Wed Nov 05 14:49:29 EST 2025
Wed Feb 19 02:25:41 EST 2025
Tue Nov 18 21:15:01 EST 2025
Sat Nov 29 05:16:51 EST 2025
Fri Feb 21 02:46:22 EST 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Issue 4
Keywords Gaussian processes
Machine learning
Probabilistic machine learning
Language English
License The Author(s) 2022.
Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c389x-7421c55e60a677db77c937f8f41d2ae2cf2e27361bb1d8ccea3ad5329ce2b3713
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ObjectType-Review-3
content type line 23
OpenAccessLink https://link.springer.com/10.1007/s42979-022-01186-x
PMID 35647556
PQID 2933094661
PQPubID 6623307
ParticipantIDs pubmedcentral_primary_oai_pubmedcentral_nih_gov_9123926
proquest_miscellaneous_2672317790
proquest_journals_2933094661
pubmed_primary_35647556
crossref_citationtrail_10_1007_s42979_022_01186_x
crossref_primary_10_1007_s42979_022_01186_x
springer_journals_10_1007_s42979_022_01186_x
PublicationCentury 2000
PublicationDate 2022-07-01
PublicationDateYYYYMMDD 2022-07-01
PublicationDate_xml – month: 07
  year: 2022
  text: 2022-07-01
  day: 01
PublicationDecade 2020
PublicationPlace Singapore
PublicationPlace_xml – name: Singapore
– name: Kolkata
PublicationTitle SN computer science
PublicationTitleAbbrev SN COMPUT. SCI
PublicationTitleAlternate SN Comput Sci
PublicationYear 2022
Publisher Springer Nature Singapore
Springer Nature B.V
Publisher_xml – name: Springer Nature Singapore
– name: Springer Nature B.V
References RobertsSOsborneMEbdenMReeceSGibsonNAigrainSGaussian processes for time-series modellingPhilos Trans Ser A Math Phys Eng Sci2013371198420110550300566810.1098/rsta.2011.05501353.62103
Hayashi K, Imaizumi M, Yoshida Y. On random subsampling of gaussian process regression: a graphon-based analysis. In: AISTATS, Proceedings of machine learning research, vol 108, p. PMLR 2020;2055–2065.
Malkomes G, Schaff C, Garnett R. Bayesian optimization for automated model selection. In: NIPS. 2016;2892–2900.
Rossi S, Heinonen M, Bonilla EV, Shen Z, Filippone M. Sparse gaussian processes revisited: Bayesian approaches to inducing-variable approximations. In: AISTATS, Proceedings of machine learning research, vol. 130, p. PMLR 2021;1837–1845.
Berns F, Beecks C. Complexity-adaptive gaussian process model inference for large-scale data. In: SDM. SIAM 2021.
Rivera R, Burnaev E. Forecasting of commercial sales with large scale gaussian processes. In: ICDM workshops, IEEE Computer Society 2017;625–634.
GhahramaniZProbabilistic machine learning and artificial intelligenceNature2015521755345245910.1038/nature14541
CholletFDeep learning with Python2018Shelter IslandManning Publications Co
DattaABanerjeeSFinleyAOGelfandAEHierarchical nearest-neighbor gaussian process models for large geostatistical datasetsJ Am Stat Assoc2016111514800812353870610.1080/01621459.2015.1044091
SteinrueckenCSmithEJanzDLloydJRGhahramaniZThe automatic statisticianAutomated machine learning, The Springer series on challenges in machine learning2019New YorkSpringer161173
Snelson E, Ghahramani Z. Sparse gaussian processes using pseudo-inputs. In: NIPS. 2005;1257–1264.
Duvenaud D, Lloyd JR, Grosse RB, Tenenbaum JB, Ghahramani Z. Structure discovery in nonparametric regression through compositional kernel search. In: ICML (3), JMLR workshop and conference proceedings, vol. 28. JMLR.org 2013;1166–1174.
MasoudniaSEbrahimpourRMixture of experts: a literature surveyArtif Intell Rev201442227529310.1007/s10462-012-9338-y
Quinlan JR. Combining instance-based and model-based learning. In: ICML. Morgan Kaufmann 1993;236–243.
Hebrail G, Berard A. Individual household electric power consumption data set. https://archive.ics.uci.edu/ml/datasets/individual+household+electric+power+consumption. 2012. Accessed: 09 Jan 2020.
Berns F, Schmidt K, Bracht I, Beecks C. 3CS algorithm for efficient gaussian process model retrieval. In: 25th international conference on pattern recognition (ICPR). 2020.
ParkCApleyDWPatchwork kriging for large-scale gaussian process regressionJ Mach Learn Res2018197:17:4338997731444.62088
Titsias MK. Variational learning of inducing variables in sparse gaussian processes. In: AISTATS, JMLR proceedings, vol. 5, pp. 567–574. JMLR.org 2009.
Snelson E, Ghahramani Z. Local and global sparse gaussian process approximations. In: AISTATS, JMLR Proceedings, vol. 2. JMLR.org 2007;524–531.
Hensman J, Fusi N, Lawrence ND. Gaussian processes for big data. In: UAI. AUAI Press; 2013.
Zamora-MartínezFRomeuPBotella-RocamoraPPardoJOn-line learning of indoor temperature forecasting models towards energy efficiencyEnergy Build20148316217210.1016/j.enbuild.2014.04.034
RasmussenCEWilliamsCKIGaussian processes for machine learning. Adaptive computation and machine learning2006New YorkMIT Press1177.68165
TsanasAXifaraAAccurate quantitative estimation of energy performance of residential buildings using statistical machine learning toolsEnergy Build20124956056710.1016/j.enbuild.2012.03.003
Low KH, Yu J, Chen J, Jaillet P. Parallel gaussian process regression for big data: Low-rank representation meets Markov approximation. In: AAAI. AAAI Press 2015;2821–2827.
YükselSEWilsonJNGaderPDTwenty years of mixture of expertsIEEE Trans Neural Netw Learn Syst20122381177119310.1109/TNNLS.2012.2200299
IlievAIKyurkchievNMarkovSOn the approximation of the step function by some sigmoid functionsMath Comput Simul2017133223234357527810.1016/j.matcom.2015.11.005
VerrelstJAlonsoLCamps-VallsGDelegidoJMorenoJFRetrieval of vegetation biophysical parameters using gaussian process techniquesIEEE Trans Geosci Remote Sens2012505–21832184310.1109/TGRS.2011.2168962
KiblerDFAhaDWAlbertMKInstance-based prediction of real-valued attributesComput Intell19895515710.1111/j.1467-8640.1989.tb00315.x
Kim H, Teh YW. Scaling up the automatic statistician: scalable structure discovery using gaussian processes. In: AISTATS, Proceedings of machine learning research, vol. 84. PMLR 2018;575–584.
Truong C, Oudre L, Vayatis N. Selective review of offline change point detection methods. Signal Process. 2020;167
Arias Velásquez RM, Mejía Lara JV. Forecast and evaluation of covid-19 spreading in USA with reduced-space gaussian process regression. Chaos Solit Fractals. 2020;136:109924. https://doi.org/10.1016/j.chaos.2020.109924. https://www.sciencedirect.com/science/article/pii/S0960077920303234.
Berns F, Beecks C. Automatic gaussian process model retrieval for big data. In: CIKM. ACM 2020;1965–1968.
HintonGETraining products of experts by minimizing contrastive divergenceNeural Comput20021481771180010.1162/089976602760128018
LohYPLiangXChanCSLow-light image enhancement using gaussian process for features retrievalSignal Process Image Commun20197417519010.1016/j.image.2019.02.001
Csató L, Opper M. Sparse representation for gaussian process models. In: NIPS. MIT Press 2000;444–450.
Li SC, Marlin BM. A scalable end-to-end gaussian process adapter for irregularly sampled time series classification. In: NIPS, 2016;1804–1812.
LiuHOngYShenXCaiJWhen gaussian process meets big data: a review of scalable gpsIEEE Trans Neural Netw Learn Syst2020311144054423416996210.1109/TNNLS.2019.2957109
Wilson AG, Nickisch H. Kernel interpolation for scalable structured gaussian processes (KISS-GP). In: ICML, JMLR workshop and conference proceedings, vol. 37, pp. 1775–1784. JMLR.org. 2015.
Berns F, Beecks C. Large-scale retrieval of Bayesian machine learning models for time series data via gaussian processes. In: KDIR. SciTePress 2020;71–80.
Abrahamsen P. A review of gaussian random fields and correlation functions. In: Technical report. 1997, p 917. https://doi.org/10.13140/RG.2.2.23937.20325. https://www.nr.no/directdownload/917_Rapport.pdf
Lloyd JR, Duvenaud D, Grosse RB, Tenenbaum JB, Ghahramani Z. Automatic construction and natural-language description of nonparametric regression models. In: AAAI. AAAI Press 2014;1242–1250.
Bauer M, van der Wilk M, Rasmussen CE. Understanding probabilistic sparse gaussian process approximations. In: NIPS. 2016;1525–1533.
AminikhanghahiSCookDJA survey of methods for time series change point detectionKnowl Inf Syst201751233936710.1007/s10115-016-0987-z
Cheng C, Boots B. Variational inference for gaussian process models with linear complexity. In: NIPS. 2017;5184–5194.
Alsaleh R, Sayed T. Microscopic modeling of cyclists interactions with pedestrians in shared spaces: a gaussian process inverse reinforcement learning approach. Transportmetrica A Transport Sci. 2021. https://doi.org/10.1080/23249935.2021.1898487.
Stanton S, Maddox W, Delbridge IA, Wilson AG. Kernel interpolation for scalable online gaussian processes. In: Banerjee A, Fukumizu K (eds) The 24th international conference on artificial intelligence and statistics, AISTATS 2021, April 13–15, 2021, Virtual event, Proceedings of machine learning research, vol. 130, PMLR 2021;3133–3141. http://proceedings.mlr.press/v130/stanton21a.html.
HongTPinsonPFanSGlobal energy forecasting competition 2012Int J Forecast201430235736310.1016/j.ijforecast.2013.07.001
KimHLeeJClustering based on gaussian processesNeural Comput2007191130883107235297410.1162/neco.2007.19.11.30881143.68574
Xu D, Tian Y. A comprehensive survey of clustering algorithms. Ann Data Sci. 2015;2(2).
TüfekciPPrediction of full load electrical power output of a base load operated combined cycle power plant using machine learning methodsInt J Electric Power Energy Syst20146012614010.1016/j.ijepes.2014.02.027
GittensAMahoneyMWRevisiting the Nystrom method for improved large-scale machine learningJ Mach Learn Res201617117:1117:6535435231367.68223
Max Planck Institute for Biogeochemistry: Weather Station Beutenberg/Weather Station Saaleaue: Jena Weather Data Analysis. https://www.bgc-jena.mpg.de/wetter/. 2019. Accessed: 09 Jan 2020.
Taubert N, Christensen A, Endres D, Giese MA. Online simulation of emotional interactive behaviors with hierarchical gaussian process dynamical models. In: Proceedings of the ACM symposium on applied perception, SAP ’12, pp. 25–32. Association for Computing Machinery, New York. 2012. https://doi.org/10.1145/2338676.2338682.
Wilson AG, Adams RP. Gaussian process kernels for pattern discovery and extrapolation. In: ICML (3), JMLR workshop and conference proceedings, vol. 28, pp. 1067–1075. JMLR.org. 2013.
1186_CR31
1186_CR30
F Zamora-Martínez (1186_CR54) 2014; 83
P Tüfekci (1186_CR48) 2014; 60
DF Kibler (1186_CR23) 1989; 5
J Verrelst (1186_CR49) 2012; 50
GE Hinton (1186_CR20) 2002; 14
SE Yüksel (1186_CR53) 2012; 23
A Gittens (1186_CR16) 2016; 17
1186_CR39
1186_CR37
1186_CR35
1186_CR33
H Kim (1186_CR24) 2007; 19
1186_CR42
1186_CR41
AI Iliev (1186_CR22) 2017; 133
1186_CR40
Z Ghahramani (1186_CR15) 2015; 521
S Roberts (1186_CR38) 2013; 371
A Datta (1186_CR13) 2016; 111
1186_CR46
1186_CR45
1186_CR44
1186_CR10
S Masoudnia (1186_CR32) 2014; 42
H Liu (1186_CR27) 2020; 31
1186_CR52
1186_CR51
1186_CR50
1186_CR9
1186_CR8
1186_CR19
1186_CR7
1186_CR18
1186_CR6
1186_CR17
1186_CR5
1186_CR4
1186_CR14
1186_CR2
1186_CR1
1186_CR12
YP Loh (1186_CR29) 2019; 74
C Steinruecken (1186_CR43) 2019
CE Rasmussen (1186_CR36) 2006
S Aminikhanghahi (1186_CR3) 2017; 51
T Hong (1186_CR21) 2014; 30
C Park (1186_CR34) 2018; 19
F Chollet (1186_CR11) 2018
1186_CR28
1186_CR26
1186_CR25
A Tsanas (1186_CR47) 2012; 49
References_xml – reference: Cheng C, Boots B. Variational inference for gaussian process models with linear complexity. In: NIPS. 2017;5184–5194.
– reference: KiblerDFAhaDWAlbertMKInstance-based prediction of real-valued attributesComput Intell19895515710.1111/j.1467-8640.1989.tb00315.x
– reference: Bauer M, van der Wilk M, Rasmussen CE. Understanding probabilistic sparse gaussian process approximations. In: NIPS. 2016;1525–1533.
– reference: GhahramaniZProbabilistic machine learning and artificial intelligenceNature2015521755345245910.1038/nature14541
– reference: Berns F, Beecks C. Automatic gaussian process model retrieval for big data. In: CIKM. ACM 2020;1965–1968.
– reference: Low KH, Yu J, Chen J, Jaillet P. Parallel gaussian process regression for big data: Low-rank representation meets Markov approximation. In: AAAI. AAAI Press 2015;2821–2827.
– reference: Wilson AG, Adams RP. Gaussian process kernels for pattern discovery and extrapolation. In: ICML (3), JMLR workshop and conference proceedings, vol. 28, pp. 1067–1075. JMLR.org. 2013.
– reference: TüfekciPPrediction of full load electrical power output of a base load operated combined cycle power plant using machine learning methodsInt J Electric Power Energy Syst20146012614010.1016/j.ijepes.2014.02.027
– reference: Zamora-MartínezFRomeuPBotella-RocamoraPPardoJOn-line learning of indoor temperature forecasting models towards energy efficiencyEnergy Build20148316217210.1016/j.enbuild.2014.04.034
– reference: Berns F, Beecks C. Large-scale retrieval of Bayesian machine learning models for time series data via gaussian processes. In: KDIR. SciTePress 2020;71–80.
– reference: Duvenaud D, Lloyd JR, Grosse RB, Tenenbaum JB, Ghahramani Z. Structure discovery in nonparametric regression through compositional kernel search. In: ICML (3), JMLR workshop and conference proceedings, vol. 28. JMLR.org 2013;1166–1174.
– reference: Berns F, Schmidt K, Bracht I, Beecks C. 3CS algorithm for efficient gaussian process model retrieval. In: 25th international conference on pattern recognition (ICPR). 2020.
– reference: Abrahamsen P. A review of gaussian random fields and correlation functions. In: Technical report. 1997, p 917. https://doi.org/10.13140/RG.2.2.23937.20325. https://www.nr.no/directdownload/917_Rapport.pdf
– reference: Hayashi K, Imaizumi M, Yoshida Y. On random subsampling of gaussian process regression: a graphon-based analysis. In: AISTATS, Proceedings of machine learning research, vol 108, p. PMLR 2020;2055–2065.
– reference: Rossi S, Heinonen M, Bonilla EV, Shen Z, Filippone M. Sparse gaussian processes revisited: Bayesian approaches to inducing-variable approximations. In: AISTATS, Proceedings of machine learning research, vol. 130, p. PMLR 2021;1837–1845.
– reference: Hebrail G, Berard A. Individual household electric power consumption data set. https://archive.ics.uci.edu/ml/datasets/individual+household+electric+power+consumption. 2012. Accessed: 09 Jan 2020.
– reference: Berns F, Beecks C. Complexity-adaptive gaussian process model inference for large-scale data. In: SDM. SIAM 2021.
– reference: RobertsSOsborneMEbdenMReeceSGibsonNAigrainSGaussian processes for time-series modellingPhilos Trans Ser A Math Phys Eng Sci2013371198420110550300566810.1098/rsta.2011.05501353.62103
– reference: YükselSEWilsonJNGaderPDTwenty years of mixture of expertsIEEE Trans Neural Netw Learn Syst20122381177119310.1109/TNNLS.2012.2200299
– reference: TsanasAXifaraAAccurate quantitative estimation of energy performance of residential buildings using statistical machine learning toolsEnergy Build20124956056710.1016/j.enbuild.2012.03.003
– reference: AminikhanghahiSCookDJA survey of methods for time series change point detectionKnowl Inf Syst201751233936710.1007/s10115-016-0987-z
– reference: LiuHOngYShenXCaiJWhen gaussian process meets big data: a review of scalable gpsIEEE Trans Neural Netw Learn Syst2020311144054423416996210.1109/TNNLS.2019.2957109
– reference: ParkCApleyDWPatchwork kriging for large-scale gaussian process regressionJ Mach Learn Res2018197:17:4338997731444.62088
– reference: DattaABanerjeeSFinleyAOGelfandAEHierarchical nearest-neighbor gaussian process models for large geostatistical datasetsJ Am Stat Assoc2016111514800812353870610.1080/01621459.2015.1044091
– reference: GittensAMahoneyMWRevisiting the Nystrom method for improved large-scale machine learningJ Mach Learn Res201617117:1117:6535435231367.68223
– reference: Wilson AG, Nickisch H. Kernel interpolation for scalable structured gaussian processes (KISS-GP). In: ICML, JMLR workshop and conference proceedings, vol. 37, pp. 1775–1784. JMLR.org. 2015.
– reference: HintonGETraining products of experts by minimizing contrastive divergenceNeural Comput20021481771180010.1162/089976602760128018
– reference: Malkomes G, Schaff C, Garnett R. Bayesian optimization for automated model selection. In: NIPS. 2016;2892–2900.
– reference: KimHLeeJClustering based on gaussian processesNeural Comput2007191130883107235297410.1162/neco.2007.19.11.30881143.68574
– reference: Li SC, Marlin BM. A scalable end-to-end gaussian process adapter for irregularly sampled time series classification. In: NIPS, 2016;1804–1812.
– reference: Titsias MK. Variational learning of inducing variables in sparse gaussian processes. In: AISTATS, JMLR proceedings, vol. 5, pp. 567–574. JMLR.org 2009.
– reference: Taubert N, Christensen A, Endres D, Giese MA. Online simulation of emotional interactive behaviors with hierarchical gaussian process dynamical models. In: Proceedings of the ACM symposium on applied perception, SAP ’12, pp. 25–32. Association for Computing Machinery, New York. 2012. https://doi.org/10.1145/2338676.2338682.
– reference: CholletFDeep learning with Python2018Shelter IslandManning Publications Co
– reference: Stanton S, Maddox W, Delbridge IA, Wilson AG. Kernel interpolation for scalable online gaussian processes. In: Banerjee A, Fukumizu K (eds) The 24th international conference on artificial intelligence and statistics, AISTATS 2021, April 13–15, 2021, Virtual event, Proceedings of machine learning research, vol. 130, PMLR 2021;3133–3141. http://proceedings.mlr.press/v130/stanton21a.html.
– reference: IlievAIKyurkchievNMarkovSOn the approximation of the step function by some sigmoid functionsMath Comput Simul2017133223234357527810.1016/j.matcom.2015.11.005
– reference: Rivera R, Burnaev E. Forecasting of commercial sales with large scale gaussian processes. In: ICDM workshops, IEEE Computer Society 2017;625–634.
– reference: Truong C, Oudre L, Vayatis N. Selective review of offline change point detection methods. Signal Process. 2020;167
– reference: Quinlan JR. Combining instance-based and model-based learning. In: ICML. Morgan Kaufmann 1993;236–243.
– reference: SteinrueckenCSmithEJanzDLloydJRGhahramaniZThe automatic statisticianAutomated machine learning, The Springer series on challenges in machine learning2019New YorkSpringer161173
– reference: RasmussenCEWilliamsCKIGaussian processes for machine learning. Adaptive computation and machine learning2006New YorkMIT Press1177.68165
– reference: Max Planck Institute for Biogeochemistry: Weather Station Beutenberg/Weather Station Saaleaue: Jena Weather Data Analysis. https://www.bgc-jena.mpg.de/wetter/. 2019. Accessed: 09 Jan 2020.
– reference: Arias Velásquez RM, Mejía Lara JV. Forecast and evaluation of covid-19 spreading in USA with reduced-space gaussian process regression. Chaos Solit Fractals. 2020;136:109924. https://doi.org/10.1016/j.chaos.2020.109924. https://www.sciencedirect.com/science/article/pii/S0960077920303234.
– reference: Lloyd JR, Duvenaud D, Grosse RB, Tenenbaum JB, Ghahramani Z. Automatic construction and natural-language description of nonparametric regression models. In: AAAI. AAAI Press 2014;1242–1250.
– reference: Hensman J, Fusi N, Lawrence ND. Gaussian processes for big data. In: UAI. AUAI Press; 2013.
– reference: Xu D, Tian Y. A comprehensive survey of clustering algorithms. Ann Data Sci. 2015;2(2).
– reference: LohYPLiangXChanCSLow-light image enhancement using gaussian process for features retrievalSignal Process Image Commun20197417519010.1016/j.image.2019.02.001
– reference: HongTPinsonPFanSGlobal energy forecasting competition 2012Int J Forecast201430235736310.1016/j.ijforecast.2013.07.001
– reference: Snelson E, Ghahramani Z. Local and global sparse gaussian process approximations. In: AISTATS, JMLR Proceedings, vol. 2. JMLR.org 2007;524–531.
– reference: Alsaleh R, Sayed T. Microscopic modeling of cyclists interactions with pedestrians in shared spaces: a gaussian process inverse reinforcement learning approach. Transportmetrica A Transport Sci. 2021. https://doi.org/10.1080/23249935.2021.1898487.
– reference: Csató L, Opper M. Sparse representation for gaussian process models. In: NIPS. MIT Press 2000;444–450.
– reference: MasoudniaSEbrahimpourRMixture of experts: a literature surveyArtif Intell Rev201442227529310.1007/s10462-012-9338-y
– reference: VerrelstJAlonsoLCamps-VallsGDelegidoJMorenoJFRetrieval of vegetation biophysical parameters using gaussian process techniquesIEEE Trans Geosci Remote Sens2012505–21832184310.1109/TGRS.2011.2168962
– reference: Snelson E, Ghahramani Z. Sparse gaussian processes using pseudo-inputs. In: NIPS. 2005;1257–1264.
– reference: Kim H, Teh YW. Scaling up the automatic statistician: scalable structure discovery using gaussian processes. In: AISTATS, Proceedings of machine learning research, vol. 84. PMLR 2018;575–584.
– volume: 51
  start-page: 339
  issue: 2
  year: 2017
  ident: 1186_CR3
  publication-title: Knowl Inf Syst
  doi: 10.1007/s10115-016-0987-z
– ident: 1186_CR17
– volume: 83
  start-page: 162
  year: 2014
  ident: 1186_CR54
  publication-title: Energy Build
  doi: 10.1016/j.enbuild.2014.04.034
– ident: 1186_CR33
– ident: 1186_CR10
– volume: 14
  start-page: 1771
  issue: 8
  year: 2002
  ident: 1186_CR20
  publication-title: Neural Comput
  doi: 10.1162/089976602760128018
– volume: 111
  start-page: 800
  issue: 514
  year: 2016
  ident: 1186_CR13
  publication-title: J Am Stat Assoc
  doi: 10.1080/01621459.2015.1044091
– ident: 1186_CR14
– volume-title: Gaussian processes for machine learning. Adaptive computation and machine learning
  year: 2006
  ident: 1186_CR36
– volume: 5
  start-page: 51
  year: 1989
  ident: 1186_CR23
  publication-title: Comput Intell
  doi: 10.1111/j.1467-8640.1989.tb00315.x
– ident: 1186_CR7
  doi: 10.5220/0010109700650074
– volume: 19
  start-page: 3088
  issue: 11
  year: 2007
  ident: 1186_CR24
  publication-title: Neural Comput
  doi: 10.1162/neco.2007.19.11.3088
– volume: 74
  start-page: 175
  year: 2019
  ident: 1186_CR29
  publication-title: Signal Process Image Commun
  doi: 10.1016/j.image.2019.02.001
– volume: 23
  start-page: 1177
  issue: 8
  year: 2012
  ident: 1186_CR53
  publication-title: IEEE Trans Neural Netw Learn Syst
  doi: 10.1109/TNNLS.2012.2200299
– volume: 42
  start-page: 275
  issue: 2
  year: 2014
  ident: 1186_CR32
  publication-title: Artif Intell Rev
  doi: 10.1007/s10462-012-9338-y
– ident: 1186_CR45
– ident: 1186_CR51
– volume: 31
  start-page: 4405
  issue: 11
  year: 2020
  ident: 1186_CR27
  publication-title: IEEE Trans Neural Netw Learn Syst
  doi: 10.1109/TNNLS.2019.2957109
– ident: 1186_CR4
  doi: 10.1016/j.chaos.2020.109924
– volume: 133
  start-page: 223
  year: 2017
  ident: 1186_CR22
  publication-title: Math Comput Simul
  doi: 10.1016/j.matcom.2015.11.005
– volume: 49
  start-page: 560
  year: 2012
  ident: 1186_CR47
  publication-title: Energy Build
  doi: 10.1016/j.enbuild.2012.03.003
– volume: 50
  start-page: 1832
  issue: 5–2
  year: 2012
  ident: 1186_CR49
  publication-title: IEEE Trans Geosci Remote Sens
  doi: 10.1109/TGRS.2011.2168962
– volume: 521
  start-page: 452
  issue: 7553
  year: 2015
  ident: 1186_CR15
  publication-title: Nature
  doi: 10.1038/nature14541
– ident: 1186_CR40
– start-page: 161
  volume-title: Automated machine learning, The Springer series on challenges in machine learning
  year: 2019
  ident: 1186_CR43
– ident: 1186_CR44
  doi: 10.1145/2338676.2338682
– ident: 1186_CR37
  doi: 10.1109/ICDMW.2017.89
– ident: 1186_CR25
– ident: 1186_CR50
– ident: 1186_CR19
– ident: 1186_CR46
  doi: 10.1016/j.sigpro.2019.107299
– ident: 1186_CR12
– volume: 17
  start-page: 117:1
  year: 2016
  ident: 1186_CR16
  publication-title: J Mach Learn Res
– ident: 1186_CR41
– ident: 1186_CR31
– volume: 371
  start-page: 20110550
  issue: 1984
  year: 2013
  ident: 1186_CR38
  publication-title: Philos Trans Ser A Math Phys Eng Sci
  doi: 10.1098/rsta.2011.0550
– ident: 1186_CR52
  doi: 10.1007/s40745-015-0040-1
– ident: 1186_CR30
  doi: 10.1609/aaai.v29i1.9575
– ident: 1186_CR8
  doi: 10.1137/1.9781611976700.41
– ident: 1186_CR35
  doi: 10.1016/B978-1-55860-307-3.50037-X
– volume-title: Deep learning with Python
  year: 2018
  ident: 1186_CR11
– ident: 1186_CR26
– ident: 1186_CR6
  doi: 10.1145/3340531.3412182
– ident: 1186_CR18
– volume: 60
  start-page: 126
  year: 2014
  ident: 1186_CR48
  publication-title: Int J Electric Power Energy Syst
  doi: 10.1016/j.ijepes.2014.02.027
– ident: 1186_CR2
  doi: 10.1080/23249935.2021.1898487
– volume: 30
  start-page: 357
  issue: 2
  year: 2014
  ident: 1186_CR21
  publication-title: Int J Forecast
  doi: 10.1016/j.ijforecast.2013.07.001
– ident: 1186_CR39
– volume: 19
  start-page: 7:1
  year: 2018
  ident: 1186_CR34
  publication-title: J Mach Learn Res
– ident: 1186_CR28
  doi: 10.1609/aaai.v28i1.8904
– ident: 1186_CR1
  doi: 10.13140/RG.2.2.23937.20325
– ident: 1186_CR42
– ident: 1186_CR9
– ident: 1186_CR5
SSID ssj0002504465
Score 2.2657254
SecondaryResourceType review_article
Snippet Gaussian process models (GPMs) are widely regarded as a prominent tool for learning statistical data models that enable interpolation, regression, and...
SourceID pubmedcentral
proquest
pubmed
crossref
springer
SourceType Open Access Repository
Aggregation Database
Index Database
Enrichment Source
Publisher
StartPage 300
SubjectTerms Algorithms
Approximation
Automation
Computer Imaging
Computer Science
Computer Systems Organization and Communication Networks
Data Structures and Information Theory
Datasets
Empirical analysis
Gaussian process
Inference
Information Systems and Communication Service
Interpolation
Knowledge Discovery
Knowledge Engineering and Knowledge Management
Model accuracy
Pattern Recognition and Graphics
Random variables
Review
Review Article
Software Engineering/Programming and Operating Systems
State-of-the-art reviews
Statistical analysis
Vision
SummonAdditionalLinks – databaseName: Springer LINK
  dbid: RSV
  link: http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV1LT9wwELYQcOilvNqyZUGuxK21tHZiO-ktQrwk2KI-ELco8YNdaUmqzWbLz2fsTYK221ai54xlx54Zf7ZnvkHoWMdWqtAEBLZaQULOFAHUOiA6zgzVPBbcc-ndXsnhMLq7i2-apLCqjXZvnyS9p-6S3cBzypi46HOXLSkIIMcN2O4iV7Dh67fb7mbFkXKFgjcZMn9uurwLrUDL1QjJ355J_e5ztvV_495Grxu0iZOFeuygNVPsoq22kgNuDHsPjZJ6VgJ2NRq74mgTfNnmAWIAtfg8qyuXbImbtAJTfcZJgb_MnaMxv3BpsQetpLQEACWB_vC1L01d4azQOJncl9PxbPRQvUE_zk6_n1yQpgYDUQBlHgmcnKni3IhBJqTUuZQKAI2NbEg1ywxTlhlAQILmOdWRUiYLMs0DFivD8gBOwG_RelEWZh9haQeRlQA3aG5DwIm5CUVOlQwC8CtwbOsh2q5JqhqCclcnY5J21Mp-KlOYytRPZfrYQx-7Nj8X9Bz_lO63S502plqlzF3pOJZ92kMfus9gZO7lJCtMWYOMkICDHTVjD71baEbXXcBFKDmH4cslnekEHIH38pdiPPJE3jHAhphBy0-t5jwP6-9_8f5l4gfoFfPK50KM-2h9Nq3NIdpU89m4mh5503kC1ykVrw
  priority: 102
  providerName: Springer Nature
Title Automated Model Inference for Gaussian Processes: An Overview of State-of-the-Art Methods and Algorithms
URI https://link.springer.com/article/10.1007/s42979-022-01186-x
https://www.ncbi.nlm.nih.gov/pubmed/35647556
https://www.proquest.com/docview/2933094661
https://www.proquest.com/docview/2672317790
https://pubmed.ncbi.nlm.nih.gov/PMC9123926
Volume 3
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVPQU
  databaseName: Advanced Technologies & Aerospace Database
  customDbUrl:
  eissn: 2661-8907
  dateEnd: 20241213
  omitProxy: false
  ssIdentifier: ssj0002504465
  issn: 2661-8907
  databaseCode: P5Z
  dateStart: 20200101
  isFulltext: true
  titleUrlDefault: https://search.proquest.com/hightechjournals
  providerName: ProQuest
– providerCode: PRVPQU
  databaseName: Computer Science Database
  customDbUrl:
  eissn: 2661-8907
  dateEnd: 20241213
  omitProxy: false
  ssIdentifier: ssj0002504465
  issn: 2661-8907
  databaseCode: K7-
  dateStart: 20200101
  isFulltext: true
  titleUrlDefault: http://search.proquest.com/compscijour
  providerName: ProQuest
– providerCode: PRVPQU
  databaseName: ProQuest Central Database Suite (ProQuest)
  customDbUrl:
  eissn: 2661-8907
  dateEnd: 20241213
  omitProxy: false
  ssIdentifier: ssj0002504465
  issn: 2661-8907
  databaseCode: BENPR
  dateStart: 20200101
  isFulltext: true
  titleUrlDefault: https://www.proquest.com/central
  providerName: ProQuest
– providerCode: PRVAVX
  databaseName: Springer LINK
  customDbUrl:
  eissn: 2661-8907
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0002504465
  issn: 2661-8907
  databaseCode: RSV
  dateStart: 20190101
  isFulltext: true
  titleUrlDefault: https://link.springer.com/search?facet-content-type=%22Journal%22
  providerName: Springer Nature
– providerCode: PRVAVX
  databaseName: Springer LINK
  customDbUrl:
  eissn: 2661-8907
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0002504465
  issn: 2661-8907
  databaseCode: RSV
  dateStart: 20200101
  isFulltext: true
  titleUrlDefault: https://link.springer.com/search?facet-content-type=%22Journal%22
  providerName: Springer Nature
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpV3Nb9MwFLfYxoELDPGxsq0yEjewqJ3YjrlMBW2AgFINmCouUeIPOqkk29Ju_fN5dp1MZWIXLpEi24mt9_z8e7bf7yH0wigndWoTAkutIClnmgBqHRCjCksNV4IHLr2Tz3I0yiYTNY4bbk28VtnaxGCoTa39Hvlr5j1vT4ZOD87Oic8a5U9XYwqNDbTlWRJ86oYx_9ntsXh6rjRkk4RliBGl-CTGzYToOTDFUhF_nd2HXwqyXF-bbgDOm_cm_zo8DWvS0YP_Hc02uh_RKB6u1OchumOrR2g6XMxrALLWYJ8pbYY_tkGBGBAufl8sGh95iWOMgW3e4GGFv156q2OvcO1wQLCkdgTQJYGP4y8hT3WDi8rg4ewX9GQ-_d08Rj-ODr-_-0BiQgaiAdcsCbjRVHNuxaAQUppSSg3oxmUupYYVlmnHLMAhQcuSmkxrWySF4QlT2rIyAXf4Cdqs6sruICzdIHMSsActXQqgsbSpKKmWSQJGBny4HqKtKHId2cp90oxZ3vEsB_HlIL48iC9f9tDLrs3Ziqvj1tp7rWjyOG-b_FouPfS8K4YZ549RisrWC6gjJIBiz9PYQ09XCtH9LuEilZxD9-WaqnQVPJv3ekl1Og2s3gowhGLQ8lWrVNfd-vcont0-il10jwUF9_eL99Dm_GJh99FdfTk_bS76aENOsj7aens4Gh_D2ydJ-mEGwfP428kfCGgfCw
linkProvider ProQuest
linkToHtml http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw1V1db9MwFL0aAwle-BCwdRtgJHgCi9qJ43gSQhUwVrUrPAzUt5DYDp3UJWNpt_Kn-I1cu0mmMrG3PfBs58Px8b3nxr7nArwwKpc6tAFFVxvRUHBNkbV2qVGpZUaoSHgtvW9DORrF47H6sga_m1wYd6yysYneUJtSu3_kb7iLvJ0YOnt38pO6qlFud7UpobGExcD-OseQrXrb_4Dz-5LzvY-H7_dpXVWAanTOC4qxINNC2KibRlKaTEqNLjqP85AZnlquc27Rp0csy5iJtbZpkBoRcKUtzwKM6fC-N-BmGMTSrauBpO0_HScHFvrqlej2OFVKjOs8HZ-th6ZfKuqOz7t0z4guVn3hJYJ7-ZzmX5u13gfu3fvfvt59uFuzbdJbLo8HsGaLhzDpzWclEnVriKsENyX9JumRIIMnn9J55TJLSZ1DYatd0ivI5zNnVe05KXPiGTotc4rsmeLNyYGvw12RtDCkN_2BI59NjqtH8PVaBvcY1ouysJtAZN6Nc4ncimV5iKQ4s2GUMS2DAI0oxqgdYM3UJ7pWY3dFQaZJqyPt4ZIgXBIPl2TRgVftNSdLLZIre-80UEhqu1QlFzjowPO2GS2K2yZKC1vOsU8kkfQ7HcoObCwB2D4uEFEohcDXlyvQbDs4tfLVluJo4lXLFXIkxfHK1w2IL17r36PYunoUz-D2_uHBMBn2R4NtuMP94nJnqXdgfXY6t0_glj6bHVWnT_1KJfD9usH9ByHBeBU
linkToPdf http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpV3db9MwED-hDSFetvE1ug0wEm9grXZiu9lbNShMjDIJmPYWJf6glbpkatJtf_7ObhIoAyTEs8_y1539s333O4BXJnFKxzaieNRKGguuKaLWPjVJZpkRiRSBS-_0WI3Hg7Oz5OSnKP7g7d5-SS5jGjxLU1HvXxi33wW-4S6qEuo90X3kpKSIItdj70jv7-tfTrtXFk_QFUvRRMv8vurqiXQLZt72lvzlyzScRKPN_x_DFmw0KJQMl2rzAO7Y4iFsthkeSGPwj2AyXNQlYlpriE-aNiNHbXwgQbBL3meLygdhkibcwFYHZFiQz5d-A7JXpHQkgFlaOopAk2J75FNIWV2RrDBkOPtezqf15Lx6DN9G774efqBNbgaqEeJcU7xRMy2Elf1MKmVypTQCHTdwMTM8s1w7bhEZSZbnzAy0tlmUGRHxRFueR3gzfgJrRVnYp0CU6w-cQhjCchcjfsxtLHOmVRThfoPXuR6wdn1S3RCX-_wZs7SjXA5TmeJUpmEq0-sevO7qXCxpO_4qvdcue9qYcJVy_9Tj2fdZD152xWh8_kclK2y5QBmpEB97ysYebC-1pGsuEjJWQmD31Yr-dAKe2Hu1pJhOAsF3gnAi4VjzTatFP7r151Hs_Jv4C7h38naUHh-NP-7CfR700Hsh78FaPV_YZ3BXX9bTav48WNQN3XIhdw
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Automated+Model+Inference+for+Gaussian+Processes%3A+An+Overview+of+State-of-the-Art+Methods+and+Algorithms&rft.jtitle=SN+computer+science&rft.au=Berns%2C+Fabian&rft.au=H%C3%BCwel%2C+Jan&rft.au=Beecks%2C+Christian&rft.date=2022-07-01&rft.eissn=2661-8907&rft.volume=3&rft.issue=4&rft.spage=300&rft_id=info:doi/10.1007%2Fs42979-022-01186-x&rft_id=info%3Apmid%2F35647556&rft.externalDocID=35647556
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2661-8907&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2661-8907&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2661-8907&client=summon