Estimation of prediction error by using K-fold cross-validation

Estimation of prediction accuracy is important when our aim is prediction. The training error is an easy estimate of prediction error, but it has a downward bias. On the other hand, K -fold cross-validation has an upward bias. The upward bias may be negligible in leave-one-out cross-validation, but...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Statistics and computing Jg. 21; H. 2; S. 137 - 146
1. Verfasser: Fushiki, Tadayoshi
Format: Journal Article
Sprache:Englisch
Veröffentlicht: Boston Springer US 01.04.2011
Schlagworte:
ISSN:0960-3174, 1573-1375
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Estimation of prediction accuracy is important when our aim is prediction. The training error is an easy estimate of prediction error, but it has a downward bias. On the other hand, K -fold cross-validation has an upward bias. The upward bias may be negligible in leave-one-out cross-validation, but it sometimes cannot be neglected in 5-fold or 10-fold cross-validation, which are favored from a computational standpoint. Since the training error has a downward bias and K -fold cross-validation has an upward bias, there will be an appropriate estimate in a family that connects the two estimates. In this paper, we investigate two families that connect the training error and K -fold cross-validation.
Bibliographie:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 23
ISSN:0960-3174
1573-1375
DOI:10.1007/s11222-009-9153-8