Relative deviation learning bounds and generalization with unbounded loss functions

We present an extensive analysis of relative deviation bounds, including detailed proofs of two-sided inequalities and their implications. We also give detailed proofs of two-sided generalization bounds that hold in the general case of unbounded loss functions, under the assumption that a moment of...

Full description

Saved in:
Bibliographic Details
Published in:Annals of mathematics and artificial intelligence Vol. 85; no. 1; pp. 45 - 70
Main Authors: Cortes, Corinna, Greenberg, Spencer, Mohri, Mehryar
Format: Journal Article
Language:English
Published: Cham Springer International Publishing 01.01.2019
Springer
Springer Nature B.V
Subjects:
ISSN:1012-2443, 1573-7470
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:We present an extensive analysis of relative deviation bounds, including detailed proofs of two-sided inequalities and their implications. We also give detailed proofs of two-sided generalization bounds that hold in the general case of unbounded loss functions, under the assumption that a moment of the loss is bounded. We then illustrate how to apply these results in a sample application: the analysis of importance weighting.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1012-2443
1573-7470
DOI:10.1007/s10472-018-9613-y