Authors
Corinna Cortes, Yishay Mansour, Mehryar Mohri
Publication date
2010
Journal
Advances in neural information processing systems
Volume
23
Description
This paper presents an analysis of importance weighting for learning from finite samples and gives a series of theoretical and algorithmic results. We point out simple cases where importance weighting can fail, which suggests the need for an analysis of the properties of this technique. We then give both upper and lower bounds for generalization with bounded importance weights and, more significantly, give learning guarantees for the more common case of unbounded importance weights under the weak assumption that the second moment is bounded, a condition related to the Renyi divergence of the training and test distributions. These results are based on a series of novel and general bounds we derive for unbounded loss functions, which are of independent interest. We use these bounds to guide the definition of an alternative reweighting algorithm and report the results of experiments demonstrating its benefits. Finally, we analyze the properties of normalized importance weights which are also commonly used.
Total citations
2011201220132014201520162017201820192020202120222023202449141421191823354357635037
Scholar articles
C Cortes, Y Mansour, M Mohri - Advances in neural information processing systems, 2010