Rényi Divergence Based Bounds on Generalization Error

Authors

Image provided by Eeshan Modak
Eeshan
Modak
Tata Institute of Fundamental Research
Profile
Himanshu
Asnani
Tata Institute of Fundamental Research, Mumbai & University of Washington, Seattle
Profile
Vinod
Prabhakaran
Tata Institute of Fundamental Research

Abstract

Generalization error captures the degree to which the output of a learning algorithm overfits the training data. We obtain a family of bounds which generalize the bounds developed by Xu & Raginsky (2017) and Bu, Zou and Veeravalli (2019), under certain assumptions. Our bounds are based on the Rényi analogue of the Donsker-Varadhan representation of Kullback-Leibler divergence. We also obtain bounds on the probability of generalization error which recover the bounds of Esposito, Gastpar and Issa (2020). We also give a multiplicative lower bound on the expected true loss for a 0-1 loss function.

Paper Manuscript