Research Uncategorized

Paper Accepted to IEEE Information Theory Workshop (ITW) 2020

Title: “Jensen-Shannon Information Based Characterization of the Generalization Error of Learning Algorithms”
Authors: Gholamali Aminian, Laura Toni, and Miguel Rodrigues


Abstract: Generalization error bounds are critical to under- standing the performance of machine learning models. In this work, we propose a new information-theoretic based general- ization error upper bound applicable to supervised learning scenarios. We show that our general bound can specialize in various previous bounds. We also show that our general bound can be specialized under some conditions to a new bound involving the Jensen-Shannon information between a random variable modelling the set of training samples and another random variable modelling the set of hypotheses. We also prove that our bound can be tighter than mutual information-based bounds under some conditions.

Index Terms: Generalization Error Bounds, Mutual Informa- tion, Jensen-Shannon Information 

0 comments on “Paper Accepted to IEEE Information Theory Workshop (ITW) 2020

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: