Speaker : | Édouard Pineau |
Télécom Paris / Safran | |
Date: | 22/01/2020 |
Time: | 11:00 am - 12:00 pm |
Location: | Paris-Rennes Room (EIT Digital) |
Abstract
Reference: Mutual Information Neural Estimation, Belghazi et al. , 2018.
The mutual information (MI) of two random variables is a measure of the mutual dependence between two variables. More specifically, it quantifies the “amount of information” obtained about one random variable through observing the other random variable. Despite being a pivotal quantity across data science, mutual information has historically been difficult to compute. Exact computation is only tractable for discrete variables, or for a limited family of problems where the probability distributions are known. For more general problems, this is not possible. Common approaches for MI estimation generally do not scale with dimensionality or sample size. I will present a method for estimating efficiently a strongly consistent lower bound of mutual information that can be used in numerous machine learning problems.