BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//wp-events-plugin.com//7.2.3.1//EN
BEGIN:VEVENT
UID:523@lincs.fr
DTSTART;TZID=Europe/Paris:20200122T110000
DTEND;TZID=Europe/Paris:20200122T120000
DTSTAMP:20200127T080001Z
URL:https://www.lincs.fr/events/mutual-information-neural-estimation/
SUMMARY:Mutual Information Neural Estimation
DESCRIPTION:Reference: Mutual Information Neural Estimation\, Belghazi et
 al. \, 2018.\n\nThe mutual information (MI) of two random variables is a
 measure of the mutual dependence between two variables. More specifically\,
 it quantifies the "amount of information" obtained about one random
 variable through observing the other random variable. Despite being a
 pivotal quantity across data science\, mutual information has historically
 been difficult to compute. Exact computation is only tractable for discrete
 variables\, or for a limited family of problems where the probability
 distributions are known. For more general problems\, this is not possible.
 Common approaches for MI estimation generally do not scale with
 dimensionality or sample size. I will present a method for estimating
 efficiently a strongly consistent lower bound of mutual information that
 can be used in numerous machine learning problems.\n\nSlides.\n\nNotebook.
CATEGORIES:Network Theory,Working Group
LOCATION:Paris-Rennes Room (EIT Digital)\, 23 avenue d'Italie\, 75013
 Paris\, France
X-APPLE-STRUCTURED-LOCATION;VALUE=URI;X-ADDRESS=23 avenue d'Italie\, 75013
 Paris\, France;X-APPLE-RADIUS=100;X-TITLE=Paris-Rennes Room (EIT
 Digital):geo:0,0
END:VEVENT
BEGIN:VTIMEZONE
TZID:Europe/Paris
X-LIC-LOCATION:Europe/Paris
BEGIN:STANDARD
DTSTART:20191027T020000
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
END:STANDARD
END:VTIMEZONE
END:VCALENDAR