BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//wp-events-plugin.com//7.2.3.1//EN
BEGIN:VEVENT
UID:502@lincs.fr
DTSTART;TZID=Europe/Paris:20191009T103000
DTEND;TZID=Europe/Paris:20191009T120000
DTSTAMP:20191107T082842Z
URL:https://www.lincs.fr/events/transformer-models-in-artificial-intellige
 nce-for-natural-language-processing/
SUMMARY:Transformer models in Artificial Intelligence for Natural Language
 Processing
DESCRIPTION:\nWe'll explore recent Deep Learning models for Natural
 Language Processing based on the ("post-Recurrent Neural
 Network")&nbsp\;Transformer&nbsp\;architecture described in&nbsp\;Attention
 Is All You Need&nbsp\;(Vaswani&nbsp\;et al.\, 2017).&nbsp\;We'll understand
 the intuition behind this architecture and how it was applied to supervised
 Seq2Seq tasks.&nbsp\;Then we'll see how Devlin&nbsp\;et
 al.&nbsp\;(2018)&nbsp\;pre-trained a deep bidirectional transformer
 called&nbsp\;BERT\, a model producing state-of-the-art results on several
 Natural Language Understanding tasks.If time permits\, we'll dig into
 derived models of transformer such as Open AI GPT-2&nbsp\;(Radford&nbsp\;et
 al.\,&nbsp\;2019). GPT-2 achieves&nbsp\;astounding results in Natural
 Language Generation and has been reported as "the text generator performing
 too well to be released" in the press...\nSlides to the presentation\n
CATEGORIES:Network Theory,Working Group
LOCATION:Telecom Paristech\, I304 (3rd floor)\, 23\, avenue d'Italie\,
 Paris\, 75013\, France
X-APPLE-STRUCTURED-LOCATION;VALUE=URI;X-ADDRESS=23\, avenue d'Italie\,
 Paris\, 75013\, France;X-APPLE-RADIUS=100;X-TITLE=Telecom Paristech\, I304
 (3rd floor):geo:0,0
END:VEVENT
BEGIN:VTIMEZONE
TZID:Europe/Paris
X-LIC-LOCATION:Europe/Paris
BEGIN:DAYLIGHT
DTSTART:20190331T030000
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
TZNAME:CEST
END:DAYLIGHT
END:VTIMEZONE
END:VCALENDAR