Cornell University
Cornell Tech, Bloomberg Center, Room 061 Free Event
Free Event

Learning Machines Seminar Series

What: LMSS @ Cornell Tech: Kyunghyun Cho (NYU/Facebook AI Research)
When: Friday, March 6th, 12:15 p.m. (lunch served at 12pm)
Where: Room 061, Bloomberg Center, Cornell Tech (map)

 

"Inconsistency of a recurrent language model: a question i forgot to ask in 2014"

In this talk, I will go back to the basic of neural sequence modeling and ask the glaringly obvious question I forgot to ask in 2014; "is density estimation a good strategy for sequence generation?" I try to give my initial stab at belatedly answering this question by empirically investigating the discrepancy between density estimation and sequence generation, studying the effectiveness of two distinct approaches to neural sequence modeling (Lee, Tran, Firat & Cho, 2020 under review), and by theoretically studying the inconsistency of incomplete decoding in a recurrent language model (Welleck, Kulikov, Kim, Pang & Cho, 2020 under review). I will conclude the talk by discussing some lessons I have learned during the past half a year trying to answer this question.

 

BIO

Kyunghyun Cho is an associate professor of computer science and data science at New York University, a research scientist at Facebook AI Research and a CIFAR Associate Fellow. He was a postdoctoral fellow at University of Montreal until summer 2015 under the supervision of Prof. Yoshua Bengio, and received PhD and MSc degrees from Aalto University early 2014 under the supervision of Prof. Juha Karhunen, Dr. Tapani Raiko and Dr. Alexander Ilin. He tries his best to find a balance among machine learning, natural language processing, and life, but almost always fails to do so. 

1 person is interested in this event

User Activity

No recent activity