Online Hyperparameter Optimization by Real-time Recurrent Learning

Speaker: Professor Kyunghyun Cho
Affiliation: New York University

Zoom: https://ucla.zoom.us/j/96763629578?pwd=YVY2bVJRb1pNODVoa29jUDhkQTVHdz09
Abstract:  Conventional hyperparameter optimization methods are computationally intensive and hard to generalize to scenarios that require dynamically adapting hyperparameters, such as life-long learning. Here, we propose an online hyperparameter optimization algorithm that is asymptotically exact and computationally tractable, both theoretically and practically. Our framework takes advantage of the analogy between hyperparameter optimization and parameter learning in recurrent neural networks (RNNs). It adapts a well-studied family of online learning algorithms for RNNs to tune hyperparameters and network parameters simultaneously, without repeatedly rolling out iterative optimization. This procedure yields systematically better generalization performance compared to standard methods, at a fraction of wallclock time. (This is work done with Daniel Jiwoong Im and Cristina Savin.)
Biography:  Kyunghyun Cho is an associate professor of computer science and data science at New York University and CIFAR Fellow of Learning in Machines & Brains. He was a research scientist at Facebook AI Research from June 2017 to May 2020 and a postdoctoral fellow at University of Montreal until Summer 2015 under the supervision of Prof. Yoshua Bengio, after receiving PhD and MSc degrees from Aalto University April 2011 and April 2014, respectively, under the supervision of Prof. Juha Karhunen, Dr. Tapani Raiko and Dr. Alexander Ilin. He tries his best to find a balance among machine learning, natural language processing, and life, but almost always fails to do so.

Date/Time:
Date(s) - Feb 11, 2021
11:00 am - 12:15 pm

Location:
Via Zoom Only
No location, Los Angeles
Map Unavailable