
Dr. Oliver Hinder: Gradient Descent for Solving Linear Programs
Échec de l'ajout au panier.
Échec de l'ajout à la liste d'envies.
Échec de la suppression de la liste d’envies.
Échec du suivi du balado
Ne plus suivre le balado a échoué
-
Narrateur(s):
-
Auteur(s):
À propos de cet audio
Send us a text
Oliver Hinder is an Assistant Professor in Industrial Engineering Department at University of Pittsburgh. Before that he was a visiting post-doc at Google in the Optimization and Algorithms group in New York and received his PhD in 2019 in Management Science and Engineering from Stanford working with professor Yinyu Ye. He studies local optimization, gradient descent, both convex and nonconvex problems, etc.
We chat about Oliver moving to the U.S. from New Zealand to start his PhD at Stanford; we talk about some of his recent work on gradient descent methods for solving LPs accurately and how using restarts can benefit algorithms like these. Finally, we touch on automated parameter tuning in ML especially in Deep Learning which is being widely used in many applications.
Check it out!