Skip to main content

A Seminar by Dr. P. Balamurugan

Title: Efficient Algorithms for a class of Saddle-Point Optimization Problems.

Speaker: Dr. P. Balamurugan, SIERRA Project Team, INRIA-Ecole Normale Superieure.

Time: 10:30am, Tuesday, 25/10/16.

Venue: IEOR Teaching Lab, Ground floor, IEOR Bldg.

Abstract: Many convex optimization problems arising in machine learning can be represented as a weighted sum of a suitable regularizer term and a loss term. For typical applications like classification and regression, the related primal and dual optimization problems contain terms that may be decomposable over the training examples. This particular property is exploited by the widely popular primal stochastic gradient descent method and dual coordinate ascent method. In spite of their excellent generalization performance, both these methods lead to moderately accurate solutions. The recently popularized stochastic averaged gradient descent (SAG), its accelerated variant (SAGA), and the stochastic variance reduced gradient descent (SVRG) methods rectify this situation by providing highly accurate solutions. In this talk, I will discuss an extension of these SAG, SAGA and SVRG methods to the case of saddle-point problems. I will first motivate the need for this extension using suitable examples. Since the extension does not reside in the favorable convex optimization framework, I will also present the challenges which need to be handled. I will conclude with some natural generalizations, a bit of theory and demonstration of the theoretical results using experiments. This is a joint work with Prof. Francis Bach, which will appear in NIPS 2016.

Speaker bio: P. Balamurugan is a post-doctoral researcher in SIERRA Project Team, INRIA-Ecole Normale Superieure, advised by Professor Francis Bach. Previously, he did his PhD in Computer Science and Automation Department, Indian Institute of Science, advised by Dr. Shirish Shevade. His primary research interests lie in the intersection of machine learning and mathematical optimization, with a particular emphasis on large-scale sparse classification, structured output learning, semi-supervised and unsupervised learning problems.
 

News Category