Skip to main content

Talk by Dr. Arjun Kodagehalli Ramachandra

Speaker: Dr. Arjun Kodagehalli Ramachandra

Title: When submodularity meets uncertainty

Venue: IEOR Seminar Room, 2nd floor

Date and Time: 12th April, 5-6 pm
 

Abstract: Submodularity provides a natural context to model a large class of discrete optimization problems including but not limited to influence maximization, content resolution, mechanism design, resource allocation and several machine learning problems. As a set functional property, submodularity models the notion of diminishing returns in the discrete space. Theoretically, it has intrigued scientists due to strong structural similarity with both convex and concave functions, leading to continous extensions useful in deriving approximation guarantees for deterministic submodular optimization problems. Distributionally robust submodular optimization, however, involves evaluating or approximating the worst-case expected value of a submodular function (subjected to random input) over a set of joint distributions. So far, traditional approaches tackling this problem assume the random inputs to be independent. For example, with monotone submodular functions, it was shown in Agrawal et.al. (2012) that the "loss" in optimality or correlation gap (by ignoring the correlation structure of the random set and assuming independence instead) is at most e/(e-1). In reality, however, more complex notions of randomness are often encountered, such as when weak correlations coexist with higher-order dependencies. Inspired by the need to incorporate more realistic notions of randomness and driven by the curiosity to understand the interplay between functional properties and randomness, we study the behavior of monotone submodular set functions with uncorrelated Bernoulli random input. We show that in this scenario, the e/(e-1) bound on the correlation gap can be improved to 4/3 (and that it is tight) in the following cases: (a) for small size of random inputs with general marginal probabilities, and (b) for general size of random inputs with small marginal probabilities. The results automatically extend to random inputs with positive, negative and mixed correlations. Our results illustrate a fundamental difference in the behavior of submodular functions under weaker notions of independence with potential ramifications in improving existing algorithmic approximations.
(This is joint work with Karthik Natarajan, Singapore University of Technology and Design)

 

 

Speaker Bio: Arjun is currently a postdoctoral research fellow at the Singapore University of Technology and Design (SUTD). He holds a Ph.D. in Operations Research from the National University of Singapore, and an Integrated Master’s degree in Maths and Computing from the Indian Institute of Technology, Kharagpur. His research interests broadly lie in optimization under uncertainty and data-driven decision-making under tail-based imbalance.

Host: Prof. Manjesh Hanawal, TCA2I & IEOR. 

News Category
Date Posted