Speaker: Kartik Ahuja
Affiliation: UCLA Ph.D. Candidate
Abstract: In many engineering and machine learning applications, we often encounter optimization problems (e.g., resource allocation, clustering) for which finding the exact solution is computationally intractable. In such problems, ad-hoc approximate solutions are often used, which have no performance guarantees. Our goal is to develop approximate optimization methods with the following features: a) provable performance guarantees; and b) computational tractability. In this talk, we focus on several challenging problems in resource allocation and machine learning and develop optimization methods for the same. In the first part of this talk, we discuss optimization methods to solve resource allocation problems encountered in the design of different engineering systems, namely wireless networks and healthcare systems.
Dense deployment of heterogeneous small cells (e.g., picocells, femtocells) is becoming the most effective way to combat the exploding demand for the wireless spectrum. Given the large-scale nature of these deployments, developing resource-sharing policies using a centralized system can be computationally and communicationally prohibitive. To this end, we propose a general framework for distributed multi-agent resource sharing. We show that the proposed framework significantly outperforms the state-of-the-art. We prove quite general constant-factor approximation guarantees with respect to the optimal centralized solutions.
Screening plans are used for the early detection of several diseases, such as breast cancer and colon cancer. These screening plans are not personalized to the history and demographics of the subject and can often lead to a delay in the detection of the disease and in other cases, cause unnecessary invasive tests such as biopsies. We show that constructing exactly optimal personalized screening plans that minimize the number of screens given tolerance on the delay is computationally intractable. We develop a framework to solve the proposed problem approximately. We establish general performance guarantees and show that the proposed solution is computationally tractable. We apply the framework to breast cancer screening and establish its utility in comparison to the existing clinical guidelines.
In the second part of this talk, we develop optimization methods useful for machine learning applications. We focus on the problem of KL divergence estimation. KL divergence is a fundamental quantity used in many disciplines, such as machine learning, statistics, and information theory. We develop an optimization-based approach to estimate the KL divergence, which relies on the Donsker-Varadhan representation. The state-of-the-art estimator based on this representation relies on solving a non-convex optimization problem and hence is not consistent. We propose a convex reformulation to construct an estimator, which we show is consistent. We also carry out experiments to show that the proposed estimator is better than the competing estimator.
Biography: Kartik received his B.tech-M.tech dual degree in Electrical Engineering from Indian Institute of Technology, Kanpur. He then joined UCLA’s Electrical and Computer Engineering Department for a Ph.D. Kartik’s research focuses on developing optimization, game theory and machine learning methods. He has applied these methods to tackle problems related to resource allocation in engineering and interpretability in machine learning. His research works have been featured in IEEE spotlight, IEEExplore, IEEE MMTC letter and was also nominated for the best paper award at IEEE Globecom. He was the recipient of 2014 Guru Krupa Fellowship, and the 2018-19 Dissertation Year Fellowship.
For more information, contact Prof. Gregory J. Pottie (firstname.lastname@example.org)
Date(s) - Aug 29, 2019
11:00 am - 1:00 pm
E-IV Tesla Room #53-125
420 Westwood Plaza - 5th Flr., Los Angeles CA 90095