Estimation of Graphical Models: Convex Formulations and Algorithms

Speaker: Jinchao Li
Affiliation: UCLA Ph.D. Candidate

Abstract:  A Gaussian graphical model is a graph representation of conditional independence relations among Gaussian random variables. A fundamental problem in the estimation of Gaussian graphical models is the selection of the graph topology given relatively small amounts of data. This problem is often solved via L1-regularized maximum likelihood estimation, for which many large-scale convex optimization algorithms have been developed. We consider several extensions of Gaussian graphical models and develop fast algorithms based on convex optimization methods.

As a first extension, we consider the restricted sparse inverse covariance selection problem where the set of zero entries of the inverse covariance matrix is partially known and an L1-norm penalization is applied to the remaining entries. The proximal Newton method is an attractive algorithm for this problem since the key computations in the algorithm, which include the evaluation of gradient and Hessian of the log-likelihood function, can be implemented efficiently with sparse chordal matrix techniques. We analyze the convergence of the inexact proximal Newton method for the penalized maximum likelihood problem. The convergence analysis applies to a wider class of problems with a self-concordant term in the objective. The numerical results indicate that the method can reach a high accuracy, even with inexact computation of the proximal Newton steps.

As a second extension, we consider Gaussian graphical models to time series, with focus on the estimation of multiple time series graphical models with similar graph structures or identical graph structures and different edge coefficient values. We formulate a joint estimation method for estimating multiple time series graphical models simultaneously, with a group penalty on the edge coefficients for different models. We apply the Douglas-Rachford algorithm to solve the estimation problem for the joint model, and provide model selection methods for choosing parameters. Both synthetic and real data (fMRI brain activity and international stock markets) examples are provided to demonstrate the advantage of the joint estimation method.

The last extension is the generalization of Gaussian graphical models for time series to latent variables. We illustrate the effect of latent variables on the conditional independence structure, and describe a Gaussian graphical model for time series with latent variables. First-order splitting methods including primal-dual Douglas-Rachford algorithm and Spingarn’s method are applied to this problem. Simulations with synthetic data demonstrate how the method recovers the graph topology.

Biography:  Jinchao Li received his B.S. in Electrical Engineering from Zhejiang University, Zhejiang, China in 2009. After that, he received his M.S. in Electrical Engineering and M.S. in Statistics from UCLA in 2010 and 2015, respectively. He is currently a Ph.D. candidate in the Department of Electrical Engineering under the mentorship of Professor Lieven Vandenberghe. His research interests include convex optimization, graphical models and statistical modeling and machine learning.

For more information, contact Prof. L. Vandenberghe (vandenbe@ee.ucla.edu)

Date/Time:
Date(s) - Nov 23, 2015
12:00 pm - 2:00 pm

Location:
E-IV Tesla Room #53-125
420 Westwood Plaza - 5th Flr., Los Angeles CA 90095