Graphical Model Selection for Gaussian Conditional Random Fields in the Presence of Latent Variables

6Citations
Citations of this article
28Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

We consider the problem of learning a conditional Gaussian graphical model in the presence of latent variables. Building on recent advances in this field, we suggest a method that decomposes the parameters of a conditional Markov random field into the sum of a sparse and a low-rank matrix. We derive convergence bounds for this estimator and show that it is well-behaved in the high-dimensional regime as well as “sparsistent” (i.e., capable of recovering the graph structure). We then show how proximal gradient algorithms and semi-definite programming techniques can be employed to fit the model to thousands of variables. Through extensive simulations, we illustrate the conditions required for identifiability and show that there is a wide range of situations in which this model performs significantly better than its counterparts, for example, by accommodating more latent variables. Finally, the suggested method is applied to two datasets comprising individual level data on genetic variants and metabolites levels. We show our results replicate better than alternative approaches and show enriched biological signal. Supplementary materials for this article are available online.

Cite

CITATION STYLE

APA

Frot, B., Jostins, L., & McVean, G. (2019). Graphical Model Selection for Gaussian Conditional Random Fields in the Presence of Latent Variables. Journal of the American Statistical Association, 114(526), 723–734. https://doi.org/10.1080/01621459.2018.1434531

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free