Joint Learning of Modular Structures from Multiple Data Types

  • Azizi E
N/ACitations
Citations of this article
8Readers
Mendeley users who have this article in their library.

Abstract

A commonly used technique for understanding underlying dependency structures among objects is module networks by Segal et al., which assumes a shared con- ditional probability distribution for objects within one module. However, learning structures from object variables alone can lead to spurious dependencies and to avoid over-fitting, imposing structural assumptions may be required. We propose an extended model inspired by module networks and stochastic blockmodels for joint learning of structures from observed object variables (e.g. gene expression in gene regulatory networks) and relational data among objects (e.g. protein-DNA interactions). By integrating complementary data types, we avoid additional struc- tural assumptions. We illustrate theoretical and practical significance of the model and developed a reversible-jumpMCMClearning procedure for learning modules and model parameters. We demonstrate the accuracy and scalability of our method for synthetic and genomic datasets. 1

Cite

CITATION STYLE

APA

Azizi, E. (2013). Joint Learning of Modular Structures from Multiple Data Types. In NIPS Workshop of Frontiers of Network Analysis: Methods, Models, and Applications. Lake Tahoe, NV. Retrieved from http://snap.stanford.edu/networks2013/

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free