We present a class of algorithms for learning the structure of graphical
models from data. The algorithms are based on a measure known as
the kernel generalized variance (KGV), which essentially allows us
to treat all variables on an equal footing as Gaussians in a feature
space obtained from Mercer kernels. Thus we are able to learn hybrid
graphs involving discrete and continuous variables of arbitrary type.
We explore the computational properties of our approach, showing
how to use the kernel trick to compute the relevant statistics in
linear time. We illustrate our framework with experiments involving
discrete and continuous data.
Mendeley saves you time finding and organizing research
There are no full text links
Choose a citation style from the tabs below