Localised Mixtures of Experts for Mixture of Regressions

  • Bouchard G
N/ACitations
Citations of this article
10Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In this paper, an alternative to Mixture of Experts (ME) called localised mixture of experts is studied. It corresponds to ME where the experts are linear regressions and the gating network is a Gaussian classifier. The underlying regressors distribution can be considered to be Gaussian, so that the joint distribution is a Gaussian mixture. This provides a powerful speed-up of the EM algorithm for localised ME. Conversely, when studying Gaussian mixtures with specific constraints, one can use the standard EM algorithm for mixture of experts to carry out maximum likelihood estimation. Some constrained models are useful, and the corresponding modifications to apply to the EM algorithm are described.

Cite

CITATION STYLE

APA

Bouchard, G. (2003). Localised Mixtures of Experts for Mixture of Regressions (pp. 155–164). https://doi.org/10.1007/978-3-642-18991-3_18

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free