Independent component analysis: recent advances

  • Hyvärinen A
  • 1

    Readers

    Mendeley users who have this article in their library.
  • N/A

    Citations

    Citations of this article.

Abstract

Independent component analysis is a probabilistic method for learning a linear transform of a random vector. The goal is to find components that are maximally independent and non-Gaussian (non-normal). Its fundamental difference to classical multi-variate statistical methods is in the assumption of non-Gaussianity, which enables the identification of original, underlying components, in contrast to classical methods. The basic theory of independent component analysis was mainly developed in the 1990s and summarized, for example, in our mono-graph in 2001. Here, we provide an overview of some recent developments in the theory since the year 2000. The main topics are: analysis of causal relations, testing independent components, analysing multiple datasets (three-way data), modelling dependencies between the components and improved methods for estimating the basic model.

Author-supplied keywords

  • Subject Areas
  • blind source separation
  • causal analysis
  • electrical engineering
  • independent component analysis
  • non-Gaussianity
  • pattern recognition Keywords
  • statistics

Get free article suggestions today

Mendeley saves you time finding and organizing research

Sign up here
Already have an account ?Sign in

Find this document

Authors

  • Aapo Hyvärinen

Cite this document

Choose a citation style from the tabs below

Save time finding and organizing research with Mendeley

Sign up for free