Error-Gated Hebbian Rule: A Local Learning Rule for Principal and Independent Component Analysis

21Citations
Citations of this article
58Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

We developed a biologically plausible unsupervised learning algorithm, error-gated Hebbian rule (EGHR)-β, that performs principal component analysis (PCA) and independent component analysis (ICA) in a single-layer feedforward neural network. If parameter β = 1, it can extract the subspace that major principal components span similarly to Oja's subspace rule for PCA. If β = 0, it can separate independent sources similarly to Bell-Sejnowski's ICA rule but without requiring the same number of input and output neurons. Unlike these engineering rules, the EGHR-β can be easily implemented in a biological or neuromorphic circuit because it only uses local information available at each synapse. We analytically and numerically demonstrate the reliability of the EGHR-β in extracting and separating major sources given high-dimensional input. By adjusting β, the EGHR-β can extract sources that are missed by the conventional engineering approach that first applies PCA and then ICA. Namely, the proposed rule can successfully extract hidden natural images even in the presence of dominant or non-Gaussian noise components. The results highlight the reliability and utility of the EGHR-β for large-scale parallel computation of PCA and ICA and its future implementation in a neuromorphic hardware.

Cite

CITATION STYLE

APA

Isomura, T., & Toyoizumi, T. (2018). Error-Gated Hebbian Rule: A Local Learning Rule for Principal and Independent Component Analysis. Scientific Reports, 8(1). https://doi.org/10.1038/s41598-018-20082-0

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free