Wirtinger calculus based gradient descent and Levenberg-Marquardt learning algorithms in complex-valued neural networks

41Citations
Citations of this article
30Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Complex-valued neural networks (CVNNs) bring in nonholomorphic functions in two ways: (i) through their loss functions and (ii) the widely used activation functions. The derivatives of such functions are defined in Wirtinger calculus. In this paper, we derive two popular algorithms-the gradient descent and the Levenberg-Marquardt (LM) algorithm-for parameter optimization in the feedforward CVNNs using the Wirtinger calculus, which is simpler than the conventional derivation that considers the problem in real domain. While deriving the LM algorithm, we solve and use the result of a least squares problem in the complex domain, ∥b - (Az + Bz*) ∥minz, which is more general than the ∥b - Az∥minz. Computer simulation results exhibit that as with the real-valued case, the complex-LM algorithm provides much faster learning with higher accuracy than the complex gradient descent algorithm. © 2011 Springer-Verlag.

Cite

CITATION STYLE

APA

Amin, M. F., Amin, M. I., Al-Nuaimi, A. Y. H., & Murase, K. (2011). Wirtinger calculus based gradient descent and Levenberg-Marquardt learning algorithms in complex-valued neural networks. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 7062 LNCS, pp. 550–559). https://doi.org/10.1007/978-3-642-24955-6_66

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free