Realizing adaptive brain functions subserving perception, cognition, and motor behavior on biological temporal and spatial scales remains out of reach for even the fastest computers. Newly introduced memristive hardware approaches open the opportunity to implement dense, low-power synaptic memories of up to 1015 bits per square centimeter. Memristors have the unique property of remembering the past history of their stimulation in their resistive state and do not require power to maintain their memory, making them ideal candidates to implement large arrays of plastic synapses supporting learning in neural models. Over the past decades, many learning rules have been proposed in the literature to explain how neural activity shapes synaptic connections to support adaptive behavior. To ensure an optimal implementation of a large variety of learning rules in hardware, some general and easily parameterized form of learning rule must be designed. This general form learning equation would allow instantiation of multiple learning rules through different parameterizations, without rewiring the hardware. This paper characterizes a subset of local learning rules amenable to implementation in memristive hardware. The analyzed rules belong to four broad classes: Hebb rule derivatives with various methods for gating learning and decay, Threshold rule variations including the covariance and BCM families, Input reconstruction-based learning rules, and Explicit temporal trace-based rules. © 2011 IEEE.
CITATION STYLE
Gorchetchnikov, A., Versace, M., Ames, H., Chandler, B., Léveillé, J., Livitz, G., … Qureshi, M. S. (2011). Review and unification of learning framework in Cog Ex Machina platform for memristive neuromorphic hardware. In Proceedings of the International Joint Conference on Neural Networks (pp. 2601–2608). https://doi.org/10.1109/IJCNN.2011.6033558
Mendeley helps you to discover research relevant for your work.