Neuromorphic computation using Spiking Neural Networks (SNN) is proposed as an alternative solution for future of computation to conquer the memory bottelneck issue in recent computer architecture. Different spike codings have been discussed to improve data transferring and data processing in neuro-inspired computation paradigms. Choosing the appropriate neural network topology could result in better performance of computation, recognition and classification. The model of the neuron is another important factor to design and implement SNN systems. The speed of simulation and implementation, ability of integration to the other elements of the network, and suitability for scalable networks are the factors to select a neuron model. The learning algorithms are significant consideration to train the neural network for weight modification. Improving learning in neuromorphic architecture is feasible by improving the quality of artificial synapse as well as learning algorithm such as STDP. In this chapter we proposed a new synapse box that can remember and forget. Furthermore, as the most frequent used unsupervised method for network training in SNN is STDP, we analyze and review the various methods of STDP. The sequential order of pre- or postsynaptic spikes occurring across a synapse in an interval of time leads to defining different STDP methods. Based on the importance of stability as well as Hebbian competition or anti-Hebbian competition the method will be used in weight modification. We survey the most significant projects that cause making neuromorphic platform. The advantages and disadvantages of each neuromorphic platform are introduced in this chapter.
CITATION STYLE
Shahsavari, M., Devienne, P., & Boulet, P. (2019). Spiking neural computing in memristive neuromorphic platforms. In Handbook of Memristor Networks (pp. 961–728). Springer International Publishing. https://doi.org/10.1007/978-3-319-76375-0_25
Mendeley helps you to discover research relevant for your work.