Event-Driven Visual-Tactile Sensing and Learning for Robots

69Citations
Citations of this article
158Readers
Mendeley users who have this article in their library.

Abstract

This work contributes an event-driven visual-tactile perception system, comprising a novel biologically-inspired tactile sensor and multi-modal spike-based learning. Our neuromorphic fingertip tactile sensor, NeuTouch, scales well with the number of taxels thanks to its event-based nature. Likewise, our Visual-Tactile Spiking Neural Network (VT-SNN) enables fast perception when coupled with event sensors. We evaluate our visual-tactile system (using the NeuTouch and Prophesee event camera) on two robot tasks: container classification and rotational slip detection. On both tasks, we observe good accuracies relative to standard deep learning methods. We have made our visual-tactile datasets freely-available to encourage research on multi-modal event-driven robot perception, which we believe is a promising approach towards intelligent power-efficient robot systems. Index Terms—Event-Driven Perception, Multi-Modal Learning, Tactile Sensing, Spiking Neural Networks.

Cite

CITATION STYLE

APA

Taunyazov, T., Sng, W., See, H. H., Lim, B., Kuan, J., Ansari, A. F., … Soh, H. (2020). Event-Driven Visual-Tactile Sensing and Learning for Robots. In Robotics: Science and Systems. MIT Press Journals. https://doi.org/10.15607/RSS.2020.XVI.020

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free