Optical lace for synthetic afferent neural networks

57Citations
Citations of this article
161Readers
Mendeley users who have this article in their library.

Your institution provides access to this article.

Abstract

Whereas vision dominates sensing in robots, animals with limited vision deftly navigate their environment using other forms of perception, such as touch. Efforts have been made to apply artificial skins with tactile sensing to robots for similarly sophisticated mobile and manipulative skills. The ability to functionally mimic the afferent sensory neural network, required for distributed sensing and communication networks throughout the body, is still missing. This limitation is partially due to the lack of cointegration of the mechanosensors in the body of the robot. Here, lacings of stretchable optical fibers distributed throughout 3D-printed elastomer frameworks created a cointegrated body, sensing, and communication network. This soft, functional structure could localize deformation with submillimeter positional accuracy (error of 0.71 millimeter) and sub-Newton force resolution (~0.3 newton).

Cite

CITATION STYLE

APA

Xu, P. A., Mishra, A. K., Bai, H., Aubin, C. A., Zullo, L., & Shepherd, R. F. (2019). Optical lace for synthetic afferent neural networks. Science Robotics, 4(34). https://doi.org/10.1126/scirobotics.aaw6304

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free