Data-Driven Multi-modal Haptic Rendering Combining Force, Tactile, and Thermal Feedback

0Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We introduce a data-driven multi-modal haptic rendering system which simultaneously presents force, tactile, and thermal feedback. To handle force, tactile, and thermal feedback together, a vibration actuator and a peltier module are attached to a force-feedback device. Several haptic properties of an object—shape, texture, friction, and viscoelasticity—are considered as components of force rendering. About tactile feedback, we combine contact transient and texture vibration when the user contacts and explores a surface. Thermal sensation between skin and an object rendered by considering both heat flux and the initial temperatures of the object and skin. Rendering models for all the modalities are collected from real interaction and modeled in a data-driven manner. We expect that our multi-modal rendering system improves realism of haptic sensation in the virtual environment.

Cite

CITATION STYLE

APA

Cho, S., Choi, H., Shin, S., & Choi, S. (2019). Data-Driven Multi-modal Haptic Rendering Combining Force, Tactile, and Thermal Feedback. In Lecture Notes in Electrical Engineering (Vol. 535, pp. 69–74). Springer Verlag. https://doi.org/10.1007/978-981-13-3194-7_15

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free