Haptic user interface integration for 3D game engines

2Citations
Citations of this article
40Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Touch and feel senses of human beings provide important information about the environment. When those senses are integrated with the eyesight, we may get all the necessary information about the environment. In terms of human-computer-interaction, the eyesight information is provided by visual displays. On the other hand, touch and feel senses are provided by means of special devices called "haptic" devices. Haptic devices are used in many fields such as computer-aided design, distance-surgery operations, medical simulation environments, training simulators for both military and medical applications, etc. Besides the touch and sense feelings haptic devices also provide force-feedbacks, which allows designing a realistic environment in virtual reality applications. Haptic devices can be categorized into three classes: tactile devices, kinesthetic devices and hybrid devices. Tactile devices simulate skin to create contact sensations. Kinesthetic devices apply forces to guide or inhibit body movement, and hybrid devices attempt to combine tactile and kinesthetic feedback. Among these kinesthetic devices exerts controlled forces on the human body, and it is the most suitable type for the applications such as surgical simulations. The education environments that require skill-based improvements, the touch and feel senses are very important. In some cases providing such educational environment is very expensive, risky and may also consist of some ethical issues. For example, surgical education is one of these fields. The traditional education is provided in operating room on real patients. This type of education is very expensive, requires long time periods, and does not allow any error-and-try type of experiences. It is stressfully for both the educators and the learners. Additionally there are several ethical considerations. Simulation environments supported by such haptic user interfaces provide an alternative and safer educational alternative. There are several studies showing some evidences of educational benefits of this type of education (Tsuda et al 2009; Sutherland et al 2006). Similarly, this technology can also be successfully integrated to the physical rehabilitation process of some diseases requiring motor skill improvements (Kampiopiotis & Theodorakou, 2003). Hence, today simulation environments are providing several opportunities for creating low cost and more effective training and educational environment. Today, combining three dimensional (3D) simulation environments with these haptic interfaces is an important feature for advancing current human-computer interaction. On the other hand haptic devices do not provide a full simulation environment for the interaction and it is necessary to enhance the environment by software environments. Game engines provide high flexibility to create 3-D simulation environments. Unity3D is one of the tools that provides a game engine and physics engine for creating better 3D simulation environments. In the literature there are many studies combining these two technologies to create several educational and training environments. However, in the literature, there are not many researches showing how these two technologies can be integrated to create simulation environment by providing haptic interfaces as well. There are several issues that need to be handled for creating such integration. First of all the haptic devices control libraries need to be integrated to the game engine. Second, the game engine simulation representations and real-time interaction features need to be coordinately represented by the haptic device degree of freedom and force-feedback speed and features. In this study, the integration architecture of Unity 3D game engine and the PHANToM Haptic device for creating a surgical education simulation environment is provided. The methods used for building this integration and handling the synchronization problems are also described. The algorithms developed for creating a better synchronization and user feedback such as providing a smooth feeling and force feedback for the haptic interaction are also provided. We believe that, this study will be helpful for the people who are creating simulation environment by using Unity3D technology and PHANToM haptic interfaces. © 2014 Springer International Publishing.

Cite

CITATION STYLE

APA

Sengul, G., Çaǧiltay, N. E., Özçelik, E., Tuner, E., & Erol, B. (2014). Haptic user interface integration for 3D game engines. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 8512 LNCS, pp. 654–662). Springer Verlag. https://doi.org/10.1007/978-3-319-07227-2_62

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free