Robotic and virtual reality BCIs using spatial tactile and auditory oddball paradigms

15Citations
Citations of this article
97Readers
Mendeley users who have this article in their library.

Abstract

The paper reviews nine robotic and virtual reality (VR) brain-computer interface (BCI) projects developed by the author, in collaboration with his graduate students, within the BCI-lab research group during its association with University of Tsukuba, Japan. The nine novel approaches are discussed in applications to direct brainrobot and brainvirtualrealityagent control interfaces using tactile and auditory BCI technologies. The BCI user intentions are decoded from the brainwaves in realtime using a noninvasive electroencephalography (EEG) and they are translated to a symbiotic robot or virtual reality agent thoughtbased only control. A communication protocol between the BCI output and the robot or the virtual environment is realized in a symbiotic communication scenario using an user datagram protocol (UDP), which constitutes an internet of things (IoT) control scenario. Results obtained from healthy users reproducing simple brainrobot and brainvirtualagent control tasks in online experiments support the research goal of a possibility to interact with robotic devices and virtual reality agents using symbiotic thoughtbased BCI technologies. An offline BCI classification accuracy boosting method, using a previously proposed information geometry derived approach, is also discussed in order to further support the reviewed robotic and virtual reality thoughtbased control paradigms.

Cite

CITATION STYLE

APA

Rutkowski, T. M. (2016). Robotic and virtual reality BCIs using spatial tactile and auditory oddball paradigms. Frontiers in Neurorobotics. Frontiers Research Foundation. https://doi.org/10.3389/fnbot.2016.00020

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free