Development and Performance Evaluation of a Neural Signal Based Computer Interface

  • Choi C
  • Kim J
N/ACitations
Citations of this article
7Readers
Mendeley users who have this article in their library.

Abstract

The use of personal computers has drastically increased since the 1990s, and they have been responsible for tremendous achievements in information searching (Internet browsing) and communication (e-mail) around the world. People commonly use standard computer interfaces such as the keyboard and mouse, which are operated through physical contact and movement. These physical interactions inherently involve delicate and coordinated movement of the upper limb, wrist, palm, and fingers. However, there are some people who are not capable of using these interfaces because they have physical disabilities such as spinal cord injuries (SCIs), paralysis, and amputated limbs. In 2005, the Ministry of Health and Welfare in South Korea estimated that there were approximately one million people suffering from motor disabilities in South Korea, and the number has been steadily increasing since 1995. It has also been reported that more than 500,000 individuals are living with SCIs in North America and Europe (Guertin, 2005). If people with disabilities could access computers for tasks such as reading and writing documents, communicating with others, and browsing the Internet, they could become capable of a wider range of activities independently. Alternative methods for providing individuals with disabilities access to computing environments include direct contact with physical keyboards, such as that shown in Fig. 1 (a); i.e., through the use of mouth sticks and head sticks. However, these devices have the disadvantage of being inaccurate and inconvenient to use. Another notable computer interface is the eye-movement tracking system, shown in Fig. 1 (b). This interface can perform as fast as, or even faster than, a mouse (Sibert & Jacob, 2000). This is because eye-gaze supports hand movement planning (Johansson et al., 2001); therefore, signals due to eye movement are quicker than those due to hand movement. Eye movements, however, as with other passive and non-command inputs (e.g., gestures and conversational speech), are often neither intentional nor conscious. Therefore, whenever a user looks at a point on the computer monitor, a command is activated (Jacob, 1993); consequently, a user cannot look at any point on the monitor without issuing a command. The eye-movement tracking system thus brings about unintended results. Currently, biomedical scientists are making new advances in computer interface technology with the development of a neural-signal-based computer interface that is capable of directly bridging the gap between the human nervous system and the computer. This neural

Cite

CITATION STYLE

APA

Choi, C., & Kim, J. (2010). Development and Performance Evaluation of a Neural Signal Based Computer Interface. In Human-Robot Interaction. InTech. https://doi.org/10.5772/8136

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free