Control of humanoid robot via motion-onset visual evoked potentials

19Citations
Citations of this article
30Readers
Mendeley users who have this article in their library.

Abstract

This paper investigates controlling humanoid robot behavior via motion-onset specific N200 potentials. In this study, N200 potentials are induced by moving a blue bar through robot images intuitively representing robot behaviors to be controlled with mind. We present the individual impact of each subject on N200 potentials and discuss how to deal with individuality to obtain a high accuracy. The study results document the off-line average accuracy of 93% for hitting targets across over five subjects, so we use this major component of the motion-onset visual evoked potential (mVEP) to code people’s mental activities and to perform two types of on-line operation tasks: navigating a humanoid robot in an office environment with an obstacle and picking-up an object. We discuss the factors that affect the on-line control success rate and the total time for completing an on-line operation task.

Cite

CITATION STYLE

APA

Li, W., Li, M., & Zhao, J. (2015). Control of humanoid robot via motion-onset visual evoked potentials. Frontiers in Systems Neuroscience, 8(JAN). https://doi.org/10.3389/fnsys.2014.00247

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free