Eye Controlled Rover for Mapping the Environment

0Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This work is focused on implementing various robotic systems to fulfill the objective of creating a prototype rover which is controlled with the eye ball movement of the user through webcam and this rover is used to map the surrounding environment. It is installed with the first person view camera system which will act as the eye of the rover. Hence, the user can deploy the rover in the desired pathway. Mapping circuitry consists of a time-of-flight sensor which measures the distance of the surrounding objects. The required output graph will be displayed in our sketchpad (processing software). The rover is controlled by an autonomous eye tracking system which will help the user to control the rover from remote places even if he cannot reach the required area. Rover and the user will have their own eyes to observe the required necessities distantly. NodeMCU that is mounted on the rover receives the data that feeds the data to a motor driver which is again connected to the motor driver mounted on the top of the automation. Motor driver accordingly drives the motor and hence the wheels are rotated either clockwise or anticlockwise according to our command. It will either move forward, left or right and stop

Cite

CITATION STYLE

APA

Rajendhar, P., Belwin Edward, J., & Badar, A. Q. H. (2021). Eye Controlled Rover for Mapping the Environment. In Lecture Notes in Electrical Engineering (Vol. 700, pp. 3247–3258). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-981-15-8221-9_302

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free