Head pose detection for awearable parrot-inspired robot based on deep learning

6Citations
Citations of this article
24Readers
Mendeley users who have this article in their library.

Abstract

Extensive research has been conducted in human head pose detection systems and several applications have been identified to deploy such systems. Deep learning based head pose detection is one such method which has been studied for several decades and reports high success rates during implementation. Across several pet robots designed and developed for various needs, there is a complete absence of wearable pet robots and head pose detection models in wearable pet robots. Designing a wearable pet robot capable of head pose detection can provide more opportunities for research and development of such systems. In this paper, we present a novel head pose detection system for a wearable parrot-inspired pet robot using images taken from the wearer's shoulder. This is the first time head pose detection has been studied in wearable robots and using images from a side angle. In this study, we used AlexNet convolutional neural network architecture trained on the images from the database for the head pose detection system. The system was tested with 250 images and resulted in an accuracy of 94.4% across five head poses, namely left, left intermediate, straight, right, and right intermediate.

Cite

CITATION STYLE

APA

Bharatharaj, J., Huang, L., Mohan, R. E., Pathmakumar, T., Krägeloh, C., & Al-Jumaily, A. (2018). Head pose detection for awearable parrot-inspired robot based on deep learning. Applied Sciences (Switzerland), 8(7). https://doi.org/10.3390/app8071081

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free