Multi-User Egocentric Online System for Unsupervised Assistance on Object Usage

  • Damen D
  • Haines O
  • Leelasawassuk T
  • et al.
N/ACitations
Citations of this article
3Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

We present an online fully unsupervised approach for auto- matically extracting video guides of how objects are used from wearable gaze trackers worn bymultiple users. Given egocentric video and eye gaze from multiple users performing tasks, the system discovers task-relevant objects and automatically extracts guidance videos on how these objects have been used. In the assistive mode, the paper proposes a method for selecting a suitable video guide to be displayed to a novice user indi- cating how to use an object, purely triggered by the user’s gaze. The approach is tested on a variety of daily tasks ranging from opening a door, to preparing coffee and operating a gym machine.

Cite

CITATION STYLE

APA

Damen, D., Haines, O., Leelasawassuk, T., Calway, A., & Mayol-Cuevas, W. (2015). Multi-User Egocentric Online System for Unsupervised Assistance on Object Usage (pp. 481–492). https://doi.org/10.1007/978-3-319-16199-0_34

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free