W!NCE

  • Rostaminia S
  • Lamson A
  • Maji S
  • et al.
N/ACitations
Citations of this article
15Readers
Mendeley users who have this article in their library.

Abstract

The ability to unobtrusively and continuously monitor one's facial expressions has implications for a variety of application domains ranging from affective computing to health-care and the entertainment industry. The standard Facial Action Coding System (FACS) along with camera based methods have been shown to provide objective indicators of facial expressions; however, these approaches can also be fairly limited for mobile applications due to privacy concerns and awkward positioning of the camera. To bridge this gap, W!NCE re-purposes a commercially available Electrooculography-based eyeglass (J!NS MEME) for continuously and unobtrusively sensing of upper facial action units with high fidelity. W!NCE detects facial gestures using a two-stage processing pipeline involving motion artifact removal and facial action detection. We validate our system's applicability through extensive evaluation on data from 17 users under stationary and ambulatory settings, a pilot study for continuous pain monitoring and several performance benchmarks. Our results are very encouraging, showing that we can detect five distinct facial action units with a mean F1 score of 0.88 in stationary and 0.82 in ambulatory settings, and that we can accurately detect facial gestures that due to pain.

Cite

CITATION STYLE

APA

Rostaminia, S., Lamson, A., Maji, S., Rahman, T., & Ganesan, D. (2019). W!NCE. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 3(1), 1–26. https://doi.org/10.1145/3314410

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free