Recognizing Eating from Body-Worn Sensors

  • Mirtchouk M
  • Lustig D
  • Smith A
  • et al.
N/ACitations
Citations of this article
36Readers
Mendeley users who have this article in their library.

Abstract

Automated dietary monitoring solutions that can find when, what, and how much individuals consume are needed for many applications such as providing feedback to individuals with chronic disease. Advances in body-worn sensors have led to systems with high accuracy for finding meals and even which foods are consumed in each bite. However, most tests are done in controlled lab settings with restricted meal choices, little background noise, and subjects focused on eating. For these systems to be adopted by users it is critical that they work well in realistic situations and be able to handle confounding factors such as background noise, shared meals, and multi-tasking. Work in realistic environments usually has lower accuracy, but has challenges in determining ground truth. Most critically, there has been a significant gap between lab and free-living environments. This is compounded by data usually being collected for different individuals in each setting, making it difficult to determine how the accuracy gap can be closed. We present a multi-modality study on eating recognition, using body-worn motion (head, wrists) and audio (earbud microphone) sensors for 12 participants (6 from the lab study, 6 new to test generalizability). In contrast to the lab, where audio alone has the highest accuracy, we find now that a combination of sensing modalities (audio, motion) is needed; yet sensor placement (head vs. wrist) is not critical. We further find that lab data does generalize to other participants, but while personal free-living data improves accuracy, more data from others can actually lead to worse performance.

Cite

CITATION STYLE

APA

Mirtchouk, M., Lustig, D., Smith, A., Ching, I., Zheng, M., & Kleinberg, S. (2017). Recognizing Eating from Body-Worn Sensors. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 1(3), 1–20. https://doi.org/10.1145/3131894

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free