Perceiving Humans: From Monocular 3D Localization to Social Distancing

9Citations
Citations of this article
76Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Perceiving humans in the context of Intelligent Transportation Systems (ITS) often relies on multiple cameras or expensive LiDAR sensors. In this work, we present a new cost-effective vision-based method that perceives humans' locations in 3D and their body orientation from a single image. We address the challenges related to the ill-posed monocular 3D tasks by proposing a neural network architecture that predicts confidence intervals in contrast to point estimates. Our neural network estimates human 3D body locations and their orientation with a measure of uncertainty. Our proposed solution (i) is privacy-safe, (ii) works with any fixed or moving cameras, and (iii) does not rely on ground plane estimation. We demonstrate the performance of our method with respect to three applications: locating humans in 3D, detecting social interactions, and verifying the compliance of recent safety measures due to the COVID-19 outbreak. We show that it is possible to rethink the concept of 'social distancing' as a form of social interaction in contrast to a simple location-based rule. We publicly share the source code towards an open science mission.

Cite

CITATION STYLE

APA

Bertoni, L., Kreiss, S., & Alahi, A. (2022). Perceiving Humans: From Monocular 3D Localization to Social Distancing. IEEE Transactions on Intelligent Transportation Systems, 23(7), 7401–7418. https://doi.org/10.1109/TITS.2021.3069376

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free