Haptic and mixed reality enabled immersive cockpits for tele-operated driving

0Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

In the last few years, the use of automated guided vehicles (AGVs) and autonomous mobile robots (AMRs) has experienced a sustainable increase in different verticals such as factories and logistics. However, they still have some technical limitations that hamper their autonomous operation in unpredictable or dynamic environments, requiring them to be supervised and/or controlled by human operators. In such situations, current tele-operated driving (ToD) systems lack the required stimulation and spatial perception to precisely manipulate the AGVs/AMRs, besides suffering from real-time challenges that limit the accuracy of movement. This chapter describes a proposal to solve these problems, by combining low-latency 5G-IoT networks and immersive cockpits equipped with haptic and mixed-reality devices. It also explains how such devices provide intuitive feedback for ToD and facilitate context-aware decision-making. The results are validated in the context of two innovative demonstrations deployed in the environment of a seaport, where ToD of multiple AGVs/AMRs is supported by a 5G mmWave network infrastructure.

Cite

CITATION STYLE

APA

Lozano, R., Cantero, M., Fuentes, M., Ruiz, J., Benito, I., & Gomez-Barquero, D. (2023). Haptic and mixed reality enabled immersive cockpits for tele-operated driving. In Shaping the Future of IoT with Edge Intelligence: How Edge Computing Enables the Next Generation of IoT Applications (pp. 301–317). River Publishers. https://doi.org/10.1201/9781032632407-19

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free