Abstract
We present the multisensor-pipeline (MSP), a lightweight, flexible, and extensible framework for prototyping multimodal-multisensor interfaces based on real-time sensor input. Our open-source framework (available on GitHub) enables researchers and developers to easily integrate multiple sensors or other data streams via source modules, to add stream and event processing capabilities via processor modules, and to connect user interfaces or databases via sink modules in a graph-based processing pipeline. Our framework is implemented in Python with a low number of dependencies, which enables a quick setup process, execution across multiple operating systems, and direct access to cutting-edge machine learning libraries and models. We showcase the functionality and capabilities of MSP through a sample application that connects a mobile eye tracker to classify image patches surrounding the user's fixation points and visualizes the classification results in real-time.
Author supplied keywords
Cite
CITATION STYLE
Barz, M., Bhatti, O. S., Lüers, B., Prange, A., & Sonntag, D. (2021). Multisensor-Pipeline: A Lightweight, Flexible, and Extensible Framework for Building Multimodal-Multisensor Interfaces. In ICMI 2021 Companion - Companion Publication of the 2021 International Conference on Multimodal Interaction (pp. 13–18). Association for Computing Machinery, Inc. https://doi.org/10.1145/3461615.3485432
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.