Combining Artificial Intelligence, Bio-Sensing and Multimodal Control for Bio-Responsive Interactives

0Citations
Citations of this article
14Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We present framework and prototype process for combining affective bio-sensing, natural multimodal control, and artificial intelligence for bio-responsive interactives. The work explores a more affective, bio-responsive experience for users singularly or in multiuser environments in real-time and/or virtual reality (VR) interactives especially where mindfulness and wellness are a goal. The systems and framework can use EEG sensors, heart rate sensors, breath controllers and/or nature gesture controllers such as hand and body gestures. These bio-sensing and natural gestures inputs are combined with artificial intelligence generated musical and visual 2D or 3D forms where focused attention, embodiment and entrainment occurs to synchronization of the beats of music and generated 2D or 3D visuals with natural body function or processes. In this paper, we present our system framework and setup, as well as present our work-in-progress prototype.

Cite

CITATION STYLE

APA

Dipaola, S., & Song, M. (2023). Combining Artificial Intelligence, Bio-Sensing and Multimodal Control for Bio-Responsive Interactives. In ACM International Conference Proceeding Series (pp. 318–322). Association for Computing Machinery. https://doi.org/10.1145/3610661.3616183

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free