Gestural interactions for multi-parameter audio control and audification

0Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This paper presents an interactive multi-modal system for real-time multi-parametric gestural control of audio processing applications. We claim that this can ease the use / control of different tasks and for this we present the following as a demonstration: (1) A musical application, i.e. the multi-parametric control of digital audio effects, and (2) a scientific application, i.e. the interactive navigation of audifications. In the first application we discuss the use of PCA-based control axes and clustering to obtain dimensionality reduced control variables. In the second application we show how the tightly closed human-computer loop actively supports the detection and discovery of features in data under analysis. © Springer-Verlag Berlin Heidelberg 2006.

Cite

CITATION STYLE

APA

Hermann, T., Paschalidou, S., Beckmann, D., & Ritter, H. (2006). Gestural interactions for multi-parameter audio control and audification. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 3881 LNAI, pp. 335–338). https://doi.org/10.1007/11678816_37

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free