Augmenting bioacoustic cognition with tangible user interfaces

2Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Using a novel visualization and control interface – the Mephistophone – we explore the development of a user interface for acoustic visualization and analysis of bird calls. Our intention is to utilize embodied computation as an aid to acoustic cognition. The Mephistophone demonstrates ‘mixed initiative’ design, where humans and systems collaborate toward creative and purposeful goals. The interaction modes of our prototype allow the dextral manipulation of abstract acoustic structure. Combining information visualization, timbre-space exploration, collaborative filtering, feature learning, and human inference tasks, we examine the haptic and visual affordances of a 2.5D tangible user interface (TUI). We explore novel representations in the audial representation-space and how a transition from spectral to timbral visualization can enhance user cognition.

Cite

CITATION STYLE

APA

Herman, I., Impett, L., Wollner, P. K. A., & Blackwell, A. F. (2015). Augmenting bioacoustic cognition with tangible user interfaces. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 9183, pp. 437–448). Springer Verlag. https://doi.org/10.1007/978-3-319-20816-9_42

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free