Using a novel visualization and control interface – the Mephistophone – we explore the development of a user interface for acoustic visualization and analysis of bird calls. Our intention is to utilize embodied computation as an aid to acoustic cognition. The Mephistophone demonstrates ‘mixed initiative’ design, where humans and systems collaborate toward creative and purposeful goals. The interaction modes of our prototype allow the dextral manipulation of abstract acoustic structure. Combining information visualization, timbre-space exploration, collaborative filtering, feature learning, and human inference tasks, we examine the haptic and visual affordances of a 2.5D tangible user interface (TUI). We explore novel representations in the audial representation-space and how a transition from spectral to timbral visualization can enhance user cognition.
CITATION STYLE
Herman, I., Impett, L., Wollner, P. K. A., & Blackwell, A. F. (2015). Augmenting bioacoustic cognition with tangible user interfaces. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 9183, pp. 437–448). Springer Verlag. https://doi.org/10.1007/978-3-319-20816-9_42
Mendeley helps you to discover research relevant for your work.