Inspire integrated spatial gesture-based direct 3D modeling and display

4Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.

Abstract

In this paper we introduce InSpire, an interactive 3D modeling system combining an optical see-through “holo-display” and video-based motion sensing and head tracking to co-locate 3D model display and user gestures. Users can directly create, edit, and manipulate digital geometry, taking a step towards an intuitive gesture modeling environment that liberates designers’ hands from the limitation of 2D mouse input and monitor output and InSpire designer’s ideas. In this paper, we describe our goals and the concepts and implementation behind the prototype, on both the software and hardware side. In addition, we present several use case examples that explore potential applications. Finally, based on initial user responses to the prototype, some future development directions are discussed.

Cite

CITATION STYLE

APA

Teng, T., & Johnson, B. R. (2014). Inspire integrated spatial gesture-based direct 3D modeling and display. In ACADIA 2014 - Design Agency: Proceedings of the 34th Annual Conference of the Association for Computer Aided Design in Architecture (Vol. 2014-October, pp. 445–452). ACADIA. https://doi.org/10.52842/conf.acadia.2014.445

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free