In this paper we introduce InSpire, an interactive 3D modeling system combining an optical see-through “holo-display” and video-based motion sensing and head tracking to co-locate 3D model display and user gestures. Users can directly create, edit, and manipulate digital geometry, taking a step towards an intuitive gesture modeling environment that liberates designers’ hands from the limitation of 2D mouse input and monitor output and InSpire designer’s ideas. In this paper, we describe our goals and the concepts and implementation behind the prototype, on both the software and hardware side. In addition, we present several use case examples that explore potential applications. Finally, based on initial user responses to the prototype, some future development directions are discussed.
CITATION STYLE
Teng, T., & Johnson, B. R. (2014). Inspire integrated spatial gesture-based direct 3D modeling and display. In ACADIA 2014 - Design Agency: Proceedings of the 34th Annual Conference of the Association for Computer Aided Design in Architecture (Vol. 2014-October, pp. 445–452). ACADIA. https://doi.org/10.52842/conf.acadia.2014.445
Mendeley helps you to discover research relevant for your work.