This paper proposes a new method of integrating the latest technologies joining brain-computer interfaces (BCIs) with eye-tracking (E-T) and applying this combination to conceptual design for architecture using AI-driven neurofeedback (NFB) to help identify the designer's intent and respond dynamically to it. Using integrated state-of-the-art E-T and BCI solutions for the latest head-mounted display (HMD) devices, this paper aims to provide an insight into the applicability of these solutions and their potential benefits and pitfalls to creating innovative conceptual design instruments. By harnessing artificial intelligence (AI) within a Game Engine (GE) context, the proposed solution tries to create a new procedural design-interaction approach that uses neurofeedback to learn and adapt to its user's design intent without the need to truly understand the complex decision-making processes taking place inside the designer's mind. While limited in its scope, this approach raises some interesting topics and questions that are discussed in more detail in the last section of the paper.
CITATION STYLE
Barsan-Pipu, C. (2020). Artificial Intelligence Applied to Brain-Computer Interfacing with Eye-Tracking for Computer-Aided Conceptual Architectural Design in Virtual Reality Using Neurofeedback. In Proceedings of the 2019 DigitalFUTURES (pp. 124–135). Springer Singapore. https://doi.org/10.1007/978-981-13-8153-9_11
Mendeley helps you to discover research relevant for your work.