Design of neural architectures is a critical aspect in deep-learning based methods. In this chapter, we explore the suitability of different neural architectures for the recognition of mobility-related human activities. Neural architecture search (NAS) is getting a lot of attention in the machine learning community and improves deep learning models' performances in many tasks like language modeling and image recognition. Deep learning techniques were successfully applied to human activity recognition (HAR). However, the design of competitive architectures remains cumbersome , time-consuming, and rely strongly on domain expertise. To address this, we propose a large-scale systematic experimental setup in order to design and evaluate neural architectures for HAR applications. Specifically, we use a Bayesian optimization (BO) procedure based on a Gaussian process surrogate model in order to tune architectures' hyper-parameters. We train and evaluate more than 600 different archi-tectures which are then analyzed via the functional ANalysis Of VAriance (fANOVA) framework to assess hyper-parameters relevance. We experiment our approach on the Sussex-Huawei Locomotion and Transportation (SHL) dataset, a highly versatile, sensor-rich and precisely annotated dataset of human locomotion modes. 12.1 Introduction Neural networks are attracting a considerable amount of interest in many fields achieving state-of-the-art performances. Fields like speech recognition, natural language processing, computer vision benefited largely from deep-learning techniques (He et al.
CITATION STYLE
Osmani, A., & Hamidi, M. (2019). Bayesian Optimization of Neural Architectures for Human Activity Recognition (pp. 171–195). https://doi.org/10.1007/978-3-030-13001-5_12
Mendeley helps you to discover research relevant for your work.