Amyotrophic lateral sclerosis (ALS) is a progressive neurodegenerative disease of the brain and the spinal cord, which leads to paralysis of motor functions. Patients retain their ability to blink, which can be used for communication. Here, We present an Artificial Intelligence (AI) system that uses eye-blinks to communicate with the outside world, running on real-time Internet-of-Things (IoT) devices. The system uses a Convolutional Neural Network (CNN) to find the blinking pattern, which is defined as a series of Open and Closed states. Each pattern is mapped to a collection of words that manifest the patient’s intent. To investigate the best trade-off between accuracy and latency, we investigated several Convolutional Network architectures, such as ResNet, SqueezeNet, DenseNet, and InceptionV3, and evaluated their performance. We found that the InceptionV3 architecture, after hyper-parameter fine-tuning on the specific task led to the best performance with an accuracy of 99.20% and 94 ms latency. This work demonstrates how the latest advances in deep learning architectures can be adapted for clinical systems that ameliorate the patient’s quality of life regardless of the point-of-care.
CITATION STYLE
Ramli, A. A., Liu, R., Krishnamoorthy, R., Vishal, I. B., Wang, X., Tagkopoulos, I., & Liu, X. (2020). BWCNN: Blink to Word, a Real-Time Convolutional Neural Network Approach. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 12405 LNCS, pp. 133–140). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-030-59615-6_10
Mendeley helps you to discover research relevant for your work.