As pain is an inevitable part of life, this study examines the use of facial expression technology in assisting individuals with pain. The self-reporting system commonly used to detect discomfort is ineffective and cannot be utilized by patients of all ages; a standardized formula for measuring pain would resolve this issue. Facial monitoring technology is an important tool for measuring pain because it is both easy to use and incredibly precise. Accordingly, this article uses deep learning techniques to examine the use of 2D facial expressions and motion to sense pain. Sequential pictures from the University of Northern British Columbia (UNBC) dataset were used to train a deep learning model, as deep learning can detect motion and assist patients in self-reporting. Our mechanism is capable of classifying pain into three categories: not painful, becoming painful, and painful. The system's performance was evaluated by comparing its findings to those of a specialist physician. The precision rates of the not painful, becoming painful, and painful classifications were 99.75 percent, 92.93 percent, and 95.15 percent, respectively. In sum, our study has developed an alternative way to test for pain prior to hospitalization that is straightforward, cost effective, and easily understood by both the general population and healthcare professionals. Additionally, this analysis technique could be applied to other screening methods, such as pain detection for infectious diseases.
CITATION STYLE
Pikulkaew, K., Boonchieng, W., Boonchieng, E., & Chouvatut, V. (2021). 2D Facial Expression and Movement of Motion for Pain Identification with Deep Learning Methods. IEEE Access, 9, 109903–109914. https://doi.org/10.1109/ACCESS.2021.3101396
Mendeley helps you to discover research relevant for your work.