1. MURTADHA A. MOHAMMED AGHA - Computer Engineering Department, University of Mosul, Mosul, Iraq.
2. MAZIN H. AZIZ - Computer Engineering Department, University of Mosul, Mosul, Iraq.
Signs of pain, generally, could be detected from facial expressions. Facial monitoring is vital to measure pain because it is relatively easy to be noticed and accurate. Recently, deep learning has been used to look at the use of facial expressions and movement to sense pain. A dataset from the University of Northern British Columbia (UNBC) were used in this paper to train the model. This research proposes a mechanism to detect the pain. It classifies the facial expressions into two categories: painful and non-painful. Two models of the collected data were dealt with. The first was balanced (that is the two groups of data are of nearly equal size) and the second was unbalanced. The feature extraction of the whole dataset was done with Face Net followed by a fully connected neural network. After training the model, the accuracy of the unbalanced data was (99.665%), while it was for balanced data with a value of (95.44%). In summary, this paper introduces an alternative technique developed to detect the pain. The method is straightforward, cost-effective, and easily recognizable by both public and healthcare professionals.
Facial Expression, Face Recognition, Machine Learning; Pediatric Pain, Facial Action Coding System.