A convolutional neural network can evaluate thermal infrared images of human faces and determine with 93 percent accuracy whether the person is drunk. The system described in the International Journal of Intelligent Information and Database Systems could be implemented in places where drunk driving and drunken behaviour are common problems. There are more than a million deaths worldwide each year from road traffic accidents, a large number of those are a direct result of drunkenness.
Kha Tu Huynh and Huynh Phuong Thanh Nguyen of Vietnam National University of Ho Chi Minh City explain that earlier efforts at developing a way to detect drunkenness have focused on eye state, head position, or functional state indicators. However, such systems might be confused by other factors. The team points out that analysis of thermal imaging offers a less ambiguous approach that is also non-invasive and could allow the authorities to screen people in city centres or at events where alcohol is likely to be consumed and people may opt to drive home.
The team points out that it is important that any system designed to identify inebriated people must have a very low rate of false positives and false negatives. After all, a false negative might see a drunk person driving their car whereas too many false positives would preclude sober drivers from using their vehicles and lead to frustration and a loss of trust in the system among the public.
There will always be a compromise in any such system, erring on the side of caution would be preferable, but optimising the classification through larger training datasets on a diverse population of thermal images should bring it closer to the ideal, which would, of course, be the theoretically unachievable 100% accuracy and zero false positives, and zero false negatives.
Huynh, K.T. and Nguyen, H.P.T. (2022) ‘Drunkenness detection using a CNN with adding Gaussian noise and blur in the thermal infrared images’, Int. J. Intelligent Information and Database Systems, Vol. 15, No. 4, pp.398–419.