A computer algorithm that can tell whether you are happy or sad, angry or expressing almost any other emotion would be a boon to the games industry. New research published in the International Journal of Computational Vision and Robotics describes such a system that is almost 99 percent accurate.
Hyung-Il Choi of the School of Media, at Soongsil University, in Seoul, Korea, working with Nhan Thi Cao and An Hoa Ton-That of Vietnam National University, in Ho Chi Minh City, explain that capturing the emotions of players could be used in interactive games for various purposes, such as transferring the player’s emotions to his or her avatar, or activating suitable actions to communicate with other players in various scenarios including educational applications.
The team has developed a simple, fast system that they have shown to be almost 99% accurate on thousands of test facial images. Fundamentally, the system uses mathematical processing to measure eyebrow position, the openness of the eyes, mouth shape and other factors in order to correlate those with basic human emotions: anger, disgust, fear, joy, sadness, surprise and a neutral expression. The system can work even on images of faces just 48 pixels square.
Facial expression recognition has been the focus of much research in recent years, thanks to the emergence of intelligence communication systems, data-driven animation and intelligent game applications, the team reports. “When facial expressions recognition of players is applied in an intelligent game system, the experience can become more interactive, vivid and attractive,” the team says. One might also imagine that the same system could be used to track the emotional expressions of actors voicing the characters in animated movies and other media for more realistic real-time emotional expression.
Cao, N.T., Ton-That, A.H. and Choi, H-I. (2016) ‘An effective facial expression recognition approach for intelligent game systems’, Int. J. Computational Vision and Robotics, Vol. 6, No. 3, pp.223–234.