London, Feb 24 (UNI) Feeling helpless about not being able to gauge your partner's real emotions? A newly-developed facial expression algorithm, capable of processing a sequence of frontal images of moving faces and categorise them, would surely make you smile in wonder.
The software can be applied to video sequences in realistic situations and can identify the facial expression of a person seated in front of a computer screen as one of six prototype expressions-- anger, disgust, fear, happiness, sadness and surprise.
The system analyses the face of a person sitting in front of a camera connected to a computer through several boxes, each attached to or focusing on part of the user's face. These boxes monitor the facial movements and determine its expressions.
The system, which has a success rate of 89 per cent, can work under adverse conditions where ambient lighting, frontal facial movements or camera displacements produce major changes in facial appearance.
This usage of the software can result in having improved relations with the e-commerce consumers and metaverse avatars with an unprecedented capability to relate to the person they represent, journal Pattern Analysis and Applications reported.
It can help identifying potential buyers' gestures, determine whether or not they intend to make a purchase and even gauge how satisfied they are with a product or service by helping to reduce the ambiguities of spoken or written language.
UNI XC SYU AS1133