Adaptive user interface design and analysis using emotion recognition through facial expressions and body posture from an RGB-D sensor.
Ontology highlight
ABSTRACT: This work presents the design and analysis of an Adaptive User Interface (AUI) for a desktop application that uses a novel solution for the recognition of the emotional state of a user through both facial expressions and body posture from an RGB-D sensor. Six basic emotions are recognized through facial expressions in addition to the physiological state, which is recognized through the body posture. The facial expressions and body posture are acquired in real-time from a Kinect sensor. A scoring system is used to improve recognition by minimizing the confusion between the different emotions. The implemented solution achieves an accuracy rate of above 90%. The recognized emotion is then used to derive an Automatic AUI where the user can use speech commands to modify the User Interface (UI) automatically. A comprehensive user study is performed to compare the usability of an Automatic, Manual, and a Hybrid AUI. The AUIs are evaluated in terms of their efficiency, effectiveness, productivity, and error safety. Additionally, a comprehensive analysis is performed to evaluate the results from the viewpoint of different genders and age groups. Results show that the hybrid adaptation improves usability in terms of productivity and efficiency. Finally, a combination of both automatic and hybrid AUIs result in significantly positive user experience compared to the manual adaptation.
SUBMITTER: Medjden S
PROVIDER: S-EPMC7365406 | biostudies-literature | 2020
REPOSITORIES: biostudies-literature
ACCESS DATA