E m o A s s i s t: emotion enabled assistive tool to enhance dyadic conversation for the blind
Abstract
This paper presents the design and implementation of EmoAssist: a smart-phone based system to assist in dyadic conversations. The main goal of the system is to provide access to more non-verbal communication options to people who are blind or visually impaired. The key functionalities of the system are to predict behavioral expressions (such a yawn, a closed lip smile, a open lip smile, looking away, sleepy, etc.) and 3-D affective dimensions (valence, arousal, and dominance) from visual cues in order to provide the correct auditory feedback or response. A number of challenges related to the data communication protocols, efficient tracking of the face, modeling of behavioral expressions/affective dimensions, feedback mechanism and system integration were addressed to build an effective and functional system. In addition, orientation-sensor information from the smart-phone was used to correct image alignment to improve the robustness for real world application. Empirical studies show that the EmoAssist can predict affective dimensions with acceptable accuracy (Maximum Correlation-Coefficient for valence: 0.76, arousal: 0.78, and dominance: 0.76) in natural dyadic conversation. The overall minimum and maximum response-times are (64.61 milliseconds) and (128.22 milliseconds), respectively. The integration of sensor information for correcting the orientation improved (16 % in average) the accuracy in recognizing behavioralexpressions. A usability study with ten blind people in social interaction shows that the EmoAssist is highly acceptable with an Average acceptability rating using of 6.0 in Likert scale (where 1 and 7 are the lowest and highest possible ratings, respectively).
Publication Title
Multimedia Tools and Applications
Recommended Citation
Rahman, A., Anam, A., & Yeasin, M. (2017). E m o A s s i s t: emotion enabled assistive tool to enhance dyadic conversation for the blind. Multimedia Tools and Applications (6), 7699-7730. https://doi.org/10.1007/s11042-016-3295-4