IMAPS: A smart phone based real-time framework for prediction of affect in natural dyadic conversation
Abstract
The lack of ability to perceive emotions and affective states is a setback for people who are blind or visually impaired in professional and social communications. Towards developing assistive technology solution in facilitating natural dyadic conversations for people with such disability, this paper describes the development of a smart phone based system called interactive mobile affect perception system (iMAPS) for prediction of affective dimensions (valence-arousal-dominance). The proposed solution utilizes an Android platform in conjunction with a wireless network to build a fully integrated iMAPS. Empirical analyses were conducted to measure the efficacy and utility of the proposed solution. It was found that the proposed framework can predict affect dimensions with good accuracy (Maximum Correlation Coefficient for valence: 0.68, arousal: 0.71, and dominance: 0.67) in natural dyadic conversation. The overall minimum and maximum response times are (181 milliseconds) and (500 milliseconds), respectively. © 2012 IEEE.
Publication Title
2012 IEEE Visual Communications and Image Processing, VCIP 2012
Recommended Citation
Rahman, A., Tanveer, M., Anam, A., & Yeasin, M. (2012). IMAPS: A smart phone based real-time framework for prediction of affect in natural dyadic conversation. 2012 IEEE Visual Communications and Image Processing, VCIP 2012 https://doi.org/10.1109/VCIP.2012.6410828