Expression: A dyadic conversation aid using Google Glass for people who are blind or visually impaired
This paper presents 'Expression' - an integrated assistive solution using Google Glass. The key function of the system is to enable the user in perceiving social signals during a natural dyadic conversation. The design and implementation of the system addressed a number of technical and research challenges - video acquisition and communication over Wi-Fi, efficient detection and tracking of faces, overheating of Google Glass, robust detection of facial features and modeling behavioral expressions, and feedback system for perceiving social signals. Performance evaluation was conducted to ensure the completeness and generalizability of models. Furthermore, usability studies were performed with ten (10) subjects (six visually impaired and four blind-folded) to illustrate the utility of the 'Expression'. Subjective evaluation of Expression was performed using a five (5) point Likert Scale and was found to be excellent (4.383).
Proceedings of the 2014 6th International Conference on Mobile Computing, Applications and Services, MobiCASE 2014
Anam, A., Alam, S., & Yeasin, M. (2015). Expression: A dyadic conversation aid using Google Glass for people who are blind or visually impaired. Proceedings of the 2014 6th International Conference on Mobile Computing, Applications and Services, MobiCASE 2014, 57-64. https://doi.org/10.4108/icst.mobicase.2014.257780