Electronic Theses and Dissertations

Expression+: An Assistive Solution for Affect-Enabled Dyadic Conversation

Identifier

4802

Date

2016

Document Type

Dissertation (Access Restricted)

Degree Name

Doctor of Philosophy

Major

Engineering

Concentration

Computer Engineering

Committee Chair

Mohammed Yeasin

Committee Member

Arthur Graesser

Committee Member

Andrew Olney

Committee Member

Madhusudhanan Balasubramanian

Abstract

Expression+: An Assistive Solution for Affect-Enabled Dyadic ConversationThe main goal of this dissertation is to develop an affect-enabled assistive technology solution, called "Expression+" to facilitate dyadic conversation. The key feature of the system is to predict the interlocutor's behavioral expressions, emotions, and affective states and provide personalized feedback to the people who are blind or visually impaired. As part of the system, a deep learning framework was implemented to model a wide range of natural facial and behavioral expressions from the video stream in real time. In addition, a personalized feedback scheme was developed to accommodate the users' needs and preferences. Besides, multiple configurations of the system were implemented to provide access to a broader set of users with varying needs, skills, and experiences. A usage data collection mechanism was integrated to understand the long-term adoption, as well as performance and usability issues in real world settings. The implementation of "Expression+" adopted a hybrid of participatory design, system thinking, and assistive thinking. It was developed based on the lessons learned, and experience gained through implementing systems such as EmoAssist, iFEPS, and Expression developed under the broader project called Blind Ambition. The main functionalities of the "Expression+" were decided through a collaborative effort between the designers and representative users.The affective model part of the "Expression+" was developed and evaluated using publicly available datasets with typical emotions. The "Expression+" was designed by addressing many practical issues in deployment and is expected to transform the users' experience by providing affect enabled conversation. The deployment of "Expression+" will provide access to non-verbal cues to facilitate dyadic conversation and improve the quality of life of people who are blind or visually impaired.

Comments

Data is provided by the student.

Library Comment

Dissertation or thesis originally submitted to the local University of Memphis Electronic Theses & dissertation (ETD) Repository.

This document is currently not available here.

Share

COinS