A real-time framework for natural multimodal interaction with large screen displays
Abstract
This paper presents a framework for designing a natural multimodal human computer interaction (HCI) system. The core of the proposed framework is a principled method for combining information derived from audio and visual cues. To achieve natural interaction, both audio and visual modalities are fused along with feedback through a large screen display. Careful design along with due considerations of possible aspects of a systems interaction cycle and integration has resulted in a successful system. The performance of the proposed framework has been validated through the development of several prototype systems as well as commercial applications for the retail and entertainment industry. To assess the impact of these multimodal systems (MMS), informal studies have been conducted. It was found that the system performed according to its specifications in 95% of the cases and that users showed ad-hoc proficiency, indicating natural acceptance of such systems.
Publication Title
Proceedings - 4th IEEE International Conference on Multimodal Interfaces, ICMI 2002
Recommended Citation
Krahnstoever, N., Kettebekov, S., Yeasin, M., & Sharma, R. (2002). A real-time framework for natural multimodal interaction with large screen displays. Proceedings - 4th IEEE International Conference on Multimodal Interfaces, ICMI 2002, 349-354. https://doi.org/10.1109/ICMI.2002.1167020