SyncWISE: Window Induced Shift Estimation for Synchronization of Video and Accelerometry fromWearable Sensors
Abstract
The development and validation of computational models to detect daily human behaviors (e.g., eating, smoking, brushing) using wearable devices requires labeled data collected from the natural field environment, with tight time synchronization of the micro-behaviors (e.g., start/end times of hand-to-mouth gestures during a smoking puff or an eating gesture) and the associated labels. Video data is increasingly being used for such label collection. Unfortunately, wearable devices and video cameras with independent (and drifting) clocks make tight time synchronization challenging. To address this issue, we present the Window Induced Shift Estimation method for Synchronization (SyncWISE) approach. We demonstrate the feasibility and effectiveness of our method by synchronizing the timestamps of a wearable camera and wearable accelerometer from 163 videos representing 45.2 hours of data from 21 participants enrolled in a real-world smoking cessation study. Our approach shows significant improvement over the state-of-the-art, even in the presence of high data loss, achieving 90% synchronization accuracy given a synchronization tolerance of 700 milliseconds. Our method also achieves state-of-the-art synchronization performance on the CMU-MMAC dataset.
Publication Title
Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies
Recommended Citation
Zhang, Y., Zhang, S., Liu, M., Daly, E., Battalio, S., Kumar, S., Spring, B., Rehg, J., & Alshurafa, N. (2020). SyncWISE: Window Induced Shift Estimation for Synchronization of Video and Accelerometry fromWearable Sensors. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 4 (3) https://doi.org/10.1145/3411824