Authors
Juyoung Lee, Shaurye Aggarwal, Jason Wu, Thad Starner, Woontack Woo
Publication date
2019/9/9
Book
Proceedings of the 2019 ACM International Symposium on Wearable Computers
Pages
123-128
Description
SelfSync enables rapid, robust initiation of a gesture interface using synchronized movement of different body parts. SelfSync is the gestural equivalent of a hotword such as OK-Google in a speech interface and is enabled by the increasing trend where a user wears two or more wearables, such as a smartwatch, wireless earbuds, or a smartphone. In a user study comparing five potential SelfSync gestures in isolation, our system averages 96%, 98% and 88% for user dependent, user adapted, and user independent accuracy, respectively. For when the user has a phone in a pocket and a smart-watch, we suggest twisting the hand about the wrist while moving the leg with the phone in synchrony left and right. When the user has a head worn device and a smartwatch, we suggest twisting the hand while twisting the head left and right.
Total citations
20202021202220233112
Scholar articles
J Lee, S Aggarwal, J Wu, T Starner, W Woo - Proceedings of the 2019 ACM International …, 2019