Authors
Per Ola Kristensson, Thomas Nicholson, Aaron Quigley
Publication date
2012/2/14
Book
Proceedings of the 2012 ACM international conference on Intelligent User Interfaces
Pages
89-92
Description
In this paper we present a new bimanual markerless gesture interface for 3D full-body motion tracking sensors, such as the Kinect. Our interface uses a probabilistic algorithm to incrementally predict users' intended one-handed and twohanded gestures while they are still being articulated. It supports scale and translation invariant recognition of arbitrarily defined gesture templates in real-time. The interface supports two ways of gesturing commands in thin air to displays at a distance. First, users can use one-handed and two-handed gestures to directly issue commands. Second, users can use their non-dominant hand to modulate single-hand gestures. Our evaluation shows that the system recognizes one-handed and two-handed gestures with an accuracy of 92.7%--96.2%.
Total citations
20122013201420152016201720182019202020212022202320244128191079532342
Scholar articles
PO Kristensson, T Nicholson, A Quigley - Proceedings of the 2012 ACM international conference …, 2012