Authors
Yezhou Yang, Cornelia Fermüller, Yi Li, Yiannis Aloimonos
Publication date
2015
Conference
IEEE Conference on Computer Vision and Pattern Recognition (CVPR)
Description
The grasp type provides crucial information about human action. However, recognizing the grasp type in unconstrained scenes is challenging because of the large variations in appearance, occlusions and geometric distortions. In this paper, first we present a convolutional neural network to classify functional hand grasp types. Experiments on a public static scene hand data set validate good performance of the presented method. Then we present two applications utilizing grasp type classification:(a) inference of human action intention and (b) fine level manipulation action segmentation. Experiments on both tasks demonstrate the usefulness of grasp type as a cognitive feature for computer vision. This study shows that the grasp type is a powerful symbolic representation for action understanding, and thus opens new avenues for future research.
Total citations
2015201620172018201920202021202220232024371010128161762
Scholar articles
Y Yang, C Fermuller, Y Li, Y Aloimonos - Proceedings of the IEEE conference on computer …, 2015