Authors
Subramanian Sundaram, Petr Kellnhofer, Yunzhu Li, Jun-Yan Zhu, Antonio Torralba, Wojciech Matusik
Publication date
2019/5/30
Journal
Nature
Volume
569
Issue
7758
Pages
698-702
Publisher
Nature Publishing Group UK
Description
Humans can feel, weigh and grasp diverse objects, and simultaneously infer their material properties while applying the right amount of force—a challenging set of tasks for a modern robot. Mechanoreceptor networks that provide sensory feedback and enable the dexterity of the human grasp remain difficult to replicate in robots. Whereas computer-vision-based robot grasping strategies, – have progressed substantially with the abundance of visual data and emerging machine-learning tools, there are as yet no equivalent sensing platforms and large-scale datasets with which to probe the use of the tactile information that humans rely on when grasping objects. Studying the mechanics of how humans grasp objects will complement vision-based robotic object handling. Importantly, the inability to record and analyse tactile signals currently limits our understanding of the role of tactile information in the human grasp …
Total citations
20192020202120222023202425115162228220122
Scholar articles