Generating Synthetic Hand-Object Contact Maps for Grasping Region Prediction
8 Pages Posted: 6 Jul 2022
Abstract
From a human point of view, adapting our grasping to different shaped and sized objects feels natural, since we are continuously learning from environment interaction. We have mastered object manipulation, grabbing a hammer by its handle is all but unconscious. However, performing the same task from a machine's perspective is particularly challenging, as it requires an in-depth understanding of objects and its manipulation. We could approximate such an understanding by capturing the hand-object contact resulting from human grasps, i.e., contact maps. This has already been done using cumbersome capture systems, and not on a large scale. In this work, we simplify and accelerate such a process by generating contact maps from our interaction with household objects in photorealistic virtual reality environments. To the best of our knowledge, we are the first to generate contact maps at scale from our interaction in a virtual scenario. We train an image-to-image translation method to predict grasp regions on objects, demonstrating the usefulness of our generated contact maps. Our purpose with this work is to foster applications where hand-object understanding is necessary, so all the developed tools, code and generated data are provided.
Keywords: Hand-object contact maps, Synthetic data generation, Object manipulation, Virtual Reality, Object Grasp
Suggested Citation: Suggested Citation