Stereoscopic Virtual Reality Teleoperation for Human Robot Collaborative Dataset Collection
Yi-Shiuan Tung, Matthew B. Luebbers, Alessandro Roncone, and Bradley Hayes
In 7th International Workshop on Virtual, Augmented, and Mixed-Reality for Human-Robot Interactions (VAM-HRI), Mar 2024
Large and diverse datasets are required to train general purpose models in NLP, computer vision, and robot manipulation. However, existing robotics datasets have single robots interacting in a static environment whereas in many real world scenarios, robots have to interact with humans or other dynamic agents. In this work, we present a virtual reality (VR) teleoperation system to enable data collection for human robot collaborative (HRC) tasks. The human operator using the VR system receives an immersive and high fidelity egocentric view with a stereoscopic depth effect, providing the situational awareness required to teleoperate the robot remotely to perform various tasks. We propose to collect data on a set of HRC tasks and introduce a taxonomy to categorize the tasks. We envision that our VR system will broaden the scope of tasks robots can perform with human collaborators and that the proposed dataset will enable the development of new algorithms for HRC.