Authors
Amandine Brunetto, Sascha Hornauer, X Yu Stella, Fabien Moutarde
Publication date
2023/10/1
Conference
2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)
Pages
1-8
Publisher
IEEE
Description
Vision research showed remarkable success in understanding our world, propelled by datasets of images and videos. Sensor data from radar, LiDAR and cameras supports research in robotics and autonomous driving for at least a decade. However, while visual sensors may fail in some conditions, sound has recently shown potential to complement sensor data. Simulated room impulse responses (RIR) in 3D apartment-models became a benchmark dataset for the community, fostering a range of audiovisual research. In simulation, depth is predictable from sound, by learning bat-like perception with a neural network. Concurrently, the same was achieved in reality by using RGB-D images and echoes of chirping sounds. Biomimicking bat perception is an exciting new direction but needs dedicated datasets to explore the potential. Therefore, we collected the BatVision dataset to provide large-scale echoes in …
Total citations
Scholar articles
A Brunetto, S Hornauer, XY Stella, F Moutarde - 2023 IEEE/RSJ International Conference on Intelligent …, 2023