A Dataset and Comparative Study for Vision-Based Relative Position Estimation of Multirotor Teams Flying in Close Proximity
Multirotor teams are useful for inspection, delivery, and construction tasks, in which they might be required to fly very close to each other. In such close-proximity cases, nonlinear aerodynamic effects can cause catastrophic crashes, necessitating each robots' awareness of the surroundings. Existing approaches rely on expensive or heavy perception sensors. Instead, we propose to use the often ignored yaw degree-of-freedom of multirotors to spin a single, cheap and lightweight monocular camera at a high angular rate for omnidirectional awareness. We provide a dataset collected with real-world physical flights as well as with 3D rendered scenes and compare two existing learning-based methods in different settings with respect to success rate, relative position estimation, and downwash prediction accuracy. As application we demonstrate that our proposed spinning camera is capable of predicting the presence of aerodynamic downwash in a challenging swapping task.
READ FULL TEXT