Multi-robot motion-formation distributed control with sensor self-calibration: experimental validation
In this paper, we present the design and implementation of a robust motion formation distributed control algorithm for a team of mobile robots. The primary task for the team is to form a geometric shape, which can be freely translated and rotated at the same time. This approach makes the robots to behave as a cohesive whole, which can be useful in tasks such as collaborative transportation. The robustness of the algorithm relies on the fact that each robot employs only local measurements from a laser sensor which does not need to be off-line calibrated. Furthermore, robots do not need to exchange any information with each other. Being free of sensor calibration and not requiring a communication channel helps the scaling of the overall system to a large number of robots. In addition, since the robots do not need any off-board localization system, but require only relative positions with respect to their neighbors, it can be aimed to have a full autonomous team that operates in environments where such localization systems are not available. The computational cost of the algorithm is inexpensive and the resources from a standard microcontroller will suffice. This fact makes the usage of our approach appealing as a support for other more demanding algorithms, e.g., processing images from onboard cameras. We validate the performance of the algorithm with a team of four mobile robots equipped with low-cost commercially available laser scanners.
READ FULL TEXT