Authors: Simona Nobili, Marco Camurri, Victor Barasuol, Michele Focchi, Darwin Caldwell, Claudio Semini, Maurice Fallon
In this paper we present a novel system for the state estimation of a dynamically locomoting robot (walking and trotting). The approach fuses four heterogeneous sensor sources (inertial, kinematic, LIDAR and vision) to maintain an accurate and consistent estimate of the robot’s base link velocity and position, in the presence of disturbances such as slips and missteps. Performance is demonstrated which is robust to changes in the structure and lighting of the environment, as well as the terrain over which the robot crosses. Our approach builds upon a modular inertial-driven Extended Kalman Filter which incorporates a rugged, probabilistic leg odometry component with additional inputs from stereo visual odometry and LIDAR registration. We motivate the simultaneous use of both stereo vision and LIDAR to combat operational issues which occur in real applications. To the best of our knowledge, this paper is the first to discuss the complexity of consistent estimation of both pose and velocity states, as well as the fusion of multiple exteroceptive signal sources at largely different frequencies and latencies, in a manner which is acceptable for a quadruped’s feedback controller. A substantial experimental evaluation demonstrates the robustness and accuracy of our system achieving continuously accurate localization and drift per distance traveled below 1 cm/m.