External perception based on vision plays a critical role in developing improved and robust localization algorithms for mobile robots, as well as in gaining important information about the vehicle and the traversed terrain. This paper presents two novel methods to improve mobility on rough terrains by using visual input. The first method consists of a stereovision algorithm for 6-DoF ego-motion estimation, which integrates image intensity information and 3D stereo data using an Iterative Closest Point (1CP) approach. The second method aims at estimating the wheel sinkage of a mobile robot on deformable soil, based on the visual input from an onboard monocular camera, and an edge detection strategy. Both methods were implemented and experimentally validated on an all-terrain mobile robot, showing that the proposed techniques can be successfully employed to improve the performance of ground vehicles operating in uncharted environments.
Vision-based Methods for Mobile Robot Localization and Wheel Sinkage Estimation
REINA, GIULIO;
2008-01-01
Abstract
External perception based on vision plays a critical role in developing improved and robust localization algorithms for mobile robots, as well as in gaining important information about the vehicle and the traversed terrain. This paper presents two novel methods to improve mobility on rough terrains by using visual input. The first method consists of a stereovision algorithm for 6-DoF ego-motion estimation, which integrates image intensity information and 3D stereo data using an Iterative Closest Point (1CP) approach. The second method aims at estimating the wheel sinkage of a mobile robot on deformable soil, based on the visual input from an onboard monocular camera, and an edge detection strategy. Both methods were implemented and experimentally validated on an all-terrain mobile robot, showing that the proposed techniques can be successfully employed to improve the performance of ground vehicles operating in uncharted environments.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.