Vision-Enhanced Localization for Cooperative Robotics
Type of Degreethesis
MetadataShow full item record
Simultaneous Localization And Mapping - SLAM, is an extremely challenging open problem. SLAM involves a mobile robot building a spatial map of its environment and finding the ego-position in the partially built map. The problem is analogous to the chicken-egg situation; since, an accurate map cannot be built without knowing the ego-position, while, the position of self cannot be determined until it has a very accurate map. As the real world is far from being ideal, a probabilistic approach is devised to model the SLAM system. This thesis aims to handle the problem with a stereovision system. There are 2 aspects of the solution; one deals with processing the images gathered from the environment and the other deals with estimating the ego-position of the robot from the images gathered. The first aspect is fulfilled by using a SIFT (Scale Invariant Feature Transforms) algorithm and the second aspect is managed by the Rao-Blackwellized particle filtering algorithm. The results provided by the Matlab simulation environment showed that the SLAM system could converge well to the real world values.