Place recognition and factor graph localization for mobile robots using Google indoor street view

Tennakoon, Kusal B. (2021) Place recognition and factor graph localization for mobile robots using Google indoor street view. Masters thesis, Memorial University of Newfoundland.

[img] [English] PDF - Accepted Version
Available under License - The author retains copyright ownership and moral rights in this thesis. Neither the thesis nor substantial extracts from it may be printed or otherwise reproduced without the author's permission.

Download (103MB)

Abstract

This thesis develops an indoor localization system for mobile robots using Google indoor street view. The proposed localization system consists of two main modules. The first is a place recognition system based on Google’s indoor street view. Its purpose is to determine the position of the robot in terms of the node on the street view map that is closest to the robot’s actual location. It is achieved by comparing an image captured by the robot’s camera against the indoor street view images. In order to achieve the best accuracy possible, the input image is compared to every street view image. The system employs two verification stages. The first stage is based on the visual similarity among the images. The best five images that qualified through this stage become candidate images for the second verification stage. In this stage, the geometric consistency between the images is assessed. The image that passes this test with the highest similarity score is considered a match with the input image. The proposed place recognition system is tested on different data sets and the performance is assessed using standard evaluation metrics. The second part is the main module of the proposed localization system. It is a graph-based estimation module that incorporates the odometry data, visual feedback, and motion data that eventually is solved via optimization techniques. The result is the estimates of the robot’s locations at specified intervals along its journey. It uses odometry data to interpret connections among successive poses. Also, it uses visual information from the robot’s camera to establish constraints between the robot’s poses and the map. It is achieved through constraints derived using two images, one from the robot’s camera and one from the node in the map that matches best with the image from the robot’s camera. The localization system is designed to minimize the drift caused by the odometer. The system is simulated and then tested for a data set captured at the Memorial University of Newfoundland engineering building basement. The performance of the system is evaluated using standard error metrics.

Item Type: Thesis (Masters)
URI: http://research.library.mun.ca/id/eprint/14798
Item ID: 14798
Additional Information: Includes bibliographical references.
Keywords: Factor Graph, Localization, Google Street View, SLAM, Place Recognition
Department(s): Engineering and Applied Science, Faculty of
Date: May 2021
Date Type: Submission
Digital Object Identifier (DOI): https://doi.org/10.48336/254v-ea17
Library of Congress Subject Heading: Google Map (Firm); Indoor positioning systems (Wireless localization); Mobile robots.

Actions (login required)

View Item View Item

Downloads

Downloads per month over the past year

View more statistics