• Login
    View Item 
    •   Home
    • Massey Documents by Type
    • Theses and Dissertations
    • View Item
    •   Home
    • Massey Documents by Type
    • Theses and Dissertations
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Fusion-SLAM by combining RGB-D SLAM and Rat SLAM : a thesis presented in partial fulfilment of the requirements for the degree of Master of Engineering in Mechatronics at Massey University, Albany, New Zealand

    Icon
    View/Open Full Text
    01_front.pdf (89.44Kb)
    02_whole.pdf (4.848Mb)
    Export to EndNote
    Abstract
    Robotic Simultaneous Localization and Mapping (SLAM) is the problem of solving how to create a map of the environment while localizing the robot in the map being created. This presents a causality dilemma where the map needs to be created in order to localize the robot, but the robot also needs to be localized in order to create the map. In past research there have been many solutions to this problem ranging from Extended Kalman Filter (EKF) to Graph SLAM systems. There has also been extensive research in bioinspired methods, like ratSLAM implemented in aerial and land-based robots. The different research setups use sensors such as Time of Flight (ToF) e.g. laser scanners and passive devices e.g. cameras. Over the past few years a new type of combined apparatus has been developed by Microsoft called the Kinect. It combines active and passive sensing elements and aligns the data in a way which allows for efficient implementation in robotic systems. This has led to the Kinect being implemented in new research and many studies, mostly around RGB-D SLAM. However these methods generally require a continuous stream of images and become inaccurate when exposed to ambiguous environments. This thesis presents the design and implementation of a fusion algorithm to solve the robotic SLAM problem. The study starts by analysing existing methods to determine what research has been done. It then proceeds to introduce the components used in this study and the Fusion Algorithm. The algorithm incorporates the colour and depth data extraction and manipulation methods used in the RGB-D SLAM system while also implementing a mapping step similar to the grid cell and firing field functions found in the ratSLAM. This method improves upon the RGB-D SLAM’s weakness of requiring a continuous stream and ambiguous images. An experiment is then conducted on the developed system to determine the extent to which it has solved the SLAM problem. Moreover, the success rate for finding a node in a cell and matching its pose is also investigated. In conclusion, this research presents a novel algorithm for successfully solving the robotic SLAM problem. The proposed algorithm also helps improve the system’s efficiency in navigation, odometry error correction, and scan matching vulnerabilities in feature sparse views.
    Date
    2016
    Author
    Tubman, Robert Tsunemichi
    Rights
    The Author
    Publisher
    Massey University
    URI
    http://hdl.handle.net/10179/11619
    Collections
    • Theses and Dissertations
    Metadata
    Show full item record

    Copyright © Massey University
    | Contact Us | Feedback | Copyright Take Down Request | Massey University Privacy Statement
    DSpace software copyright © Duraspace
    v5.7-2020.1-beta1
     

     

    Tweets by @Massey_Research
    Information PagesContent PolicyDepositing content to MROCopyright and Access InformationDeposit LicenseDeposit License SummaryTheses FAQFile FormatsDoctoral Thesis Deposit

    Browse

    All of MROCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThis CollectionBy Issue DateAuthorsTitlesSubjects

    My Account

    LoginRegister

    Statistics

    View Usage Statistics

    Copyright © Massey University
    | Contact Us | Feedback | Copyright Take Down Request | Massey University Privacy Statement
    DSpace software copyright © Duraspace
    v5.7-2020.1-beta1