Advanced Mobile Robotics: Final Project
Jizhong Xiao May 2019
1 Introduction of Simultaneous Localization and Mapping
Simultaneous Localization and Mapping (i.e. SLAM) is a process of construct- ing and updating a map of an unexplored environment while maintaining the tracking of its own location. For example, given an autonomous ground vehicle (AGV) which is mounted with a Lidar and a camera sensor, the AGV uses the Lidar and camera to perform perception and pose tracking, which are Lidar Odometry approach http://www.roboticsproceedings.org/rss10/p07.pdf and visual SLAM approach https://arxiv.org/pdf/1610.06475.pdf. Mean- while, the Lidar and camera can provide dense spatial and color registration to build a dense 3D map. The map is an occupancy representation of the visible object in the field of view.
For SLAM, it differs from visual odometry (VO) in the following aspects:
• SLAM also performs mapping to generate a map which can be used for navigation and obstacle avoidance
• In a conventional manner, VO only perform pose estimation using the pre- vious associated frames, however, SLAM can update pose using constant motion assumption with graph optimization.
• VO cannot correct the pose drift, however, SLAM performs loop-closing detection and graph optimization.
Please read the following paper for further reference, 1) https://arxiv. org/pdf/1606.05830.pdf; 2) https://hal.archives-ouvertes.fr/hal-01615897/ file/2017-simultaneous_localization_and_mapping_a_survey_of_current_ trends_in_autonomous_driving.pdf.
2 Project Description
For the final project, you are required to run the ORB-SLAM https://github. com/raulmur/ORB_SLAM2, which is a multi-purpose SLAM architecture. ORB- SLAM supports mono-camera, stereo-camera, and RGB-D camera, and allows you to visualize the tracking process.
1
You are required to run a heavily developed version as listed below on an example dataset. The goal of final project is to let you have a taste of visual SLAM, and also expect you to develop more interesting functions based on the example that we provided.
2.1 System Requirements
In order to compile and run the code, you are required to have a computer installed with Linux system. Dual operating system with virtual machine also works for this project.
Before compile and run the code, you need to install the following libraries:
1 – OpenCV : Opencv is a computer vision and image processing lib, please follow the following procedures to install the opencv lib:
git clone https://github.com/opencv/opencv.git cd opencv
mkdir build
cd build
cmake ..
sudo make install
You may have to install git, please follow this link https://www.liquidweb.
com/kb/install-git-ubuntu-16-04-lts/.
2 – Pangolin: ORB-SLAM requires Pangolin to visualize the trajectory as
well as the map, please following the following steps to install: sudo apt-get install libglew-dev
sudo apt-get install cmake
sudo apt-get install libpython2.7-dev
sudo apt-get install ffmpeg libavcodec-dev libavutil-dev libavformat-dev libswscale- dev libavdevice-dev
git clone https://github.com/stevenlovegrove/Pangolin.git cd Pangolin
mkdir build
cd build
cmake ..
cmake –build . sudo make install
2.2 Code and Dataset
Code: A heavily developed ORB-SLAM is provided, and please download the code through the following link:
https://www.dropbox.com/s/ydmgzj0elpcy3v3/RGBD_SLAM_PCL_MAP.tar.
gz?dl=0
After you download the ORB-SLAM, please unzip the folder first, then enter the folder and execute:
sh build.sh
2
to compile the project. You can also download the code from the following link:
git clone https://github.com/EricLYang/ORB-SLAM2-Example.git
and the compilation is the same as above. If you have any problem, please report issues at
https://github.com/EricLYang/ORB-SLAM2-Example/issues
Dataset: In order to run the code, a simple dataset is provided, please download the dataset through the following link: https://www.dropbox.com/ s/vy8hrolf9xgn914/2019-04-23-23-34-38.tar.gz?dl=0
You should unzip first, and then copy the path where the folder locates. For example, you put the dataset under /home/yourname/, then please copy this path.
Run the code: In order to run the code, please go to the ORB-SLAM2 home folder, and execute:
./RGBD_excutable/rgbd_tum Vocabulary/ORBvoc.bin Setting/SETTING. yaml /path/where/you/put/the/dataset /path/where/is/the/data/ set/2019-04-23-23-34-38.txt /path/of/recorded/map(do/not/change) false
Please refer to RGBD_SLAM_run.sh.
3