Skip to content

Robot location estimator with graph optimization for sensor fusion. Project for course "Artificial Intelligence in Robotics"

Notifications You must be signed in to change notification settings

mmcza/Robot-localization-with-GTSAM

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

23 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Robot localization with GTSAM

Project for course "Artificial Intelligence in Robotics"

Table of Contents

1. About the project

Goal of the project

Goal of the project was to prepare a state estimator using graph optimization for sensor fusion. To accomplish that the GTSAM library [1] was chosen.

Sensors

For the project we chose to simulate a TurtleBot3 Waffle in Gazebo simulator. The 3 sensors used to create a factor graph were:

  • Odometry,
  • IMU,
  • LiDAR.

From odometry the position and orientation was available straight away, from IMU it was enough to integrate the linear acceleration twice to get the position and to integrate angular velocity once to receive the orientation. LiDAR returns distance to obstacles around the robot, so those values were used to create a point cloud. ICP (Iterative closest point) algorithm [2] uses two consecutive point clouds and finds the rotation matrix and translation vector between them (we chose the Open3D [3] implementation of the algorithm).

Factor graph

The factor graph, made with all sensors from previous subsection, is shown in the picture below.

Factor graph

2. Installing the dependencies

Create a workspace

mkdir ~/turtlebot_localization_ws && cd ~/turtlebot_localization_ws
git clone https://github.com/mmcza/Robot-localization-with-GTSAM

a.) Using Ubuntu 22.04 and ROS2 Humble (without Docker)

Install the following modules

sudo apt-get install -y ros-humble-turtlebot3-gazebo && sudo apt-get install -y ros-humble-nav2-bringup && sudo apt-get install -y ros-humble-navigation2 && sudo apt-get install -y ros-humble-gtsam && sudo apt-get install -y python3-pip
pip install open3d && pip install gtsam && pip install numpy==1.24.4

b.) With Docker

Building the container

cd Robot-localization-with-GTSAM && docker build -t robot_localization .

Starting the container

bash start_container.sh 
Opening more terminals in the container
docker exec -ti robot_localization bash

Note

The robot_localization_ws directory is shared between the host and container. In the result files inside of it might require sudo privileges to save any changes.

Note

Dockerfile and script for running container are based on Rafał Staszak's repository

3. Running the simulation

Workspace preparation

To make the project work we need to prepare the Gazebo simulator properly

source /opt/ros/humble/setup.bash
export TURTLEBOT3_MODEL=waffle
export GAZEBO_MODEL_PATH=$GAZEBO_MODEL_PATH:/opt/ros/humble/share/turtlebot3_gazebo/models
source /usr/share/gazebo/setup.bash

Build the package

Inside of your workspace (NOT inside your src)

colcon build --symlink-install
source install/setup.bash

Start the simulation

In first terminal start the estimator

ros2 run robot_localization estimator

In second terminal start the simulator

ros2 launch nav2_bringup tb3_simulation_launch.py headless:=False

4. Running the experiment

After launching the estimator and simulator you have to estimate the initial position (by clicking specific button and placing arrow inside RVIZ) and add a goal (so the robot start to move). You can also visualize the estimated state of the robot.

Instruction how to start the experiment

Position and orientation of the robot (from estimator, ground truth from gazebo and odometry) is also saved to trajectory.csv file, so it's possible to display it and check the results after the experiment has ended. State estimation

5. Results

Due to the fact that the Odometry was returning basically same results as real position of the robot, there was added an additional noise with normal distribution ($\mu = 0, :\sigma=0.1$) to robot's position. After that the noisy position was used as the reference to be optimized.

A comparision of Odometry error (with noise) and error from our estimator can be seen on the plot below Error comparision

Table with parameters for both errors (additionally after removing first measurement as it was completely different for the estimator) - calculated using Pandas describe() method.

Parameter Estimator Odometry Estimator >0s Odometry >0s
count 736 736 735 735
mean 0.099232 0.104481 0.096562 0.104488
std 0.088873 0.056087 0.051536 0.056125
min 0.004427 0.002932 0.004427 0.002932
25% 0.059396 0.062254 0.059389 0.062188
50% 0.086959 0.095967 0.086887 0.095914
75% 0.122585 0.135575 0.122386 0.135611
max 2.061525 0.304993 0.314617 0.304993

The estimator without the intial position has lowest values for all parameters apart from it's outliers - min and max (what is interesting because both top 25% and top 75% are the lowest). The mean value of odometry's error is ~8% bigger than the mean error of estimator.

Based on the data from the table, it is noticeable that the estimator does its job and reduces the noise.

This plot illustrates the comparison of odometry, estimated, and real trajectories collected from the dataset. Trajectory comparision

6. Possible development

Due to some issues with sampling of the real position of the robot there might be made some improvements. One way could be to add constant and equal interval between adding factors to the graph. Another possible solution would be to optimize the graph less frequently (for example once every 5 or 10 poses are added as initial state to the graph.)

References

[1] Dellaert, F. and GTSAM Contributors borglab/gtsam v.4.2a8. Georgia Tech Borg Lab (2022). https://github.com/borglab/gtsam, https://doi.org/10.5281/zenodo.5794541

[2] Zhang, Z. Iterative point matching for registration of free-form curves and surfaces. Int J Comput Vision 13, 119–152 (1994). https://doi.org/10.1007/BF01427149

[3] Zhou, Q. and Park, J. and Koltun, V. Open3D: A Modern Library for 3D Data Processing. arXiv (2018). https://doi.org/10.48550/arXiv.1801.09847

About

Robot location estimator with graph optimization for sensor fusion. Project for course "Artificial Intelligence in Robotics"

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published