In this session, we work with a simulated mobile robot to integrate sensors such as lidar and an Inertial Measurement Unit (IMU). At the end of this session you'll have a simulated mobile robot driving around.
Using 3D simulators such as Gazebo, we can attach simulated sensors to simulated robots to visualise their sensor data and evaluate different robot algorithms and behaviours.
To complete this session you will need to have the following packages installed:
sudo apt install liburdfdom-tools
sudo apt install ros-$ROS_DISTRO-tf2-tools
sudo apt install ros-$ROS_DISTRO-husky-simulator
sudo apt install ros-$ROS_DISTRO-husky-viz
Notes on using $ROS_DISTRO
- Using the
$ROS_DISTRO
environment variable will ensure that you install the correct application for your ROS installation- i.e. Running the command:
echo $ROS_DISTRO
should returnnoetic
for anoetic
installation
- i.e. Running the command:
- Note: If nothing is returned, you may not have "sourced" your ROS environment correctly
- i.e. For a correctly sourced installation, running the command:
printenv | grep ROS
should return a list ofROS
prefixed environment variables
- i.e. For a correctly sourced installation, running the command:
The Unified Robot Description Format (URDF) is an XML format that describes a robot's hardware, including it's chassis, linkages, joints, sensor placement, etc. Take a look at NASA's Robonaut to see what's possible with robot URDFs.
In the first part of this session, we'll add a simulated arm and gripper to a mobile robot-- we'll then work up to controlling its movement in simulation.
In this section, we'll work with the
robot_description package in this repository.
Please go through the URDF file robot.urdf
in
its urdf directory which describes our robot.
Check whether the model is complete, and the list of components using the below commands.
cd src/robot_description/urdf
check_urdf robot.urdf
The following output will be shown.
robot name is: robot1
---------- Successfully Parsed XML ---------------
root Link: base_link has 3 child(ren)
child(1): wheel_1
child(2): wheel_2
child(3): wheel_3
To launch a simulation of an URDF modelled-robot, we will need to create a launch
file
- Create a ROS-launch file named
display.launch
insrc/robot_description/launch
. - Populate it with the following content
Note: The following packages are used in this
<?xml version="1.0"?> <launch> <arg name="model" /> <arg name="gui" default="False" /> <param name="robot_description" command="$(find xacro)/xacro --inorder $(arg model)" /> <param name="use_gui" value="$(arg gui)"/> <node name="joint_state_publisher" pkg="joint_state_publisher" type="joint_state_publisher" /> <node name="robot_state_publisher" pkg="robot_state_publisher" type="robot_state_publisher" /> <node name="rviz" pkg="rviz" type="rviz" args="-d $(find urdf_tutorial)/urdf.rviz" /> </launch>
launch
file and were installed when you installed thedesktop
ordesktop-full
version of ROS.- joint_state_publisher: https://wiki.ros.org/joint_state_publisher
- robot_state_publisher: https://wiki.ros.org/robot_state_publisher
- urdf_tutorial: https://wiki.ros.org/urdf_tutorial
We'll come back to the nodes joint_state_publisher
and robot_state_publisher
soon.
For now, let's launch the simulation by doing the following:
- Build the
robot_description
package to make it a ROS-package. From thesensor-integration
directory run:catkin build robot_description
- Source the development package path, to ensure
robot_description
package is discoverable in your shell environment (e.g. callable while usingrospack
command).source devel/setup.bash
- Launch the modelled-robot through the
display.launch
androbot.urdf
files that we just created.roslaunch robot_description display.launch model:=`rospack find robot_description`/urdf/robot.urdf use_gui:=true
- An RViz window should launch. Make the following changes:
- Set "Fixed Frame" to
base_link
. (Note: RViz sets this tomap
by default) - Add
RobotModel
display via the "Add" button.
- Set "Fixed Frame" to
If successful you should see an output similar to the following image
We break down the parts of robot.urdf
briefly in this section.
-
<link>
element: Describes a rigid body with an inertia, visual features, and collision properties.- In the below snippet, we describe the base of the robot
base_link
which has a box-geometry with visual attributes like its color.
<link name="base_link"> <visual> <geometry> <box size="0.2 .3 .1"/> </geometry> <origin rpy="0 0 0" xyz="0 0 0.05"/> <material name="white"> <color rgba="1 1 1 1"/> </material> </visual> </link>
- In the below snippet, we describe the base of the robot
-
<joint>
element: Describes the kinematics and dynamics of a joint between two links, along with its safety limits.- In the below snippet, the joint description provides details on how the base_link and wheel_1 links are connected.
- Take note of the joint type
fixed
, which means wheel_1 has a fixed connection with base_link, with all degrees-of-freedom locked.
<joint name="base_to_wheel1" type="fixed"> <parent link="base_link"/> <child link="wheel_1"/> <origin xyz="0 0 0"/> </joint>
For more information on the XML tags of URDF file, please refer to its documentation here.
You might have noticed a missing component in the robot-model shown on RViz.
Complete the model in the URDF file to add the missing component and relaunch
display.launch
using the steps in the previous section (Launch the robot simulation)
Look for the solution below in case you can't complete the model.
Solution: Missing URDF components
<link> "wheel_4"
to add a fourth wheel to the model.
<link name="wheel_4">
<visual>
<geometry>
<cylinder length="0.05" radius="0.05"/>
</geometry>
<origin rpy="0 1.5 0" xyz="-0.1 -0.1 0"/>
<material name="black"/>
</visual>
</link>
<joint> "base_to_wheel4"
to connect wheel_4
to the robot's base_link
.
<joint name="base_to_wheel4" type="fixed">
<parent link="base_link"/>
<child link="wheel_4"/>
<origin xyz="0 0 0"/>
</joint>
We finish our (box) robot description by adding <collision>
and <inertial>
properties
to its links. These properties are required when we run the simulation in an
environment supporting a physics engine e.g. Gazebo.
A complete description of the robot including these properties are shown in robot1.urdf
in the src/robot_description/urdf
directory.
Notes:
- Notice the
<collision>
and<inertial>
tags for each<link>
element inrobot1.urdf
. - Take note of
robot1.xacro
in the urdf/ directory..xacro
files provide an efficient way to reduce the size and complexity of URDF files by using constants, simple math expressions, and macros.
For more details on collision and inertial properties, go through their the tutorial: "Adding Physical and Collision Properties to a URDF Model"
Earlier we used two packages in our launch file to spin up our modelled-robot:
joint_state_publisher
and
robot_state_publisher
.
Few notes on these packages below:
joint_state_publisher
: Publishes a robot joints' state information (position and velocity) as read from its URDF file. This node publishes to the/joint_states
topic.robot_state_publisher
: Broadcasts the state of the robot to the TF transform library. Listens on/joint_states
topic and continuously publishes the relative transforms between the joints on TF, using its internal kinematics map that tracks the joints with respect to one another.
In this section, we'll work with a simulated model of a Clearpath Husky robot. We will first launch it in Gazebo, which will provide us the environment that the robot will interact with, and then we'll control its movements.
Important Note: Gazebo downloads and caches it's 3D assets from the internet in the background. If you haven't launched a Gazebo world before, it may appear to hang for ten minutes or more while the assets are downloaded in the background. The download servers are slow and often fail. To pre-download the assets, visit this repo and download the assets (Hint: Use "Download ZIP"). To install, unzip the files into ~/.gazebo/models/
.
To launch a simulated world and robot, run these commands at the same time in separate terminals:
🐢 Note if using Noetic: "husky_empty_world.launch" is "empty_world.launch"
# Launches Husky with a SICK LMS1XX lidar and and IMU, in an empty world within Gazebo.
export HUSKY_LMS1XX_ENABLED=1
roslaunch husky_gazebo husky_empty_world.launch
# Launches the RViz window showing Husky model and the measurements received by its lidar sensor.
roslaunch husky_viz view_robot.launch
Gazebo and RViz windows should appear similar to these:
Exercises:
-
Add objects (e.g. a box) in Gazebo and view its lidar scan lines on RViz.
-
Try a Velodyne VLP-16 lidar
- See the following documentation: Customize Husky Configuration
- Note: The VLP-16 will publish a
PointCloud2
topic. Is your RViz subscribed to the correct topic?
-
Try loading the Husky in a more complex world
# Look to the "important note" below if this doesn't load roslaunch husky_gazebo husky_playpen.launch
It is (always) recommended to view the TF tree and the map of topics that are
active. Make use of tf2_tools
(or rqt_tf_tree
) and rqt_graph
package's commands to view this
information.
Solution: Viewing TF tree
Either
- Run view_frames node.
rosrun tf2_tools view_frames.py
- And view the generated TF tree.
evince frames.pdf
Or, Run the rqt_tf_tree
package:
rosrun rqt_tf_tree rqt_tf_tree
Solution: Viewing the ROS node graph
Run the rqt_graph
package to view the nodes and topics.
rosrun rqt_graph rqt_graph
From the above exercise on viewing topics' map, you might have noticed that Husky
simulation has a /husky_velocity_controller/cmd_vel
namespace topic published
by /twist_mux
node and subscribed by gazebo
node.
We will make use of this topic to publish geometry_msgs/Twist
messages to make Husky move.
rostopic pub -r 10 /husky_velocity_controller/cmd_vel geometry_msgs/Twist '{linear: {x: 0.1, y: 0.0, z: 0.0}, angular: {x: 0.0,y: 0.0,z: 0.0}}'
In this exercise you'll write a ROS node that commands the simulated Husky to move in a circular path. Make use of the husky_controller package in this repository.
- The ROS node will need to publish messages to the
/husky_velocity_controller/cmd_vel
topic. Use either thecircle_driver.cpp
orcircle_driver_python.py
to get started with this task. - Note: Make sure to compile and source the
devel/setup.bash
directory for your node to be included in the path.
Solution: Making Husky move in circles
Find a sample way to achieve this task in circle_driver_solution.md
- Add an object on Gazebo like a box in front of Husky so that Husky stops when it is less than 5 metres away from the box.
-
Make use of the logic in
circle_driver
above that publishes commands to Husky's/husky_velocity_controller/cmd_vel
topic. -
A ROS node that subscribes to lidar scans, observes the distance from an object, and sends commands similar to
circle_driver
node.
- Find a way to integrate Kinect or Realsense camera onto the simulated Husky. Going through its source might give you few pointers. The Kinect camera may be depreciated in husky_gazebo, if so use the Realsense.
- URDF XML Specification: http://wiki.ros.org/urdf/XML
- Collision and Inertial properties: (http://wiki.ros.org/urdf/Tutorials/Adding%20Physical%20and%20Collision%20Properties%20to%20a%20URDF%20Model
- Xacro file format: http://wiki.ros.org/urdf/Tutorials/Using%20Xacro%20to%20Clean%20Up%20a%20URDF%20File