2-5-hands-on-exercises
Hands-On Exercises
Theory is important, but robotics is a hands-on discipline. In this section, we will apply the concepts we've learned to build and interact with our digital twin. These exercises will require you to have ROS 2, Gazebo, and the gazebo_ros_pkgs installed.
Exercise 1: Spawn a Humanoid from URDF into Gazebo
Objective: To understand the process of taking a robot description file and bringing the robot to life in the Gazebo simulator.
You will need:
- A sample URDF file for a simple humanoid robot. For this exercise, you can use a pre-existing model like the
iRobot-Create-URDFor a simple custom biped model. (A sample filesimple_biped.urdfshould be provided alongside this course material). - A Gazebo world file. An empty world is sufficient.
Steps:
-
Create a Launch File: In your ROS 2 package, create a Python launch file (e.g.,
spawn_humanoid.launch.py). This file will automate the process of starting Gazebo and spawning the robot. -
Include the Gazebo Executable: Your launch file should first start the Gazebo server (
gzserver) and client (gzclient). You can use theExecuteProcessaction fromlaunch.actionsto do this.# Inside your launch file
gazebo = ExecuteProcess(
cmd=['gazebo', '--verbose', '-s', 'libgazebo_ros_init.so', '-s', 'libgazebo_ros_factory.so', world_path],
output='screen'
) -
Read the URDF File: Your launch file needs to locate and read the contents of your
simple_biped.urdffile. -
Define the Spawner Node: The key to spawning the robot is the
spawn_entity.pyscript provided bygazebo_ros. You will create aNodeaction in your launch file that runs this script.# The spawner node
spawn_entity = Node(
package='gazebo_ros',
executable='spawn_entity.py',
arguments=[
'-topic', 'robot_description',
'-entity', 'simple_biped',
'-x', '0.0',
'-y', '0.0',
'-z', '1.0'
],
output='screen'
)This node tells Gazebo to listen to the
/robot_descriptiontopic for the robot's model and spawn an entity namedsimple_bipedat a height of 1.0 meter. -
Publish the Robot Description: You need a
robot_state_publishernode that reads the URDF and publishes it to the/robot_descriptiontopic. -
Launch and Observe: Run your launch file using
ros2 launch <your_package> spawn_humanoid.launch.py. You should see Gazebo open, and after a moment, your humanoid model will appear and likely fall over due to gravity, demonstrating that the physics engine is active.
Success Criteria: The humanoid robot model appears in the Gazebo window at the specified coordinates.
Exercise 2: Simulate a Room with LiDAR + IMU
Objective: To build a simple test environment and simulate sensor data that can be visualized in ROS 2.
Steps:
-
Create a World SDF File: Create a new file named
test_room.world. In this file, use SDF to define a ground plane and four simple walls to form a box.<!-- Abridged example for test_room.world -->
<sdf version='1.7'>
<world name='default'>
<include>
<uri>model://ground_plane</uri>
</include>
<model name='walls'>
<!-- Define walls using <link> and <collision>/<visual> tags -->
</model>
</world>
</sdf> -
Add Sensors to the URDF: Modify your
simple_biped.urdffile to include plugins for a LiDAR sensor and an IMU sensor. Attach them to relevant links, like theheadortorso.<!-- Abridged example for URDF -->
<gazebo reference="torso_link">
<sensor name="imu_sensor" type="imu">
<plugin name="imu_plugin" filename="libgazebo_ros_imu_sensor.so">
<!-- ROS 2 parameters for topic name, frame, etc. -->
</plugin>
</sensor>
</gazebo>
<gazebo reference="head_link">
<sensor name="lidar" type="ray">
<plugin name="lidar_plugin" filename="libgazebo_ros_ray_sensor.so">
<!-- ROS 2 parameters for topic name, etc. -->
</plugin>
<!-- LiDAR-specific configuration for range, samples, etc. -->
</sensor>
</gazebo> -
Update the Launch File: Modify your launch file to load your new
test_room.worldinstead of an empty world. -
Launch the Simulation: Run your updated launch file. Your biped should now spawn inside the room you created.
-
Visualize the Data:
- LiDAR: Open RViz2 (
ros2 run rviz2 rviz2). Add aLaserScandisplay and set the topic to your LiDAR's topic (e.g.,/scan). You should see red points indicating where the LiDAR beams are hitting the walls of your room. - IMU: In a new terminal, use
ros2 topic echoto view the raw data from your IMU topic (e.g.,/imu/data).
- LiDAR: Open RViz2 (
Success Criteria: The robot spawns in the enclosed room, and you can successfully visualize the LiDAR data in RViz2 and echo the IMU data on the console.
Bonus Exercise: Create a Unity Scene & Link to ROS 2
Objective: To get a feel for the Unity side of the digital twin by creating a visual scene and preparing it for ROS 2 communication.
Steps:
- Set up Unity Robotics Hub: Create a new Unity project and import the official
Unity-Robotics-Hubpackage. - Create a Simple Scene: In Unity, create a new scene. Add a "Plane" GameObject to act as the ground. Add a few "Cube" or "Sphere" GameObjects to act as simple props. Apply materials and colors to make the scene visually distinct.
- Import Robot Model: Import the visual meshes (
.daeor.fbxfiles) of your humanoid robot into Unity and assemble them into a prefab. - Add ROS 2 Connection: Add the
ROSConnectionprefab to your scene and configure it to connect to your ROS 2 network (you may need to run theros2-web-bridge). - Next Steps (Conceptual):
- Think about how you would write a C# script to subscribe to the
/joint_statestopic. - This script would need a reference to each of the visual links of your robot model in Unity.
- In the script's
Updatefunction, you would parse theJointStatemessage and apply the received joint angles to thetransform.rotationof each corresponding link.
- Think about how you would write a C# script to subscribe to the
Success Criteria: You have a Unity scene that can successfully connect to your ROS 2 network. When you run the scene, the Unity console shows a "Connected to ROS" message.