Building Cheapest ROS2 Robot using ESP32 — Part 3— Simulation

8 min readNov 23, 2023

In Part2 of this article series, we built and tested various sensors of the 2WD robot and made sure that wheels, encoders and IMU sensor are all working properly and PID control of the motors is able to keep the speed of the wheel constant. In this article, we will take the next steps to upload the main firmware of the Linorobot2_hardware which is modified to work with ESP32.

Uploading the firmware

The steps for doing this are almost identical to what we did during our calibration tests with exception of changing the working directory to firmware instead of calibration.

Just a reminder of prep steps for making sure the robot doesn’t fall off a table if you testing on one:

  1. Make sure your robot is propped up and front wheels are not touching the ground
  2. ESP32 is plugged into your laptop/computer using a USB to micro-USB cable. Also, make sure you have followed the steps in this article to ensure you have added a Udev rules file for ESP32 so you can flash to ESP32 memory, without it you’ll have to change the write permission everytime it is connected.
  3. Battery power for motors is turned on and L298N is showing leds ON
  4. Make sure you have installed the command line version of PlatformIO following these instructions

Assumption here is you have also followed two steps below to get the code downloaded and myrobot_config.h customized to suite your specific robot setup.

  1. Clone the github repo [git clone]
  2. Make sure to update myrobot_config.h file with values appropriate for you

Ok. We are ready to upload the robot firmare, follow these steps:

Run these commands to test the motors:

cd ~/linorobot2_hardware
cd firmware
# upload the test motors program to ESP32, make sure ESP32 is connected to your computer
pio run --target upload -e myrobot

You’ll see the progress percentage as the program is flashed to ESP32 finally showing 100% complete. At this point the program is flashed on to ESP32.

  • At this point you can disconnect the esp32 board USB cable from your laptop and power it with separate battery [untethered from your laptop]. And as soon as the esp32 comes up it will start broadcasting to agent running via command line above.

We need to start the micro-ROS agent using docker on our laptop. If you have not installed docker yet make sure you follow instructions here. If you want to read more about how the whole process of running micro-ROS on ESP32 works or need help troubleshooting micro-ROS connection checkout this article. The command below assumes you are using ROS2 humble.

docker run -it --rm --net=host microros/micro-ros-agent:humble udp4 --port 8888 -v6

You will see output like this:

# you will see output like this if the firmware is already running on your robot,
# and the micro-ros agent on your laptop is able to connect to the robot over wifi/udp

In another separate terminal you can run following ROS2 command to make sure the linorobot topics are getting published over wifi from esp32

ros2 topic list

You should be able to see following topics being broadcast by the esp32 over Wifi to your laptop/computer.


You can also launch Rqt app from command line on your laptop and see the node graph should look like this:

ROS2 topics published from esp32 micro-ROS

As you can see above the robot is listening for /cmd_vel topic and publishing all other topics [we are only concerned about imu/data and /odom topics the other ones we don’t have hardware like battery monitor or ultrasound sensor yet].

At this point we are ready to move our robot using a cheap generic Joystick controller like this:

If you are interested in something more mainstream and more expensive, you can use Logitech controller suggested by Linorobot

I haven’t used Logitech one yet since the cheaper one has been working great for me for few years now and I hadn’t even changed the batteries yet as it is very efficient with auto power off.

Setting up and Testing Joystick Controller

Honestly, I would love to describe the details here again but @ArticulatedRobo has done such a great job describing all the details, I would just recommend follow his article to get this setup — He also goes into details of explaining how the Joystick hardware is converted to ROS2 Joy messages first and then to Twist messages in order for it to be published on cmd_vel topic so that our robot can listen for those on cmd_vel topic and move as we move the joystick.

If you remember we already tested during our calibration tests a hard-coded cmd_vel message to drive the robot wheels so we can be confident that if the robot receives the cmd_vel message in real-time it will act on those the same way.

The basic idea is these Joystick controllers come with small USB plugin that goes into your laptop USB port and then you can run all the hardware tests using jtest or jtest-gtk.

In some controllers, after you have turned the power switch on you may have to press the home button or hold X-button down while moving the joystick to make sure the joy messages are being transmitted.

If you don’t want to shell out for Joystick controller ROS2 provides teleop_twist_keyboard which you can use to move your robot using your laptop keyboard OR another way to do it would be use some off the shelf Android or iOS app like Dabble app. However, going deep into using that app is beyond the scope of this article so I’ll leave you to do your own investigation or follow one of the other methods recommended in @ArticulatedRobo article or youtube video.

You can also use other software apps like foxglove studio teleop panel or rqt_robot_steering [comes installed with ROS2 Humble desktop packages] to send the cmd_vel directly to the robot using the UI for those apps.

The final test that your joystick is working should be like this:

## start the teleop_twist_joy node in ROS2 [this will automatically start joy node as well]
ros2 launch teleop_twist_joy joy_config:='xbox'

## in separate terminal window monitor the cmd_vel topic to see messages as we move joystick
ros2 topic echo /cmd_vel

### in third terminal window launch the Rqt app

If everything went well, you’ll see the node graph like this in Rqt node graph window and in the second terminal where we are monitoring the /cmd_vel topic, you’ll see the linear and angular values changing as you move the joystick.

Node graph for joy and cmd_vel topics


At this point we have tested the robot in stand alone mode with wheels, sensors and we have also tested the joystick controller. Now its time to put everything together and drive the robot. In order to do that we are going to have to do following main steps:

  1. Download and test Linorobot2 ROS2 packages on laptop and make sure we can run the out of box Linorobot2 simulations
  2. Create a modified launch file for our custom robot so we can launch all the necessary nodes including micro-ROS docker agent and teleop twist nodes from single launch file.
  3. Test the robot in Rviz2 and Gazebo simulations to make sure things work properly before we run the real robot. If everything works well then we will run everything together in next section .
  4. — the esp32 real/physical robot, Linorobot2 packages and move the robot around using joystick.

Before running the simulations make sure you power of the robot and esp32 so it is not publishing any topics to avoid the confusion of seeing topics published by Gazebo and the real robot together. Also, make sure you have shut down the micro-ROS agent docker [you can just abort it using CTRL+C].

Now lets prepare to run the simulation by downloading the Linorobot2 packages and then running various simulations in Rviz and Gazebo.

## clone the esp32 linobot github repo
git clone ~/esp32_linorobot2

## build the packages and source
cd ~/esp32_linorobot2
colcon build
source install/setup.bash

## run the rviz to make sure our robot urdf looks good
ros2 launch linorobot2_description rviz:=true

The rviz2 will show the robot that looks like this:

Now we are ready to actually run the Gazebo simulation for our robot and drive around in simulator to make sure the simulation works.

ros2 launch linorobot2_gazebo

The Gazebo simulation will look something like this:

At this point if you examining the ROS2 node graph in Rqt it will look like this:

Gazebo Topics

As you can see Gazebo is listening on /cmd_vel topic and also generating fake simulated messages for IMU and diff drive controller for odometry which will be fed into the ekf_filter_node to localize the robot within the simulator world.

Lets launch the rqt_robot_steering to drive the robot around in the playground world in Gazebo.

rqt_robot_steering GUI

You can simply keep the Gazebo window and rqt_robot_steering window in the view and start clicking +/- and left/right arrows to provide cmd_vel with linear and angular velocities which will be published on cmd_vel topic as shown at top of the window. You can also increase or decrease the speed or stop everything with stop button and you’ll see the robot moving around in the simulator as shown below:

Gazebo — Driving robot using rqt_robot_steering

Now all the basics have been tested in simulation and we are ready to run the real robot in real world and we will dig into it in the next article.

Hope you had fun building and tinkering with the software and hardware so far and learned some good stuff about ROS2 along the way!!!