Using BMI160 IMU with ROS2

RoboFoundry
13 min readApr 3, 2022

In this article we will explore how to use the Bosch Sensortec IMU sensor BMI160. BMI160 is a 6-DOF IMU that has both gyroscope and accelerometer sufficient to provide the inputs for robot odometry in combination with wheel odometry from encoders on robot wheels. It also has capabilities to act as a step counter if you were to use it in pedometer type of application but we won’t be using it for that purpose. We will only be discussing using the sensor with Raspberry Pi [RPi] and not Arduino. It is assumed that RPi is running Ubuntu 20.04 and ROS2 Foxy is already installed on both RPi and a host computer. We will also refer to RPi as robot computer and host computer can be a laptop or desktop.

This sensor is also part of the component list from NVidia Kaya and it was used in NASA Mars 2021 mission as well. So decided to add it to my robot project and I recently purchased BMI160 from Digikey electronics. The version I purchased is made by DFRobot and it comes with gravity connectors. Here is the direct link to the digikey product page [I don’t get any benefit of linking it here, it’s just for quick reference and you can purchase it from anywhere else]. I believe cheaper versions of this sensor are also available from Amazon but those will not come with Gravity connector and will need some soldering to install the headers for pins. In general, my experience has been that DFRobot parts are made very well and are of fairly good quality and their wiki provides good software support and instructions on how to connect the hardware with good examples. This was another reason for going with DFRobot BMI160.

Software support for using the BMI160

I researched a lot before purchasing the sensor to make sure there was good support for reading the sensor data in different programming languages. However, everything was not as simple and easy as you would hope for and unfortunately that is true of not just this IMU sensor but most of the IMUs I have used so far. There is a C driver library supplied by Bosch Sensortec but it has very little instruction on how to use it on their github repo. After doing further digging I realized that you actually have to download their COINES SDK before you can compile and run the examples. This sounded to me like a lot of bloatware and unnecessary jumping through the hoops just to read the sensor data.

On the other hand I found two other options that are much more straightforward to use the BMI160 sensor — one was to use the python and C github repo from DFRobot and second was to use this generic bmi160-i2c python package. Both of these packages have their own strengths — I liked the DFRobot examples for the simplicity of it and used it to quickly test the sensor to make sure the hardware connection was working and raw data is coming through. However, the DFRobot code does not come with any methods/functions to do calibration. The bmi160-i2c python package on the other hand is fairly complete package and comes with auto-calibration methods which are very important for getting the IMU working with the robot. I also found that this github repo that is a IMU node for AWS DeepRacer project, which was very useful for me in getting the BMI160 working with ROS2 and I have created a fork to this repo and added some additional functionality to visualize BMI160 in RVIZ, more on that in later steps in the article.

Enough talk about the background on how everything came together. We will go over following steps to get the BMI160 up and running and publishing sensor data over ROS2 topic and visualize it in RVIZ2 simulator.

  1. Hardware connections and wiring the sensor with RPi
  2. Preparing RPi to connect to sensor via i2c bus
  3. Testing with simple python script to make sure we can capture raw sensor data
  4. Creating a ROS2 workspace and packages that will — read sensor data, apply filter to remove noise from filter readings and publish it to ROS2 topic
  5. Run the ROS2 nodes and visualize the sensor in RVIZ2

Hardware Connection and Wiring

As you can see in the diagram below the wiring is very straightforward with RPi. The DFRobot BMI160 comes with Gravity connector that plugs into the sensor board and other ends are female DuPont connectors which can be easily plugged on to RPi 40 pin male header pins.

  • BMI160 +ve pin →RPi pin 4 [5v] or RPi pin 1 [3.3v]
  • BMI160 -ve pin → RPi pin 6 [GND]
  • BMI160 D pin → RPi pin 3 [SDA]
  • BMI160 C pin → RPi pin 5 [SCL]

The actual mounted sensor should look something like this.

Preparing Raspberry Pi

Setting up RPi for i2c connection involves two main steps:

  • Configuring hardware permissions in boot firmware config.txt file and adding the udev rules to allow permissions to access hardware
  • Installing the Linux and python packages to list the i2c devices

Configuring Hardware

Follow the instructions here to setup the RPi. Main steps are:

  1. Ensure the line dtparam=i2c_arm=on is uncommented in the /boot/firmware/config.txt file
  2. Ensure your user is added to i2c group [though this may be optional]
  3. Ensure that you have added following line to /etc/udev/rules.d/local.rules file as suggested in article linked above
ACTION=="add", KERNEL=="i2c-[0-1]*", MODE="0666"

You may need to reboot RPi for all the changes to take effect

Installing Packages

Make sure you have run following commands on your RPi to have necessary Linux and python packages installed.

sudo apt update 
sudo apt upgrade -y
sudo apt install -y i2c-tools python3-pip

python packages

sudo pip3 install BMI160-i2c smbus2

If everything goes well when you run following command:

sudo i2cdetect -r -y 1

You should see the bmi160 sensor show up at address 0x69 like this:

If you don’t see the sensor showing up in output of the i2cdetect, most likely your wiring is not correct. Make sure you have connected everything to correct pins, connections are not loose and that sensor is getting power from RPi. If everything else fails, it is possible that you may have a defective piece.

Testing the Sensor

At this point we are ready to test the sensor with sample code from DFRobot code examples.

Clone the sample code repository by running following command:

git clone https://github.com/DFRobot/DFRobot_BMI160.gitcd DFRobot_BMI160/python/raspberrypi/examplespython3 demo_get_accel_gyro.py

You should see that the python script is printing raw gyroscope and accelerometer readings that looks something like this:

Creating a ROS2 workspace and packages

Getting the raw data is first step but in order for the data to be usable for Odometry we need to ensure:

  • The data is translated to quarternion co-ordinates. If you are new to Quarternion geometry I highly recommend this phenomenal website that has an interactive video series on the topic
  • We are passing it through filter to remove the noise so the robot is not jumping around all over the map

First clone this git repository on both RPi/robot and your host machine.

# clone git repogit clone https://github.com/robofoundry/aws-deepracer-imu-pkg.git# download the ros2_imu_tools repo into your workspace
cd aws-deepracer-imu-pkg
# execute the shell script in scripts directory to import the git repo and install ros dependencies. imu_pkg/scripts/load_ros2_imu_tools_repo.sh

build the ROS2 workspace [make sure to be in root dir of workspace i.e. aws-deepracer-imu-pkg

# source the main ros2 foxy setup bash scriptsource /opt/ros/foxy/setup.bash
cd aws-deepracer-imu-pkg
colcon build
source install/setup.bash

At this point we should be able to examine the nodes that are part of this workspace

ros2 pkg executables imu_pkg

it should show, ROS2 node in the package. The imu_node is the main one that reads and publishes raw sensor data. We would also be using transform node from imu_tf package [part of ros2-imu-tools]to read the filtered and corrected imu/data and publish the TF between imu_link and its parent link in URDF.

imu_pkg imu_node

Let’s launch the node and make sure the IMU data is being published on ROS2 topic

ros2 launch imu_pkg imu_pkg_launch.py

I ran into a roadblock issue after installing the BMI160-i2c python package using pip install, it complained about missing constants while executing the code. I realized after digging into it, for some weird reason it was not installing the right version of the code via pip install. It showed that it had the correct 0.5 version of the library but when I looked into the installed files it was indeed missing the constants the error was referring to in definition.py file [ACC_OFFSET_EN]. So I finally removed previously installed packages via pip and installed it by first cloning the git repository locally and then installing using local package install commands. You can follow the instructions here on how to do it from code if you need to do that. Again, you may not face this error and it may have been some left over stuff in my particular case where it went into that bad state but you know what to do if it happens. This definitely set me back during my initial research and I had to try looking for other alternatives which were not as great and finally I came back to this library. Once it went past that run time error, it was all smooth sailing from that point on.

Update 7/11/22 — no need to do the workaround above to install from github as the issues with BMI160-i2c library are fixed and a new v0.6 has been published so simply go ahead to sudo pip install BMI160-i2c to install from python package repo

You might also receive an error first time to you try to launch it saying :

Failed to create IMU monitor: [Errno 121] Remote I/O error

This is mainly because the i2c address of sensor for you is not matching with address that is configured for the imu_node. This is a fairly easy fix [I separated out the parameters in config file instead of having them in constants file in my fork], open the file imu_pkg/config/imu_params.yaml and edit the line with device_address and make sure it has the same address that you got from your i2cdetect command. The default value is 105 which is 0x69 in hex values.

/imu_node:ros__parameters:~device_address: 105~bus_id: 1~imu_frame: 'imu_link'~imu_topic: '/imu/data_raw'~publish_rate: 25

recompile the workspace by running the colcon build commands above and source your workspace with “source install/setup.bash” and relaunch the node by running the ros2 launch command above.

You should see output like this:

We can now check if the IMU sensor data is getting published on ROS2 topic by running following command in another terminal against your RPi:

ros2 topic list## you will see something like this in output
/imu/data_raw
/parameter_events
/rosout

The /imu/data_raw is the topic we are looking for so lets echo that topic to see the data being captured and published to ROS2 topic:

ros2 topic echo /imu/data_raw

Output will be something like this. Notice the frame_id is imu_link. Cool thing about ROS2 messages is that you should be able to run the above command from your main desktop/laptop [not just from your RPi] and as long as they both are on same wifi network and have the same ROS_DOMAIN_ID set in environment variables, you should be able to see the messages from RPi on your other computer as well.

From your other computer you should also be able to see the ROS2 node graph by running rqt [assuming your other computer has full desktop version of ROS2 foxy installed and its on same Wifi network].

rqt&

Visualize IMU movements in RVIZ2

In order to visualize the Imu in RVIZ we need to do following.

On RPi [robot computer]

  • Run a node that will read the raw sensor data and publish it on topic /imu/data_raw
  • Run imu_fusion_madgwick filter node to remove the noise from sensor data and apply quarternion conversion to the data. Listen on /imu/data_raw and publish it back on /imu/data
  • Run imu_tf node that will listen for /imu/data and publish TF transform between imu_link and its parent node in URDF which is plane

On Host machine [your laptop or desktop]

  • run the robot_state_publisher node to publish URDF on robot_description topic
  • run the rviz with parameter to load the rviz config file so it loads all the correct topics and visualization we need to see the sensor moving on the screen

Don’t worry you don’t need to do this manually yourself. I have two launch files in launch directory just for this purpose.

Here are the steps to follow to launch both scripts correctly.

  1. Ensure that you have run sourcing and compiling commands from root directory of your workspace
# always ensure that you have the global ros2 foxy sourcedsource /opt/ros/foxy/setup.bash# from root directory of your workspace compile the workspacecolcon build # source your workspacesource install/setup.bash

2. Execute following from root directory of your workspace on RPi/robot and host machine

# for RPiros2 launch imu_pkg imu_rviz_robot.launch.py# for host machineros2 launch imu_pkg imu_rviz_host.launch.py

When we run all the nodes the final node graph should look like this in rqt.

Our URDF node hierarchy is :

world → plane →imu_link

You can examine your TF tree either in RVIZ2 or by running following command which will spit out a PDF file describing your TF tree:

# install the pkg if not already theresudo apt install ros-foxy-tf2-tools# run the command to dump the TF tree in current directoryros2 run tf2_tools view_frames.py

It should look something like this:

When you run the launch command for host machine, it will start the RVIZ2 application and it should show the sensor in the simulator. You can move your robot or sensor [if you don’t have it mounted on robot yet] in any direction and the RVIZ2 should show the movements corresponding to real world movements.

The RVIZ2 simulation should something like this:

There is a little bit of annoying flicker and based on my research it is caused by difference in time of when the messages arrive and RVIZ updates the simulation so functionally it should not affect the robot. However, I’m going to continue to tinker with various parameters and frequencies to see if it can be eliminated.

In the real robot you will typically have additional sensors like wheel encoders that you will need to fuse with IMU sensor using robot_localization node which provides ekf filter to do the sensor fusion. Together wheel encoder and IMU data will provide solid odometry for the robot to take the next step which is either doing mapping or navigation using ROS2 Nav2 stack.

Hope this helps folks in understanding several things regarding IMU sensor, how to simulate it and also sets up some foundational concepts of how this can be extended to run an actual robot which will have a lot more number of nodes but the pattern between RPi and host machine will be same.

That’s it. Enjoy and have fun building!!

References and Acknowledgements

As you can imagine a complex sensor project cannot be successful without the knowledge and contributions from many people and as such there are many references and acknowledgements below if you care to follow.

A big thanks to Lars Lorenz Ludwig who created the AWS Deepracer github repository which provided a good foundation for getting the BMI160 sensor working with auto calibration. My github fork is extension of his work.

My fork with additional capabilities for simulation

Very grateful for all the knowledge, concepts, github code examples and youtube videos from bandasaikrishna and particularly his git repo for orientations_from_IMU_MPU_6050. I hope you don’t mind that I borrowed your plane STL file to use it in the rendering of the simulation. In addition, thanks for wonderful robot project and explanations of ROS control on your website, your work is very inspiring and I learned a lot when I was trying to understand those concepts.

Thanks to Marcus M. Scheunemann for the port of imu tools repo to ROS2, this was very helpful in converting some of the ROS1 concepts to ROS2. I struggled a lot to find the ROS2 version of this nice package until I found your repository.

Thanks to this python module repo on which the AWS Deepracer project is based on.

--

--