Note: This article applies to Gazebo Harmonic 8.6.0. The same or similar principles generally apply to other versions of Gazebo, though specific implementation details may vary.
Note: A partially relevant official documentation can also be found in the Gazebo Classic tutorials.
Gazebo provides a range of virtual sensors (e.g., Lidar, IMU, camera, etc.) that generate simulated sensor data and make it accessible through Gazebo’s internal transport system or via ROS using ros_gz_bridge. By default, this simulated data is machine-accurate — meaning that if, for example, your Lidar sensor remains stationary, the generated scans will be identical down to the least significant bit. While precise, this can lead to unrealistic results or numerical instability in certain algorithms that expect naturally varying sensor input.
To address this, you can intentionally introduce sensor noise into the data. This is accomplished by including a <noise> tag in the sensor’s SDF description. Gazebo currently supports a GAUSSIAN noise model. The following example demonstrates all available configuration options for a sensor noise model:
<noise type='gaussian'>
<mean>0.0</mean>
<stddev>2e-4</stddev>
<bias_mean>0.0000075</bias_mean>
<bias_stddev>0.0000008</bias_stddev>
<precision>0.2</precision>
</noise>
A detailed explanation of all configuration options is available in sdformat/noise.sdf.
For insight into how these parameters are applied to simulate noise, you can refer to the implementation in gz-sensors/GaussianNoiseModel.cc.
Below is an example demonstrating how to add sensor noise to the output of a virtual Lidar sensor by including a <noise> tag with the desired parameters:
<sensor name="gpu_lidar" type="gpu_lidar">
<topic>/model/l4xz/scan</topic>
<update_rate>10</update_rate>
<ray>
<!-- ... -->
<noise>
<type>gaussian</type>
<mean>0.0</mean>
<stddev>0.005</stddev>
</noise>
</ray>
<!-- ... -->
</sensor>
On the short video below the difference in sensor output with noise is clearly visible: