Article Preview
TopIntroduction
In navigation of mobile robots information from multiple sensors need to be used to reliably determine the position and orientation, i.e. the pose, of the mobile robot (Thrun & Burgard, 2005; Dudek & Jenkin, 2010; Gustafsson, 2010). In outdoor robotics applications usually GPS or DGPS measurements are used to compensate for the drift in the position estimation based on odometry, see e.g. (Ohno et al., 2004). For autonomous driving on high-ways and in urban environments other sensors, such as Lidar, Radar and/or camera’s, are being used as well, see e.g. (Thrun & Burgard, 2005). For navigation in indoor applications or near tall buildings, where GPS or other GNSS has weak or no coverage, other beacons can be used e.g., based on ultrasound transmission time (Wijk & Christensen, 2000) or received signal strength (RSS) of visible light (Plets et al., 2017) and transmission time of ultra-wideband (UWB) transmission (Dabove et al., 2018). For indoor navigation also beacon-less methods are being used, that rely on simultaneous localisation and mapping (SLAM) (Thrun & Burgard, 2005) and often make use of optical sensors observing the robots environment, such as Lidar and/or (RGBd or stereo) camera’s.
In dirty and/or dusty working environments optical sensors may not be effective and ultrasonic or UWB beacon-based techniques are needed to compensate for drift in (semi) indoor mobile robot navigation. There are several approaches that can be taken:
- •
Odometry based: The pose of the robot is determined only by odometry, and optionally including compass and/or IMU sensors;
- •
Beacon based: Odometry information is not used for navigation, only triangulation based on two or more time of arrival (TOA) or three or more-time difference of arrival (TDOA) measurements;
- •
Beacon based resetting of odometry: The position of the robot determined by odometry is reset to the position determined by a beacon based (TOA or TDOA) method (the resetting is usually at a lower rate than the odometry update rate);
- •
Sensor fusion of beacon and odometry based measurements: The measurements from the odometry, optional compass and IMU sensors and the beacon-based sensors are fused according to some sensor fusion algorithm to provide an estimate of the robots pose.
In this paper, the latter approach of fusing the beacon and odometry sensor measurements, including a compass sensor, is used to achieve an estimate of the robots pose. Because the intended application is for mobile robots in dirty and/or dusty environments the choice has been made to focus on sensor fusion of odometry, compass and UWB beacon distance measurements. Various sensor fusion algorithms are evaluated, a heuristic approach, the extended Kalman filter and the particle filter, see e.g. (Thrun & Burgard, 2005; Dudek & Jenkin, 2010; Gustafsson, 2010). The algorithms are compared in a simulation experiment. Parts of this paper, especially the algorithm presentation, have been published in (Fraanje et al., 2019) as a conference publication. In addition, the current paper discusses various implementation issues, gives directions for extensions, such as the multi-beacon case, and the section on the simulation experiments and their discussion is extended and fully revised.