Development of an EKF-based Localization Algorithm Using Compass Sensor and LRF T. T. Hoang, P. M. Duong, N. T. T. Van, D. A. Viet and T. Q. Vinh VNU University of Engineering and Technology Hanoi, Vietnam thuanhoang@donga.edu.vn Abstract— This paper presents the implementation of a perceptual system for a mobile robot. The system is designed and installed with modern sensors and multi-point communication channels. The goal is to equip the robot with a high level of perception to support a wide range of navigating applications including Internet-based telecontrol, semi-autonomy, and autonomy. Due to uncertainties of acquiring data, a sensor fusion model is developed, in which heterogeneous measured data including odometry, compass heading and laser range is combined to get an optimal estimate in a statistical sense. The combination is carried out by an extended Kalman filter. Experimental results indicate that based on the system, the robot localization is enhanced and sufficient for navigation tasks. Index Terms— sensor fusion; Kalman Filter; localization; I. INTRODUCTION Perceptual system is the key to the intelligence of a mobile robot. When a mobile robot travels in an unknown or partially known environment, it must understand the environment for obstacle avoidance or path planning. The perception of robot is done by taking measurements using various sensors and then extracting meaningful information from those measurements. Based on advanced material technologies, modern sensors can be nowadays equipped for the mobile robot such as optical incremental encoders, heading sensors, ultrasonic sensors, infrared sensors, laser range finders and vision systems. These sensors are selected so as to accord with the goal of application, the specific constraints of the working environment, and the individual properties of the sensors themselves. Nevertheless, there is no single sensor which can adequately capture all relevant features of a real environment. It is necessary to combine the data from different sensors into a process known as sensor fusion. The expectation is that the fused data is more informative and synthetic than the original. Several methods have been reported to cope with this trend. Durrant-Whyte has developed a multi-Bayesian estimation technique for combining touch and stereo sensing [1], [2]. Tang and Lee proposed a generic framework that employed a sensor- independent, feature-based relational model to represent information acquired by various sensors [3]. In [4], a Kalman filter update equation was developed to obtain the correspondence of a line segment to a model, and this correspondence was then used to correct position estimation. In [5], an extended Kalman filter was conducted to manipulate image and spatial uncertainties. In this work, we develop a multi-sensor perceptual system for the mobile robot. Sensors include but not limit to optical quadrature encoders, compass sensors, ultrasonic sensors, laser range finders, global positioning systems (GPS) and vision systems. The goal is to equip the robot with diverse levels of perception to support a wide range of navigating applications including Internet-based telecontrol, semi-autonomy, and autonomy. At this stage of research, we use the optical quadrature encoders for position measurement, compass sensor for deflect angle calculation and laser range finder (LRF) for object-boundary detection. These types of data have their strengths and are often used in robotic applications, but combining them provides even more useful information. This combined approach deserves further investigation. In our system, an extended Kalman filter (EKF) has been designed. It fuses the raw measurement data of optical quadrature encoders, the compass sensor, and the LRF to obtain optimal estimation of robot positions while reduces uncertainties in measurements. The proposed fusing algorithm also provides constraints that filter out unreliable data from the sensor readings. Outputs of EKF combined with the boundaries of objects detected from LRF ensure the success navigation of the mobile robot in indoor environments. The paper is arranged as follows. Details of the perceptual system are described in Section II. The algorithm for sensor fusion using EKF is explained in Section III. Section IV introduces experimental results. The paper concludes with an evaluation of the system, with respect to its strengths and weaknesses, and with suggestions of possible future developments. II. SENSOR SYSTEM DESIGN The sensor system consists of an Omni-directional camera, a GPS, a compass sensor, a LRF, three quadrature encoders and eight ultrasonic sensors. Fig.1 describes the sensors in relation with actuators and communication channels in a mobile robot. The communication is performed via low-rate and high-rate channels. The low-rate channel with standards of RS-485 is developed by an on-board 60MHz Microchip dsPIC30F4011- based micro-controller and MODBUS protocol for multi-point interface. The high-rate channels use the USB-to-COM and IEEE-1394/firewire available commercial ports. Figure 1. Sensors in relation with actuators and communication channels in our mobile robot In the system, optical quadrature encoders are used. An optical encoder is basically a mechanical light chopper that produces a certain number of sine or square wave pulses for each shaft revolution. As the diameter of wheel and the gear coefficient are known, the angular position and speed of wheel can be calculated. In the quadrature encoder, a second light chopper is placed 90 degrees shifted with respect to the original resulting in twin square waves as shown in fig.2. Observed phase relationship between waves is employed to determine the direction of the rotation. In the system, measurements from encoders are used as input data for positioning and feedback for a closed-loop motor speed controller. Figure 2. Optical encoder structure and output pulses The heading sensor is used to determine the robot orientation. This sensory module contains a CMPS03 compass sensor operating based on Hall effect with heading resolution of 0.1° (fig.3). The module has two axes, x and y. Each axis reports the strength of the magnetic field component paralleled to it. The microcontroller connected to the module uses synchronous serial communication to get axis measurements [6]. Figure 3. Heading module and output data The GPS is mainly applied for positioning in the outdoor environment. A HOLUX GPS UB-93 module is used [7]. Due to the presence of networking in our system, an Assisted GPS (A-GPS) can be also used in order to locate and utilize satellite information of the network in the poor signal condition. The system provides eight SFR-05 ultrasonic sensors split into four arrays, two on each, arranged at four sides of the robot. The measuring range is from 0.04m to 4m. The vision system is detachable and mounted on the top of the robot. It mainly consists of an Omni-directional digital camera Hyper-Omni Vision SOIOS 55 with a high-rate IEEE- 1394 data transfer line. With a 360-degree field of view, the Omni-directional camera is a good vision sensor for landmark- based navigation [8]. Last but not least, a 3D laser range finder (LRF) with a range from 0.04m to 80m is developed for the system. Its operation is based on the time-of-flight measurement principle. A single laser pulse is emitted out and reflected by an object surface within the range of the sensor. The lapsed time between emission and reception of the laser pulse is used to calculate the distance between the object and the sensor. By an integrated rotating mirror, the laser pulses sweep a radial range in its front so that a 2D field/area is defined (fig.4). Figure 4. Two-dimension laser scanning plane [10]. Due to the fixation of the pitching angle in the scanning plane, the information of 2D image may be, in certain cases, insufficient for obstacle detection (fig.4). In those situations, a 3D image is necessary. As the 2D scanner is popular and low- cost, building a 3D LRF from the 2D is usually a prior choice [9][10]. In our system, a 3D LRF is developed based on the 2D SICK-LMS 221 [9]. It has the view angle of 180° in horizontal Camera GPS Module Compass Module Sonar Modules Trigger for LMS Laser Range Finder (LMS) PID Controller Encoder M0 MCU dsPIC 30F4011 Trigger switch Encoder M1 Encoder M2 USB to RS-485 USB to RS-232 PC USB 1394 USB Frame Grabber Network Interface PID Controller PID Controller USB to RS-232 USB Motor for vertical rotation of LMS creating 3D image Motor for moving wheel Motor for moving wheel and 25° in vertical. During the measurement time, a set of deflect angle β and distance ρ values are received. The set (β, ρ) is then combined with the pitching angle α to create the 3D data. Based on these data, we can define the Cartesian coordinates of one point in the space. Figure 5. Determine the coordinates of one point in 3D space [10]. Fig.6 shows the proposed sensor system implemented in a mobile robot developed by our laboratory. Figure 6. The implemented sensor system III. SENSOR FUSION The proposed sensor platform equips the robot with the ability to perceive many parameters of the environment. Their combination, however, presents even more useful information. In this work, the raw data of three different types of sensors including the compass sensor, the quadrature encoder and the LRF is syndicated inside an EKF. The aim is to determine the robot position during operation as accurately as possible. Figure 7. The robot’s pose and parameters We start with the kinematic model of the mobile robot. The two wheeled, differential-drive mobile robot with non-slipping and pure rolling is considered. Fig.7 shows the coordinate systems and notations for the robot, where (XG, YG) is the global coordinate, (XR, YR) is the local coordinate relative to the robot chassis. R denotes the radius of driven wheels, and L denotes the distance between the wheels. During one sampling period ∆t, the rotational speed of the left and right wheels ωL and ωR create corresponding increment distances ∆sL and ∆sR traveled by the left and right wheels respectively: L L R R s tR s tR ω ω ∆ = ∆ ∆ = ∆ (1) These can be translated to the linear incremental displacement of the robot’s center ∆s and the robot’s orientation angle ∆θ : 2 L R R L s s s s s L θ ∆ + ∆ ∆ −∆ ∆= ∆ = (2) The coordinates of the robot at time k+1 in the global coordinate frame can be then updated by: ( ) ( )           ∆ ∆ + ∆ ∆ + ∆ +           =           + + + k k k k k k k k k k k k k s s y x y x θ θ θ θ θ θ θ 2 / sin 2 / cos 1 1 1 (3) In practice, (3) is not really accurate due to unavoidable errors appeared in the system. Errors can be both systematic such as the imperfectness of robot model and nonsystematic such as the slip of wheels. These errors have accumulative characteristic so that they can break the stability of the system if appropriate compensation is not considered. In our system, the compensation is carried out by the EKF. Let [ ]T x yθ = x be the state vector. This state can be observed by some absolute measurements, z. These measurements are described by a nonlinear function, h, of the robot coordinates and an independent Gaussian noise process, v. Denoting the function (3) is f, with an input vector u, the robot can be described by: 1 ( , , ) ( , ) k k k k k k k f h + = = x x u w z x v (4) where the random variables wk and vk represent the process and measurement noise respectively. They are assumed to be independent to each other, white, and with normal probability distributions: ~ (0, ) ~ (0, ) ( ) 0 T k k k k i j E = w N Q v N R w v The steps to calculate the EKF are then realized as below: 1. Prediction step with time update equations: 1 1 ˆ ˆ( , , ) k k k f − − − = x x u 0 (5) 1 1 T T k k k k k k k − − − = + P A P A W Q W (6) where ˆ n k −∈ℜ x is the priori state estimate at step k given knowledge of the process prior to step k, ˆ k − P denotes the covariance matrix of the state-prediction error, Ak is the Jacobian matrix of partial derivates of f to x: ( ) ( ) ( 1) ( 1) ˆ ij ( , , ) ( 1) 1 0 sin / 2 | ; 0 1 os / 2 0 0 1 p k k k k k i k k k k pj k s s c θ θ θ θ − − −   −∆ + ∆ ∂   = = ∆ + ∆   ∂     x u 0 f A A x (7) W is the Jacobian matrix of partial derivates of f to w: ρ z x 0 y β α object Laser beam cos cos cos sin sin x y z ρ α β ρ α β ρ α = = = 1 ˆ ( , , ) 1 1 cos sin cos sin 2 2 2 2 1 1 sin cos sin cos 2 2 2 2 2 1 1 k k k k k k k k k k k k k k k k k k k k k k k k s s L L f R W t s s L L L L θ θ θ θ θ θ θ θ θ θ θ θ θ θ θ θ + +   ∆ ∆ ∆ ∆         + − ∆ + + + ∆ +                       ∂ ∆ ∆ ∆ ∆         = = ∆ + + ∆ + + − ∆ +           ∂            −    x u 0 w    (8) and 1 k − Q is the input-noise covariance matrix which depends on the standard deviations of noise of the right-wheel rotational speed and the left-wheel rotational speed. They are modeled as being proportional to the rotational speed ωR,k and ωL,k of these wheels at step k. This results in the variances equal to 2 R δω and 2 L δω , where δ is a constant determined by experiments. The input-noise covariance matrix Qk is defined as: 2 , 2 , 0 0 R k k L k δω δω   =       Q (9) 2. Correction step with measurement update equations: 1 ( ) T T k k k k k k k − − − = + K P H H P H R (10) ( ) ( ) k ˆ ˆ ˆ k k k k − = + − - x x K z h x (11) ( ) k k k k − = − P I K H P (12) where ˆ n k ∈ℜ x is the posteriori state estimate at step k given measurement kz , Kk is the Kalman gain, H is the Jacobian matrix of partial derivates of h to x: ( ) 1 1 1 1 ˆ ij ( ) ( 1) os( ) sin( ) 0 0 0 1 . . . . . . | ; (13) ˆ . . . os( ) sin( ) 0 0 0 1 0 0 1 p k i k pj k N N N N C c C C c C β β β β − − −     −       ∂   = =   ∂   − −     −       x u H H x Rk is the covariance matrix of noises estimated from the measurements of compass sensor and LRF as follows. Figure 8. The line parameters (ρj, βj) according to the global coordinates, and the line parameters (ri, ψi) according to the robot coordinates At the first scan of LRF, a global map of environment is constructed composed of a set of line segments described by parameters βj and ρj. The line equation in normal form is: xG cos βj + yG sin βj = ρj (14) When the robot moves, a new scan of LRF is performed and a new map of environment, namely local map, is constructed which also consists of a set of line segments described by the equation: xR cosψi + yR sinψi = ri (15) where ψi and ri are the parameters of lines (fig.8). The line segments of the global and local map are then matched using weighted line fitting algorithm [11]. The matching line parameters ir and i ψ from the local map are collected in the vector zk, which is used as the input for the correction step of the EKF to update the robot’s state: 1 1 [ , ,...., , , ]T k N N k r r ψ ψ ϕ = z (16) where iϕ is the signal measured from the compass sensor. From the robot pose estimated by odometry, the parameters j ρ and j β of the line segments in the global map (according to the global coordinates) is transformed into the parameters ˆir and ˆi ψ (according to the coordinates of the robot) by: ( ) ( ) os( ) sin( ) j j r k j r k j C x c y ρ β β = − − (17) ˆ ˆ ˆ ˆ ( , , , ) ( / 2) ( 0.5si ( ) 0.5) ˆ ˆ j i i k j j i j i j k k C r x gn C ψ ρ β ϕ β θ π π θ θ − − −           = = − − + − +               u (18) The covariance matrix Ri,k of measurement noise has the diagonal structure, where the ith block is: , ar( ) 0 ... 0 0 0 ar( ) ... ... ... R ... ... ... ... ... ... ... ... ar( ) 0 0 0 ... 0 ar( ) i i i k i k v r v v v ψ ψ ϕ         ≅         (19) From (16) and (19), the data of the compass sensor is utilized to correct the robot orientation. At step k, this data is employed as the absolute measurement for the element θk of zk. The noise of this measurement is achieved from the manufactory specification. The accuracy of 0.10 corresponds to the noise variance of 0.01. This collected into the covariance matrix Ri,k (19) gives Rk for the correction step of EKF. IV. EXPERIMENTS The setup of the sensor system has been described in Section II. In this section, experiments are conducted to evaluate the efficiency of the fusion algorithm. A. Experimental Setup A rectangular shaped flat-wall environment constructed from several wooden plates surrounded by a cement wall is setup. The robot is a two wheeled, differential-drive mobile robot. Its wheel diameter is 10 cm and the distance between two drive wheels is 60cm. The speed stability of the motor during each sampling time step is an important factor to maintain the efficiency of the EKF. For that reason, motors are controlled by microprocessor- based electronic circuits which carry out a PID algorithm. The stability checked by a measuring program written in LABVIEW is ±5%. The compass sensor has the accuracy of 0.10. The LRF has the accuracy of 30mm in distance and 0.250 in deflect angle. The sampling time ∆t of the EKF is 100ms. The proportional factor δ of the input-noise covariance matrix Qk is experimentally estimated as follows. The deviation between the true and the position estimated by the kinematic model when driving the robot straight forwards several times (from the minimum to the maximum tangential speed of the robot) is determined. The deviation between the true robot orientation and the orientation estimated by the kinematic model when rotating the robot around its own axis several times (from the minimum to the maximum angular speed of the robot) is also determined. Based on the error in position and orientation, the parameter δ is calculated. In our system, δ is estimated to be the value 0.01. Figure 9. Estimated robot trajectories under different EKF configurations Figure 10. The deviation in X and Y direction between estimated positions and the real one Figure 11. Trajectories of the robot moving along predefined paths a) Rounded rectangular path b) Rounded triangular path B. Sensor Fusion Algorithm Evaluation In order to evaluate the efficiency of the fusion algorithm, different configurations of the EKF were implemented. Fig.9 describes the trajectories of a robot movement estimated by four methods: the odometry, the EKF with compass sensor, the EKF with LRF, and the EKF with the combination of LRF and compass sensor. The deviations between each trajectory and the real one are represented in fig.10. a) In another experiment, the robot is programmed to follow predefined paths under two different scenarios: with and without the EKF. Fig.11a represents the trajectories of the robot moving along a rounded rectangular path in which the one with dots corresponds to the non-existence of EKF and the one with pluses corresponds to the existence of EKF in the implementation. The trajectories in case the robot follows a rounded triangular path are shown in fig.11b. It is concluded that the EKF algorithm improves the robot localization and its combination with LRF and compass sensor gives the optimal result. C. Autonomous Navigation In this experiment, we evaluate the applicability of the proposed fusion approach in a real autonomous navigation application. The goal is to navigate the mobile robot to go a round closed to the line segments of the global map extracted from the first scan of the LRF. Fig.12 shows the extracted map and the success trajectory of the robot during the navigation. A sequence of images showing the motion of the robot in the experiment is described in fig.13. Figure 12. Line segments of a global map and the trajectory of the robot Figure 13. A sequence of images showing the motion of robot in an experimental environment during the autonomous navigation operation V. CONCLUSION It is necessary to develop a perceptual system for the mobile robot. The system is required to not only be well- working but also critically support various levels of perception. In this work, many types of sensors including position speed encoders, integrated sonar ranging sensors, compass and laser finder sensors, the global positioning system (GPS) and the visual system have been implemented in a real mobile robot platform. An EKF has been designed to fuse the raw data of compass sensor and LRF. Experiments show that this novel combination significantly improves the accuracy of robot localization and is sufficient to ensure the success of robot navigation. Further investigation will be continued with more sensor combination to better support the localization in outdoor environments. ACKNOWLEDGMENT This work was supported by Vietnam National Foundation for Science and Technology Development (NAFOSTED). REFERENCES [1] H. F. Dunant-Whyte, “Consistent integration and propagation of disparate sensor observations,” Znt. J. Robot. Res., vol. 6, no. 3, pp. 3-24, 1987. [2] H. F. Dunant-Whyte, “Sensor models and multi-sensor integration,”Z nt.J.Robot. Res., vol. 7, no. 6, pp. 97-113, 1988. [3] Y. C. Tang and C. S. G. Lee, “A geometric feature relation graph formulation for consistent sensor fusion,” in Proc. IEEE 1990 Int. ConSyst., Man, Cybern., Los Angeles, CA, 1990, pp. 188-193. [4] J. L. Crowly, “World modeling and position estimation for a mobile robot using ultrasonic ranging,” in Proc. IEEE Int. Conf. Robot., Automat., 1989, pp. 674-680. [5] T. Skordas, P. Puget, R. Zigmann, and N. Ayache, “Building 3- D edge-lines tracked in an image sequence,” in Proc. Intell. Autonomous Systems-2, Amsterdam, 1989, pp. 907-917. [6] [Online] http://www.robot-electronics.co.uk/htm/cmps3tech.htm [7] [Online] http://www.holux.com [8] N. Winters et al, “Omni-directional vision for robot navigation”, Omnidirectional Vision, 2000. Proceedings. IEEE Workshop on, Hilton Head Island, USA, Jun 2000. [9] Sick AG., 2006-08-01 Telegrams for Operating/ Configuring the LMS 2xx (Firmware Version V2.30/X1.27), www.sick.com , Germany. [10] T. T. Hoang, D. A. Viet, T. Q. Vinh, “A 3D image capture system using a laser range finder”, IEICE Proceeding of the 2th international conference on Integrated Circuit Design ICDV, Vietnam, October, 2011. [11] S.T. Pfister, S.I. Roumeliotis and J.W. Burdick,”Weighted line fitting algorithms for mobile robot map building and efficient data representation,” IEEE Int. Conf. Robotics and Automation, 2003. Proceedings. ICRA ’03, vol. 1, pp. 1304–1311 (2003). [12] Shoval, S., Mishan, A., & Dayan, J. , “Odometry and triangulation data fusion for mobile-robots environment recognition”, Control Engineering Practice 6, 1383-1388, 1998. [13] Jetto, L., Longhi, S., & Vitali, D., “Localization of a wheeled mobile robot by sensor data fusion based on a fuzzy logic adapted Kalman filter”, Control Engineering Practice 7, 763- 771, 1999. [14] Jetto, L., Longhi, S., & Venturini, G. , “Development and experimental validation of an adaptive extended Kalman filter for the localization of mobile robots”, IEEE Transactions on Robotics and Automation 15(2), 219-229, 1999.