Proposal of Algorithms for Navigation and Obstacles Avoidance of Autonomous Mobile Robot T. T. Hoang, D. T. Hiep, P. M. Duong, N. T. T. Van, B. G. Duong and T. Q. Vinh Department of Electronics and Computer Engineering University of Engineering and Technology Vietnam National University, Hanoi thuanhoang@donga.edu.vn Abstract— This paper presents algorithms to navigate and avoid obstacles for an in-door autonomous mobile robot. A laser range finder is used to obtain 3D images of the environment. A new algorithm, namely 3D-to-2D image pressure and barriers detection (IPaBD), is proposed to create a 2D global map from the 3D images. This map is basic to design the trajectory. A tracking controller is developed to control the robot to follow the trajectory. The obstacle avoidance is addressed with the use of sonar sensors. An improved vector field histogram (Improved- VFH) algorithm is presented with improvements to overcome some limitations of the original VFH. Experiments have been conducted and the result is encouraged. Keywords-3D-laser image; 2D-laser scanner; localization; navigation; obstacles avoidance. I. INTRODUCTION The navigation process for an autonomous mobile robot can be divided into four steps: environment mapping, trajectory design, motion control, and obstacle avoidance. In order to obtain good navigation performance, it is necessary to have two separated parts: the global and the local navigation. The global process uses information taken from sensors which have enough visibility of the environment such as the 3D laser and camera to generate a round path planning and control strategy. For non-structural environment like indoor, the robot must view the whole environment by itself to create a map and design a trajectory which can lead it to the destination. With the requirement of high precision, the image data from the camera is often insufficient due to its dependence on the light condition and the surface structures of objects. In addition, a vision system cannot directly measure the geometry parameters such distances between objects. The stereo camera can partly overcome this problem but the computation is expensive and the accuracy is low [1, 2]. Instead, the laser range finder is an appropriate choice. It measures distances at high rate and precision while insensitive to environment’s conditions [3]. With the constraint that the robot always moves on the floor plane, the navigation task can be executed by directly using the pixel cloud of the 3D environment [4-5]. However, the algorithm is complicated and the computation is relatively high. In this paper, we propose an algorithm which builds a 2D global map from 3D images for the navigation problem. The algorithm, namely 3D-to-2D image compression, based on the idea that pixels at a same position but different height can be presented by a single pixel described the occupancy or vacancy of the environment that the robot moves through. Based on the created 2D map, the A* algorithm is employed to design the global trajectory [6]. The motion control to follow the designed trajectory is derived by the Lyapunov stability criterion [7-9]. In the local navigation, the robot needs to react to the change of the environment such as a sudden appearance of obstacles. Sonar sensors have been proven to be effective for this task [10-13]. A well-known algorithm is virtual force field method [11, 13]. Nevertheless, this method has the week point that when the robot moves from a square to another in the net graph, the sum of propulsive forces is changed much in both the intensity and the direction so that there exists the case the robot cannot pass vacancy space such as an opened door because the propulsive forces from two sides of the door make a joint force which pushes the robot away from the door [12]. In order to overcome this problem, we applied the VFH method of J. Borenstein [13] and tuned it to be suitable to the robot’s configuration. This method, called the Improved-VFH, uses the polar coordinate to control the robot to avoid obstacles in a region from 0,3m to 4m. II. CONSTRUCTING THE 2D MAP FROM THE CLOUD OF 3D PIXELS OF THE ENVIRONMENT A. Collecting data from laser sensor Figure 1 shows the image of the laser range finder LMS- 211 (LRF) in association with other sensors in a mobile robot designed at our laboratory. The measurement of the distance to a point in the obstacle is based on the principle of determining the going-returning time of the laser pulse reflected from the obstaclc. In order to get the 3D image of the environment, the LRF is stacked in a configuration that it can pitch up and down around a horizontal axis as described in [3]. In this work we adjust parameters of LRF so that every cycle of up pitching, the sensor gets 94 2D photo frames with pitch angles in the range from -5º to +20º and the increasing step of 0.266º. Each obtained 2D photo frame contains a data set corresponding to the horizontal scan of the laser ray with a scanning angle k β in the range from 40º to 140º and the resolution 1º. Consequently, the obtained 3D image is a cloud with 9,494 pixels. This image has the linearity in the height due to the constant rate of the pitching rotation. Figure 1. The laser range finder and sonar sensors in association with other sensors attached in a mobile robot.[14-15]. Since the sensor is put on the height 0.4m relative to the floor and at the minimum pitch angle -5º, the sensor only can detect objects at a distance at least 4.58m. Obstacles which are in the smaller distance to the robot are detected by eight sonar sensors. The combination of data from the laser range finder and sonar sensors gives the robot an image of the whole environment. B. Constructing the 2D global map from 3D laser images At a pitch angle j α , the robot obtains a 2D image frame consisting of pixels determined by the horizontal angles k β and the distance k R to the object. From a set of 2D image frame, Cartesian coordinates of pixels that creats the 3D image of the environment are determined as below [3]. j k k j k j k k j k j k k j R z R y R x α β α β α sin sin cos cos cos , , , = = = (1) The projection of pixels of objects on a plane x-y parallel with the floor can be used to construct the 2D map. Taking the union (U) of all pixels which have same coordinates (x, y) along the direction z of the environment creates a unique 2D map on the plane x-y as shown in figure 2. Figure 2. Pressure of frames in the 3D image to get the 2D map It is a matrix of pixels in which every pixel is expressed by a pair of parameters ( , ) k k R β , where k β is the scanning angle of the k-th laser ray and k R is the distance measured to the pixel. The relation between these two parameters is not monovalent in each 2D frame. It means that there is possible one or many points with different k R (associated to different heights of z) in correspondence to a scanning horizontal angle k β . Figure 3. Detecting pixels which have the minimum distance (circular points in seamless lines) The generated 2D image presents the surface image of obstacles in front of the robot. It is necessary to determine the boundary of image regions as they will be use for the path planning and obstacle avoidance. As shown in fig.3, this can be done by selecting pixels (circle points) that have the smallest distance to the robot among pixels received at the same laser scanning angle. The remaining pixels (squared points) are neglected. After this processing step, the relation between β and R becomes monovalent. The segmentation algorithm is then applied. The PDBS method is employed [16]. To further simplify the image, it is unnecessary to process the pixels at the height higher than the robot height as those pixels do not affect to the robot navigation. Let zlimit be the robot height. We can remove all pixels with the height higher then zlimit. In summary, all steps to construct the 2D map from a 3D image are described as follows: Step 1: Starting from the cloud of 3D pixels, the union of all pixels is taken along the vertical direction z. Step 2: For each horizontal scanning angle k β , find and select the value Rk = Rmin. Step 3: Remove points with limit z z > and z = 0 (corresponding to the floor). Step 4: Performing the image segmentation algorithm (if necessary). We call this algorithm 3D-to-2D image pressure and boundary detection (IPaBD). III. TRAJECTORY DESIGN AND MOTION CONTROL A. Trajectory design The next step is to find the optimal path for the robot to reach the destination from the initial position (the path planning). In IPaBD algorithm, scanning pixels that present the surrounding environment of the robot can be described as a set of line segments as in [17] where line segments are obstacles such as wall, door, or lobby…Based on these line segments, Rk2 βk1 Rk1 Robot x y the floor space is discretized into grid-cell map in which the size of each cell is set to be (a × a) cm2, where a equals to 1/3 diameter of the robot chassis. This grid map can be presented as a matrix with elements 0 and 1 where element 0 implies the appearance of obstacle. A “dilation” process is used in occupied region to extend two more cells to ensure that the robot cannot collide with the occupied region. Each line segment extracted from scanning pixels is characterized in the Cartesian coordinate by two nodal points. Line segment parameters such as length, line equation, and angular coefficient can be easily determined from these two nodal points. In grid map, cells of line segment created by two nodal points have the value of 1. Let ( ) i i 1,L 1,L x , y and ( ) i i 2,L 2,L x , y be coordinates of two nodal points presenting the ith obstacle ( ) 1... i n = , the coordinates of these two points in grid map are calculated as below: i i ,L min , x x j j G x cellsize − = i i ,L min , j j G y y y cellsize − = ( ) 1... i n = ( 1...2) j = where cellsize is the size of the grid cell. Cells occupied by the line segment presented by xj,Gi and yj,Gi are determined according to the following condition: 1, 2, 1, 1, 2, 1, i i i i i i G G G G G G m x x x T n y y y − − − ≤ − − if 1, 2, i i G G x x ≠ 1, i G m x T − ≤ if 1, 2, i i G G x x = where (m, n) is the coordinate of cell in grid map and T is the size of the robot. From data of occupied cells, we employed the A* algorithm [6] to find the shortest path to the destination D(xD,yD) from the starting cell S(xS, yS) (figure 9). B. Motion control After the path planning, control algorithm need developing to navigate the robot to follow the designed trajectory. In this paper we introduce a new control algorithm as follows. Figure 4. A noholonomic mobile robot platform. Let the tangent and angular velocities of the robot be v and ω , respectively. We have: 1 cos 0 ; sin 0 1 0 1 r l R x v v v r r y v R r r θ θ ω ω θ ª º ª º ª º « » ª º ª º ª º « » « » = = = = « » « » « » « » « » « » − ¬ ¼ ¬ ¼ « » ¬ ¼ « » « » ¬ ¼ ¬ ¼ « » ¬ ¼ v q     (2) The objective of the control problem is to design an adaptive control law so that the position vector q and the velocity vector q track the desired position vector ( ) r t q and velocity vector ( ) r t q under condition that robot parameters are not exactly known. The desired position and velocity vectors are represented by: [ ] cos sin T r r r r r r r r r r r r x y x v y v θ θ θ θ ω = = = = q    with vr > 0 for all t (3) The tracking error position between the reference robot and the actual robot can be expressed in the mobile basis frame as below [18]: 1 2 3 cos sin 0 sin cos 0 0 0 1 r p r r e x x e y y e θ θ θ θ θ θ − ª º ª º ª º « » « » « » = = − − « » « » « » « » « » « » − ¬ ¼ ¬ ¼ ¬ ¼ e (4) The derivation of the position tracking error can be expressed as: 1 2 3 2 1 3 3 cos sin r p r r e e v v e e e v e e ω ω ω ω −+ ª º ª º « » « » = = − + « » « » « » « » − ¬ ¼ ¬ ¼ e     (5) There are some methods in the literature to select the smooth velocity input. In this research, we choose a new control law for ,v ω as: 3 1 1 3 3 3 2 3 cos sin r r r v e k e v e k e v e e ω ω + ª º ª º « » = « » « » + + ¬ ¼ « » ¬ ¼ (6) where 1 3 , k k > 0. One can see that when 3 0 e → then 3 3 sin 1 e e → so ω always be bounded. With the control law in equation (6), it is easy to prove the asymptotical stability of the system. Choosing a positive definite function Vp as follows: ( ) 2 2 2 1 2 3 1 1 2 2 T p p p V e e e = = + + e e (7) The derivation of Vp with respect to time Vp is: 2r 2R d θ X Y O Yp Xp P ( ) ( ) ( ) ( ) ( ) 1 1 2 2 3 3 1 2 3 2 1 3 3 1 3 2 3 3 cos sin cos sin T p p p r r r r r r V ee e e e e e e v v e e e v e e e v v e e v e e ω ω ω ω ω ω = = + + = −+ + − + + − = −+ + + − e e      (8) Replacing (6) into (8) gives ( ) ( ) ( ) 1 3 2 3 3 3 1 3 1 1 3 2 3 3 3 3 2 3 2 2 1 1 3 3 cos sin sin cos cos sin p r r r r r r r r r V e v v e e v e e e e v e k e v e e v e e k e v e e k e k e ω ω ω ω = −+ + + − § · = − − + + + − − − ¨ ¸ © ¹ =− −  (9) It is able to see that p V is continuous and bounded according to the Barbalat theorem. It means that 0 p V →  when t →∞. Consequently, 1 3 0, 0 e e → → when t →∞. Applying the Barbalat theorem again for the derivation, we get: 1 3 0, 0 e e → →   (10) and the equation (6) becomes r v v → (11) r ω ω → (12) Combining (5), (10), (11), (12) gives: 2 2 0, 0 e e → →  . Thus the control law (6) assures the proximity control system 0 p → e when t →∞. C. Obstacle avoidance As discussed in the section I, we developed a method for obstacle avoidance namely Improved-VFH. This method uses a net chart as in VFH but the updating law is changed as described in Figure 5. At each time reading information from the ultrasound sensor, the algorithm assigns values of 2 or 3 to squares as in the figure, other squares are assigned the value 0. The data switch law from the net chart to the polar chart is inherited from VFH. After that, the process of calculation of the directional angle d θ is divided into 2 steps as follows: Step 1: Calculate slots with restriction in the range of sectors such that 100o d θ θ − < . If there exists a slot satisfying 100o d θ θ − < , the robot will control the directional angle so that d θ θ = . Step 2: If there is not any slot satisfying the above condition, the robot will perform a left or a right rotation with an angle 100º. The robot will rotate left if t θ θ < and vice verse, where ( ) atan2 , t t t y y x x θ = − − with , , , t t x y x y are coordinates of the target and the current coordinates of the robot, respectively. The algorithm then search a new slot in the range 100o θ ± , i.e. return to the Step 1. If in the range 100o θ ± there exits more than one slot, then the algorithm will select a slot depending on each case: Case 1: If 90o t θ θ − < , the algorithm selects a slot which has the directional angle d θ so that d t θ θ − is minimum. Case 2: If 90o t θ θ − ≥ , the algorithm selects a slot which has the directional angle d θ so that d θ θ − is minimum. The algorithm of control of the angle velocity is selected in a simple way as follows: ( ) 10 d ω θ θ = − and 25 / . o s ω ≤ The algorithm to control the tangent velocity is selected as follows: • If the robot is far from the obstacle, V=Vmax=0,5m/s. • If the robot is near the obstacle, V=5(d30-0.4) m/s, where 30 d is the distance from the robot to the nearest obstacle in the range -30o → 30o in the polar coordinate system attached to the robot. • If 10 / o s ω ≥ , the tangent velocity V is replaced by 2,5 V V ∗= . Figure 5. Environment information updating structure in the net chart in Improved VFH. IV. EXPERIMENT AND DISCUSSION A. The result of constructing the map by the IPaBD algorithm Figure 6 shows an 3D image obtained by the laser range finder with the starting point S(0,0) and the destination D(0,7.6). In the image, there is a gate (A) with the girder shorter than the robot height and a gate (B) with the girder higher, where the robot height is 1.2m in the experiment. There is also a corridor next to the gates with the height that the robot can go through. The robot has to choose itself the shortest path from the starting position S to the destination position D without colliding obstacles. Figure 6. The global 3D image of the environment Figure 7 shows the compressed 2D map obtained by applying the union (U) operator to 9494 pixels of the 3D image. Figure 8 presents the map constructed by the IPaBD algorithm. Figure 7. The 2D map with all 3D pixels pressed onto the plane x-y. Figure 8.The result of the obtained map by the IPaBD algorithm. B. Path planning with A* algorithm From the 2D map, the A* algorithm is applied to find the path from the starting position S to the destination D. As shown in figure 9, the map is divided into squares with the size of 40x40cm2. Since the size of the robot is 80x80cm2, it is necessary extend the occupied regions (presented by the grey cells) to avoid the collision. The result path shows that the robot does not pass the gate (A) since it detects an obstacle which is the girder. It also does not pass the corridor (C) as the path is long. Instead, it goes through gate (B) as expected. Figure 9. Robot navigation by the algorithm A*. C. Control the robot to track the trajectory With the designed trajectory, we used the control law (6) to control the robot to follow the trajectory. The result is shown in figure 10. In the experiment, the tangent velocity of the robot is around 0.2m/s to 0.4m/s. Figure 10.The real trajectory of the robot(the red curve) sticks to the designed trajectory(the blue curve). D. Avoiding obstacles using ultrasonic sensors The experiment is implemented by sending an unexpected obstacle O to the designed trajectory. Figure 11 shows that the robot avoids the obstacle and then continues to track the global trajectory to arrive at the destination. Figure 12 is a sequence of images showing the robot motion during the experiments. Figure 11. Avoiding an unexpected obstacle on the trajectory of the robot. Figure 12. A sequence of images showing the robot motion during the experiments. V. CONCLUSION This paper presents new algorithms for the problem of navigation and obstacle avoidance for the autonomous mobile robot. The navigation is developed from the environment data extraction to the mapping, path planning and trajectory tracking. The obstacle avoidance is accomplished with both known and unknown obstacles. The main contribution of the paper is the proposal of the IPaBD mapping algorithm for the path planning, the tracking control law for the robot system with partial unknown robot parameters, and the Improved- VHF algorithm for the obstacle avoidance. Experiments in a real robot system have been conducted and the result confirmed the effectiveness of the proposed methods. ACKNOWLEDGEMENT This work was supported by Vietnam National Foundation for Science and Technology Development (NAFOSTED). REFERENCES [1] Zi-xing Cai, Jin-xia Yu, Zhuo-hua Duan, Xiao-bing Zou1Oliver, Bernardo Wagner, “Design of the 3D perceptive system based on laser scanner for a mobile robot”, IJCSNS International Journal of Computer Science and Network Security, VOL.6 No.3B, March 2006. [2] Andrzej TypiakHartmut, “Use of lasser rangerfinder to detecting in surroundings of mobile robot the obstacles”, The 25th International Symposium on Automation and Robotics in Construction, June 26-29, 2008. [3] T. T. Hoang, D. A. Viet, T. Q. Vinh, “A 3D image capture system using a laser range finder”, IEICE Proceeding of the 2th international conference on Integrated Circuit Design ICDV, Vietnam, October, 2011. [4] P. de la Puente et al.. Extraction of Geometrical. Features in 3D Environments for Service Robotic Applications. Springer-Verlag Berlin Heidelberg 2008. [5] Johannes Strom, Andrew Richardson, Edwin Olson. Graph-based Segmentation for Colored 3D Laser Point Clouds. The 2010 [6] R.Murphy Robin.: Introduction to AI Rbotics. Abradford Book The MIT Press Cambridge, London, England, 2000. [7] Augie Widyotriatmo,Keum-Shik Hong, and Lafin H. Prayudhi, “Robust stabilization of a wheeled vehicle: Hybrid feedback control design and experimental validation,” Journal of Mechanical Science and Technology 24 (2) (2010) 513~520. [8] Thuan Hoang TRAN, Manh Duong PHUNG, Van Tinh NGUYEN and Quang Vinh TRAN, “A Path Following Algorithm for Wheeled Mobile Robot Using Extended Kalman Filter ,” IEICE Proceeding of the 3th international conference on Integrated Circuit Design ICDV (IEICE 2012), August 2012 Vietnam. [9] Tua Agustinus Tamba, Bonghee Hong, and Keum-Shik Hong, “A Path Following Control of an Unmanned Autonomous Forklift” International Journal of Control, Automation, and Systems (2009) 7(1):113-122. [10] Khatib, O., “Real-time Obstacle Avoidance for Manipulators and Mobile Robots”, 1985 IEEE International Conference on Robotics and Automation, St. Louis, Missouri, March 25-28, 1985, pp. 500-505. [11] http://www. users.isr.ist.utl.pt/~mir/pub/ObstacleAvoidance.pdf. [12] Y. Koren, Senior Member, IEEE and J. Borenstein, “Potential Field Methods and Their Inherent Limitations for Mobile Robot Navigation”, Proceedings of the IEEE Conference on Robotics and Automation, Sacramento, California, April 7-12, 1991, pp. 1398-1404. [13] Borenstein, J. and Koren, Y., “The Vector Field Histogram-Fast Obstacle Avoidance for Mobile Robot”, IEEE Journal of Robots and Automation, Vol. 7, No. 3, June 1991, pp. 278-288. [14] T. T. Hoang, P. M. Duong, N. T. T. Van, D. A. Viet and T. Q. Vinh, “Multi-Sensor Perceptual System for Mobile Robot and Sensor Fusion- based Localization ”, IEEE International Conference on Control, Automation and Informatics Sciences (ICCAIS 2012), 259-264. 26-28 December 2012, HCM, VietNam. [15] T. T. Hoang, P. M. Duong, N. T. T. Van, D. A. Viet and T. Q. Vinh, “Development of an EKF-based Localization Algorithm Using Compass Sensor and LRF ”, 2012 IEEE12th International Conference on Control, Automation, Robotics & Vision Guangzhou, China, 5-7th December 2012 (ICARCV 2012) [16] Cristiano Premebida and Urbano Nunes, “Segmentation and geometric primitives extraction from 2D laser range data for mobile robot applications”. Robótica 2005 – Actas do Encontro Cienistisfico Coimbra, 29 de Abril de 2005. [17] Tran Hiep Dinh, Manh Duong Phung, Thuan Hoang Tran, Quang Vinh Tran, “Localization of a Unicycle-like Mobile Robot Using LRF and Omni-directional Camera”, 2012 IEEE International Conference on Control System, Computing and Engineering, 23 - 25 Nov. 2012, Penang, Malaysia. [18] Y. Kanayama, Y. Kimura, F. Miyazaki and T. Noguchi, “A stable tracking control method for an autonomous mobile robot”, in: Proc. IEEE Intl. Conf. on Robotics Automation, 1990, pp. 1722-1727.