DOI QR코드

DOI QR Code

Expanded Guide Circle-based Obstacle Avoidance for the Remotely Operated Mobile Robot

  • Park, Seunghwan (Intelligent Cognitive Technology Research Department, Electronics and Telecommunications Research Institute (ETRI)) ;
  • Kim, Gon-Woo (School of Electronics Engineering, Chungbuk National University)
  • 투고 : 2013.06.27
  • 심사 : 2013.12.30
  • 발행 : 2014.05.01

초록

For the remote operation of the mobile robot, the human operator depends fully on the sensory information which is the partial information of the workspace of the mobile robot. It is usually very hard to fully manually operate the mobile robot in this situation. We propose the efficient guidance navigation method for improving the efficiency of the remote operation with the expanded guide circle using the sensory information. The guidance command is generated from the proposed algorithm using the expanded guide circle. We evaluated the performance of the proposed algorithm using the experiments.

키워드

1. Introduction

Recently, the applications of the intelligent mobile robot have been drastically increased in the various fields, such as vacuum cleaning, telepresence, surveillance, rescue, exploration, guidance, military purpose, etc. [1-4]. In many applications, the mobile robot should be partially or fully operated by the human operator because of the technical limitations of the autonomous mobility. Especially, the remote operated mobile robots are surely feasible for the applications in the hazardous environments like space, nuclear plants, battlefield, underwater, and so forth.

Telerobotics is the area of robotics with the human operator in control from a distance. The remotely operated mobile robot is one of the applications of telerobotics. In this application, the mobile robot is generally operated by the human operator at a distance, which is called as teleoperation. This approach is typically applied to robotic applications to keep the operator out of the dangerous situation.

Remote operation is the method to operate remotely a mobile robot with a joystick or other control device which may be plugged directly into the robot, may be connected by wireless or may be a component to the remote controller. The remotely operated mobile robot can be conceptually separated into two parts: the master with the human operator and the operational components, which could be joysticks, monitors, keyboards, or other I/O devices, and the slave with the robot, sensors and control components.

For the remote operation, the human operator relies fully on the sensory information from the mobile robot. However, it is hard for the operator to acquire the whole environmental information from the robot. In general, the operator controls the robot using the limited information such as the partially observable visual information, some range information, etc. It is very hard to fully manually operate the mobile robot in this situation. For the efficient remote operation, several functions for assisting the operator should be needed such as obstacle avoidance, visualization of the environment, shared autonomy, etc. [7, 8]. Especially, it is necessary to intervene actively in the operation for avoiding obstacles.

The obstacle avoidance methods have been widely developed by many researchers [9-18]. Borenstein et al. proposed the Vector Field Histogram (VFH) [9, 10]. This method is a popular local path planner based on the Virtual Force Field (VFF) in the local polar histograms. Another approved local path planner is the Potential Field Method (PFM) [11, 17]. In this method, the workspace of the mobile robot is covered with the artificial potential field with the attractive and the repulsive potential. Besides these methods, there are many algorithms proposed using the fuzzy algorithm, the geometric transform, the virtual force, etc. [12-16, 18]. Many of the research for the obstacle avoidance have been developed on the part of the autonomous navigation.

Rönnbäck et al. [5, 6] proposed a method for the path planning with obstacle avoidance amid circular objects. This method is to find the sequence of circles in the workspace of the mobile robot based on the circular data of the obstacles as the input. This sequence represents a possible path for a holonomic mobile robot. In this paper, we just brought the concept to expand the circle from the previous works in order to apply to the safe operation of the remotely operated mobile robot.

The purpose of this study is to improve the operational convenience for the remote operation using the efficient method to avoid the obstacles. The proposed method is based on remote control commands for the remotely operated mobile robot to navigate the robot to a safe area, a sort of guide that is how the concept of shared autonomy. The major contribution of this paper is to propose the simple, new and efficient method to guide the operator safely through the expansion of the circle based upon the remote control commands. In this paper, we proposed the novel and simple approach to avoid the obstacles using the expansion of the Guide Circle (hereinafter called “GC”).

This paper is organized as follows: In Section II, we introduce the proposed remote operation system. Then, we propose the guidance operation algorithm using the expanded Guide Circle in Section III. In Section IV, we verify the validity and the effectiveness of the proposed algorithm in the experiment. Finally, we conclude this study in Section V.

 

2. Remote Operation System

2.1 Configuration of the remote operation system

The proposed remote operation system is split into two components: the remote operator and the mobile robot as shown in Fig. 1. The remote operator is composed of the control PC with a joystick in the user interface, to get the user’s command. The mobile robot is equipped with the control PC and the sensor. In the proposed system, RGBD camera is utilized for acquiring the visual and the range information about the environment of the robot. Especially, the RGB-D camera is used for measuring the distance to the obstacles and building the 2D local grid map.

Fig. 1.Configuration of the remotely operated mobile robot system

The operation command by the user is transmitted to the control PC in the mobile robot and the visual and the range information is also transmitted to the control PC in the remote operator. The information is visualized in the control PC for assisting the user in the remote operation.

2.2 Kinematics of the mobile robot

For the derivation from the kinematic model of the mobile robot, we define the reference frames as shown in Fig. 2. The pose of a mobile robot in the global reference frame can be defined as:

Fig. 2.Definition of the reference frames: global reference frame {G} and robot reference frame {R}

where xR, yR, and θR denote the position and orientation of a mobile robot, respectively.

The kinematic model of a differential drive robot with a wheel radius r is shown by using Jacobian matrix as:

where is the column vector of the rotating speed of each wheel, and and W is the distance between two wheels.

Using the linear and angular velocity of a mobile robot, the speed of a mobile robot can be acquired as:

2.3 Remote operation of the mobile robot

For the remote operation, we implement two control modes which can be manually selected by the user: the direct control mode and the shared autonomy mode. In the direct control mode, the mobile robot is directly controlled using the joystick command (xo, yo). The suitable conversion of the joystick command is needed in order to control the robot, which is controlled using the velocity command (vR, ωR). The coordinate transformation between the joystick command space and the velocity space is shown in Fig. 3. The equation of the coordinate transformation is shown in (4).

Fig. 3.Coordinate transformation between the joystick command space and the velocity space

where γv and γω are the scale factors for the linear and the angular velocity, respectively. The maximum values of the linear and the angular velocity of the mobile robot are denoted as vmax and ωmax.

In the direct control mode, all movements of the mobile robot are fully manually controlled by the operation command. Therefore, it is hard to guarantee the safety of the mobile robot while it is moving.

 

3. Guidance Operation using the Expanded Guide Circle

In this section, we propose the new simple but efficient guidance operation algorithm using the expanded Guide Circle (GC) for the shared autonomy mode motivated by the algorithm in [5, 6]. The proposed algorithm produces the guidance operation command based on the sensory information for avoiding obstacles while preserving the user’s operation command as possible.

The concept of the proposed algorithm is to evaluate the degree of safety of the robot in advance at the position to be reached by the velocity command (vR, ωR) and to modify the velocity command suitably using the expanded GC when the degree of safety is below the reference value.

3.1 Definition of the safety index circle

When we operate the remote mobile robot, the operation command is not transmitted directly, modified and then are delivered for ensuring the safety using the proposed algorithm. Before generating the modified operation command, we need to check the safety using the expanded GC and regulate the velocity of the mobile robot when it reaches the location predicted using the modified command in advance. According to the radius of the expanded GC, we can determine the safety of that area. For evaluating the degree of safety, the Safety Index Circle (hereinafter called “SIC”) can be defined (see Fig. 4).

Fig. 4.The definition of Safety Index Circle (SIC)

According to the radius of SIC, the area around the robot can be divided into three regions, which are the unsafe, conditionally safe and safe regions. Assume that the initial or the expanded GC is found. Through the expansion of the GC, we can get the range of the GC which can be represented as the radius rc of the GC.

1) Safe Region (rc > rmax): If the radius of the GC is greater than rmax, it means that the mobile robot can be safely movable because SIC is included in the Safe Region. In this case, the robot is operated using the user’s operation command directly.

2) Conditionally Safe Region (rmin < rc ≤ rmax): If the radius of the GC is less than rmax and greater than rmin, it means that it is hard to guarantee the safety of the mobile robot. In this case, the degree of the safety will increase proportionally as the radius rc. If the initial GC is included within this region, we have to find the expanded GC using the proposed algorithm in the next subsection. If the expanded GC is also included within this region, the mobile robot should be regulated for the safe navigation.

3) Unsafe Region (rc ≤ rmin): If the radius of the GC is less than rmin, it means that the degree of collision is very high when the mobile robot is moving. Therefore, it is better not to move in this situation, regardless of the operator’s command.

The degree of safety can be defined using the safety index wSI. Safety index will be utilized for regulating the linear and the angular velocity of the mobile robot using the proposed guidance operation in the next subsection. When the expanded GC using the proposed algorithm is found, the safety of the expanded GC can be evaluated using this index, and this will also affect the control command according to the degree of the safety.

3.2 Guide circle expansion algorithm

In this subsection, we present the guide circle expansion algorithm in order to produce the modified operation command for avoiding the obstacles.

Step 1) Define the Initial Guide Circle (Line 1-6 in Table 1): When the velocity command (vR, ωR) from the operator is transmitted to the remote mobile robot, the robot position can be predicted after the specific period of time. For evaluating the safety of this position, the initial GC has been expanded with its position at the center until the boundary of the initial GC meets with the obstacle (green circle shown in Fig. 5). The predicted position is to be set as the center of the initial GC as shown below. (Line 1 in Table 1)

Fig. 5.Define the initial Guide Circle (GC) in the robot local reference frame {R}: Process to find the center and the radius of the initial GC (top) and the method to select one point as the center of the auxiliary GC between two points on the intersection of two circles which are the initial GC and the reachable circle (bottom)

Table 1.Note: ← means “insert or replace the values”

The reachable region can be determined using rmax in (5) which can represent the boundary can be reached using the maximum velocities. Within the reachable region, we can define the reachable circle of the radius, rRC which represents the reachable distance using the current velocity command (vR, ωR) at the robot pose. The equation of the reachable circle shown in Fig. 5 can be calculated as shown below.

In order to find the radius of the initial GC, the initial GC is expanded from the center to the nearest scan point in the scan data set, M = {mi : ith scan data on the point Pmi=(xi, yi) in the frame {R} for i=1,…,n}. This expansion is progressed until the distant between the center to the nearest scan point is less than or equal to rmax. Then, the radius of the initial GC is determined using the shortest distance from the center to the nearest obstacle. If the distance exceeds rmax, the expansion is stopped and the radius is determined as rmax·

If rIGC is equal to rmax, the user’s operation command is valid as itself. Therefore, Step 2 - 4 is not necessary, and the mobile robot is controlled using the original operation command in (4). If rIGC is less than rmax, the mobile robot will be guided through the fusion of the operator’s command and the guiding command produced by the expanded GC.

Step 2) Define the Auxiliary Guide Circle (Line 7-12 in Table 1): If it needs to expand the GC, the direction of the expansion should be selected. Because the expansion of the GC should be progressed in the direction for avoiding the obstacle, the direction is determined by the reference line shown in Fig. 5. In Fig. 5, the obstacle is detected on the right side of the reference line, so the left is set to the direction of the expansion. According to the direction, the suitable point between two points PCL, PCR of the intersection of two circles which are the initial GC and the reachable circle is selected as the center of the auxiliary GC PAGC=(xAGC, yAGC).

Next, the auxiliary GC has been expanded from the selected center to the nearest scan point in the scan data set, M in the same fashion as the expansion of the initial GC as shown in Fig. 6. The auxiliary GC doesn’t have the special meaning and is only used to find the expanded GC. The step to expand the auxiliary GC is the intermediate step to find the expanded GC. The nearest scan point from the center and the radius of the maximal expanded auxiliary GC can be defined as:

Fig. 6.Define the auxiliary Guide Circle (GC): Process to find the center and the radius of the auxiliary GC

Step 3) Define the Expanded Guide Circle (Line 13-17 in Table 1): Through Step 1 and 2, the initial and the auxiliary GC can be acquired. Then, we should find the expanded GC in order to drive the robot safely in its environment. The center of the expanded GC can be determined using the weighted sum of two centers of the initial and the auxiliary GC as shown in Fig. 7. It is reasonable to employ each radius as the weight factor because the radius reflects the degree of safety. Then, the expanded GC is also expanded from the center to the closest scan point in the same fashion until the distant between the center to the nearest scan point is less than or equal to rmax·

Fig. 7.Define the expanded Guide Circle (GC): Process to find the center and the radius of the expanded GC

Step 4) Produce the modified operation command (Line 18-21 in Table 1): Using the center and the radius of the expanded GC shown in (12) and (13), the modified operation command can be produced in order to avoid the obstacles near the mobile robot in advance. In detail, the radius of the expanded GC indicates the safety of the area where the robot reaches. Using its radius, the safety index wSI can be determined using (16) according to the SIC in (5), and the velocity of the mobile robot is also regulated using the safety index.

Using (14-16), the mobile robot can be properly controlled for avoiding the obstacles and concurrently preserving the user’s operational command as possible.

 

4. Experimental Results

4.1 Experimental setup

The experimental setup is the same as shown in Fig. 1. The remote operator is composed of the laptop computer with the Logitech extreme 3D pro joystick. Fig. 8 shows the GUI S/W implemented in the side of the remote operator. We used Pioneer 3DX from Adept Mobilerobots as the mobile robot platform. The mobile robot is equipped with the laptop computer and the range sensor. For acquiring the range data and building the local map, we used Asus Xtion Pro Live which is one of the inexpensive RGB-D cameras. The laptop computer uses Ivy Bridge i5 2.6GHz and 4GB DDR3 memory.

Fig. 8.GUI S/W for the guidance navigation with the expanded guide circle

Some parameters are determined using the datasheet of the mobile robot. The width of the mobile robot, W is 0.381m and the length, L is 0.455m. The maximum linear and the angular velocities of the robot are set to be 0.14m/s and 0.5rad/s, respectively. Then, we can determine drobot=0.455m and dsafe=0.14m as mentioned in Section 3.1. So we can define rmin and rmax as 0.455m and 0.595m, respectively. Using these parameters, the value as the safety index can be determined using (16).

4.2 Performance of the obstacle avoidance

For evaluating the validity of the proposed algorithm, we perform the experiments using the remotely operated mobile robot system as shown in the previous section. The human operator just produced the forward velocity command using the joystick in this experiment in order to measure the performance of the obstacle avoidance. Fig. 9 shows the experimental results for avoiding the obstacles in a hallway. The blue boxes in Fig. 9 represent the trajectory of the mobile robot controlled using the modified command produced by the proposed expanded GC algorithm when the original command was only the constant forward velocity. The ground truth of the pose of the mobile robot is measured using the indoor localization system which is StarGazer manufactured by Hagisonic. The experimental results have been shown the feasibility of the proposed algorithm. Especially, we could verify the proposed algorithm for guiding the operator to avoid collisions at the four different regions indicated by the area A, B, C, and D in Fig. 9. For each area, the produced expanded guide circle is represented in Fig. 10. In Fig. 10, the blue circle which is the initial GC is produced by the direct operational command from the operator and the red circle represents the expanded GC produced by the modified operational command for avoiding the obstacle. The blue points represent the local grid map from the scan data obtained by the RGB-D camera. When the RGB-D camera acquires the 3D points of obstacles, each point has been projected on the 2D plane. The projected 2D points have been used as the scan data and for building the 2D local grid map.

Fig. 9.Trajectory of the remotely operated mobile robot controlled using the modified command produced by the proposed expanded GC algorithm

Fig. 10.Experimental results for avoiding the obstacles using the expanded GC: the initial GC (blue) and the expanded GC (red) at the area A (top, left), at the area B (top, right), at the area C (bottom, left) and at the area D (bottom, right)

At A area in Fig. 10, the expanded GC has been expanded toward the left side of the frontal obstacle. It produced the modified operational command by itself; therefore, the motion of the avoidance was generated. At B area in Fig. 10, the expanded GC has been expanded toward the right side of the frontal obstacle. At C and D area in Fig. 10, the expanded GC has been expanded toward the left and the right side, respectively. Fig. 11 shows the reference operation commands produced by the operator and the modified operation commands produced by the proposed algorithm. At the specific areas (which are denoted as A, B, C, and D above), the proper translational and rotational commands for avoiding the obstacles were generated efficiently using the proposed algorithm. According to the experimental results, the mobile robot is guided and navigated to the safe region efficiently using the proposed expanded GC algorithm.

Fig. 11.The reference operation commands as an input of the operator using the Joystick (blue line) and the modified operation commands produced by the expanded GC (red line)

Fig. 12 shows another experimental result for the arbitrary operation command with the operator’s intension. In this experiment, the human operator manipulated the mobile robot just like a real robot operation from the remote location. The human operator intentionally produced the operation command for avoiding the obstacles and moving toward the free space depending on the visual information. However, the visual information is not enough to recognize the whole environment around the mobile robot. As shown in Fig. 12, the modified operation commands using the proposed expanded GC algorithm were, firstly, generated at the area A. At the area A, the original operational command seems like to manipulate the mobile robot toward the obstacle. As the robot is approaching the obstacle, the translational velocity has been regulated in order to avoid the collision, and the rotational velocity has been modified for avoiding the obstacle by the proposed algorithm. The red lines at the bottom of Fig. 12 show the modified operation command along the path.

Fig. 12.Experimental results for the arbitrary operation command with the operator’s intention: the trajectory of the robot controlled using the modified command (top), and the reference operation commands (blue line) and the modified operation commands (red line) (bottom)

 

5. Conclusions

In this paper, we propose the efficient guidance navigation method for improving the efficiency of the remote operation with the expanded guide circle. The proposed algorithm which is the new approach for avoiding the obstacles is very simple and well-defined. Using the proposed algorithm, the operator can easily operate the mobile robot safely in the remote site, and we also evaluated the validity using the experiments. In the future, we will employ this algorithm for the local path planning for the autonomous navigation.

참고문헌

  1. R. R. Murphy, and J. L. Burke, "From remote tool to shared roles," IEEE Robotics & Automation Mag., vol.15, no.4, pp.39-49, 2008
  2. D. J. Bruemmer, D. A. Few, R. L. Boring, J. L. Marble, M. C. Walton, and C. W. Nielsen, "Shared understanding for collaborative control," IEEE Trans. Systems, Man and Cybernetics, Part A: Systems and Humans, vol.35, no.4, pp.494-504, 2005 https://doi.org/10.1109/TSMCA.2005.850599
  3. Y. Horiguchi, and T. Sawaragi, "Effects of probing behaviors to adapt machine autonomy in shared control systems," in Proc. 2005 IEEE Int'l Conf. Systems, Man and Cybernetics; pp.317-323, 2005
  4. B. Hamner, S. Singh, S. Roth, and T. Takahashi, "An efficient system for combined route traversal and collision avoidance," Auton. Robot., vol. 24, pp. 365-385, 2008 https://doi.org/10.1007/s10514-007-9082-3
  5. S. Ronnback, T. Berglund, and H. Freriksson, "Circle sector expansions for on-line exploration," in Proc. Int'l Conf. Robotics and Biomimetics, pp. 1227-1232, 2006
  6. S. Ronnback, S. Westerberg, and K. Prorok, "CSE+: Path planning amid circles," in Proc. 4th Int'l Conf. Auton. Robots and Agents, pp. 447-452, 2009
  7. J. H. Lee, H. J. Lee, and S. Jung, "Teleoperation control of omni-directional mobile robot with force feedback," in Proc. Information and Control Symposium, pp. 243-245, 2007 (in Korean)
  8. H. Mano, K. Kon, N. Sato, M. Ito, H. Mizumoto, K. Goto, R. Chatterjee, and F. Matsuno, "Treaded control system for rescue robots in indoor environment." in Proc. IEEE Int'l Conf. Robotics and Biomimetics, pp. 1836-1843, 2008
  9. J. Borenstein, and Y. Koren, "The vector field histogram - fast obstacle avoidance for mobile robots," IEEE J. Robotics and Automation, vol. 7, no. 3, pp. 278-288, 1991 https://doi.org/10.1109/70.88137
  10. I. Ulrich, and J. Borenstein, "VFH+: Reliable obstacle avoidance for fast mobile robots," in Proc. IEEE Int'l Conf. Robotics and Automation, pp. 1572-1577, 1998
  11. S. S. Ge, and Y. J. Cui, "New potential functions for mobile robot path planning," IEEE Trans. Robotics and Automation, vol. 16, no. 5, pp. 615-620, 2000 https://doi.org/10.1109/70.880813
  12. J. Minguez, "The obstacle-restriction method (ORM) for robot obstacle avoidance in difficult environments," in Proc. IEEE/RSJ Int'l Conf. Intelligent Robots and Systems, pp. 2284-2290, 2005
  13. F. Wen, Z. Qu, C. Wang, and B. Hu, "Study on realtime obstacle avoidance of mobile robot based on vision sensor," in Proc. IEEE Int'l Conf. Automation and Logistics, pp. 2438-2442, 2008
  14. X. T. Le, E. Z. Hong, H. S. Kim, Y. R. Cheon, S. H. Lee, S. H. Han, and Y. G. An, "Real-time obstacle avoidance of mobile robots," in Proc. Int'l Conf. Control, Automation and Systems, pp. 2294-2298, 2007
  15. Y. S. Chen, and J. G. Juang, "Intelligent obstacle avoidance control strategy for wheeled mobile robot," in Proc. ICROS/SICE Int'l Joint Conf., pp. 3199-3204, 2009
  16. L. C. McNinch, R. A. Soltan, K. R. Muske, H. Ashrafiuon, and J. C. P. Jones, "Application of a coordinated trajectory planning and real-time obstacle avoidance algorithm," in Proc. American Control Conference (ACC), pp. 3824-3829, 2010
  17. L. Tang, S. Dian, G. Gu, K. Zhou, S. Wang, and X. Feng, "A novel potential field method for obstacle avoidance and path planning of mobile robot," in Proc. IEEE Int'l Conf. Computer Science and Information Technology (ICCSIT), pp. 633-637, 2010
  18. J. Ren, K. A. McIsaac, and R. V. Patel, "A fast algorithm for moving obstacle avoidance for visionbased mobile robots," in Proc. IEEE Conf. Control Applications, pp. 209-214, 2005

피인용 문헌

  1. Autonomous Navigation of a Mobile Robot in Unknown Environment Based on Fuzzy Inference vol.17, pp.3, 2016, https://doi.org/10.5762/KAIS.2016.17.3.292
  2. Fast ellipse detection based on three point algorithm with edge angle information vol.14, pp.3, 2016, https://doi.org/10.1007/s12555-014-0561-y
  3. Real-time Obstacle Avoidance of Non-holonomic Mobile Robots Using Expanded Guide Circle Method vol.12, pp.1, 2017, https://doi.org/10.7746/jkros.2017.12.1.086
  4. Range Sensor-Based Efficient Obstacle Avoidance through Selective Decision-Making vol.18, pp.4, 2018, https://doi.org/10.3390/s18041030