This application claims the benefit of Korean Patent Application No. 10-2004-0011013, filed on Feb. 19, 2004, in the Korean Intellectual Property Office, the disclosure of which is incorporated herein in its entirety by reference.
1. Field of the Invention
The present invention relates to a method and/or apparatus for improving a navigation performance of a robot, and more particularly, to a sensor which is used to detect obstacles.
2. Description of the Related Art
Various kinds of methods are used for robot navigation.
Map-based navigation methods, such as a vector field histogram (VFH) (“Real-time obstacle avoidance for fast mobile robots in cluttered environments”, IEEE International Conference on Robotics and Automation, vol. 1, pp. 572-577, 1990) and U.S. Pat. No. 5,006,988 entitled “Obstacle avoiding navigation system”, construct the presence and absence of obstacles using histogram according to angles. Since information on real distance to the obstacles is not contained, it is possible to avoid the obstacle, however, hard to navigate the robot to move close to the obstacles.
Sensor-based navigation methods, such as Bug, Bug2, DistBug and VisBug, have disadvantages in that they cannot detect obstacles that are disposed in directions in which sensors are not installed. In other words, in order to correctly detect positions of the obstacles, a large number of sensors are required.
In order to solve these problems, rotary sensor based navigation methods, such as U.S. Pat. No. 5,309,212 entitled “Scanning rangefinder with range to frequency conversion” and Japanese Patent No. 2002-215238 entitled “Obstacle detecting sensor of automatic guided vehicle”, rotate sensors or additional devices (mirrors, prisms, etc.), which are attached to the sensors, in order to decrease the number of sensors, thus obtaining the same effect as the case when a large number of sensors are installed. However, due to the rotation of the sensors, delay time occurs and it is necessary to design a complex apparatus such as mirror/prism rotating devices.
Additional aspects and/or advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
According to an aspect of the present invention, a navigation method and apparatus that can overcome limitations in the number and arrangement of previously installed physical sensors by controlling a movement of a robot based on virtual sensors that are virtually present.
According to an aspect of the present invention, a navigation method using a virtual sensor includes: generating information data about positions of obstacles, the information, which is estimated to be generated by virtual sensors that are virtually present, based on information on positions of the obstacles, which is generated by physical sensors; and controlling a movement of a robot according to the information on the positions of the obstacles.
According to another aspect of the present invention, a navigation apparatus using a virtual sensor includes: a virtual sensor information generating part which generates information data on positions of obstacles, the information, which is estimated to be generated by virtual sensors that are virtually present, based on information on positions of the obstacles, which is generated by physical sensors; and a control part which controls a movement of a robot according to the information on the positions of the obstacles.
According to a further another aspect of the present invention, there is provided a computer-readable recording medium encoded with processing instructions for implementing a navigation method using a virtual sensor, in which the method includes: generating information on positions of obstacles, the information, which is estimated to be generated by virtual sensors that are virtually present, based on information on positions of the obstacles, which is generated by physical sensors; and controlling a movement of a robot according to the information on the positions of the obstacles.
These and/or other aspects and advantages of the invention will become apparent and more readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
The present invention will now be described more fully with reference to the accompanying drawings, in which exemplary embodiments of the invention are shown. The invention may, however, be embodied in many different forms and should not be construed as being limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the invention to those skilled in the art. In the drawings, the thickness of layers and regions are exaggerated for clarity. Like reference numerals in the drawings denote like elements, and thus their description will be omitted.
Referring to
The physical sensors (PS0, PS1 and PS2) 11, 12 and 13 detect obstacles existing within detectable ranges and generate information on the positions of the detected obstacles. Ultrasonic sensors, infrared sensors, laser sensors and so on are used as the physical sensors.
An operation of the ultrasonic sensor, which is widely used, will now be described. The ultrasonic sensor transmits ultrasonic waves in a forward direction. When an obstacle is present in the path of an ultrasonic wave, the ultrasonic wave is reflected by the obstacle and the ultrasonic sensor receives the reflected ultrasonic wave. At this time, the distance between the robot and the obstacle can be measured using the speed of the ultrasonic wave and the difference between the transmission time and reception time. Also, since the direction of the ultrasonic sensor is that of the obstacle, the position of the obstacle can be determined. An operation principle of the other sensors is similar to that of the ultrasonic sensor.
The virtual sensor arrangement type selecting part 1 selects one of the sensor arrangement types stored in a sensor arrangement type database.
Referring to
For type I, a total of seven sensors are arranged at 0°, 30°, 60°, 90°, 120°, 150°, and 180°, respectively. Since the sensors are arranged regularly, it is easy to control a movement of the robot using position information of the obstacles, which are generated from the sensors. However, type I sensors cannot detect the obstacles more accurately than type II or type III, which will be described below.
As shown in
Accordingly, designers who design robots need to select a proper type, considering a performance required of the robot and their design capability. According to an embodiment of the present invention, although type I is selected for fully understanding and easy explanation, the cases of type II and type III can also be designed with the same principle. Of course, other types of arrangements may be designed according to a user taste or purpose in order to enhance a performance of a robot.
Referring again to
Referring to
A principle of the map-based navigation method will now be described using the regular grid map, which is widely used. A detectable region of the ultrasonic sensor is illustrated on the regular grid map. Due to the characteristic of the ultrasonic wave, the detectable region is shown with a cone shape. If the obstacle is present around a region including a cell 31 among the cone-shaped region, the ultrasonic sensor receives a reflected signal. It is detected or determined that the obstacle is present in all the adjacent cells. Accordingly, information on an accurate position of the obstacle cannot be generated through a one-time measurement only. Considering the movement of the robot, the position of the obstacle is measured at several positions and several times. The measured values are reflected on the probability values of the cells, thus generating information on a more accurate position of the obstacle.
According to an embodiment of the present invention, the map is updated using an equation below:
grid(Gx,Gy)=grid(Gx,Gy)+I (Eq. 1)
where grid(Gx,Gy) represents a probability value of a cell that is present at a coordinate (Gx,Gy), and I represents an increment or decrement of the probability value. Although a general probability value is represented by an actual number in the range of 0 to 1, the probability value of Eq. 1 will be represented by an integer between 0 and 15 for convenience sake. If the probability value of Eq. 1 is smaller than 0, it is represented as 0. If the probability value of Eq. 1 is greater than 15, it is represented as 15. For example, if the obstacle is present within the cell, I=3, and if no obstacle is present within the cell, I=−1. All the probability values are set to 0 when the map is initially established, and the probability value of the map is updated while the robot is moving. Based on the probability value of the current map, the robot avoids the cells in which probability values higher than a threshold value are recorded.
If the ultrasonic sensor receives the reflected signal of the shown regions (
Referring again to
According to an aspect of this invention, the map may be used to store the information on the positions of the obstacles, which is generated by the physical sensors (PS0, PS1 and PS2) 11, 12 and 13. In other words, the virtual sensor information generating part 3 generates the information on the distances between the obstacles and the robot, the information, which is estimated to be generated by the virtual sensors (VS0 to VS6) 21 to 27, using the difference between the information on the positions of the cells having probability values greater than a threshold value among the cells having the updated probability values and the information on the current position of the robot. Here, the information on the positions of the cells having probability values greater than the threshold value corresponds to the information on the positions of the obstacles. In the above example, the threshold value may be set to 4. The cells having the probability values greater than 4 are considered as the presence of the obstacles. In other words, the virtual sensor information generating part 3 generates the information on the distances between the obstacles and the robot, the information, which is estimated to be generated by the virtual sensors (VS0 to VS6) 21 to 27, using the difference between the information on the positions of the cells having probability values greater than 4 and the information on the current position of the robot.
Referring to
The virtual sensor ID generating unit 31 generates IDs of the virtual sensors, based on the direction angles of the virtual sensors, which are virtually present at the positions determined according to the sensor arrangement type selected by the virtual sensor arrangement type selecting part 1, and the current direction angle of the robot, which is generated by the robot position information generating part 6. The virtual sensors are virtually present and need to be distinguished therefrom. The virtual sensor ID generating unit 31 functions to assign the IDs to the virtual sensors in order to distinguish them from each other. According to the present invention, the virtual sensor ID generating part 31 generates the IDs of the virtual sensors from the difference between the direction angles of the positions determined according to the type I (that is, 0°, 30°, 60°, 90°, 120°, 150° and 180°) and the current direction angle of the robot, using an equation below:
where,
An absolute value of (α−θ) will be equal to or less than π and SensorID will be a value ranging from 0 to (Nvs−1). In this embodiment, Nvs is 7. Therefore, if Nvs is substituted into the equation 2, the IDs of the virtual sensors (VS0 to VS6) 21 to 27 respectively become 0, 1, 2, 3, 4, 5 and 6 regardless of the movement direction of the robot.
The virtual sensor distance generating unit 32 generates the information on the distances between the obstacles and the robot, the information, which is estimated to be generated by the virtual sensors having the IDs generated from the virtual sensor ID generating unit 31, using the difference between the information on the positions of the obstacles, which are present at the direction angles of the virtual sensors, and the information on the current position of the robot, which is generated from the robot position information generating part 6. Here, the information on the positions of the obstacles is generated from the physical sensors (PS0, PS1 and PS2) 11, 12 and 13. In this embodiment, using an equation below, the virtual sensor distance generating unit 32 generates information on the distances between the obstacles and the robot, the information, which is estimated to be generated by the virtual sensors having the IDs generated from the virtual sensor ID generating unit 31, using the difference between the information on the positions of cells having probability values greater than 4 among the cells having the updated probability values and the information on the current position of the robot.
range[SensorID]=√{square root over ((GX−RX)2+(GY−RY)2)}{square root over ((GX−RX)2+(GY−RY)2)} (Eq. 3)
where (Gx,Gy) is the information on the positions of the cells having the probability values greater than the threshold value of 4, and (Rx,Ry) is information on the current position of the robot. The cells having probability values greater than the threshold value of 4 and the distance values of the robot are calculated. The calculated values are the information data about the distances between the obstacles and the robot.
Referring to
Then, according to “if (grid[Gx][Gy]≧Threshold){ }”, { } is performed only when the probability value of the cell is equal to or greater than 4. Thereafter, according to
range[SensorID]=√{square root over ((GX−RX)2+(GY−RY)2)}{square root over ((GX−RX)2+(GY−RY)2)}·meter/grid, the IDs of the virtual sensors and the distances between the cells and the robot are calculated. Then, according to “if (range<range(SensorID)) then range[SensorID]=range”, in case there are a plurality of cells having the probability values greater than 4 with respect to the same virtual sensors, that is, the same direction angles, the distance between the robot and the cell disposed closest to the robot becomes the information on the distance between the obstacles and the robot, which is estimated to be generated by the virtual sensor. In case the obstacle is present over the several cells, the position of the cell closest to the robot corresponds to an edge of the obstacle.
Referring to
Referring again to
The robot position information generating part 6 generates the information on the current position of the robot using an encoder, which can measure the number of rotations of the two wheels 14 and 15. The information on the current position of the robot, which is generated from the robot position information generating part 6, is fed back to the map value updating part 2 and the virtual sensor information generating part 3.
The navigation method using the virtual sensors, which will be described below, is performed in the robot of
Referring to
Then, information on the positions of the obstacles, which is estimated to be generated by the virtual sensors, is generated based on the information on the positions of the obstacles (operation 74). Here, the virtual sensors are virtually present at the positions determined according to the selected sensor arrangement type. In other words, the information on the distances between the obstacles and the robot, the information, which is estimated to be generated by the virtual sensors, is generated using the difference between the information on the positions of the obstacles, which are present in the direction angles of the virtual sensors, and the information on the current position of the robot. The operation 74 includes the following procedures. The IDs of the virtual sensors are generated based on the direction angles of the virtual sensors and the current direction angle of the robot. Then, the information on the distances between the obstacles and the robot, the information, which is estimated to be generated from the virtual sensors having the IDs, is generated using the difference between the information on the positions of the obstacles, which are present in the direction angles of the virtual sensors, and the information on the current position of the robot. In more detail, the IDs of the virtual sensors are generated using the difference between the direction angles of the virtual sensors and the current direction angle of the robot. Then, the information on the distances between the obstacles and the robot, the information, which is estimated to be generated by the virtual sensors, is generated using the difference between the information on the positions of cells having probability values greater than a threshold value among the cells having the updated probability values and the information on the current position of the robot.
Then, the movement of the robot is controlled according to the information on the positions of the obstacles. In other words, the movement of the robot is controlled from the information provided by the virtual sensors (operation 75). In more detail, the movement of the robot is controlled using methods such as a fuzzy rule-based control method, a neural network-based control method, or a proportional integral derivative (PID) control method, which corresponds to the sensor-based control method. Thereafter, the robot moves according to the control (operation 76), and the information on the current position of the robot according to the movement is generated (operation 77).
Referring to
Referring to
Referring to
According to an aspect of the present invention, it is possible to overcome the limit in the number and arrangement of the previously installed physical sensors by controlling the move of the robot based on the virtual sensors that are virtually present. Thus, the present invention has an effect that can improve the navigation performance of the physically-completed robot. Also, although the number and arrangement of the physical sensors are changed, the number and arrangement of the virtual sensors can be maintained constantly. Therefore, the control algorithm prior to the change can be used without any modification.
Further, since the detectable region of the virtual sensor can be set freely, robot system designers or users can navigate the robot to move close to the obstacle by any desired distance. Especially, the present invention can be applied to cleaner robot or the like, which has to move close to the obstacle.
Although a few embodiments of the present invention have been shown and described, it would be appreciated by those skilled in the art that changes may be made in these embodiments without departing from the principles and spirit of the invention, the scope of which is defined in the claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
10-2004-0011013 | Feb 2004 | KR | national |
Number | Name | Date | Kind |
---|---|---|---|
5124918 | Beer et al. | Jun 1992 | A |
5309212 | Clark | May 1994 | A |
5341459 | Backes | Aug 1994 | A |
5400244 | Watanabe et al. | Mar 1995 | A |
5525882 | Asaka et al. | Jun 1996 | A |
5545960 | Ishikawa | Aug 1996 | A |
5758298 | Guldner | May 1998 | A |
20050049788 | Haider | Mar 2005 | A1 |
20060064202 | Gutmann et al. | Mar 2006 | A1 |
Number | Date | Country |
---|---|---|
2002-215238 | Jul 2002 | JP |
1995-0012987 | Oct 1995 | KR |
Entry |
---|
Dima et al, Sensor and Classifier Fusion for Outodoor Obstacle Detection: an Application of Data Fusion To Autonomous Off-Road Navigation, 2003, IEEE Computer Society, Proceedings of the 32nd Applied Imagery Pattern Recognition Workshop (AIPR'03). |
Simmons et al, “Probabilistic Robot Navigation in Partially Observable Environments,” Jul. 1995, Proceedings of the International Joint Conference on Artificial Intelligence, pp. 1080-1087. |
Pauly et al, “Real-Time Object Detection for Autonomous Robots,” 1998, Autonome Mobile Systeme. |
Elfes, “Using Occupancy Grids for Mobile Robot Perception and Navigation,” Jun. 1989, Carnegie Mellon University. |
Kirman, et al., “Sensor Abstractions for Control of Navigation,” Apr. 1991, International Conference on Robotics and Automation, Department of Computer Science, Brown University. |
J. Borenstein et al., “Real-time Obstacle Avoidance for Fast Mobile Robots in Cluttered Environments”, Reprint of the Proceedings of the 1990 IEEE International Conference on Robotics and Automation, May 1990, pp. 572-577. |
Number | Date | Country | |
---|---|---|---|
20050187678 A1 | Aug 2005 | US |