These and/or other aspects, features, and advantages will become apparent and more readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the like elements throughout. Exemplary embodiments are described below referring to the figures.
Exemplary embodiments provides a method of localizing a moving robot (relative to a fixed location) or a user (relative to the moving robot) using a receiving sensor (e.g. supersonic sensor) and a measuring unit for a rotational angle of the moving robot, such as an encoder or a gyroscope that is generally mounted in the moving robot. However, the difference in the localizing of the moving robot and the user is just what the reference is, and the technical objects are the same. A relative location between the moving robot and a transmitter herein implies location of the robot and the transmitter relative to each other.
The transmitter may be a device, such as movable remote control, which transmits synchronous signals and supersonic waves to a moving robot 100, and also may be a fixed device such as a beacon. However, a remote control is described herein by way of an example for the transmitter.
The technical configuration of an exemplary embodiment may be, as shown in
The moving robot 100 is provided with an IR/RF receiver 111 and a supersonic wave receiving sensor 112. The moving robot 100 is also provided with a motion controller that controls the motion of the robot and an encoder or a gyroscope that measures the rotational angle of the robot. The motion controller controls driving wheels to move or rotate the moving robot 100. A mechanical unit well known in the art may be used as a mechanism for the movement or rotation in exemplary embodiments. In respect to the center, the sensor 112 is disposed near the edge. The sensor 112 is disposed at the front of the robot 100 in the following exemplary embodiments, but the sensor 112 may be placed in another location.
The driving unit 160 provides driving force to the moving robot 100 through the control of the motion controller 120 so that the moving robot 100 can move. The driving unit 160 typically includes several driving wheels and a steering system, but may be other common driving units allowing the robot 100 to be movable. Further, the driving unit 160 can rotate the moving robot 100 by a specific angle from the center of the moving robot 100 by steering the driving wheels through the control of the motion controller 120. However, another mechanism known in the art other than the driving unit 160 may be used to rotate the moving robot 100 as described above.
Because three measurement points on the moving robot 100 are needed to localize the moving robot 100 or the remote control 200, as shown in
The rotational angle measuring unit 130 measures a rotational angle of the moving robot 100 rotated by the driving unit 160. The gyroscope 132, the encoder 131, or combination of them may be used to measure the rotational angle.
The encoder 131 senses the rotational velocity of the driving wheels in the driving unit 160. The linear velocities of the wheels are calculated by multiplying the rotational velocities by the wheel's radius.
Encoders 131a and 131b are respectively provided to two wheels 101 and 102 of the moving robot 100. In
The rotational angular velocity of the moving robot 100 may be expressed by following Numerical Formula 1 depending on mechanical relationships.
The rotational angle θ is determined by integrating the ω with respect to time.
The gyroscope 132 measures the rotational angular velocity of the moving robot 100 using rotational inertia of a mass that is capable of rotating about at least one or more axes, and calculates a rotational angle by integrating the rotational angular velocity. A gyroscope may be a uniaxial, biaxial or multiaxial gyroscope that is capable of measuring the rotational angular velocity of the moving robot 100 on a plane.
The distance measuring unit 110 measures the distance between the remote control 200 and the moving robot 100. An exemplary method of measuring the distance is described herein using an electromagnetic wave, such as IR and RF, as a synchronous signal and a supersonic wave as a signal for measuring the distance.
When a user presses a specific button of the remote control 200, the measuring of the distance starts.
Returning to
As shown in
As seen from
The location calculator 150 calculates the location of the remote control 200 relative to the moving robot 100 using the rotational angles and the distances at the measurement points stored in the data storage 140. The calculation process in the location calculator 150 is described below with reference to
Therefore, the coordinates of the center of the moving robot 100 may be expressed by (x0, 0,0) and the initial directional angle of the sensor 112 is represented by θ0. As the moving robot 100 rotates about the center 105, the sensor 112 moves along a circle 90 having a predetermined radius r. As the moving robot 100 rotates, when the sensor 112 reaches each of the measurement points pa, pb, pc, the distance measuring unit 110 receives a synchronous signal and a supersonic wave and measures the distance between the moving robot 100 and the remote control 200.
x
a
=x
0
+r·cos(θ0+θa)
y
a
=r·sin(θ0+θa)
x
b
=x
0
+r·cos(θ0+θb)
y
b
=r·sin(θ0+θb)
x
c
=x
0
+r·cos(θ0+θc)
y
c
=r·sin(θ0+θc) [Numerical Formula 2]
The measurement points pa, pb, and pc and the distances La, Lb, and Lc may be expressed as the following Numerical Formula 3,
L
a
2
=x
a
2
+y
a
2
+h
2
L
b
2
=x
b
2
+y
b
2
+h
2
L
c
2
=x
c
2
+y
c
2
+h
2 [Numerical Formula 3]
where, h is the height of the remote control 200, i.e. the z-axis coordinate.
Substituting the coordinates xa, ya, xb, yb, xc, and yc in Numerical Formula 2 for the terms in Numerical Formula 3, three simultaneous equations expressed by three variables of x0, θ0, and h can be obtained, where x0 and h are limited to a negative and a positive, respectively. The terms θa, θb, and θc and La, Lb, and Lc are constants in the simultaneous equations, because they are measured by the rotational angle measuring unit 130 and the distance measuring unit 110, respectively. As a result, the location calculator 150 can determine the only one value for each of the variable x0, θ0, and h simultaneously from the equations.
The location of the remote control 200 relative to the moving robot 100 is obtained from the determined coordinates x0 and θ0 (where, h is unnecessary, because the moving robot is placed on the ground), so that the motion controller 120 controls the driving unit 160 such that the moving robot 100 returns to the remote control 200 and a user. For example, the motion controller 120 may control the driving unit 160 such that the moving robot 100 turns to the remote control 200 and travels the distance |x0|.
According to the above exemplary embodiment, the location of the remote control 200 is calculated by three measurement points. However, more measurement points may be used for more exact calculation. For example, as the moving robot 100 rotates, if the distance and the rotational angle are measured for n measurement points, Numerical Formulas 2 and 3 are operated for triangles formed by any three measurement points of the n measurement points and then representative values (averages, intermediate values) for the obtained plurality of values x0 and θ0 may be obtained.
As the moving robot 100 is rotated by the motion controller 120 (S20), when the sensor 112 reaches a measurement point of a plurality of measurement points (at least three measurement points) (Yes in S30), the distance measuring unit 110 measures the distance between the sensor and a predetermined signal generator at the measurement points using the sensor 112 that senses a predetermined wave (e.g. supersonic wave) generated from the signal generator (S40).
In order to measure the distance, the distance measuring unit 110 may further include the receiver 111 that receives a synchronous signal (transmitted in the form of IR or RF) out of the signal generator and the distance calculator 113 that calculates the distance from delay time between the synchronous signal and the wave.
On the other hand, the rotational angle measuring unit 130 measures the rotational angle for the measurement point where the sensor 112 reaches (S50). For the measurement of the rotational angle, the rotational angle measuring unit 130 may include at least one of the encoder 131 and the gyroscope 132. The gyroscope 132 measures an angular velocity using rotating inertial mass and the encoder 131 measures an angular velocity by dividing the difference of both driving wheels' linear velocities by the distance between the driving wheels.
Further, the location calculator 150 calculates the relative location using input values, i.e. the measured distance and rotational angle, and the radius of a circle determined by the sensor resulting from the rotation of the moving robot 100 (S60).
In order to calculate the relative location, the location calculator 150 sequentially performs operations including finding numerical formulas that expresses the coordinates of the plurality of measurement points into distance between the center of the moving robot and the signal generator, the initial directional angle of the moving robot, and the rotational angle for the positions; finding numerical formulas that expresses the distance between the measurement points and the signal generator into coordinates; and calculating the distance between the center of the moving robot and the signal generator and the initial directional angle of the moving robot by finding the solutions of simultaneous equations of the obtained numerical formulas.
Finally, the motion controller 120 may control the moving robot 100 such that the moving robot 100 approaches the signal generator depending on the calculated relative locations (S70).
The following Table 1 shows the result of a test applied to an exemplary embodiment, where the height h of the remote control 200 was 60 cm, the horizontal distance |x0| between the remote control 200 and the moving robot 100 was 1 m, 2 m, 3 m, respectively, the rotational velocity of the moving robot 100 was 5 cm/s, and the rotational radius r of the moving robot 100 was 10 cm.
As seen from Table 1, the range error and the angular error are not more than 10 cm and 1 degree, so that an exemplary embodiment may be sufficiently practical when used indoors where the distance between the moving robot 100 and the remote control 200 is not too far.
In the above exemplary embodiment, it was assumed that the sensor 112 is an omni-directional sensor not depending on a direction. However, if the sensor 112 is a directional sensor that can receive a supersonic wave within a fixed angle of view only, some algorithms are additionally needed for the motion of the motion controller 120, because, for example, in
First, the receiver 111 receives an order code from the remote control 200 (Yes in S1), and then the motion controller 120 controls the driving unit 160 to rotate the moving robot 100 in a predetermined direction (clockwise or counterclockwise) (S2).
The distance measuring unit 110 checks whether the sensor 112 can receive a supersonic wave transmitted from the remote control 200 at the same time the rotation, and if not (No in S3), the moving robot 100 keeps rotating in the predetermined direction and the data storage 140 continually stores distances and rotational angles from when the distance can be measured (S4).
In S3, if it is possible to measure the distance (Yes in S3), whether the distance decreases during the rotation is checked. When it is determined that the distance decreases (Yes in S8), the motion controller 120 controls the moving robot 100 such that it rotates reversely to the predetermined rotational direction (S9), thereafter, a point where it is impossible to measure the distance (S10) is set to a first end point in the angle of view and the rotational angle at the point is stored in the data storage 113. The motion controller 120 then controls the moving robot 100 such that it rotates reversely to the predetermined direction (S11).
When it is determined that the distance does not decrease in S8 (No in S8), the motion controller 120 controls the moving robot 100 such that it keeps rotating in the predetermined direction. A point where it is impossible to measure the distance is set to a first end point in the angle of view and the rotational angle at the point is stored in the data storage 140. The motion controller 120 then controls the moving robot 100 such that it rotates reversely to the predetermined direction (S11).
The data storage 140 continually stores distances and rotational angles during the rotation in S11. The distances and the rotational angles are provided from the distance measuring unit 110 and the rotational angle measuring unit 130, respectively.
If the minimum distance is reached during the rotation in S2 or S11 (Yes in S5), it implies that the front of the moving robot 100 faces the remote control 200, so that the motion controller 120 controls the moving robot 100 such that it stops after rotating by a half of the angle of view from the minimum distance point, holding the rotational direction before (S6). After the moving robot 100 stops, the location calculator 150 calculates the location of the remote control 200 relative to the moving robot 100 from relationships between the stored distances and the rotational angles using at least three or more points as measurement points.
In the above exemplary embodiment, a method of calculating the locations of a remote control and a user relative to a moving robot was described. However, a method of calculating the location of a moving robot relative to a beacon placed at a fixed location instead of a user is also contemplated, because the difference between the methods is just what the reference is, but the technical configurations are similar or the same.
In addition to the above-described exemplary embodiments, exemplary embodiments can also be implemented by executing computer readable code/instructions in/on a medium/media, e.g., a computer readable medium/media. The medium/media can correspond to any medium/media permitting the storing and/or transmission of the computer readable code/instructions. The medium/media may also include, alone or in combination with the computer readable code/instructions, data files, data structures, and the like. Examples of code/instructions include both machine code, such as produced by a compiler, and files containing higher level code that may be executed by a computing device and the like using an interpreter. In addition, code/instructions may include functional programs and code segments.
The computer readable code/instructions can be recorded/transferred in/on a medium/media in a variety of ways, with examples of the medium/media including magnetic storage media (e.g., floppy disks, hard disks, magnetic tapes, etc.), optical media (e.g., CD-ROMs, DVDs, etc.), magneto-optical media (e.g., floptical disks), hardware storage devices (e.g., read only memory media, random access memory media, flash memories, etc.) and storage/transmission media such as carrier waves transmitting signals, which may include computer readable code/instructions, data files, data structures, etc. Examples of storage/transmission media may include wired and/or wireless transmission media. The medium/media may also be a distributed network, so that the computer readable code/instructions are stored/transferred and executed in a distributed fashion. The computer readable code/instructions may be executed by one or more processors. The computer readable code/instructions may also be executed and/or embodied in at least one application specific integrated circuit (ASIC) or Field Programmable Gate Array (FPGA).
In addition, one or more software modules or one or more hardware modules may be configured in order to perform the operations of the above-described exemplary embodiments.
The term “module”, as used herein, denotes, but is not limited to, a software component, a hardware component, a plurality of software components, a plurality of hardware components, a combination of a software component and a hardware component, a combination of a plurality of software components and a hardware component, a combination of a software component and a plurality of hardware components, or a combination of a plurality of software components and a plurality of hardware components, which performs certain tasks. A module may advantageously be configured to reside on the addressable storage medium/media and configured to execute on one or more processors. Thus, a module may include, by way of example, components, such as software components, application specific software components, object-oriented software components, class components and task components, processes, functions, operations, execution threads, attributes, procedures, subroutines, segments of program code, drivers, firmware, microcode, circuitry, data, databases, data structures, tables, arrays, and variables. The functionality provided for in the components or modules may be combined into fewer components or modules or may be further separated into additional components or modules. Further, the components or modules can operate at least one processor (e.g. central processing unit (CPU)) provided in a device. In addition, examples of a hardware components include an application specific integrated circuit (ASIC) and Field Programmable Gate Array (FPGA). As indicated above, a module can also denote a combination of a software component(s) and a hardware component(s). These hardware components may also be one or more processors.
The computer readable code/instructions and computer readable medium/media may be those specially designed and constructed for the purposes of embodiments, or they may be of the kind well-known and available to those skilled in the art of computer hardware and/or computer software.
According to exemplary embodiments of the invention, a robot can localize a user and move thereto using single supersonic sensor. Accordingly, manufacturing cost of a robot can be reduced and errors due to different sensitivities of several supersonic sensors can be removed.
Although a few exemplary embodiments have been shown and described, it would be appreciated by those skilled in the art that changes may be made in these exemplary embodiments without departing from the principles and spirit of embodiments, the scope of which is defined in the claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
10-2006-0064054 | Jul 2006 | KR | national |