The present disclosure relates to subject matter contained in priority Korean Application No. 10-2011-0094803, filed on Sep. 20, 2011, which is herein expressly incorporated by reference in its entirety.
1. Field of the Disclosure
This specification relates to a mobile robot capable of recognizing its own position using image information and detecting a position of a recharging station, and a controlling method thereof.
2. Background of the Disclosure
Generally, a robot has been developed for an industrial use, and has managed some parts of factory automation. As the robot is applied to various fields recently, medical robots, space robots, home robots used at homes, etc. are being developed.
A representative of the home robots is a robot cleaner, a kind of home electronic appliance capable of performing a cleaning operation by sucking peripheral dust particles or foreign materials with autonomously moving on a predetermined region. This robot cleaner is provided with a rechargeable battery, and is provided with an obstacle sensor for avoiding an obstacle while moving.
In recent time, an application technology using a mobile (movable) robot, especially, a robot cleaner is being developed. For example, with the mobile robot having a networking function developed, it may be possible to send a cleaning command from a remote place or monitor an indoor condition. Also, mobile robots having a self-position recognition and map composition function using cameras or various types of sensors are being developed.
Therefore, an aspect of the detailed description is to provide a mobile robot capable of detecting image information using cameras and recognizing its own position using the image information, and a controlling method thereof.
Another aspect of the detailed description is to provide a mobile robot capable of fast detecting a position of a charging station using image information, and a controlling method thereof.
Another aspect of the detailed description is to provide a mobile robot capable of fast moving to a charging station based on image information and guideline information sent from the charging station, and a controlling method thereof.
To achieve these and other advantages and in accordance with the purpose of this specification, as embodied and broadly described herein, there is provided a mobile robot including an image detection unit configured to detect image information by capturing images of surroundings, a storage unit configured to store the image information, and a controller configured to recognize an absolute position by comparing currently detected image information with stored image information, and detect a position of a charging station based on the recognition result of the absolute position. The storage unit may further store position information corresponding to the image information. The controller may compare image information related to a region where the charging station is located with the currently detected image information, and detect the position of the charging station based on the comparison result.
The robot cleaner may further include an object sensing unit having a charging signal sensor to receive a guideline signal transmitted by the charging station.
To achieve these and other advantages and in accordance with the purpose of this specification, as embodied and broadly described herein, there is provided a controlling method for a mobile robot including an image detecting step of detecting image information by capturing images of surroundings, an image comparing step of comparing the currently detected image information and pre-stored image information, a position recognizing step of recognizing an absolute position of the mobile robot based on the comparison result from the image comparing step, and a charging station detecting step of detecting a position of the charging station based on the recognition result of the absolute position.
In accordance with another exemplary embodiment of this specification, a mobile robot may include an image detection unit configured to detect image information by capturing images of surroundings, a storage unit configured to store the image information, an object sensing unit having a charging signal sensor to receive a guideline signal transmitted by the charging station, and a controller configured to recognize an absolute position by comparing currently detected image information with stored image information, detect a position of a charging station based on the recognition result of the absolute position, and dock the mobile robot with the charging station according to the guideline signal.
In a mobile robot and a controlling method of the same according to those exemplary embodiments, the mobile robot may recognize a precise position thereof by detecting a plurality of images through an image detection unit, extracting one or more feature points from the plurality of images, and comparing and matching information related to the feature points.
In accordance with the exemplar embodiments, a position of a charging station can be easily detected based on image information and the mobile robot is allowed to quickly move to the charging station upon the lack of residual battery capacity, resulting in improvement of stability and operation efficiency of a system.
In accordance with the exemplar embodiments, the position of the charging station can be detected based on the image information and a guideline signal can be received within a signal reception range, resulting in facilitation of docking with the charging station.
In accordance with the exemplar embodiments, the precisely recognized position may be linked to an indoor map to perform cleaning or moving, resulting in improvement of efficiency of the system.
In accordance with the exemplar embodiments, in recognizing a position of the mobile robot based on images, errors, which may be caused when the mobile robot is placed at an arbitrary position or a position change occurs, can be reduced, and the mobile robot can precisely recognize its own position within a short time.
Further scope of applicability of the present application will become more apparent from the detailed description given hereinafter. However, it should be understood that the detailed description and specific examples, while indicating preferred embodiments of the disclosure, are given by way of illustration only, since various changes and modifications within the spirit and scope of the disclosure will become apparent to those skilled in the art from the detailed description.
The accompanying drawings, which are included to provide a further understanding of the disclosure and are incorporated in and constitute a part of this specification, illustrate exemplary embodiments and together with the description serve to explain the principles of the disclosure.
In the drawings:
Description will now be given in detail of the exemplary embodiments, with reference to the accompanying drawings. For the sake of brief description with reference to the drawings, the same or equivalent components will be provided with the same reference numbers, and description thereof will not be repeated.
Referring to
The storage unit 200 may further store position information corresponding to the image information in addition to the image information. The controller 300 may compare image information related to a region where a charging station is located with currently detected image information, and detect the position of the charging station using the comparison result. For example, the storage unit 200 may previously store image information relating to a region where the charging station exists and a position of the charging station, and the controller 300 may compare the stored image information with detected image information to detect the position of the charging station.
The storage unit 200 may further store information relating to at least one of obstacle, indoor map (cleaning map), region and path, as well as the image information and the position information. The storage unit 200 may store a control program for control of the mobile robot and associated data. The storage unit 200 may further store a cleaning mode, a traveling method, and a position of the charging station.
The image detection unit 100 may detect image information by capturing peripheral regions. The image detection unit 100, as shown in
The controller 300 may extract one or more feature points from the image information detected by the image detection unit 100 and the image information stored in the storage unit 200. The controller 300 may extract one or more feature points having coordinate information with respect to each of a plurality of images. Referring to
The controller 300 may calculate a similarity between the feature points based on the feature point information, and recognize an absolute position using the similarity. Here, the controller 300 may match the feature points with each other based on the pre-stored image information in the storage unit 200, namely, the images or the feature points, and the image information related to the images detected by the image detection unit 100, and recognize a position of the mobile robot. The feature points may have a distance therebetween in a feature point space for determining a similarity. That is, the feature points have a high similarity when the distance therebetween is short. On the other hand, the feature points have a low similarity when a distance therebetween is long. The feature points may be expressed, for example, as (x1, i, y1, i) and (x2, i, y2, i). The feature points may alternatively be expressed as points on a three dimensional (3D) coordinate system. Here, the distance Δ between the feature points may be represented as the following Equation 1.
Δ=√{square root over ((x1, i−x2, i)2+(y1, i−y2, i)2)}{square root over ((x1, i−x2, i)2+(y1, i−y2, i)2)} [Equation 1]
For example, when the distance between feature points obtained by Equation 1 is less than a predetermined value, the controller 300 determines that the feature points are the same feature point, and matches the feature points with each other.
Referring to
The feature point extraction module 310, the similarity calculation module 320 and the position recognition module 330 of the controller 300 may be configured as different types of units or be combined into a single microprocessor.
The feature point extraction module 310 may be configured to extract one or more feature points from the image information detected by the image detection unit 100 and the image information stored in the storage unit 200. The feature point extraction module 310 may extract one or more feature points each having coordinate information, with respect to each of a plurality of images. Referring to
The similarity calculation module 320 may calculate the similarity between the feature points based on the feature point information, and the position recognition module 300 may recognize the absolute position based on the similarity. Here, as shown in
The image detection unit 100 may detect peripheral image information while performing a cleaning operation or moving, and the storage unit 200 may store the image information. Here, the storage unit 200 may previously store image information within a region in form of a database, in addition to the detected image information. When the mobile robot is moved to another region by a user or other reasons, the image detection unit 100 detects image information at a position to which the mobile robot has moved. Here, the image detection unit 100 recognizes the position of the mobile robot based on a comparison result between the image information detected by the image detection unit 100 and the image information stored in the storage unit 200. Here, the position may not be a relative position, which is recognized by a position recognition unit, for example, a wheel sensor, a gyro sensor, an acceleration sensor or the like, but an absolute position indicating to which position within a region the position corresponds.
The mobile robot may be provided with left and right main wheels movably disposed at both lower sides thereof. At both side surfaces of each main wheel may be installed handles for facilitating a user to grab them. The driving unit 500 may be connected to the left and right main wheels, and provided with specific wheel motors for rotating the wheels. As the driving unit 500 drives the wheel motors, a main body of the mobile robot is movable. The wheel motors are connected to the main wheels, respectively, so as to rotate the main wheels. The wheel motors may operate independent of each other and be bidirectionally rotatable. The mobile robot may further be provided with one or more auxiliary wheels at a rear surface thereof so as to support the main body and minimize friction between a lower surface of the main body and a floor (a surface to be cleaned), thereby allowing for smooth movement of the mobile robot. Wheel sensors may be connected to the left and right main wheels, respectively, so as to sense the number of turns (rpm) of the main wheels. Here, the wheel sensor may be a rotary encoder. The rotary encoders may sense and output the number of turns of the respective left and right wheels when the mobile robot moves in a traveling mode or a cleaning mode. The controller 300 may calculate a rotation speed of each of the left and right wheels based on the number of turns. The controller 300 may also calculate a rotation angle based on the difference of the number of turns of the left and right wheels. Accordingly, the controller 300 may recognize a relative position using the wheel sensors. As another example, the mobile robot may be provided with an acceleration sensor for recognizing speed and position thereof or a gyro sensor for detecting a rotation speed of a robot cleaner, so as to detect a relative position thereof.
The controller 300 may further include a feature point matching module 370 to match feature points within images, having a similarity calculated more than a predetermined reference value, with feature points within the newly detected images.
The feature point cluster generation module 340 may divide a plurality of feature points into a predetermined number (for example, 16, 32, 64) of clusters to generate a predetermined number of feature point clusters. The central point extraction module 350 may extract central points from the individual clusters. Here, the central point extraction module 350 may use a K-means algorithm, for example. The central points may include properties of the feature points within each cluster. A relationship between the feature points and the central points may be represented in form of a histogram. Each image may be represented using such relationship. Each image information represented using the central points may be stored in the storage unit 200.
The similarity calculation module 320 may calculate the similarity of a plurality of images using the central points. The central point matching module 360 may match the central points with each other based on the similarity calculation result. The similarity of the images represented using the central points may be calculated from the relationship between histograms. The similarity of the images may be calculated by a method, such as a running total of a Euclidean distance, for example, by the following Equation 2.
where α denotes similarity, H1 and H2 denote histograms, and K denotes the number of clusters. Generally, images obtained by consecutively capturing a specific region exhibits a high similarity, whereas images obtained from different regions exhibit a low similarity. However, the images from the different regions may exhibit a high similarity. To distinguish those, the feature point matching module 370 may perform a feature point matching.
The controller 300 may select a specific number of images which show a high similarity calculated between the central points, and match feature points within the selected images with feature points within newly detected images. The controller 300 may then recognize an absolute position based on the matching results of the feature points. A matching algorithm for matching the central points or feature points may be existent in various forms, so detailed description of the matching algorithm will be omitted.
The feature point extraction module 310, the similarity calculation module 320, the position recognition module 330, the feature point cluster generation module 340, the central point extraction module 350, the central point matching module 360 and the feature point matching module 370 may be configured as different types of units, respectively. Each module is divided for the sake of explanation, so it may alternatively be combined into a single microprocessor.
Referring to
The charging signal sensor 410 may be installed at one side of the inside or outside of the mobile robot. The charging signal sensor 410, as shown in
Referring to
The object sensing unit 400, as shown in
The object sensing unit 400 may further include a cliff sensor installed at a lower surface of the main body so as to sense an obstacle on a bottom surface, for example, a cliff (precipice). The cliff sensor may be configured to obtain a stable measurement value irrespective of reflectivity of the bottom surface or a color difference, and be implemented as a type of an infrared module, such as a PSD sensor.
Referring to
The output unit 700, as shown in
When the mobile robot is a robot cleaner, the mobile robot may further include a cleaning unit. The cleaning unit may include a dust bin for storing collected dust, a suction fan for providing a driving force to suck dust on a region to be cleaned, and a suction motor for sucking air by rotating the suction fan. With the configuration, the cleaning unit may suck dust or foreign materials on surroundings.
Referring to
Referring to
The mobile robot calculates a similarity between the feature points based on the feature point information. The mobile robot matches the feature points with each other based on the pre-stored image information, namely, the images or the feature points, and the image information related to the images detected through cameras, and recognizes a position of the mobile robot (S400). Since the position information corresponding to the image information is further stored in addition to the image information, the mobile robot compares the image information at a position of a charging station with the currently detected image information, and detects the position of the charging station based on the comparison result (S500).
Referring to
The mobile robot divides the plurality of feature points into a predetermined number (for example, 16, 32, 64) of clusters to generate a predetermined number of feature point clusters, and extracts central points from the respective clusters (S311). A relationship between the feature points and the central points may be represented in form of histogram. Each image may be represented based on such relationship. The mobile robot may store each image information represented based on the central points. The mobile robot calculates a similarity of the plurality of images based on the central points, and matches the central points with each other based on the similarity calculation result (S321). The similarity of the images represented using the central points may be calculated from the relationship between the histograms. The similarity of the images may be calculated by a method, such as a running total of a Euclidean distance. Generally, images obtained by continuous capturing of a specific region exhibits a high similarity, whereas images obtained from different regions exhibit a low similarity. However, the images from the different regions may exhibit a high similarity. To distinguish those, the mobile robot performs a feature point matching between selected images (S323). For example, the mobile robot selects a predetermined number of images having a higher similarity calculated between the central points, and matches the feature points within the selected images with the feature points within the newly detected images (S323). Next, the mobile robot recognizes an absolute position based on the matching result of the feature points (S400). A matching algorithm for matching the central points or feature points may be existent in various forms, so detailed description of the matching algorithm will be omitted.
Referring to
Referring to
As described above, in accordance with the exemplary embodiments, a mobile robot may be allowed to recognize a precise position thereof by detecting a plurality of images by an image detection unit, such as a camera, which is located at an upper surface or a front surface, extracting two or more feature points from the plurality of images, comparing and matching information related to each of the feature points. Also, the mobile robot may use image information to easily detect a position of a charging station and quickly move to the charging station upon the lack of residual battery capacity. In addition, the mobile robot may detect the position of the charging station based on the image information and receive a guideline signal within a signal reception range, so as to easily dock with the charging station.
Number | Date | Country | Kind |
---|---|---|---|
10-2011-0094803 | Sep 2011 | KR | national |