The present invention relates to sensors, particularly to sensors for robots.
A number of sensors have been developed for use on robots. Infrared (IR) or sonar sensors have been used on robots to detect wall or other objects. The information from the sensors can be used to map the environment of the robot. Mapping the environment of a robot using such sensors can be difficult. Often, information from different times or from multiple sensors is need to be combined in order to determine environmental features. Such relatively complex calculations can be difficult to do in real-time especially when other software processes needs to be executed on the robot.
It is desired to have a robot using an improved infrared sensors.
One embodiment of the present invention is a robot including a motion unit; an infrared sensor and a processor. The infrared sensor includes an infrared light source to produce pulses of infrared light, optics to focus reflections from the infrared light pulse from different portions of the environment of the robot to different detectors in a 2D array of detectors. The detectors of the 2D array of detectors are adapted to produce an indication of the distance to the closest object in an associated portion of the environment. The processor receives the indications from the infrared sensor, determines a feature in the environment using the indications and controls the motion unit to avoid the feature.
An embodiment of the present invention is a method including producing pulses of infrared light, focusing reflections from the infrared light pulse from different portions of the environment of a robot to different detectors in a 2D array of detectors, the detectors producing an indication of the distance to the closest object in an associated portion of the environment, and using the indications from the infrared sensor to determine a feature in the environment so that the robot can be controlled to avoid the feature.
An infrared sensor 102 includes an infrared light source 104. The infrared light source 104 can produce pulses of infrared light. An infrared light sensor 102 includes optics 106 to focus reflections from an infrared light source pulse from different portions of the environment to different detectors in a two dimensional (2D) array of the detectors 108. The optics 106 can include a single or multiple optical elements. In one embodiment, the optics 106 focus light reflected from different regions of the environment to the detectors in the 2D array 108. The detectors produce indications of the distances to the closest objects in associated portions of the environment. In the example of
In one embodiment, each detector produces an indication of the distance to the closest object in the associated portion of the environment. Such indications can be sent from the 2D detector array 108 to a memory such as the Frame Buffer RAM 114 that stores frames of the indications. A frame can contain distance indication data of the pixel detectors for a single pulse.
Controller 105 can be used to initiate the operation of the IR pulse source 104 as well as to control the counters in the 2D detector array 108.
An exemplary infrared sensor for use in the present invention is available from Canesta, Inc. of San Jose, Calif. Details of such infrared sensors are described in the U.S Pat. No. 6,323,942 and published patent applications US 2002/0140633 A1, US 2002/0063775 A1, US 2003/0076484 A1 each of which are incorporated herein by reference.
The processor 118 in one embodiment is adapted to receive the indications from the IR sensor 102. In one embodiment, the indications are stored in the frame buffer Random Access Memory (RAM) 114. The indications are used by the processor to determine a feature in the environment and to control the motion of the unit to avoid the feature. Examples of features include steps, walls and objects such as a chair legs. The advantage of the above described IR sensor with a two-dimensional array of detectors is that a full frame of distance indications can be created. Full frames of distance indications simplify feature detection. The burden on the processor 118 is also reduced. In one embodiment, feature detection software 122 receives frames of indications and uses the frames to detect features. Once the features are determined, the features can be added to an internal environment map with feature mapping software 124. The motion control software 120 can be used to track the position of the robot 100. Alternately, other elements can be used for positioning the robot In one embodiment, the robot uses the indications from the detector to determine how to move the robot so that the robot avoids falling down stairs, and bumping into walls and other objects.
In one embodiment, the robot 100 is a robot cleaner that has an optional cleaning unit 126. Cleaning control software 128 can be used to control the operation of the cleaning unit 126.
In one embodiment, other sensors 130 are used. The other sensors 130 can include sonar sensors or simple infrared sensors positioned around the perimeter of the robot. The other sensors 130 can be used to improve the mapping or other operations of the robot 100.
One embodiment of the present invention is a robot, such as robot cleaner 100, that includes a sensor 102 producing multiple indications of distances to the closest object in an associated portion of the environment. A processor 118 receives indications from the sensor, determines a feature in the environment and controls a motion unit to avoid the feature. A determined feature can be indicated in an internal map. The determined feature can be a step, an object in a room, or other element.
Depending on the orientation of the objects, the detected reflections can be defuse or specular. Specular reflections are mirror-like reflections that result in high reflected intensities for a narrow angle range. Defuse reflections are relatively low intensities over a wider angle range. In the orientations of
Alternately, the IR sensor can be oriented pointing down onto the floor at the edge of the robot. This results in greater specular reflections when the robot is over the floor. When the robot is over a descending stairway the reflections will be reduced. Based on this information the descending stairway can be detected. Using the IR sensor 102 with a 2D array 108 has the additional advantage that the orientation of the edge of the descending stairway can be determined. This can often give hints as to the orientation of a robot with respect to the room.
In one embodiment, the transmitted pulse is modulated. The modulated pulse can be used to detect low-energy defuse reflections.
In one embodiment, a cutoff time, tcutoff, can be used. The total detected energy up to the cutoff time gives some indication of distance. The more reflected energy received up to the cutoff time, the closer the object is to the infrared sensor. Such an embodiment is especially useful for specular reflections.
Examples of the operation of an infrared detector of one embodiment are shown in
The regions are three dimensional but form a footprint on the floor shown by the dotted lines of
The foregoing description of preferred embodiments of the present invention has been provided for the purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise forms disclosed. Many embodiments were chosen and described in order to best explain the principles of the invention and its practical application, thereby enabling others skilled in the art to understand the invention for various embodiments and with various modifications that are suited to the particular use contemplated. It is intended that the scope of the invention be defined by the claims and their equivalence.
This application claims priority to U.S. Patent Provisional Application No. 60/454,934 filed Mar. 14, 2003; U.S. Provisional Application No. 60/518,756 filed Nov. 10, 2003; U.S. Provisional Application No. 60/518,763 filed Nov. 10, 2003; U.S. Provisional Application No. 60/526,868 filed Dec. 4, 2003; U.S. Provisional Application No. 60/527,021 filed Dec. 4, 2003 and U.S. Provisional Application No. 60/526,805 filed Dec. 4, 2003. This application incorporates by reference U.S. application Ser. No. 10/798,232 entitled “Robot Vacuum” by Taylor et al., filed concurrently and published as US Pub.20040244138, which has been abandoned. This application is related to the following commonly owned, co-pending applications: U.S. Patent Appln. Nos. FiledSer. No. 10/798,232 Mar. 11, 2004Ser. No. 10/799,916 Mar. 11, 2004Ser. No. 10/798,732 Mar. 11, 2004Ser. No. 10/798,716 Mar. 11, 2004Ser. No. 10/798,231 Mar. 11, 2004Ser. No. 10/798,228 Mar. 11, 2004Ser. No. 11/104,890 Apr. 13, 2004Ser. No. 11/171,031 Jun. 6, 2005Ser. No. 11/574,290 Feb. 26, 2007
Number | Name | Date | Kind |
---|---|---|---|
4674048 | Okumura | Jun 1987 | A |
4700427 | Knepper | Oct 1987 | A |
4706327 | Getz et al. | Nov 1987 | A |
4782550 | Jacobs | Nov 1988 | A |
4962453 | Pong et al. | Oct 1990 | A |
4977639 | Takahashi et al. | Dec 1990 | A |
5012886 | Jonas et al. | May 1991 | A |
5023444 | Ohman | Jun 1991 | A |
5095577 | Jonas et al. | Mar 1992 | A |
5109566 | Kobayashi et al. | May 1992 | A |
5111401 | Everett, Jr. et al. | May 1992 | A |
5148573 | Killian et al. | Sep 1992 | A |
5208521 | Aoyama | May 1993 | A |
5220263 | Onishi et al. | Jun 1993 | A |
5276618 | Everett, Jr. | Jan 1994 | A |
5279972 | Heckenberg et al. | Jan 1994 | A |
5284522 | Kobayashi et al. | Feb 1994 | A |
5293955 | Lee | Mar 1994 | A |
5307273 | Oh et al. | Apr 1994 | A |
5309592 | Hiratsuka | May 1994 | A |
5321614 | Ashworth | Jun 1994 | A |
5341540 | Soupert et al. | Aug 1994 | A |
5402051 | Fujiwara et al. | Mar 1995 | A |
5440216 | Kim | Aug 1995 | A |
5446356 | Kim | Aug 1995 | A |
5498940 | Kim et al. | Mar 1996 | A |
5534762 | Kim | Jul 1996 | A |
5554917 | Kurz et al. | Sep 1996 | A |
5568589 | Hwang | Oct 1996 | A |
5613261 | Kawakami et al. | Mar 1997 | A |
5621291 | Lee | Apr 1997 | A |
5622236 | Azumi et al. | Apr 1997 | A |
5634237 | Paranjpe | Jun 1997 | A |
5664285 | Melito et al. | Sep 1997 | A |
5677836 | Bauer | Oct 1997 | A |
5682640 | Han | Nov 1997 | A |
5720077 | Nakamura et al. | Feb 1998 | A |
5787545 | Colens | Aug 1998 | A |
5815880 | Nakanishi | Oct 1998 | A |
5841259 | Kim et al. | Nov 1998 | A |
5894621 | Kubo | Apr 1999 | A |
5940927 | Haegermarck et al. | Aug 1999 | A |
5940930 | Oh et al. | Aug 1999 | A |
5942869 | Katou et al. | Aug 1999 | A |
5974347 | Nelson | Oct 1999 | A |
5995883 | Nishikado | Nov 1999 | A |
5995884 | Allen et al. | Nov 1999 | A |
6023064 | Burgin | Feb 2000 | A |
6042656 | Knutson | Mar 2000 | A |
6076025 | Ueno et al. | Jun 2000 | A |
6076226 | Reed | Jun 2000 | A |
6119057 | Kawagoe | Sep 2000 | A |
6255793 | Peless et al. | Jul 2001 | B1 |
6263989 | Won | Jul 2001 | B1 |
6323932 | Zhang et al. | Nov 2001 | B1 |
6323942 | Bamji | Nov 2001 | B1 |
6327741 | Reed | Dec 2001 | B1 |
6338013 | Ruffner | Jan 2002 | B1 |
6339735 | Peless et al. | Jan 2002 | B1 |
6370453 | Sommer | Apr 2002 | B2 |
6389329 | Colens | May 2002 | B1 |
6417641 | Peless et al. | Jul 2002 | B2 |
6431296 | Won | Aug 2002 | B1 |
6457206 | Judson | Oct 2002 | B1 |
6459955 | Bartsch et al. | Oct 2002 | B1 |
6480265 | Maimon et al. | Nov 2002 | B2 |
6493612 | Bisset et al. | Dec 2002 | B1 |
6493613 | Peless et al. | Dec 2002 | B2 |
6507773 | Parker et al. | Jan 2003 | B2 |
6508867 | Schoenewald et al. | Jan 2003 | B2 |
6519804 | Vujik | Feb 2003 | B1 |
D471243 | Cioffi et al. | Mar 2003 | S |
6532404 | Colens | Mar 2003 | B2 |
6535793 | Allard | Mar 2003 | B2 |
6553612 | Dyson et al. | Apr 2003 | B1 |
6574536 | Kawagoe et al. | Jun 2003 | B1 |
6586908 | Petersson et al. | Jul 2003 | B2 |
6590222 | Bisset et al. | Jul 2003 | B1 |
6594844 | Jones | Jul 2003 | B2 |
6597143 | Song et al. | Jul 2003 | B2 |
6601265 | Burlington | Aug 2003 | B1 |
6604022 | Parker et al. | Aug 2003 | B2 |
6605156 | Clark et al. | Aug 2003 | B1 |
6611120 | Song et al. | Aug 2003 | B2 |
6615108 | Peless et al. | Sep 2003 | B1 |
6615885 | Ohm | Sep 2003 | B1 |
6657705 | Sano et al. | Dec 2003 | B2 |
6661239 | Ozick | Dec 2003 | B1 |
6671592 | Bisset et al. | Dec 2003 | B1 |
7155308 | Jones | Dec 2006 | B2 |
20010022506 | Peless et al. | Sep 2001 | A1 |
20010047895 | De Fazio et al. | Dec 2001 | A1 |
20020025472 | Komori et al. | Feb 2002 | A1 |
20020060542 | Song et al. | May 2002 | A1 |
20020063775 | Taylor | May 2002 | A1 |
20020091466 | Song et al. | Jul 2002 | A1 |
20020112899 | Dijksman et al. | Aug 2002 | A1 |
20020120364 | Colens | Aug 2002 | A1 |
20020140633 | Rafii et al. | Oct 2002 | A1 |
20020153855 | Song et al. | Oct 2002 | A1 |
20030030398 | Jacobs et al. | Feb 2003 | A1 |
20030039171 | Chiapetta | Feb 2003 | A1 |
20030060928 | Abramson et al. | Mar 2003 | A1 |
20030076484 | Bamji et al. | Apr 2003 | A1 |
20030120389 | Abramson et al. | Jun 2003 | A1 |
20030192144 | Song et al. | Oct 2003 | A1 |
20030208304 | Peless et al. | Nov 2003 | A1 |
20030229421 | Chmura et al. | Dec 2003 | A1 |
20040088079 | Lavarec et al. | May 2004 | A1 |
Number | Date | Country |
---|---|---|
1 133 537 | Jul 2003 | EP |
2 344 747 | Jun 2000 | GB |
2 344 748 | Jun 2000 | GB |
2 352 486 | Jan 2001 | GB |
2 355 523 | Apr 2001 | GB |
2 369 511 | May 2002 | GB |
05046246 | Feb 1993 | JP |
WO 9113319 | Sep 1991 | WO |
WO 0036968 | Jun 2000 | WO |
WO 0036970 | Jun 2000 | WO |
WO 0038255 | Jun 2000 | WO |
WO 0073868 | Dec 2000 | WO |
WO 0101208 | Jan 2001 | WO |
WO 0128400 | Apr 2001 | WO |
WO 02067744 | Sep 2002 | WO |
WO 02075469 | Sep 2002 | WO |
WO 03031285 | Apr 2003 | WO |
WO 03062937 | Jul 2003 | WO |
WO 03104909 | Dec 2003 | WO |
Number | Date | Country | |
---|---|---|---|
20040220698 A1 | Nov 2004 | US |
Number | Date | Country | |
---|---|---|---|
60454934 | Mar 2003 | US | |
60518756 | Nov 2003 | US | |
60518763 | Nov 2003 | US | |
60526868 | Dec 2003 | US | |
60527021 | Dec 2003 | US | |
60526805 | Dec 2003 | US |