Robotic vacuum cleaner with edge and object detection system

Information

  • Patent Grant
  • 7801645
  • Patent Number
    7,801,645
  • Date Filed
    Thursday, March 11, 2004
    21 years ago
  • Date Issued
    Tuesday, September 21, 2010
    14 years ago
Abstract
A robot uses an infrared sensor including an infrared light source which produces pulses of infrared light. Optics focus reflections of the infrared light pulses from different portions of the environment of the robot to different detectors in a 2D array of detectors. The detectors produce an indication of the distance to the closest object in an associated portion of the environment. The robot can use the indications to determine features in the environment. The robot can be controlled to avoid these features.
Description
FIELD OF THE INVENTION

The present invention relates to sensors, particularly to sensors for robots.


BACKGROUND

A number of sensors have been developed for use on robots. Infrared (IR) or sonar sensors have been used on robots to detect wall or other objects. The information from the sensors can be used to map the environment of the robot. Mapping the environment of a robot using such sensors can be difficult. Often, information from different times or from multiple sensors is need to be combined in order to determine environmental features. Such relatively complex calculations can be difficult to do in real-time especially when other software processes needs to be executed on the robot.


It is desired to have a robot using an improved infrared sensors.


BRIEF SUMMARY

One embodiment of the present invention is a robot including a motion unit; an infrared sensor and a processor. The infrared sensor includes an infrared light source to produce pulses of infrared light, optics to focus reflections from the infrared light pulse from different portions of the environment of the robot to different detectors in a 2D array of detectors. The detectors of the 2D array of detectors are adapted to produce an indication of the distance to the closest object in an associated portion of the environment. The processor receives the indications from the infrared sensor, determines a feature in the environment using the indications and controls the motion unit to avoid the feature.


An embodiment of the present invention is a method including producing pulses of infrared light, focusing reflections from the infrared light pulse from different portions of the environment of a robot to different detectors in a 2D array of detectors, the detectors producing an indication of the distance to the closest object in an associated portion of the environment, and using the indications from the infrared sensor to determine a feature in the environment so that the robot can be controlled to avoid the feature.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a functional diagram of a robot including an infrared sensor of one embodiment of the present invention.



FIG. 2 illustrates an example showing the operation of the infrared sensor of one embodiment of the present invention.



FIG. 3 is a diagram that illustrates the use of reflected pulses for the example of FIG. 2.



FIG. 4 illustrates another embodiment of the operation of the infrared sensor for a robot.



FIG. 5 illustrates the use of reflective pulses for the example of FIG. 4.



FIG. 6A-6D illustrates the operation of the infrared sensor to detect features in the environment of a robot.





DETAILED DESCRIPTION


FIG. 1 illustrates one example of a Robot 100. In the example of FIG. 1, the Robot 100 includes a motion unit 116. The motion unit 116 can be for example, wheels, tracks, legs or any element to move the robot 100. In one embodiment, the processor 118 uses motion control software 120 to control the motion unit 116. The motion unit 116 can include wheels, tracks, wings, legs or any other means of locomotion. The robot 100 also includes an IR sensor 102.


An infrared sensor 102 includes an infrared light source 104. The infrared light source 104 can produce pulses of infrared light. An infrared light sensor 102 includes optics 106 to focus reflections from an infrared light source pulse from different portions of the environment to different detectors in a two dimensional (2D) array of the detectors 108. The optics 106 can include a single or multiple optical elements. In one embodiment, the optics 106 focus light reflected from different regions of the environment to the detectors in the 2D array 108. The detectors produce indications of the distances to the closest objects in associated portions of the environment. In the example of FIG. 1, the 2D array includes pixel detectors 110 and associated detector logic 112. In one embodiment, the 2D array of detectors is constructed of CMOS technology on a semiconductor substrate. The pixel detectors can be photodiodes. The detector logic 112 can include counters. In one embodiment, a counter for a pixel detector runs until a reflected pulse is received. The counter value thus indicates the time for the pulse to be sent from the IR sensor and reflected back from an object in the environment to the pixel detector. Different portions of environment with different objects will have different pulse transit times.


In one embodiment, each detector produces an indication of the distance to the closest object in the associated portion of the environment. Such indications can be sent from the 2D detector array 108 to a memory such as the Frame Buffer RAM 114 that stores frames of the indications. A frame can contain distance indication data of the pixel detectors for a single pulse.


Controller 105 can be used to initiate the operation of the IR pulse source 104 as well as to control the counters in the 2D detector array 108.


An exemplary infrared sensor for use in the present invention is available from Canesta, Inc. of San Jose, Calif. Details of such infrared sensors are described in the U.S Pat. No. 6,323,942 and published patent applications US 2002/0140633 A1, US 2002/0063775 A1, US 2003/0076484 A1 each of which are incorporated herein by reference.


The processor 118 in one embodiment is adapted to receive the indications from the IR sensor 102. In one embodiment, the indications are stored in the frame buffer Random Access Memory (RAM) 114. The indications are used by the processor to determine a feature in the environment and to control the motion of the unit to avoid the feature. Examples of features include steps, walls and objects such as a chair legs. The advantage of the above described IR sensor with a two-dimensional array of detectors is that a full frame of distance indications can be created. Full frames of distance indications simplify feature detection. The burden on the processor 118 is also reduced. In one embodiment, feature detection software 122 receives frames of indications and uses the frames to detect features. Once the features are determined, the features can be added to an internal environment map with feature mapping software 124. The motion control software 120 can be used to track the position of the robot 100. Alternately, other elements can be used for positioning the robot In one embodiment, the robot uses the indications from the detector to determine how to move the robot so that the robot avoids falling down stairs, and bumping into walls and other objects.


In one embodiment, the robot 100 is a robot cleaner that has an optional cleaning unit 126. Cleaning control software 128 can be used to control the operation of the cleaning unit 126.


In one embodiment, other sensors 130 are used. The other sensors 130 can include sonar sensors or simple infrared sensors positioned around the perimeter of the robot. The other sensors 130 can be used to improve the mapping or other operations of the robot 100.


One embodiment of the present invention is a robot, such as robot cleaner 100, that includes a sensor 102 producing multiple indications of distances to the closest object in an associated portion of the environment. A processor 118 receives indications from the sensor, determines a feature in the environment and controls a motion unit to avoid the feature. A determined feature can be indicated in an internal map. The determined feature can be a step, an object in a room, or other element.



FIG. 2 illustrates the operation of a two-dimensional array of detectors. In the example of FIG. 2, a simplified one-dimensional slice of the two dimensional array is shown. In this example, regions 1, 2, 3, 4 and 5 extend over the edge of the step and thus do not receive substantial reflections back to the two-dimensional array. Region 6 receives some reflections and steps 7 and 8 receive even more reflections.



FIG. 3 illustrates an exemplary timing diagram for the reflections of the example of FIG. 2. Reflections from objects located closer to the robot will return to the sensor sooner than reflections from objects that are further away. In one embodiment, the time from the start of the pulse to the start of the received reflection indicates the distance to an object.


Depending on the orientation of the objects, the detected reflections can be defuse or specular. Specular reflections are mirror-like reflections that result in high reflected intensities for a narrow angle range. Defuse reflections are relatively low intensities over a wider angle range. In the orientations of FIG. 3, floors and carpets tend to result in defuse reflections at the sensor while perpendicular objects such as walls tend to result in specular reflections at the sensor. The intensity of the reflected energy can be useful in obtaining information concerning the type of reflections. Reflective devices can be placed in the environment to increase the specular reflections at stair edges or other locations. In one example, the reflective devices can use retro reflective material such as a retro reflective tape or ribbon.


Alternately, the IR sensor can be oriented pointing down onto the floor at the edge of the robot. This results in greater specular reflections when the robot is over the floor. When the robot is over a descending stairway the reflections will be reduced. Based on this information the descending stairway can be detected. Using the IR sensor 102 with a 2D array 108 has the additional advantage that the orientation of the edge of the descending stairway can be determined. This can often give hints as to the orientation of a robot with respect to the room.


In one embodiment, the transmitted pulse is modulated. The modulated pulse can be used to detect low-energy defuse reflections.


In one embodiment, a cutoff time, tcutoff, can be used. The total detected energy up to the cutoff time gives some indication of distance. The more reflected energy received up to the cutoff time, the closer the object is to the infrared sensor. Such an embodiment is especially useful for specular reflections.



FIG. 4 illustrates an example, in which an object 404 is detected by the 2D array 400 and optics 402. The regions that detect the objects 404 regions 1, 2, 3, 4 and 5 will typically have relatively strong reflections. FIG. 5 illustrates an exemplary timing diagram for the reflections of the example of FIG. 4.


Examples of the operation of an infrared detector of one embodiment are shown in FIGS. 6A to 6D. The example of 6A shows a simplified embodiment in which a 2D array produces a frame of indications of the distances to the closest objects in the environment. In the example of FIG. 6A, an exemplary 4×4 frame of indications is used but the frame size and the number of regions examined can be much larger.


The regions are three dimensional but form a footprint on the floor shown by the dotted lines of FIG. 6A. The example of FIG. 6A illustrates the distance indications for a flat floor. FIG. 6B illustrates a case in which an object 602 is in the detected region for the IR sensor. FIG. 6C shows an example when the object 602 is closer to the robot. The changes of the distance indications can be used determine the size and location of the object. Walls can be detected due to their characteristic patterns.



FIG. 6D illustrates an example of when the robot 600 approaches a step 604. In this example, some regions indicate greatly increased distances or have no detected reflections due to the step. The robot 600 can thus determine the location of the step. Steps and objects can be avoided by the robot 600. The position of the objects and steps can be stored in an internal map maintained by the robot 600.


The foregoing description of preferred embodiments of the present invention has been provided for the purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise forms disclosed. Many embodiments were chosen and described in order to best explain the principles of the invention and its practical application, thereby enabling others skilled in the art to understand the invention for various embodiments and with various modifications that are suited to the particular use contemplated. It is intended that the scope of the invention be defined by the claims and their equivalence.

Claims
  • 1. A robot comprising: a motion unit;a two-dimensional (2D) array of detectors supported by the motion unit, each detector having a counter associated therewith, the 2D array operable to generate a frame of distance indications to one or more features in an environment in which the robot operates;an infrared sensor including: (a) an infrared light source configured to produce a plurality of modulated pulses of infrared light directed toward the environment of the robot; and (b) at least one optic element configured to focus a plurality of reflections of the infrared light pulses from the environment of the robot to the 2D array of detectors, causing the detection of the 2D array of detectors, wherein the modulated pulses enable detection of low-energy diffuse reflections; andat least one processor operatively coupled to the 2D array of detectors, the processor operable: (a) to determine one or more features of the environment based at least in part on one or more frames of distance indications; and (b) to control the motion unit of the robot to avoid the one or more detected features.
  • 2. The robot of claim 1, wherein the distance indication is produced by the counter measuring a period of time to receive a reflected pulse.
  • 3. The robot of claim 2, wherein the feature is indicated in an internal map of the environment.
  • 4. The robot of claim 2, wherein the feature is a step.
  • 5. The robot of claim 2, wherein the feature is an object in a room.
  • 6. The robot of claim 1, wherein the distance indication is produced by measuring an energy of a reflected pulse up to a cutoff time.
  • 7. The robot of claim 1, wherein the robot is a robot cleaner.
  • 8. The robot of claim 1, wherein the processor is further operable to add the one or more determined features to an internal map of the environment.
  • 9. The robot of claim 1, further comprising a memory device for storing the one or more distance indications.
  • 10. A method for controlling a robot comprising: producing a plurality of modulated pulses of infrared light directed toward an environment of the robot;focusing with at least one optic element a plurality of reflections of the infrared light pulses from the environment of the robot to a two-dimensional (2D) array of detectors;detecting by the 2D array of detectors the plurality of reflections of the infrared light pulses, wherein the modulated pulses enable detection of low-energy diffuse reflections;generating a frame of distance indications to one or more features in the environment;processing the generated one or more frames of distance indications to determine one or more features of the environment; andcontrolling the motion of the robot to avoid the one or more features of the environment.
  • 11. The method of claim 10, wherein the distance indication is produced by a counter measuring the time to receive a reflected pulse.
  • 12. The method of claim 11, wherein the feature is indicated in an internal map of the environment.
  • 13. The method of claim 11, wherein the feature is a step.
  • 14. The method of claim 11, wherein the feature is an object in a room.
  • 15. The method of claim 10, wherein the distance indication is produced by measuring the energy of a reflected pulse up to a cutoff time.
  • 16. The method of claim 10, wherein the robot is a robot cleaner.
  • 17. A robot comprising: an infrared light source configured to produce a plurality of modulated pulses of infrared light directed toward an environment of the robot;a two-dimensional (2D) array of detectors, each detector having a counter associated therewith, the 2D array operable:(a) to detect a plurality of reflections of the infrared light pulses from the environment, including low-energy diffuse reflections; and(b) to generate a frame of distance indications to one or more features of the environment;at least one processor operatively coupled to the 2D array of detectors, the processor operable:(a) to determine one or more features of the environment based at least in part on one or more frames of distance indications; and(b) to control the motion of the robot to avoid the one or more detected features.
  • 18. The robot of claim 17, wherein the distance indication is produced by the counter measuring a period of time to receive a reflected pulse.
  • 19. The robot of claim 18, wherein the feature is indicated in an internal map of the environment.
  • 20. The robot of claim 18, wherein the feature is a step.
  • 21. The robot of claim 18, wherein the feature is an object in a room.
  • 22. The robot of claim 17, wherein the distance indication is produced by measuring an energy of a reflected pulse up to a cutoff time.
  • 23. The robot of claim 17, wherein the robot is a robot cleaner.
  • 24. The robot of claim 17, wherein the processor is further operable to add the one or more determined features to an internal map of the environment.
  • 25. The robot of claim 17, further comprising a memory device for storing the one or more distance indications.
  • 26. A method for controlling a robot comprising: transmitting a plurality of modulated pulses of infrared light toward an environment of the robot;detecting by an infrared detector a plurality of low-energy diffuse reflections of the infrared light pulses indicating distances to one or more features of an environment;processing one or more distance indications to determine one or more features of the environment;adding the one or more determined features to an internal map of the environment; andcontrolling the motion of the robot to avoid the one or more features of the environment.
  • 27. The method of claim 26, wherein the distance indication is produced by measuring the time to receive a reflected pulse from one or more features of the environment.
  • 28. The method of claim 26, wherein the distance indication is produced by measuring the energy of a reflected pulse from one or more features of the environment.
  • 29. The method of claim 26, wherein one or more features are indicated in the internal map of the environment.
  • 30. The method of claim 26, wherein the feature is a step.
  • 31. The method of claim 26, wherein the feature is an object in a room.
  • 32. The method claim 26, wherein the robot is a robot cleaner.
CLAIM OF PRIORITY

This application claims priority to U.S. Patent Provisional Application No. 60/454,934 filed Mar. 14, 2003; U.S. Provisional Application No. 60/518,756 filed Nov. 10, 2003; U.S. Provisional Application No. 60/518,763 filed Nov. 10, 2003; U.S. Provisional Application No. 60/526,868 filed Dec. 4, 2003; U.S. Provisional Application No. 60/527,021 filed Dec. 4, 2003 and U.S. Provisional Application No. 60/526,805 filed Dec. 4, 2003. This application incorporates by reference U.S. application Ser. No. 10/798,232 entitled “Robot Vacuum” by Taylor et al., filed concurrently and published as US Pub.20040244138, which has been abandoned. This application is related to the following commonly owned, co-pending applications: U.S. Patent Appln. Nos. FiledSer. No. 10/798,232 Mar. 11, 2004Ser. No. 10/799,916 Mar. 11, 2004Ser. No. 10/798,732 Mar. 11, 2004Ser. No. 10/798,716 Mar. 11, 2004Ser. No. 10/798,231 Mar. 11, 2004Ser. No. 10/798,228 Mar. 11, 2004Ser. No. 11/104,890 Apr. 13, 2004Ser. No. 11/171,031 Jun. 6, 2005Ser. No. 11/574,290 Feb. 26, 2007

US Referenced Citations (109)
Number Name Date Kind
4674048 Okumura Jun 1987 A
4700427 Knepper Oct 1987 A
4706327 Getz et al. Nov 1987 A
4782550 Jacobs Nov 1988 A
4962453 Pong et al. Oct 1990 A
4977639 Takahashi et al. Dec 1990 A
5012886 Jonas et al. May 1991 A
5023444 Ohman Jun 1991 A
5095577 Jonas et al. Mar 1992 A
5109566 Kobayashi et al. May 1992 A
5111401 Everett, Jr. et al. May 1992 A
5148573 Killian et al. Sep 1992 A
5208521 Aoyama May 1993 A
5220263 Onishi et al. Jun 1993 A
5276618 Everett, Jr. Jan 1994 A
5279972 Heckenberg et al. Jan 1994 A
5284522 Kobayashi et al. Feb 1994 A
5293955 Lee Mar 1994 A
5307273 Oh et al. Apr 1994 A
5309592 Hiratsuka May 1994 A
5321614 Ashworth Jun 1994 A
5341540 Soupert et al. Aug 1994 A
5402051 Fujiwara et al. Mar 1995 A
5440216 Kim Aug 1995 A
5446356 Kim Aug 1995 A
5498940 Kim et al. Mar 1996 A
5534762 Kim Jul 1996 A
5554917 Kurz et al. Sep 1996 A
5568589 Hwang Oct 1996 A
5613261 Kawakami et al. Mar 1997 A
5621291 Lee Apr 1997 A
5622236 Azumi et al. Apr 1997 A
5634237 Paranjpe Jun 1997 A
5664285 Melito et al. Sep 1997 A
5677836 Bauer Oct 1997 A
5682640 Han Nov 1997 A
5720077 Nakamura et al. Feb 1998 A
5787545 Colens Aug 1998 A
5815880 Nakanishi Oct 1998 A
5841259 Kim et al. Nov 1998 A
5894621 Kubo Apr 1999 A
5940927 Haegermarck et al. Aug 1999 A
5940930 Oh et al. Aug 1999 A
5942869 Katou et al. Aug 1999 A
5974347 Nelson Oct 1999 A
5995883 Nishikado Nov 1999 A
5995884 Allen et al. Nov 1999 A
6023064 Burgin Feb 2000 A
6042656 Knutson Mar 2000 A
6076025 Ueno et al. Jun 2000 A
6076226 Reed Jun 2000 A
6119057 Kawagoe Sep 2000 A
6255793 Peless et al. Jul 2001 B1
6263989 Won Jul 2001 B1
6323932 Zhang et al. Nov 2001 B1
6323942 Bamji Nov 2001 B1
6327741 Reed Dec 2001 B1
6338013 Ruffner Jan 2002 B1
6339735 Peless et al. Jan 2002 B1
6370453 Sommer Apr 2002 B2
6389329 Colens May 2002 B1
6417641 Peless et al. Jul 2002 B2
6431296 Won Aug 2002 B1
6457206 Judson Oct 2002 B1
6459955 Bartsch et al. Oct 2002 B1
6480265 Maimon et al. Nov 2002 B2
6493612 Bisset et al. Dec 2002 B1
6493613 Peless et al. Dec 2002 B2
6507773 Parker et al. Jan 2003 B2
6508867 Schoenewald et al. Jan 2003 B2
6519804 Vujik Feb 2003 B1
D471243 Cioffi et al. Mar 2003 S
6532404 Colens Mar 2003 B2
6535793 Allard Mar 2003 B2
6553612 Dyson et al. Apr 2003 B1
6574536 Kawagoe et al. Jun 2003 B1
6586908 Petersson et al. Jul 2003 B2
6590222 Bisset et al. Jul 2003 B1
6594844 Jones Jul 2003 B2
6597143 Song et al. Jul 2003 B2
6601265 Burlington Aug 2003 B1
6604022 Parker et al. Aug 2003 B2
6605156 Clark et al. Aug 2003 B1
6611120 Song et al. Aug 2003 B2
6615108 Peless et al. Sep 2003 B1
6615885 Ohm Sep 2003 B1
6657705 Sano et al. Dec 2003 B2
6661239 Ozick Dec 2003 B1
6671592 Bisset et al. Dec 2003 B1
7155308 Jones Dec 2006 B2
20010022506 Peless et al. Sep 2001 A1
20010047895 De Fazio et al. Dec 2001 A1
20020025472 Komori et al. Feb 2002 A1
20020060542 Song et al. May 2002 A1
20020063775 Taylor May 2002 A1
20020091466 Song et al. Jul 2002 A1
20020112899 Dijksman et al. Aug 2002 A1
20020120364 Colens Aug 2002 A1
20020140633 Rafii et al. Oct 2002 A1
20020153855 Song et al. Oct 2002 A1
20030030398 Jacobs et al. Feb 2003 A1
20030039171 Chiapetta Feb 2003 A1
20030060928 Abramson et al. Mar 2003 A1
20030076484 Bamji et al. Apr 2003 A1
20030120389 Abramson et al. Jun 2003 A1
20030192144 Song et al. Oct 2003 A1
20030208304 Peless et al. Nov 2003 A1
20030229421 Chmura et al. Dec 2003 A1
20040088079 Lavarec et al. May 2004 A1
Foreign Referenced Citations (19)
Number Date Country
1 133 537 Jul 2003 EP
2 344 747 Jun 2000 GB
2 344 748 Jun 2000 GB
2 352 486 Jan 2001 GB
2 355 523 Apr 2001 GB
2 369 511 May 2002 GB
05046246 Feb 1993 JP
WO 9113319 Sep 1991 WO
WO 0036968 Jun 2000 WO
WO 0036970 Jun 2000 WO
WO 0038255 Jun 2000 WO
WO 0073868 Dec 2000 WO
WO 0101208 Jan 2001 WO
WO 0128400 Apr 2001 WO
WO 02067744 Sep 2002 WO
WO 02075469 Sep 2002 WO
WO 03031285 Apr 2003 WO
WO 03062937 Jul 2003 WO
WO 03104909 Dec 2003 WO
Related Publications (1)
Number Date Country
20040220698 A1 Nov 2004 US
Provisional Applications (6)
Number Date Country
60454934 Mar 2003 US
60518756 Nov 2003 US
60518763 Nov 2003 US
60526868 Dec 2003 US
60527021 Dec 2003 US
60526805 Dec 2003 US