Robot confinement

Abstract
A robot confinement system includes a portable housing and a mobile robot. The portable housing includes a first detector operable to detect a presence of the mobile robot in a field of detection, and an emitter operable to emit a first signal when the first detector detects the presence of the mobile robot in the field of detection. The mobile robot is operable to move on a surface to clean the surface and includes a controller operable to control a movement path of the mobile robot on the surface. The mobile robot further includes a second detector operable to detect the first signal emitted by the portable housing. The controller of the mobile robot is operable to change the movement path of the mobile robot in response to detection of the first signal.
Description
BACKGROUND OF THE INVENTION

The invention relates to a method and system for robot localization and confinement.


There have been many systems proposed in the prior art for confining a robot to specific physical space for the purpose of performing work. These systems are typically designed for any number of robotic applications such as lawn care, floor cleaning, inspection, transportation, and entertainment, where it is desired to have a robot operate in a confined area for performing work over time.


By way of example, a vacuuming robot working in one room may unintentionally wander from one room to another room before satisfactorily completing the vacuuming of the first room. One solution is to confine the robot to the first room by closing all doors and physically preventing the robot from leaving the first room. In many houses, however, open passageways often separate rooms, and doors or other physical barriers cannot easily be placed in the robot's exit path. Likewise, a user may desire to only have the robot operate in a portion of a single open space and, therefore, letting the robot work in the entire room decreases efficiency.


It is therefore advantageous to have a means for confining the area in which a robot works.


One approach in the prior art is to provide sophisticated systems for navigation and orientation for the robot such that the robot either travels along a predetermined path and/or monitors its current location against a map stored in memory. These systems require sophisticated hardware, such as precision sensors and significant computer memory and computational power, and typically do not adapt well to changes in the area in which the robot is working. Likewise the robot cannot simply be taken from one building to another building, or even from room-to-room, without significant reprogramming or training.


For example, the method disclosed in U.S. Pat. No. 4,700,427 (Knepper) requires a means for generating a path for the robot to travel, which can be either a manually-controlled teaching of the path or automatic mapping function. If “the place of use is frequently changed” or the “rooms are modified,” large amounts of data memory is required in order to store information related to each location. Similarly, the method and system disclosed in U.S. Pat. No. 4,119,900 (Kremnitz) requires powerful computation and sensors to constantly ascertain the orientation of the robot in a given space. Other examples of robotic systems requiring inputted information about the space in which the robot is working include methods and systems shown in U.S. Pat. No. 5,109,566 (Kobayashi et al.) and U.S. Pat. No. 5,284,522 (Kobayashi et al.).


Similarly, certain prior art systems not only require the training or programming of the robot to the specifics of a particular space, but also require some preparation or alteration to the space in which the robot is to work. For example, U.S. Pat. No. 5,341,540 (Soupert et al.) discloses a system in which in a preferred embodiment requires the robot to include a positioning system and that the area for the robot be set up with “marking beacons . . . placed at fixed reference points.” While this system can avoid an unknown obstacle and return to its preprogrammed path through signals from the beacons, the system requires both significant user set-up and on-board computational power.


Similar systems and methods containing one or more of the above-described disadvantages are disclosed in U.S. Pat. No. 5,353,224 (Lee et al.), U.S. Pat. No. 5,537,017 (Feiten et al.), U.S. Pat. No. 5,548,511 (Bancroft), and U.S. Pat. No. 5,634,237 (Paranjpe).


Yet another approach for confining a robot to a specified area involves providing a device defining the entire boundary of the area. For example, U.S. Pat. No. 6,300,737 (Bergvall et al.) discloses an electronic bordering system in which a cable is placed on or under the ground to separate the inner area from the outer area. Likewise, the system disclosed in U.S. Pat. No. 6,255,793 (Peless et al.) requires installation of a metallic wire through which electricity flows to define a border. While these systems provide an effective means for confinement, they are difficult to install, are not portable from room-to-room, and can be unsightly or a tripping hazard if not placed under ground or beneath carpeting. Equally important, such systems can be difficult to repair if the wire or other confinement device breaks, as the location of such breaks can be difficult to determine when the system is placed underground or under carpet.


The present invention provides a modified and improved system for confining a robot to a given space without the drawbacks of the prior art.


SUMMARY OF THE INVENTION

In accordance with the present invention a robot confinement system is disclosed comprising: a portable barrier signal transmitter, wherein said barrier signal is transmitted primarily along an axis, said axis defining a barrier; a mobile robot, where said mobile robot comprises means for turning in at least one direction, a barrier signal detector, and a control unit controlling said means for turning; whereby the control unit runs an algorithm for avoiding said barrier signal upon detection of said barrier signal, said algorithm comprising the step of turning the robot until said barrier signal is no longer detected.


Accordingly, the present invention has several objects and advantages.


It is an object of the invention to provide a simplified and portable system and method for confining a robot to a given area.


It is an object of the invention to provide a confinement system that does not require installation.


It is an object of the invention to provide a barrier system that can be set up intuitively and includes a means for visually indicating the barrier.


It is an additional object of the invention to provide a system such that a robot approaching the barrier from either side of the barrier will turn in such a way as to avoid crossing the barrier.


It is an object of the invention to provide a robot confinement system that operates regardless of the angle at which the robot approaches the barrier.


It is an additional object of a preferred embodiment of the invention to provide a system that is substantially impervious to the effects of sunlight, will not cause interference with other devices, and will not be interfered by other devices.


The preferred embodiment of the present invention is for a robotic, indoor cleaning device similar to the types disclosed in U.S. Pat. No. 4,306,329 (Yokoi), U.S. Pat. No. 5,293,955 (Lee), U.S. Pat. No. 5,369,347 (Yoo), U.S. Pat. No. 5,440,216 (Kim), U.S. Pat. No. 5,613,261 (Kawakami et al.), U.S. Pat. No. 5,787,545 (Colens), U.S. Pat. No. 5,815,880 (Nakanishi), U.S. Pat. No. 6,076,226 (Reed). One of skill in the art will recognize that the present invention can be used in any number of robotic applications where confinement is desired. In addition, while the preferred embodiments described herein are for a robot without a navigation system, one of skill in the art will recognize the utility of the invention in applications using more sophisticated robots.


Other features and advantages of the invention will be apparent from the following detailed description, including the associated drawings, and from the claims.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1A shows an embodiment of the robot confinement system according to the invention with the barrier signal transmitter in an unpowered state;



FIG. 1B shows an embodiment of the robot confinement system according to the invention with the barrier signal transmitter in a powered state;



FIG. 2A shows a schematic representation of a preferred embodiment of the barrier signal transmitter;



FIG. 2B shows a circuit diagram of a specific embodiment of the barrier signal transmitter;



FIG. 3A shows a side-view schematic representation of a mobile robot used in a preferred embodiment of the invention;



FIG. 3B shows a top-view schematic representation of a mobile robot used in a preferred embodiment of the invention;



FIG. 4 shows a side-view of a preferred embodiment of an omni-directional barrier signal detector;



FIG. 5 demonstrates a hardware block diagram of the robot shown in FIGS. 3A & 3B;



FIG. 6 shows a schematic representation of an alternative embodiment of the robot employing multiple barrier signal detectors;



FIGS. 7A & 7B are flow-chart illustrations of the barrier avoidance algorithm of a preferred embodiment of the invention;



FIGS. 8A-C are schematic illustrations of the system and method of a preferred embodiment of the present invention;



FIGS. 9A-B are schematic illustrations of the system and method of an alternative embodiment of the present invention.





DETAILED DESCRIPTION

Referring to FIGS. 1A & 1B, living room 10 is shown separated from dining room 12 by interior walls 14 & 15. The living room and/or dining room may contain various furnishings, for example, couch 16, television 17, buffet 18 and table and chairs 19.


The rooms also contain a mobile robot 20 and a barrier signal transmitting device 30, which for purposes of this specification is also called a robot confinement (or RCON) transmitter 30. In FIGS. 1A & 1B, the robot is placed in the living room 10, and the RCON transmitter 30 is placed in the area dividing the living room 10 from the dining room 12, against interior wall 14 and pointing toward interior wall 15.


As described in more detail herein, FIG. 1B shows the same configuration of rooms with the RCON transmitter 30 in a powered state emitting, e.g., an infrared beam 42 from the RCON transmitter 30 toward interior wall 15. The beam 42 is directed primarily along an axis to create a boundary or barrier between living room 10 and dining room 12.


The system and method described herein each rely on a portable RCON transmitting unit 30 and a mobile robot 20. Each of these elements is first described independently, then the operation of a preferred embodiment of the invention is discussed.


RCON Transmitter



FIG. 2A illustrates a preferred embodiment of the RCON transmitter 30. The RCON transmitter 30 includes a first infrared emitter 32, a second infrared emitter 34, a power switch 36, and variable power-setting knob 38. The RCON transmitter enclosure 31 also houses the batteries (not shown) and necessary electronics for the various components. FIG. 2B shows a circuit diagram for the necessary electronics for an embodiment of the RCON transmitter 30. Other embodiments may use other conventional power sources.


In the embodiment shown in FIG. 2A, a user would turn on the RCON transmitter 30 using power switch 36 at the same time as the robot 20 begins operation. The user can also select a variable power using knob 38. In other embodiments, any number of known input devices can be used to turn on the unit and/or select a power setting, such as keypads, toggle switches, etc. A higher power can be used to provide a longer barrier useful for dividing a single room, while a lower power setting can be used to provide a barrier for a single doorway. Because of the reflective properties of various materials such as walls painted white, it is preferable to limit the power of the RCON transmitter 30 to the minimum necessary to provide the desired barrier.


In alternative embodiments, the RCON transmitter's power may be automatically turned off after a predetermined amount of time in order to preserve battery life.


In alternative embodiments, a control system can be used to turn on and turn off one or more RCON transmitters and/or robots in order to allow automatic cleaning of multiple rooms or spaces in a controlled manner. For example, a “smart house” control system might communicate directly with one or more RCON transmitters allowing a cycling of work spaces. In the alternative, the robot 20 might send a signal to the RCON to turn it on.


In the preferred embodiment, two infrared emitters 32 & 34 are used. The first IR emitter 32—the primary emitter—is powered to provide a directed barrier 42 of a given length from the RCON transmitter 30. In this embodiment, the beam 42 is a modulated, narrow IR beam. In the preferred embodiment, a collimated IR emitter is used such as Waitrony p/n IE-320H. The specifics of the emitter(s) are left to one of skill in the art; however, as explained in detail below, the beam 42 must have sufficient width. It is preferred that the minimum beam width be greater than the turning radius of the detector on a particular robot.


The second IR emitter 34—the secondary emitter—is powered to provide a diffuse region 44 near the RCON transmitter 30 to prevent robot 20 from crossing the beam 42 in its most narrow region closest to the RCON transmitter 30 and, in addition, prevents robot 20 from coming into direct contact with the RCON transmitter 30. In the preferred embodiment, a lens identical to the lens portion of the RCON detector, described below, is used for the secondary emitter 34. In other embodiments, a single active emitter operatively connected to appropriate optics can be used to create multiple emission points, including the two emitter system disclosed herein.


Because of potential interference from sunlight and other IR sources, most IR devices, such as remote controls, personal digital assistances and other IR communication devices, modulate the emitted signal. Herein, the emitters 32 & 34 modulate the beam at 38 kHz. In addition, IR devices modulate the beam to provide a serial bit stream to the unit being controlled to tell it what to do. In an embodiment of the present invention, additional modulation of the beam at a frequency, for example 500 Hz, different from the frequency of common IR bit streams prevents interference with other IR equipment.


While the preferred embodiment uses an infrared signal, the system and method of the present invention can use other signals such as electromagnetic energy to accomplish the goals, including radio waves, X-rays, microwaves, etc. Many of these types of waves have significant drawbacks. For example, radio waves are more difficult and expensive to make directional, and visible light suffers from interference from many sources and may be distracting to users. Sound waves could also be used, but it is similarly difficult to make purely directional and tend to scatter and reflect more.


Robot


As shown in FIGS. 3A & 3B, in the preferred embodiment, the robot 20 comprises a substantially circular shell 21 mounted to a chassis containing two wheels 22 & 23 mounted on opposite sides of a center line, wherein each of the wheels 22 & 23 can be independently driven to allow the robot to turn. In the preferred embodiment, the wheels are mounted in such a manner as to allow the robot to turn substantially in place. The preferred embodiment of the robot 20 also comprises motors 24, cleaning mechanism 25, rechargeable battery 26, microprocessor 27, and various tactile and optical sensors 28.


In FIG. 5 is illustrated a hardware block diagram of a robot similar to the one shown in FIGS. 3A & 3B. The hardware is built around a Winbond W78 XXX Series 8-bit processor. The processor is controlled by software stored in ROM. The system shown in FIG. 5 includes various control functions and motor drivers, along with various sensors (e.g. physical bump sensors, cliff sensors, the RCON detector/sensor).


For the instant invention, the robot also has an RCON detector 50, which in the preferred embodiment is a standard IR receiver module, which comprises a photodiode and related amplification and detection circuitry, mounted below an omni-directional lens, where omni-directional refers to a single plane. In a preferred embodiment, the IR receiver module is East Dynamic Corporation p/n IRM-8601S. However, any IR receiver module, regardless of modulation or peak detection wavelength, can be used as long as the RCON emitter is also changed to match the receiver. As shown in FIGS. 3A & 3B, the RCON detector is mounted at the highest point on the robot 20 and toward the front of the robot as defined by the primary traveling direction of the robot, as indicated by an arrow in FIG. 3B.


While the RCON detector should be mounted at the highest point of the robot in order to avoid shadows, it is desirable in certain applications to minimize the height of the robot 20 and/or the RCON detector 50 to prevent operational difficulties and to allow the robot 20 to pass under furniture or other obstacles. In certain embodiments, the RCON detector 50 can be spring mounted to allow the detector to collapse into the body of the robot when the robot runs under a solid overhanging object.



FIG. 4 shows in detail the preferred embodiment of the RCON detector 50. The RCON detector 50 includes a lens 52 that allows in the barrier signal (or rays) 42 from all directions through the outer lens wall 54 and focuses the rays at IR detector 55. At the same time, the method and systems of the present invention are likely to be used in the presence of sunlight. Because direct sunlight can easily saturate the IR detector 55, efforts may be made to exclude sunlight from the RCON detector 50. Therefore, in the preferred embodiment, opaque plastic horizontal plate 57 is used, which is supported by post 58.


The lens 52 used in the preferred embodiment is a primarily cylindrical device designed to accept rays perpendicular to the axis of the lens and to reject rays substantially above or substantially below the plane perpendicular to the axis of the lens. The lens focuses horizontal rays primarily on IR detector 55 mounted below the lens.


In the preferred embodiment, the geometry of the lens is determined by rotating a parabola about its focus, where the focus is collocated with the active element of the receiver 55. The inner lens wall 53 is thereby defined by the swept parabola. The rays are reflected by the phenomena called total internal reflection, defined here by the discontinuation between the lens material and the material internal to the inner lens wall 53. The preferred embodiment is constructed of clear polycarbonate chosen for its low cost and index of refraction.


The omni-directional nature of the RCON detector 50 allows a system with only a single RCON detector 50 to function equally well regardless of the angle of incident radiation from the RCON transmitter. If the RCON detector 50 is insensitive to the beams 42 & 44 from certain angles, then the robot 20 can break through the confining beams 42 & 44 when the robot 20 approaches the beam(s) such that the beam(s) occupies the RCON detector 50 blind spot.


In addition, in the preferred embodiment, the RCON transmitter 30 is battery powered. This imposes a high sensitivity requirement on the robot-mounted detector 50 in order to promote long battery life in the emitter 30. As such, the RCON detection system should be designed to gather as much IR as possible from the emitter(s).


The RCON detector of the preferred embodiment is designed to be triggered by modulated IR above a certain intensity threshold. If the IR levels are below the given threshold, the RCON detector computes no detection whatsoever and therefore triggers no specific control commands.


One of skill in the art will recognize that in alternative embodiments multiple RCON detectors 50 can be used. FIG. 6 illustrates such an embodiment using six side-mounted sensors 50. Each of the sensors should be oriented in a manner to have its field of view correspond to that of the single, top mounted sensor. Because a single, omni-directional RCON detector should be mounted at the highest point of the robot for optimal performance, it is possible to lower the profile of the robot by incorporating multiple detectors.


As disclosed above, the system and method of the present invention can be used with any number of robots existing in the prior art, including those designed for indoor cleaning applications.


Operation of System & Method


As shown in FIGS. 8A-C, an IR beam is used to divide the space (living room 10 and dining room 12) into two distinct areas. The robot has a sensor for detecting this beam 42 mounted at the robot's top front. As seen in FIG. 8B, whenever a measurable level of IR radiation strikes the detector the robot's IR avoidance behavior is triggered. In a preferred embodiment, this behavior causes the robot to spin in place to the left until the IR signal falls below detectable levels (FIG. 8C). The robot then resumes its previous motion. Spinning left is desired in certain systems because, by convention, the robot attempts to keep all objects to its right during following operations. The robot's confinement behavior is consistent with its other behaviors if it spins left on detecting the confining beam 42. In this embodiment, the IR sensor acts as a gradient detector. When the robot encounters a region of higher IR intensity the robot spins in place. Because the IR sensor is mounted at the front of the robot and because the robot does not move backward, the sensor always sees the increasing IR intensity before other parts of the robot. Thus spinning in place causes the sensor to translate to a region of decreased intensity. When the robot next moves forward, following the sensor, the robot necessarily moves to a region of decreased IR intensity—away from the beam.


In another preferred embodiment, the room confinement behavior works as a single behavior in a strictly priority based behavior system which controls the robot's motion. Each of the behaviors is assigned a priority, and the behavior with the highest priority requests control of the robot at any given time and has full control of the robot. These behaviors may include driving forward, turning when bumped, spiraling, etc. The confinement behavior is one of the highest priority behaviors. It requests control of the robot when the room confinement IR sensor has detected a signal from a room confinement transmitter.


A flow-chart of a preferred embodiment of the control logic of the confinement behavior is shown in FIG. 7A. The robot determines whether the RCON detector detects a signal (step 110). If a signal is detected, the robot chooses a turning direction (step 120). The robot then begins to turn in the chosen direction until the signal is no longer detected (step 130). Once the signal is no longer detected, the robot continues turning for an additional distance (step 140).


In the preferred embodiment of step 120, the direction is chosen through the algorithm illustrated in the flow chart shown in FIG. 7B. The robot's control logic keeps track of the robot's discrete interactions with the beam. The robot first increments the counter by one (step 122). On odd numbered interactions, the robot chooses a new turning direction randomly (steps 124 & 126); on even numbered interactions, the robot again uses its most recent turning direction.


In other embodiments, the robot can always turn a single direction or choose a direction randomly. When the robot always turns one direction, the robot may get stuck in a loop by turning away from the beam, bumping into another obstacle in a room, turning back toward the beam, seeing the beam again, turning away, bumping again, ad infinitum. Moreover, when the robot only turns in a single direction, it preferentially ends up at one end of the beam. Where the robot's task is to complete work evenly throughout a room, such as cleaning, a single turning direction is not optimal. If the direction is chosen purely randomly, the robot may turn back and forth quite a bit as it encounters the beam more than once.


In the preferred embodiment of step 140, the robot turns an additional 20 degrees from the point at which the signal is lost. The amount of the turn, which was selected arbitrarily in the preferred embodiment, is left to the particular robot and application. The additional turn prevents the robot from re-encountering the confinement beam immediately after exiting the beam. For various applications, the amount of additional movement (linear or turning) can be a predetermined distance or time, or in the alternative may include a random component.


In still other embodiments, the robot's avoidance behavior may include reversing the robot's direction until the beam 42 is no longer detected.


In other embodiments, the RCON detector is able to determine the gradient levels of the beam. This information can be used to send the robot in the direction of the lowest level of detection and prevent the situation where the robot is situated entirely within the beam and therefore turns in 360 degrees without the detector exiting the beam. In these embodiments, if the robot turns 360 degrees without exiting the beam, the control logic may give a higher priority to a “gradient behavior.” The gradient behavior divides the possible robot headings into a fixed number of angular bins, each bin covering an equal sweep of the angular area around the robot. The robot then turns at a constant rate while sampling the number of detections in each angular bin. (For a system using infrared signals, detection counts are monotonically related to the signal strength.) After the robot has rotated more than 360 degrees, the gradient behavior commands the robot to turn toward the angular bin with the lowest detection count. When the robot achieves the correct heading, the gradient behavior commands the robot to move forward a predetermined distance, for example one-half of the width of the robot, then control is released from the gradient behavior. If necessary, this process repeats until the robot has moved into a region where IR intensity is below the detection threshold.


One of skill in the art will recognize that the emitter/detector system can also be used to guide the robot in any number of ways. For example, the beam 42 could be used to allow the robot to perform work parallel to the edge of the beam, allowing, for example, the floor right up to the edge of the room confinement beam to be cleaned.


In an alternative embodiment of the present invention, the RCON transmitter may comprise both a signal emitter and a signal detector. As shown in FIG. 9A, the RCON transmitter 210 includes both a primary emitter 212 and a detector 214. The RCON transmitter 210 is placed at one end of the desired barrier and a retroreflector 230 is placed at the opposite end of the desired barrier. The retroreflector, which reflects the beam back toward the emitter regardless of the orientation of the retroreflector relative to the beam, can be constructed from, for example, standard bicycle reflectors. As shown in FIG. 9A, primary emitter 212 produces beam 242. A portion of beam 242 reflects from retroreflector 230 and is detected by detector 214.


In the embodiment shown in FIGS. 9A & 9B, the IR radiation emitted by the primary emitter 212 can be modulated in either of two ways constituting signal A or signal B. During normal operation, the beam 242 emitted from the primary emitter 212 is reflected by the retro-reflective material 230 back into the detector 214. When this is true the RCON transmitter broadcasts signal A, which is received by robot 220. As shown in FIG. 9B, if the robot or other object comes between the emitter 212 and the retro-reflective material 230 then no signal is returned to the receiver 214 and the RCON transmitter 210 broadcasts signal B, which is received by robot 220. The robot 220 then uses this information to improve its performance. The robot turns away from the beam as described previously only when the robot detects signal B. When the robot detects signal A no action is taken.


For certain applications, the embodiment shown in FIGS. 9A & 9B provides improved performance. For example, in cleaning application, the completeness of cleaning is improved because the robot tends to clean up to the line connecting the confinement device and the retro-reflective material. Also, this embodiment is more resistant to beam blockage. If furniture or other obstacles partially occlude the beam, the robot tends to turn away when it is further from crossing the beam. Finally, an indicator, such as an LED, can be added to the RCON transmitter to indicate when the device is functioning and correctly aimed.


In other embodiments, the RCON transmitter can be used to define an annular confinement region. For example, an RCON transmitter with two omni-directional emitters may be employed, wherein the first emitter would broadcast the standard modulated beam and the second emitter would a emit radiation 180 degrees out of phase with the output of the first emitter, but with less power. The robot would be programmed to turn when the IR was not detected. As the robot gets further from the emitter, it would eventually, lose the beam and turn back into it. As it gets closer, the radiation from the second emitter would jam the radiation from the first emitter, creating essentially unmodulated IR. The detector would fail to detect this, and the robot would again turn back into the annulus.


In yet another embodiment, the RCON transmitter can be used as a “home base.” For example, once the voltage of the robot's battery drops below a predetermined level, the robot can use the gradient detection behavior to home in on the RCON transmitter. This allows the user to easily find the robot when it has finished cleaning instead of it randomly ending up in corners, under furniture, etc.


Although the description above contain many specificities, there should not be construed as limiting the scope of the invention but as merely providing illustrations of some of the presently preferred embodiments of this invention.


Other embodiments of the invention are within the scope of the following claims.

Claims
  • 1. A robot confinement system, comprising: a portable housing, including: a first detector operable to detect a presence of a mobile robot in a field of detection; andan emitter operable to emit a first signal when the first detector detects the presence of the mobile robot in the field of detection; andthe mobile robot, including: a shell;a chassis including at least two wheels;at least one motor connected to the at least two wheels for moving the mobile robot on a surface;a cleaner operable to clean the surface as the mobile robot moves on the surface;a controller operable to control the at least one motor to control a movement path of the mobile robot on the surface; anda second detector operable to detect the first signal,wherein the controller is operable to change the movement path of the mobile robot in response to detection of the first signal, and wherein the mobile robot is operable to generate an indication that is detectable by the first detector, and wherein the first detector is oprable to detect the indication for detecting the presence of the mobile robot in the field of detection.
  • 2. The robot confinement system as set forth in claim 1, wherein the first detector is operable to detect a modulated signal for detecting the presence of the mobile robot in the field of detection.
  • 3. The robot confinement system as set forth in claim 1, wherein the field of detection extends generally linearly from the portable housing on the surface.
  • 4. The robot confinement system as set forth in claim 3, wherein the portable housing includes a power selector operable to variably set a length of the field of detection.
  • 5. The robot confinement system as set forth in claim 3, wherein the controller is operable to change the movement path of the mobile robot to prevent the mobile robot from crossing the field of detection in response to detection of the first signal.
  • 6. The robot confinement system as set forth in claim 1, wherein the portable housing is operable to emit a second signal that is detectable by the mobile robot when the mobile robot is within a predetermined distance of the portable housing,the second detector is operable to detect the second signal, andthe controller is operable to change the movement path of the mobile robot to prevent the mobile robot from physically contacting the portable housing in response to detection of the second signal.
  • 7. The robot confinement system as set forth in claim 6, wherein the portable housing includes a second emitter that emits the second signal.
  • 8. The robot confinement system as set forth in claim 1, wherein the emitter is operable to automatically turn off after a predetermined time period.
  • 9. The robot confinement system as set forth in claim 1, wherein the portable housing includes a receiver operable to receive a smart house signal that changes an on/off status of the portable housing.
  • 10. The robot confinement system as set forth in claim 1, wherein the mobile robot further includes: a bump sensor operable to detect a physical contact with the shell as the mobile robot moves on the surface; anda cliff sensor operable to detect a falling edge of the surface as the mobile robot moves toward the falling edge,wherein the controller is operable to change the movement path of the mobile robot in response to detection of the physical contact and in response to detection of the falling edge.
  • 11. A method for confining a mobile robot with a portable housing, the method comprising; moving the mobile robot along a movement path on a surface and cleaning the surface with the mobile robot;detecting, with the portable housing, a presence of the mobile robot in a field of detection;emitting, with the portable housing, a first signal in response to detection of the presence of the mobile robot in the field of detection;detecting, with the mobile robot, the first signal emitted by the portable housing;changing, with the mobile robot, the movement path of the mobile robot in response to detection of the first signal;generating, with the mobile robot, an indication that is detectable by the portable housing when the mobile robot is in the field of detection, andwherein the portable housing detects the indication to detect the presence of the mobile robot in the field of detection.
  • 12. The method as set forth in claim 11, wherein the robot detects a modulated signal to detect the presence of the mobile robot in the field of detection.
  • 13. The method as set forth in claim 11, wherein the field of detection extends generally linearly from the portable housing on the surface.
  • 14. The method as set forth in claim 13, further comprising: variably setting, with the portable housing, a length of the field of detection.
  • 15. The method as set forth in claim 13, wherein the mobile robot changes the movement path to prevent the mobile robot from crossing the field of detection in response to detection of the first signal.
  • 16. The method as set forth in claim 11, further comprising: emitting, with the portable housing, a second signal that is detectable by the mobile robot when the mobile robot moves within a predetermined distance of the portable housing;detecting, with the mobile robot, the second signal; andchanging, with the mobile robot, the movement path of the mobile robot to prevent the mobile robot from physically contacting the portable housing in response to detection of the second signal.
  • 17. The method as set forth in claim 13, further comprising: changing, with the portable housing, an on/off status of the portable housing in response to a predetermined condition,wherein the predetermined condition occurs when at least one of a predetermined time period lapses and a smart house signal is received by the portable housing.
  • 18. The method as set forth in claim 13, further comprising: detecting, with the mobile robot, a physical contact with a shell of the mobile robot;detecting, with the mobile robot, a falling edge of the surface as the mobile robot moves toward the falling edge; andchanging, with the mobile robot, the movement path in response to detection of one of the physical contact and the falling edge.
  • 19. A method for confining a mobile robot with a portable housing, the method comprising: placing the mobile robot in a first area for cleaning the first area;placing the portable housing between the first area and a second area;setting a field of detection of the portable housing to extend generally linearly between the first area and the second area;variably setting a length of the field of detection to separate the first area from the second area;controlling the portable housing to detect a presence of the mobile robot in the field of detection; andcontrolling the mobile robot to automatically move in the first area and to prevent movement into the second area.
  • 20. The method as set forth in claim 19, wherein the portable housing emits a signal in response to detection of the presence of the mobile robot in the field of detection, andthe mobile robot detects the signal and changes a movement path to prevent the movement into the second area.
  • 21. The method as set forth in claim 19, wherein the portable housing is operable to detect a modulated signal to detect the presence of the mobile robot in the field of detection.
  • 22. The method as set forth in claim 19, wherein the mobile robot is operable to generate an indication that is detectable by the portable housing when the mobile robot moves in the field of detection, andthe portable housing is operable to detect the indication for detecting the presence of the mobile robot in the field of detection.
  • 23. The method as set forth in claim 19, further comprising: controlling the portable housing to emit a signal that is detectable by the mobile robot when the mobile robot moves within a predetermined distance of the portable housing,wherein the mobile robot is operable to detect the signal and change a movement path of the mobile robot to prevent the mobile robot from physically contacting the portable housing in response to detection of the signal.
  • 24. The method as set forth in claim 19, further comprising: controlling the portable housing to automatically turn off after a predetermined time period.
  • 25. The method as set forth in claim 19, further comprising: controlling the portable housing to receive a smart house signal that changes an on/off status of the portable housing.
  • 26. The method as set forth in claim 19, wherein the mobile robot includes a bump sensor operable to detect a physical contact,the mobile robot includes a cliff sensor operable to detect a falling edge in the first area, andthe mobile robot is operable to change a movement path of the mobile robot in response to detection of one of the physical contact and the falling edge.
  • 27. A robot confinement system, comprising: a portable housing, including: a first detector operable to detect a presence of a mobile robot in a field of detection;an emitter operable to emit a first signal when the first detector detects the presence of the mobile robot in the field of detection; anda receiver operable to receive a smart house signal that changes an on/off status of the portable housing;the mobile robot, including: a shell;a chassis including at least two wheels;at least one motor connected to the at least two wheels for moving the mobile robot on a surface;a cleaner operable to clean the surface as the mobile robot moves on the surface;a controller operable to control the at least one motor to control a movement path of the mobile robot on the surface; anda second detector operable to detect the first signal,wherein the controller is operable to change the movement path of the mobile robot in response to detection of the first signal.
  • 28. A method for confining a mobile robot with a portable housing, the method comprising: moving the mobile robot along a movement path on a surface and cleaning the surface with the mobile robot;detecting, with the portable housing, a presence of the mobile robot in a field of detection;emitting, with the portable housing, a first signal in response to detection of the presence of the mobile robot in the field of detection;detecting, with the mobile robot, the first signal emitted by the portable housing;changing, with the mobile robot, the movement path of the mobile robot in response to detection of the first signal; andchanging, with the portable housing, an on/off status of the portable housing in response to a predetermined condition, wherein the predetermined condition occurs when at least one of a predetermined time period lapses and a smart house signal is received by the portable housing.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application for U.S. patent is a continuation of U.S. patent application Ser. No. 12/540,564 filed Aug. 13, 2009, which is a continuation of U.S. patent application Ser. No. 11/929,558 filed Oct. 30, 2007, now U.S. Pat. No. 7,579,803, which is a continuation of U.S. patent application Ser. No. 11/691,735 filed Mar. 27, 2007, which is a continuation of U.S. patent application Ser. No. 11/221,392 filed Sep. 8, 2005, now U.S. Pat. No. 7,196,487, which is a continuation of U.S. patent application Ser. No. 10/921,775 filed Aug. 19, 2004, now U.S. Pat. No. 6,965,209, which is a continuation of U.S. patent application Ser. No. 10/696,456 filed Oct. 29, 2003, now U.S. Pat. No. 6,781,338, which is a divisional of U.S. patent application Ser. No. 10/056,804 filed Jan. 24, 2002, now U.S. Pat. No. 6,690,134, which claims the benefit of U.S. Provisional Application No. 60/263,692 filed Jan. 24, 2001, the contents of all of which are expressly incorporated by reference herein in their entireties.

US Referenced Citations (345)
Number Name Date Kind
2136324 Louis Nov 1938 A
3457575 Bienek Jul 1969 A
3550714 Bellinger Dec 1970 A
3674316 De Brey Jul 1972 A
3937174 Haaga Feb 1976 A
4099284 Shinozaki et al. Jul 1978 A
4119900 Kremnitz Oct 1978 A
4306329 Yokoi Dec 1981 A
4369543 Chen et al. Jan 1983 A
4513469 Godfrey et al. Apr 1985 A
4556313 Miller, Jr. et al. Dec 1985 A
4626995 Lofgren et al. Dec 1986 A
4674048 Okumura Jun 1987 A
4679152 Perdue Jul 1987 A
4696074 Cavalli Sep 1987 A
4700301 Dyke Oct 1987 A
4700427 Knepper Oct 1987 A
4716621 Zoni Jan 1988 A
4733430 Westergren Mar 1988 A
4733431 Martin Mar 1988 A
4756049 Uehara Jul 1988 A
4777416 George, II et al. Oct 1988 A
4782550 Jacobs Nov 1988 A
4811228 Hyyppa Mar 1989 A
4815157 Tsuchiya Mar 1989 A
4854000 Takimoto Aug 1989 A
4887415 Martin Dec 1989 A
4893025 Lee Jan 1990 A
4901394 Nakamura et al. Feb 1990 A
4912643 Beirxe Mar 1990 A
4918441 Bohman Apr 1990 A
4919224 Shyu et al. Apr 1990 A
4920605 Takashima May 1990 A
4933864 Evans, Jr. et al. Jun 1990 A
4956891 Wulff Sep 1990 A
4962453 Pong et al. Oct 1990 A
4974283 Holsten et al. Dec 1990 A
5002145 Waqkaumi et al. Mar 1991 A
5032775 Mizuno et al. Jul 1991 A
5086535 Grossmeyer et al. Feb 1992 A
5093955 Blehert et al. Mar 1992 A
5105502 Takashima Apr 1992 A
5109566 Kobayashi et al. May 1992 A
5115538 Cochran et al. May 1992 A
5136750 Takashima et al. Aug 1992 A
5142985 Stearns et al. Sep 1992 A
5144715 Matsuyo et al. Sep 1992 A
5152028 Hirano Oct 1992 A
5163202 Kawakami et al. Nov 1992 A
5165064 Mattaboni Nov 1992 A
5182833 Yamaguchi et al. Feb 1993 A
5204814 Noonan et al. Apr 1993 A
5208521 Aoyama May 1993 A
5216777 Moro et al. Jun 1993 A
5233682 Abe et al. Aug 1993 A
5239720 Wood et al. Aug 1993 A
5251358 Moro et al. Oct 1993 A
5261139 Lewis Nov 1993 A
5279672 Betker et al. Jan 1994 A
5284522 Kobayashi et al. Feb 1994 A
5293955 Lee Mar 1994 A
5303448 Hennessey et al. Apr 1994 A
5309592 Hiratsuka May 1994 A
5319828 Waldhauser et al. Jun 1994 A
5321614 Ashworth Jun 1994 A
5324948 Dudar et al. Jun 1994 A
5341540 Soupert et al. Aug 1994 A
5353224 Lee et al. Oct 1994 A
5369347 Yoo Nov 1994 A
5410479 Coker Apr 1995 A
5440216 Kim Aug 1995 A
5444965 Colens Aug 1995 A
5446356 Kim Aug 1995 A
5454129 Kell Oct 1995 A
5455982 Armstrong et al. Oct 1995 A
5465525 Mifune et al. Nov 1995 A
5467273 Faibish et al. Nov 1995 A
5471391 Gudat et al. Nov 1995 A
5497529 Boesi Mar 1996 A
5507067 Hoekstra et al. Apr 1996 A
5515572 Hoekstra et al. May 1996 A
5534762 Kim Jul 1996 A
5537017 Feiten et al. Jul 1996 A
5539953 Kurz Jul 1996 A
5542146 Hoekstra et al. Aug 1996 A
5548511 Bancroft Aug 1996 A
5553349 Kilstrom et al. Sep 1996 A
5555587 Guha Sep 1996 A
5560077 Crotchett Oct 1996 A
5568589 Hwang Oct 1996 A
5611106 Wulff Mar 1997 A
5611108 Knowlton et al. Mar 1997 A
5613261 Kawakami et al. Mar 1997 A
5621291 Lee Apr 1997 A
5622236 Azumi et al. Apr 1997 A
5634237 Paranjpe Jun 1997 A
5634239 Tuvin et al. Jun 1997 A
5636402 Kubo et al. Jun 1997 A
5650702 Azumi Jul 1997 A
5652489 Kawakami Jul 1997 A
5682313 Edlund et al. Oct 1997 A
5682839 Grimsley et al. Nov 1997 A
5696675 Nakamura et al. Dec 1997 A
5709007 Chiang Jan 1998 A
5714119 Kawagoe et al. Feb 1998 A
5717484 Hamaguchi et al. Feb 1998 A
5720077 Nakamura et al. Feb 1998 A
5735959 Kubo et al. Apr 1998 A
5761762 Kubo Jun 1998 A
5781960 Kilstrom et al. Jul 1998 A
5787545 Colens Aug 1998 A
5794297 Muta Aug 1998 A
5812267 Everett, Jr. et al. Sep 1998 A
5815880 Nakanishi Oct 1998 A
5819008 Asama et al. Oct 1998 A
5820821 Kawagoe et al. Oct 1998 A
5825981 Matsuda Oct 1998 A
5839156 Park et al. Nov 1998 A
5841259 Kim et al. Nov 1998 A
5867800 Leif Feb 1999 A
5869910 Colens Feb 1999 A
5903124 Kawakami May 1999 A
5926909 McGee Jul 1999 A
5935179 Kleiner et al. Aug 1999 A
5940927 Haegermarck et al. Aug 1999 A
5940930 Oh et al. Aug 1999 A
5942869 Katou et al. Aug 1999 A
5943730 Boomgaarden Aug 1999 A
5943733 Tagliaferri Aug 1999 A
5947225 Kawakami et al. Sep 1999 A
5959423 Nakanishi et al. Sep 1999 A
5974348 Rocks Oct 1999 A
5991951 Kubo et al. Nov 1999 A
5995884 Allen et al. Nov 1999 A
5998953 Nakamura et al. Dec 1999 A
6025687 Himeda et al. Feb 2000 A
6038501 Kawakami Mar 2000 A
6041471 Charky et al. Mar 2000 A
6076025 Ueno et al. Jun 2000 A
6108076 Hanseder Aug 2000 A
6112143 Allen et al. Aug 2000 A
6112996 Matsuo Sep 2000 A
6119057 Kawagoe Sep 2000 A
6124694 Bancroft et al. Sep 2000 A
6138063 Himeda Oct 2000 A
6142252 Kinto et al. Nov 2000 A
6226830 Hendriks et al. May 2001 B1
6240342 Fiegert et al. May 2001 B1
6255793 Peless et al. Jul 2001 B1
6259979 Holmquist Jul 2001 B1
6285930 Dickson et al. Sep 2001 B1
6300737 Bergvall et al. Oct 2001 B1
6321515 Colens Nov 2001 B1
6339735 Peless et al. Jan 2002 B1
6374155 Wallach et al. Apr 2002 B1
6385515 Dickson et al. May 2002 B1
6389329 Colens May 2002 B1
6408226 Byrne et al. Jun 2002 B1
6430471 Kintou et al. Aug 2002 B1
6438456 Feddema et al. Aug 2002 B1
6442476 Poropat Aug 2002 B1
6443509 Levin et al. Sep 2002 B1
6444003 Sutcliffe Sep 2002 B1
6459955 Bartsch et al. Oct 2002 B1
6463368 Feiten et al. Oct 2002 B1
6465982 Bergvall et al. Oct 2002 B1
6493612 Bisset et al. Dec 2002 B1
6493613 Peless et al. Dec 2002 B2
6496754 Song et al. Dec 2002 B2
6496755 Wallach et al. Dec 2002 B2
6507773 Parker et al. Jan 2003 B2
6525509 Petersson et al. Feb 2003 B1
6532404 Colens Mar 2003 B2
6535793 Allard Mar 2003 B2
6548982 Papanikolopoulos et al. Apr 2003 B1
6553612 Dyson et al. Apr 2003 B1
6571415 Gerber et al. Jun 2003 B2
6574536 Kawagoe et al. Jun 2003 B1
6580246 Jacobs Jun 2003 B2
6584376 Van Kommer Jun 2003 B1
6594844 Jones Jul 2003 B2
6601265 Burlington Aug 2003 B1
6604021 Imai et al. Aug 2003 B2
6604022 Parker et al. Aug 2003 B2
6605156 Clark et al. Aug 2003 B1
6611120 Song et al. Aug 2003 B2
6611734 Parker et al. Aug 2003 B2
6611738 Ruffner Aug 2003 B2
6615108 Peless et al. Sep 2003 B1
6658693 Reed, Jr. Dec 2003 B1
6661239 Ozick Dec 2003 B1
6671592 Bisset et al. Dec 2003 B1
6690134 Jones et al. Feb 2004 B1
6741054 Koselka et al. May 2004 B2
6748297 Song et al. Jun 2004 B2
6760647 Nourbakhsh et al. Jul 2004 B2
6764373 Osawa et al. Jul 2004 B1
6774596 Bisset Aug 2004 B1
6781338 Jones et al. Aug 2004 B2
6809490 Jones et al. Oct 2004 B2
6830120 Yashima et al. Dec 2004 B1
6841963 Song et al. Jan 2005 B2
6845297 Allard Jan 2005 B2
6865447 Lau et al. Mar 2005 B2
6870792 Chiappetta Mar 2005 B2
6883201 Jones et al. Apr 2005 B2
6901624 Mori et al. Jun 2005 B2
D510066 Hickey et al. Sep 2005 S
6938298 Aasen Sep 2005 B2
6940291 Ozick Sep 2005 B1
6956348 Landry et al. Oct 2005 B2
6965209 Jones et al. Nov 2005 B2
6971140 Kim Dec 2005 B2
6999850 McDonald Feb 2006 B2
7024278 Chiappetta et al. Apr 2006 B2
7024280 Parker et al. Apr 2006 B2
7053578 Diehl et al. May 2006 B2
7055210 Keppler et al. Jun 2006 B2
7069124 Whittaker et al. Jun 2006 B1
7079923 Abramson et al. Jul 2006 B2
7085624 Aldred et al. Aug 2006 B2
7133746 Abramson et al. Nov 2006 B2
7155308 Jones Dec 2006 B2
7196487 Jones et al. Mar 2007 B2
7201786 Wegelin et al. Apr 2007 B2
7206677 Hulden Apr 2007 B2
7225500 Diehl et al. Jun 2007 B2
7246405 Yan Jul 2007 B2
7248951 Hulden Jul 2007 B2
7318248 Yan Jan 2008 B1
7324870 Lee Jan 2008 B2
7352153 Yan Apr 2008 B2
7359766 Jeon et al. Apr 2008 B2
7408157 Yan Aug 2008 B2
7418762 Arai et al. Sep 2008 B2
7515991 Egawa et al. Apr 2009 B2
7567052 Jones et al. Jul 2009 B2
7568259 Yan Aug 2009 B2
7579803 Jones et al. Aug 2009 B2
7600521 Woo Oct 2009 B2
7647144 Haegermarck Jan 2010 B2
7650666 Jang Jan 2010 B2
7660650 Kawagoe et al. Feb 2010 B2
7693605 Park Apr 2010 B2
7720554 DiBernardo et al. May 2010 B2
7801645 Taylor et al. Sep 2010 B2
7805220 Taylor et al. Sep 2010 B2
7920941 Park et al. Apr 2011 B2
7937800 Yan May 2011 B2
7957836 Myeong et al. Jun 2011 B2
8368339 Jones et al. Feb 2013 B2
20010004719 Sommer Jun 2001 A1
20010013929 Torsten Aug 2001 A1
20010020200 Das et al. Sep 2001 A1
20010025183 Shahidi Sep 2001 A1
20010037163 Allard Nov 2001 A1
20010043509 Green et al. Nov 2001 A1
20010045883 Holdaway et al. Nov 2001 A1
20010047231 Peless et al. Nov 2001 A1
20010047895 De Fazio et al. Dec 2001 A1
20020011367 Kolesnik Jan 2002 A1
20020011813 Koselka et al. Jan 2002 A1
20020016649 Jones Feb 2002 A1
20020021219 Edwards Feb 2002 A1
20020027652 Paromtchik et al. Mar 2002 A1
20020036779 Kiyoi et al. Mar 2002 A1
20020081937 Yamada et al. Jun 2002 A1
20020095239 Wallach et al. Jul 2002 A1
20020097400 Jung et al. Jul 2002 A1
20020104963 Mancevski Aug 2002 A1
20020108209 Peterson Aug 2002 A1
20020112742 Bredo et al. Aug 2002 A1
20020113973 Ge Aug 2002 A1
20020116089 Kirkpatrick, Jr. Aug 2002 A1
20020120364 Colens Aug 2002 A1
20020124343 Reed Sep 2002 A1
20020153185 Song et al. Oct 2002 A1
20020156556 Ruffner Oct 2002 A1
20020159051 Guo Oct 2002 A1
20020166193 Kasper Nov 2002 A1
20020169521 Goodman et al. Nov 2002 A1
20020173877 Zweig Nov 2002 A1
20020189871 Won Dec 2002 A1
20030019071 Field et al. Jan 2003 A1
20030023356 Keable Jan 2003 A1
20030025472 Jones et al. Feb 2003 A1
20030060928 Abramson et al. Mar 2003 A1
20030120389 Abramson et al. Jun 2003 A1
20030137268 Papanikolopoulos et al. Jul 2003 A1
20030192144 Song et al. Oct 2003 A1
20030216834 Allard Nov 2003 A1
20030233177 Johnson et al. Dec 2003 A1
20040020000 Jones Feb 2004 A1
20040030448 Solomon Feb 2004 A1
20040030449 Solomon Feb 2004 A1
20040030450 Solomon Feb 2004 A1
20040030571 Solomon Feb 2004 A1
20040031113 Wosewick et al. Feb 2004 A1
20040049877 Jones et al. Mar 2004 A1
20040068351 Solomon Apr 2004 A1
20040068415 Solomon Apr 2004 A1
20040068416 Solomon Apr 2004 A1
20040074044 Diehl et al. Apr 2004 A1
20040076324 Burl et al. Apr 2004 A1
20040088079 Lavarec et al. May 2004 A1
20040111184 Chiappetta et al. Jun 2004 A1
20040134336 Solomon Jul 2004 A1
20040134337 Solomon Jul 2004 A1
20040156541 Jeon et al. Aug 2004 A1
20040158357 Lee et al. Aug 2004 A1
20040187457 Colens Sep 2004 A1
20040200505 Taylor et al. Oct 2004 A1
20040204792 Taylor et al. Oct 2004 A1
20040211444 Taylor et al. Oct 2004 A1
20040236468 Taylor et al. Nov 2004 A1
20040244138 Taylor et al. Dec 2004 A1
20050000543 Taylor et al. Jan 2005 A1
20050010331 Taylor et al. Jan 2005 A1
20050150519 Keppler et al. Jul 2005 A1
20050156562 Cohen et al. Jul 2005 A1
20050187678 Myeong et al. Aug 2005 A1
20050204717 Colens Sep 2005 A1
20050209736 Kawagoe Sep 2005 A1
20050213082 DiBernardo et al. Sep 2005 A1
20050235451 Yan Oct 2005 A1
20050273967 Taylor et al. Dec 2005 A1
20060020369 Taylor et al. Jan 2006 A1
20060020370 Abramson Jan 2006 A1
20060061657 Rew et al. Mar 2006 A1
20060100741 Jung May 2006 A1
20060259194 Chiu Nov 2006 A1
20070017061 Yan Jan 2007 A1
20070028574 Yan Feb 2007 A1
20070032904 Kawagoe et al. Feb 2007 A1
20070179670 Chiappetta et al. Aug 2007 A1
20070213892 Jones et al. Sep 2007 A1
20070250212 Halloran et al. Oct 2007 A1
20070290649 Jones et al. Dec 2007 A1
20080015738 Casey et al. Jan 2008 A1
20080039974 Sandin et al. Feb 2008 A1
20080184518 Taylor et al. Aug 2008 A1
20080302586 Yan Dec 2008 A1
20090319083 Jones et al. Dec 2009 A1
20100063628 Landry et al. Mar 2010 A1
20100312429 Jones et al. Dec 2010 A1
Foreign Referenced Citations (199)
Number Date Country
10242257 Apr 2003 DE
102004038074 Jun 2005 DE
10357636 Jul 2005 DE
102004046813 Apr 2007 DE
0 792 726 Sep 1997 EP
1149333 Nov 2002 EP
1 331 537 Jul 2003 EP
1 380 245 Jan 2004 EP
1380246 Jan 2004 EP
1 557 730 Jul 2005 EP
1642522 Apr 2006 EP
1672455 Jun 2006 EP
2 828 589 Feb 2003 FR
2225221 May 1990 GB
2 283 838 May 1995 GB
2409966 Jul 2005 GB
59-33511 Mar 1984 JP
62-120510 Jun 1987 JP
62-154008 Jul 1987 JP
62-292126 Dec 1987 JP
63-183032 Jul 1988 JP
63-241610 Oct 1988 JP
02-006312 Jan 1990 JP
2-283343 Nov 1990 JP
03-051023 Mar 1991 JP
5-46246 Feb 1993 JP
5-84200 Apr 1993 JP
06-327598 Nov 1994 JP
07-129239 May 1995 JP
7-222705 Aug 1995 JP
7-281742 Oct 1995 JP
07-281752 Oct 1995 JP
07-295636 Nov 1995 JP
7-295638 Nov 1995 JP
07-313417 Dec 1995 JP
07-319542 Dec 1995 JP
7-334242 Dec 1995 JP
08-016241 Jan 1996 JP
08-016776 Jan 1996 JP
08-063229 Mar 1996 JP
08-089451 Apr 1996 JP
08-123548 May 1996 JP
8-152916 Jun 1996 JP
08-152916 Jun 1996 JP
08-256960 Oct 1996 JP
08-263137 Oct 1996 JP
08-286741 Nov 1996 JP
08-286744 Nov 1996 JP
08-286745 Nov 1996 JP
08-286747 Nov 1996 JP
08-322774 Dec 1996 JP
08-335112 Dec 1996 JP
09-047413 Feb 1997 JP
09-066855 Mar 1997 JP
9-160644 Jun 1997 JP
09-179625 Jul 1997 JP
9-179625 Jul 1997 JP
09-185410 Jul 1997 JP
09-204223 Aug 1997 JP
09-204224 Aug 1997 JP
9-206258 Aug 1997 JP
9-251318 Sep 1997 JP
09-265319 Oct 1997 JP
09-269807 Oct 1997 JP
09-269810 Oct 1997 JP
09-269824 Oct 1997 JP
09-319431 Dec 1997 JP
09-319432 Dec 1997 JP
09-319434 Dec 1997 JP
09-325812 Dec 1997 JP
10-027020 Jan 1998 JP
10-055215 Feb 1998 JP
10-105233 Apr 1998 JP
10-117973 May 1998 JP
10-118963 May 1998 JP
10-228316 Aug 1998 JP
10-240342 Sep 1998 JP
10-240343 Sep 1998 JP
10-260727 Sep 1998 JP
10-295595 Nov 1998 JP
11-065655 Mar 1999 JP
11-065657 Mar 1999 JP
11-085269 Mar 1999 JP
11-102219 Apr 1999 JP
11-102220 Apr 1999 JP
11-174145 Jul 1999 JP
11-175149 Jul 1999 JP
11-212642 Aug 1999 JP
11-213157 Aug 1999 JP
11-508810 Aug 1999 JP
11-510935 Sep 1999 JP
11-295412 Oct 1999 JP
2000-056006 Feb 2000 JP
2000-056831 Feb 2000 JP
2000-066722 Mar 2000 JP
2000-075925 Mar 2000 JP
2000-353014 Dec 2000 JP
2001-022443 Jan 2001 JP
2001-87182 Apr 2001 JP
2001-258807 Sep 2001 JP
2001-265437 Sep 2001 JP
2001-275908 Oct 2001 JP
2001-525567 Dec 2001 JP
2002-078650 Mar 2002 JP
2002-204768 Jul 2002 JP
2002-532178 Oct 2002 JP
2002-533797 Oct 2002 JP
2002-323925 Nov 2002 JP
2002-333920 Nov 2002 JP
2002-355206 Dec 2002 JP
2002-360471 Dec 2002 JP
2002-360482 Dec 2002 JP
3356170 Dec 2002 JP
2003-010076 Jan 2003 JP
2003-010088 Jan 2003 JP
2003-015740 Jan 2003 JP
2003-05296 Feb 2003 JP
2003-036116 Feb 2003 JP
2003-038401 Feb 2003 JP
2003-038402 Feb 2003 JP
2003-505127 Feb 2003 JP
3375843 Feb 2003 JP
2003-061882 Mar 2003 JP
2003-167628 Jun 2003 JP
2003180586 Jul 2003 JP
2003262520 Sep 2003 JP
2003304992 Oct 2003 JP
2003-310489 Nov 2003 JP
2003310509 Nov 2003 JP
2004-123040 Apr 2004 JP
2004148021 May 2004 JP
2004-160102 Jun 2004 JP
2004-174228 Jun 2004 JP
2005-135400 May 2005 JP
2005-245916 Sep 2005 JP
2005352707 Oct 2005 JP
2005-346700 Dec 2005 JP
2006043071 Feb 2006 JP
2006-079145 Mar 2006 JP
2006-079157 Mar 2006 JP
2006155274 Jun 2006 JP
2006-247467 Sep 2006 JP
2006-260161 Sep 2006 JP
2006-293662 Oct 2006 JP
2006-296697 Nov 2006 JP
2007034866 Feb 2007 JP
2007-213180 Aug 2007 JP
2009-015611 Jan 2009 JP
2010-198552 Sep 2010 JP
9526512 Oct 1995 WO
9715224 May 1997 WO
9740734 Nov 1997 WO
9741451 Nov 1997 WO
9853456 Nov 1998 WO
9916078 Apr 1999 WO
9928800 Jun 1999 WO
9938056 Jul 1999 WO
9938237 Jul 1999 WO
9943250 Sep 1999 WO
9959042 Nov 1999 WO
0004430 Jan 2000 WO
0036962 Jun 2000 WO
0038026 Jun 2000 WO
0038028 Jun 2000 WO
0038029 Jun 2000 WO
0078410 Dec 2000 WO
0106904 Feb 2001 WO
0106905 Feb 2001 WO
0239864 May 2002 WO
0239868 May 2002 WO
02058527 Aug 2002 WO
02062194 Aug 2002 WO
02067744 Sep 2002 WO
02067745 Sep 2002 WO
02069775 Sep 2002 WO
02071175 Sep 2002 WO
02074150 Sep 2002 WO
02075356 Sep 2002 WO
02075469 Sep 2002 WO
02075470 Sep 2002 WO
02101477 Dec 2002 WO
2003026474 Apr 2003 WO
2003040845 May 2003 WO
2003040846 May 2003 WO
2004004533 Jan 2004 WO
2004006034 Jan 2004 WO
2004043215 May 2004 WO
2004058028 Jul 2004 WO
2004059409 Jul 2004 WO
2005006935 Jan 2005 WO
2005036292 Apr 2005 WO
2005055795 Jun 2005 WO
2005055796 Jun 2005 WO
2005077244 Aug 2005 WO
2005082223 Sep 2005 WO
2006061133 Jun 2006 WO
2006068403 Jun 2006 WO
2006073248 Jul 2006 WO
2007036490 Apr 2007 WO
Non-Patent Literature Citations (121)
Entry
Morland, “Autonomous Lawnmower Control,” Downloaded from the internet at: http://cns.bu.edu/˜cjmorlan/robotics/lawnmower/report.pdf, 10 pages, Jul. 24, 2002.
Doty, Keith L et al., “Sweep Strategies for a Sensory-Driven, Behavior-Based Vacuum Cleaning Agent” AAAI 1993 Fall Symposium Series Instantiating Real-World Agents Research Triangle Park, Raleigh, NC, Oct. 22-24, 1993, pp. 1-6.
Electrolux, “Facts on the Trilobite,” http://trilobiteelectroluxse/presskit—en/node1335.asp?print=yes&pressID=, accessed Dec. 12, 2003 (2 pages).
Electrolux designed for the well-lived home, website: http://www.electroluxusa.com/node57.as?currentURL=node142.asp%3F, accessed Mar. 18, 2005, 5 pgs.
eVac Robotic Vacuum S1727 Instruction Manual, Sharper Image Corp, Copyright 2004, 16 pgs.
Evolution Robotics, “NorthStar- Low-cost Indoor Localization, How it Works,” E Evolution robotics , 2 pages, 2005.
Everyday Robots, website: http://www.everydayrobots.com/index.php?option=content&task=view&id=9, accessed Apr. 20, 2005, 7 pgs.
Friendly Robotics Robotic Vacuum RV400—The Robot Store website: http://www.therobotstore.com/s.nl/sc.9/category,-109/it.A/id.43/.f, accessed Apr. 20, 2005, 5 pgs.
Gat, Erann, “Robust Low-computation Sensor-driven Control for Task-Directed Navigation,” Proceedings of the 1991 IEEE, International Conference on Robotics and Automation, Sacramento, California, Apr. 1991, pp. 2484-2489.
Hitachi, News release, The home cleaning robot of the autonomous movement type (experimental machine) is developed, website: http://www.i4u.com/japanreleases/hitachirobot.htm, accessed Mar. 18, 2005, 5 pgs.
Kahney, “Wired News: Robot Vacs are in the House,” website: http://www.wired.com/news/print/0,1294,59237,00.html, accessed Mar. 18, 2005, 6 pgs.
Karcher Product Manual Download webpage: “http://wwwkarchercom/bta/downloadenshtml?ACTION=SELECTTEILENR&ID=rc3000&-submitButtonName=Select+Product+Manual” and associated pdf file “5959-915enpdf (47 MB) English/English” accessed Jan. 21, 2004.
Karcher RC 3000 Cleaning Robot—user manual Manufacturer: Alfred-Karcher GmbH & Co, Cleaning Systems, Alfred Karcher—Str 28-40, PO Box 160, D-71349 Winnenden, Germany, Dec. 2002.
Karcher, “Karcher RoboCleaner RC 3000 Product Details,” http://www.robocleaner.de/english/screen3.html, 4 pages, accessed Dec. 12, 2003.
Karcher USA, RC3000 Robotic Cleaner, website: http://www.karcher-usa.com/showproducts.php?op=view—prod&Param1=143&param2=&param3=, accessed Mar. 18, 2005, 6 pgs.
Koolvac Robotic Vacuum Cleaner Owner's Manual, Koolatron, Undated, 26 pgs.
Put Your Roomba . . . On “Automatic” Roomba Timer> Timed Cleaning—Floorvac Robotic Vacuum webpages: http://cgi.ebay. com/ws/eBayISAPI.d11?ViewItem&category=43575198387&rd=1, accessed Apr. 20, 2005, 5 pgs.
Put Your Roomba . . . On “Automatic” webpages: “http://www.acomputeredge.com/roomba,” accessed Apr. 20, 2005, 5 pgs.
The Robo Maid, “RoboMaid Sweeps Your Floors So You Won't Have To,” the Official Website, http://www.robomaid.com, 2 pages, accessed Mar. 15, 2005.
Robot Review Samsung Robot Vacuum (VC-RP30W), website: http://www.onrobo.com/reviews/At—Home/Vacuum—Cleaners/on00vcrp30rosam/index.htm, accessed Mar. 18, 2005, 11 pgs.
Robotic Vacuum Cleaner—Blue, website: http://www.sharperimage.com/us/en/catalog/productview.jhtml?sku=S1727BLU, accessed Mar. 18, 2005, 3 pgs.
Schofield, Monica, “Neither Master nor Slave, A Practical Study in the Development and Employment of Cleaning Robots,”, 1999 Proceedings EFA '99 1999 7th IEEE International Conference on Emerging Technologies and Factory Automation, vol. 2, Barcelona, Spain Oct. 18-21, 1999, pp. 1427-1434, 1999.
Sebastian Thrun, Learning Occupancy Grip Maps With Forward Sensor Models, School of Computer Science, Carnegie Mellon University, pp. 1-28.
Zoombot Remote Controlled Vacuum—RV-500 New Roomba 2, website: http://egi.ebay.com/ws/eBay|SAP|.d11?ViewItem&category=43526&item=4373497618&rd=1, accessed Apr. 20, 2005, 7 pgs.
Office Action in Japanese Patent Application No. 2003-008478, drafting date of Jan. 8, 2004, and English language translation thereof.
Prassler, et al., “A Short History of Cleaning Robots”, Autonomous Robots 9, 211-226, Kluwer Academic Publishers, 2000.
Jarosiewicz, Eugenio, “EEL 5666 Intelligent Machine Design Laboratory”, University of Florida, Department of Electrical and Computer Engineering, Aug. 4, 1999, 50 pages.
Office Action in U.S. Appl. No. 10/056,804, dated Apr. 21, 2003.
Notice of Allowance in U.S. Appl. No. 10/056,804, dated Oct. 21, 2003.
Notice of Allowance in U.S. Appl. No. 10/696,456, dated Apr. 13, 2004.
Office Action in U.S. Appl. No. 10/921,775, dated Mar. 10, 2005.
Notice of Allowance in U.S. Appl. No. 10/921,775, dated Jun. 16, 2005.
Office Action in U.S. Appl. No. 11/221,392, dated Nov. 30, 2005.
Office Action in U.S. Appl. No. 11/221,392, dated Jun. 6, 2006.
Notice of Allowance in U.S. Appl. No. 11/221,392, dated Jan. 8, 2007.
Office Action in U.S. Appl. No. 11/691,735, dated Oct. 17, 2007.
Office Action in U.S. Appl. No. 11/929,558, dated Aug. 11, 2008.
Notice of Allowance in U.S. Appl. No. 1 1/929,558, dated Mar. 26, 2009.
Notice of Allowance in U.S. Appl. No. 1 1/929,558, dated May 29, 2009.
Notice of Allowance in U.S. Appl. No. 11/929,608, dated Mar. 26, 2009.
Office Action in U.S. Appl. No. 12/540,564, dated Oct. 28, 2010.
Notice of Allowance in U.S. Appl. No. 12/540,564, dated Apr. 20, 2011.
Notice of Allowance in U.S. Appl. No. 12/827,126, dated Nov. 30, 2010.
Notice of Allowance in U.S. Appl. No. 12/827,126, dated Mar. 23, 2011.
English language translation of EP 1380245, published Jan. 2004.
English language translation of EP 1557730, published Jul. 2005.
English language translation of JP 2003/061882, published Mar. 2003.
English language translation of WO 02/071175, published Sep. 2002.
English language translation of WO 2004/058028, published Jul. 2004.
English language translation of WO 2004/059409, published Jul. 2004.
English language translation of WO 2005/055795, published Jun. 2005.
English language translation of WO 2006/061133, published Jun. 2006.
English language translation of WO 2006/068403, published Jun. 2006.
LG RoboKing V-R4000, http://www.popco.net/zboard/view.php?id=tr—review&no=40, Aug. 5, 2005, 15 pages, copyright date 1999-2011.
Dome Style Robot Room Cleaner, http://www.rakuten.co.jp/matsucame/587179/711512/, 7 pages.
Dyson's Robot Vacuum Cleaner—the DC06, http://www.gizmag.com/go/1282/ 3 pages, dated May 2, 2004.
Electrolux Trilobite ZA1, http://www.electrolux-ui.com:8080/2002%5C822%5C833102EN.pdf 10 pages, dated Jan. 12, 2001.
Electrolux Trilobite, http://www.robocon.co.kr/trilobite/Presentation—Trilobite—Kor—030104.ppt 19 pages, undated.
Electrolux web site Sep. 2002, http://www.frc.ri.cmu.edu/˜hpm/talks/Extras/trilobite.desc.html 2 pages, dated Sep. 2002.
Euroflex Intellegente Monster manual, English language excerpt, cover and pp. 17-30, undated.
Euroflex Monster, http://www.euroflex.tv/novita—dett.php?id=15 1 page, dated Jan. 1, 2006.
Floorbotics VR-8 Floor Cleaning Robot, http://www.consensus.com.au/SoftwareAwards/CSAarchive/CSA2004/CSAart04/FloorBot/FX1%20Product%20Description%2020%20January%202004.pdf, (2004), 11 pages.
Friendly Robotics RV Manual, http://www.robotsandrelax.com/PDFs/RV400Manual.pdf pp. 1-18. dated 2004.
Hitachi Robot Cleaner, It's eye, www.hitachi.co.jp/rd/pdf/topics/hitac2003—10.pdf, Oct. 2003, 2 pages, copyright date 2003.
Hitachi Robot Cleaner, http://www.hitachi.co.jp/New/cnews/hl—030529—hl—030529.pdf, 8 pages, dated May 29, 2003.
LG Announces the First Robotic Vacuum Cleaner of Korea, Robot Buying Guide, http://robotbg.com/news/2003/04/22/lg—announces—the—first—robotic—vacuum—cleaner—of korea, 1 page, Apr. 21, 2003.
Roboking-Not Just a Vacuum Cleaner, a Robot!, http;//infocom.uz/2004/01/21/robokingne-prosto-pyilesos-a-robot/, Jan. 21, 2004, foreign language version, 7 pages.
Roboking-Not Just a Vacuum Cleaner, a Robot!, http;//infocom.uz/2004/01/21/robokingne-prosto-pyilesos-a-robot/, Jan. 21, 2004, English version, 5 pages.
Clean Mate 365 Intelligent Automatic Vacuum Cleaner Model QQ-1 User Manual, www.metapo.com/support/user—manual.pdf 3 pages, undated.
Microrobot UBot MR-UBO1K, http://us.aving.net/news/view.php?articleId=23031 5 pages, dated Aug. 25, 2006.
Robotic Vacuum by Matsushita about to undergo Field Testing, http://www.taipeitimes.com/News/worldbiz/archives/2002/03/26/0000129338 2 pages, dated Mar. 26, 2002, copyright date 1999-2011.
Matsushita robotic cleaner, http://techon.nikkeibp.cojp/members/01db/200203/1006501/ 3 pages, dated Mar. 25, 2002, copyright date 1995-2011.
Matsushita robotic cleaner, http://ascii.jp/elem/000/000/330/330024/ 9 pages, dated Mar. 25, 2002.
Sanyo Robot Cleaner http://www.itmedia.co.jp/news/0111/16/robofesta—m.html, 4 pages, dated Nov. 16, 2001.
Sanyo Robot Cleaner http://www.itmedia.co.jp/news/0111/16/robofesta—m2.html, 3 pages, dated Nov. 16, 2001.
Yujin Robotics, An Intelligent Cleaning Robot “Iclebo Q”, http://us.aving.net/news/view.php?articleId=7257 8 pages, dated Sep. 2, 2005.
Vacuum Cleaner Robot Operated in Conjunction with 3G Cellular Phone, http://www.toshiba.co.jp/tech/review/2004/09/59—09pdf/a13.pdf pp. 53-55, dated 2004.
Toshiba prototype, http://warp.ndl.go.jp/info:ndljp/pid/258151/www.soumu.go.jp/joho—tsusin/policyreports/chousa/netrobot/pdf/030214—1—33—a.pdf, pp. 1-16, dated 2003.
SVET Kompujutera Robot usisivac http://www.sk.rs/1999/10/sknt01.html, foreign language version, 1 page, dated Oct. 1999, copyright date 1984-2011.
SVET Kompjutera Robot Vacuum Cleaner, SKWeb 2:54, English version, dated Oct. 1999, 1 page, copyright date 1984-2011.
Robo Vac, Arbeitet ohne Aufsicht, Maschinemarkt, Wurzburg 105 (1999) 27, 3 pages, dated Jul. 5, 1999.
U.S. Appl. No. 60/605,066, filed Aug. 27, 2004.
U.S. Appl. No. 60/605,181, filed Aug. 27, 2004.
Hitachi, “Feature,” http://kadenfan.hitachi.co.jp/robot/feature/feature.html, 1 page, accessed Nov. 19, 2008, dated May 29, 2003.
Microrobot, “Home Robot—UBOT,” http://www.microrobotusa.com/product—1—1—.html, 2 pages, accessed Dec. 2, 2008, copyright date 2007.
InMach, “Intelligent Machines,” http://www.inmach.de/inside.html, 1 page, accessed Nov. 19, 2008.
Hammacher Schlemmer, “Electrolux Trilobite Robotic Vacuum at Hammacher Schlemmer,” www.hammacher.com/publish/71579.asp?promo=xsells, 3 pages, accessed Mar. 18, 2005, copyright date 2004.
TotalVac.com, “Karcher RC3000 RoboCleaner Robot Vacuum at TotalVac,” www.totalvac.com/robot—vacuum.htm, 3 pages, accessed Mar. 18, 2005, copyright date 2004.
MobileMag, Samsung unveils high-tech robot vacuum cleaner, http://www.mobilemag.com/content/100/102/C2261/, 4 pages, accessed Mar. 18, 2005, dated Nov. 25, 2003, copyright date 2002-2004.
Iirobotics.com, Samsung unveils its multifunction robot vacuum,Samsung Robot Vacuum (VV-RP30W), http://www.iirobotics.com/webpages/hotstuff.php?ubre=111, 3 pages, accessed Mar. 18, 2005, dated Aug. 31, 2004.
OnRobo, Samsung unveils it multifunction robot vacuum, http://www.onrobo.com/enews/0210/samsung—vacuum.shtml, 3 pages, accessed Mar. 18, 2005, copyright date 2004.
Gregg, M. et al., “Autonomous Lawn Care Applications”, 2006 Florida Conference on Recent Advances in Robotics, FCRAR May 25-26, 2006, pp. 1-5.
UAMA (Asia) Industrial Co. Ltd., “Robot Family,” 1 page, indicates available in 2005.
Matsutek Enterprises Co. Ltd., “Automatic Rechargeable Vacuum Cleaner,” http://matsutek.manufacturer.globalsources.com/si/6008801427181/pdtl/Home-vacuum/10, 3 pages, accessed Apr. 23, 2007, copyright date 2007.
LG, RoboKing, 4 pages. Undated.
Collection of pictures of robotic cleaners, devices AA-BF, 50 pages. Undated.
Braunstingl et al., “Fuzzy Logic Wall Following of a Mobile Robot Based on the Concept of General Perception,” Sep. 1995. ICAR '95, 7th Int'l Conference on Advanced Robotics, Sant Feliu De Guixols, Spain, pp. 367-376.
Yata et al., “Wall Following Using Angle Information Measured by a Single Ultrasonic Transducer,” Proceedings of the 1998 IEEE, International Conference on Robotics & Automation, Leuven, Belgium, pp. 1590-1596, May 1998.
Tse et al., “Design of a Navigation System for a Household Mobile Robot Using Neural Networks” Dept. of Manufacturing Engg. & Engg. Management, City University of Hong Kong, pp. 2151-2156, 1998.
Wolf, J. et al., “Robust Vision-Based Localization by Combining an Image-Retrieval System with Monte Carlo Localization”, IEEE Transactions on Robotics, vol. 21, No. 2 pp. 208-216, Apr. 2005.
Eren et al., “Accuracy in Position Estimation of Mobile Robots Based on Coded Infrared Signal Transmission,” Proceedinus: Integrating Intelligent Instrumentation and Control, Instrumentation and Measurement Technology Conference. IMTC/95, pp. 548-551. 1995.
Karlsson, N. et al. “Core Technologies for Service Robotics”, Proceedings of 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems, Sep. 28-Oct. 2, 2004, Sendai Japan, pp. 2979-2984.
Leonard et al., “Mobile Robot Localization by Tracking Geometric Beacons,” IEEE Transactions on Robotics and Automation, vol. 7, No. 3, pp. 376-382, Jun. 1991.
Paromtchik, “Toward Optical Guidance of Mobile Robots.” Proceedings of the Fourth World Multiconference on Systemics, Cybernetics and Informatics, Orlando, FL, USA, Jul. 23-26, 2000, vol. IX, six pages.
Wong, EIED Online>>Robot Business, ED Online ID# 13114, 17 pages, Jul. 26, 2006, copyright date 2006.
Facchinetti et al., “Using and Learning Vision-Based Self-Positioning for Autonomous Robot Navigation,” ICARCV '94, The Third International Conference on Automation, Robotics and Computer Vision, Singapore, vol. 3 pp. 1694-1698, Nov. 1994.
Facchinetti et al., “Self-Positioning Robot Navigation Using Ceiling Images Sequences,” ACCV'95, pp. 1-5, Dec. 5-8, 1995.
King et al., “Heplmate-TM-Autonomous Mobile Robot Navigation System,” SPIE, vol. 1388, Mobile Robots V, pp. 190-198, 1990.
Fairfield et al., “Mobile Robot Localization with Sparse Landmarks,” Proceedings of SPIE, vol. 4573, pp. 148-155, 2002.
Benayad-Cherif et al., “Mobile Robot Navigation Sensors,” SPIE, vol. 1831, Mobile Robots VII pp. 378-387, 1992.
The Sharper Image, e-Vac Robotic Vacuum, S1727 Instructions, www.sharperimage.com , 18 pages, copyright 2004.
Ebay, Roomba Timer -> Timed Cleaning- Floorvac Robotic Vacuum, Cgi.ebay.com/ws/eBay|SAP|.dll?viewitem&category=43526&item=4375198387&rd=1, 5 pages, Apr. 20, 2005.
Friendly Robotics , “Friendly Robotics- Friendly Vac, Robotic Vacuum Cleaner,” http://www.friendlyrobotics.com/vac.htm, 4 pages, accessed Apr. 20, 2005.
Ebay, Zoombot Remote Controlled Vacuum- RV-500 New Roomba 2, Cgi.ebay.com/ws/ebay|SAP|.d11?viewitem&category?43526&item=4373497618&RD=1, 7 pages, Apr. 20, 2005.
The Sharper Image, E Vac Robotic Vacuum, http://www.sharperimage.com/us/en/templates/products/pipmorework1printable.jhtml, 1 page, accessed Mar. 18, 2005.
Office Action from U.S. Appl. No. 11/671,305, dated Aug. 22, 2007.
Notice of Allowance in U.S. Appl. No. 12/540,564, dated Dec. 26, 2012.
Notice of Allowance in U.S. Appl. No. 12/540,564, dated Sep. 12, 2012.
Notice of Allowance in U.S. Appl. No. 12/540,564, dated Apr. 17, 2012.
Notice of Allowance in U.S. Appl. No. 12/540,564, dated Feb. 13, 2012.
Notice of Allowance in U.S. Appl. No. 12/540,564, dated Oct. 11, 2011.
Related Publications (1)
Number Date Country
20100268384 A1 Oct 2010 US
Provisional Applications (1)
Number Date Country
60263692 Jan 2001 US
Divisions (1)
Number Date Country
Parent 10056804 Jan 2002 US
Child 10696456 US
Continuations (6)
Number Date Country
Parent 12540564 Aug 2009 US
Child 12827016 US
Parent 11929558 Oct 2007 US
Child 12540564 US
Parent 11691735 Mar 2007 US
Child 11929558 US
Parent 11221392 Sep 2005 US
Child 11691735 US
Parent 10921775 Aug 2004 US
Child 11221392 US
Parent 10696456 Oct 2003 US
Child 10921775 US