Method and system for multi-mode coverage for an autonomous robot

Information

  • Patent Grant
  • 8838274
  • Patent Number
    8,838,274
  • Date Filed
    Wednesday, June 30, 2010
    14 years ago
  • Date Issued
    Tuesday, September 16, 2014
    10 years ago
Abstract
A mobile robot operable to move on a surface in a room is provided. The mobile robot includes a shell and a chassis including at least two wheels. At least one motor is connected to the wheels for moving the mobile robot on the surface. A cleaner is operable to clean the surface as the mobile robot moves on the surface. A wall sensor is operable to detect a wall in the room as the mobile robot moves on the surface. A controller is operable to control the motor to move the mobile robot on the surface in accordance with a wall following mode and a bounce mode. In the wall following mode, the mobile robot moves generally adjacent to and along the wall in response to detection of the wall by the wall sensor. In the bounce mode, the mobile robot moves away from the wall.
Description
BACKGROUND OF THE INVENTION

1. Field of the Invention


This invention relates generally to autonomous vehicles or robots, and more specifically to methods and mobile robotic devices for covering a specific area as might be required of, or used as, robotic cleaners or lawn mowers.


2. Description of Prior Art


For purposes of this description, examples will focus on the problems faced in the prior art as related to robotic cleaning (e.g., dusting, buffing, sweeping, scrubbing, dry mopping or vacuuming). The claimed invention, however, is limited only by the claims themselves, and one of skill in the art will recognize the myriad of uses for the present invention beyond indoor, domestic cleaning.


Robotic engineers have long worked on developing an effective method of autonomous cleaning. By way of introduction, the performance of cleaning robots should concentrate on three measures of success: coverage, cleaning rate and perceived effectiveness. Coverage is the percentage of the available space visited by the robot during a fixed cleaning time, and ideally, a robot cleaner would provide 100 percent coverage given an infinite run time. Unfortunately, designs in the prior art often leave portions of the area uncovered regardless of the amount of time the device is allowed to complete its tasks. Failure to achieve complete coverage can result from mechanical limitations—e.g., the size and shape of the robot may prevent it from reaching certain areas—or the robot may become trapped, unable to vary its control to escape. Failure to achieve complete coverage can also result from an inadequate coverage algorithm. The coverage algorithm is the set of instructions used by the robot to control its movement. For the purposes of the present invention coverage is discussed as a percentage of the available area visited by the robot during a finite cleaning time. Due to mechanical and/or algorithmic limitations, certain areas within the available space may be systematically neglected. Such systematic neglect is a significant limitation in the prior art.


A second measure of a cleaning robot's performance is the cleaning rate given in units of area cleaned per unit time. Cleaning rate refers to the rate at which the area of cleaned floor increases; coverage rate refers to the rate at which the robot covers the floor regardless of whether the floor was previously clean or dirty. If the velocity of the robot is v and the width of the robot's cleaning mechanism (also called work width) is w then the robots coverage rate is simply wv, but its cleaning rate may be drastically lower.


A robot that moves in a purely randomly fashion in a closed environment has a cleaning rate that decreases relative to the robot's coverage rate as a function of time. This is because the longer the robot operates the more likely it is to revisit already cleaned areas. The optimal design has a cleaning rate equivalent to the coverage rate, thus minimizing unnecessary repeated cleanings of the same spot. In other words, the ratio of cleaning rate to coverage rate is a measure of efficiency and an optimal cleaning rate would mean coverage of the greatest percentage of the designated area with the in minimum number of cumulative or redundant passes over an area already cleaned.


A third metric cleaning robot performance is the perceived effectiveness of the robot. This measure is ignored in the prior art. Deliberate movement and certain patterned movement is favored as users will perceive a robot that contains deliberate movement as more effective.


While coverage, cleaning rate and perceived effectiveness are the performance criteria discussed herein, a preferred embodiment of the present invention also takes into account the ease of use in rooms of a variety of shapes and sizes (containing a variety of unknown obstacles) and the cost of the robotic components. Other design criteria may also influence the design, for example the need for collision avoidance and appropriate response to other hazards.


As described in detail in Jones, Flynn & Seiger, Mobile Robots: Inspiration to Implementation second edition, 1999, A K Peters, Ltd., and elsewhere, numerous attempts have been made to build vacuuming and cleaning robots. Each of these robots has faced a similar challenge: how to efficiently cover the designated area given limited energy reserves.


We refer to maximally efficient cleaning, where the cleaning rate equals the coverage rate, as deterministic cleaning. As shown in FIG. 1A, a robot 1 following a deterministic path moves in such a way as to completely cover the area 2 while avoiding all redundant cleaning. Deterministic cleaning requires that the robot know both where it is and where it has been; this in turn requires a positioning system. Such a positioning system—a positioning system suitably accurate to enable deterministic cleaning might rely on scanning laser rangers, ultrasonic transducers, carrier phase differential GPS, or other methods—can be prohibitively expensive and involve user set-up specific to the particular room geometries. Also, methods that rely on global positioning are typically incapacitated by the failure of any part of the positioning system.


One example of using highly sophisticated (and expensive) sensor technologies to create deterministic cleaning is the RoboScrub device built by Denning Mobile Robotics and Windsor Industries, which used sonar, infrared detectors, bump sensors and high-precision laser navigation. RoboScrub's navigation system required attaching large bar code targets at various positions in the room. The requirement that RoboScrub be able to see at least four targets simultaneously was a significant operational problem. RoboScrub, therefore, was limited to cleaning large open areas.


Another example, RoboKent, a robot built by the Kent Corporation, follows a global positioning strategy similar to RobotScrub. RoboKent dispenses with RobotScrub's more expensive laser positioning system but having done so RoboKent must restrict itself only to areas with a simple rectangular geometry, e.g., long hallways. In these more constrained regions, position correction by sonar ranging measurements is sufficient. Other deterministic cleaning systems are described, for example, in U.S. Pat. No. 4,119,900 (Kremnitz) U.S. Pat. No. 4,700,427 (Knepper) U.S. Pat. No. 5,353,224 (Lee et al.), U.S. Pat. No. 5,537,017 (Feiten et al.), U.S. Pat. No. 5,548,511 (Bancroft), U.S. Pat. No. 5,650,702 (Azumi).


Because of the limitations and difficulties of deterministic cleaning some robots have relied on pseudo-deterministic schemes. One method of providing pseudo-deterministic cleaning is an autonomous navigation method known as dead reckoning. Dead reckoning consists of measuring the precise rotation of each robot drive wheel (using for example optical shaft encoders). The robot can then calculate its expected position in the environment given a known starting point and orientation. One problem with this technique is wheel slippage. If slippage occurs, the encoder on that wheel registers a wheel rotation even though that wheel is not driving the robot relative to the ground. As shown in FIG. 1B, as the robot 1 navigates about the room these drive wheel slippage errors accumulate making this type of system unreliable for runs of any substantial duration. (The path no longer consists of tightly packed rows, as compared to the deterministic coverage shown in FIG. 1A.) The result of reliance on dead reckoning is intractable systematic neglect; in other words, areas of the floor are not cleaned.


One example of a pseudo-deterministic a system is the Cye robot from Probotics, Inc. Cye depends exclusively on dead reckoning and therefore takes heroic measures to maximize the performance of its dead reckoning system. Cye must begin at a user-installed physical registration spot in a known location where the robot fixes its position and orientation. Cye then keeps track of position as it moves away from that spot. As Cye moves, uncertainty in its position and orientation increase. Cye must make certain to return to a calibration spot before this error grows so large that it will be unlikely to locate a calibration spot. If a calibration spot is moved or blocked or if excessive wheel slippage occurs then Cye can become lost (possibly without realizing that it is lost). Thus Cye is suitable for use only in relatively small benign environments. Other examples of this approach are disclosed it U.S. Pat. No. 5,109,566 Kobayashi et al.) and U.S. Pat. No. 6,255,793 (Peless et al.).


Another approach to robotic cleaning is purely random motion. As shown in FIG. 1C, in a typical room without obstacles, a random movement algorithm will provide acceptable coverage given significant cleaning time. Compared to a robot with a deterministic algorithm, a random cleaning robot must operate for a longer time to achieve acceptable coverage. To have high confidence that the random-motion robot has cleaned 98% of an obstacle-free room, the random motion robot must run approximately five times as long as a deterministic robot with the same cleaning mechanism, moving at the same speed.


The coverage limitations of a random algorithm can be seen in FIG. 1D. An obstacle 5 in the room can create the effect of segmenting the room into a collection of chambers. The coverage over time of a random algorithm robot in such a room is analogous to the time density of gas released in one chamber of a confined volume. Initially, the density of gas is highest in the chamber where it is released and lowest in more distant chambers. Similarly the robot is most likely to thoroughly clean the chamber where it starts, rather than more distant chambers, early in the process. Given enough time a gas reaches equilibrium with equal density in all chambers. Likewise given time, the robot would clean all areas thoroughly. The limitations of practical power supplies, however, usually guarantee that the robot will have insufficient time to clean all areas of a space cluttered with obstacles. We refer to this phenomenon as the robot diffusion problem.


As discussed, the commercially available prior art has not been able to produce an effective coverage algorithm for an area of unknown geometry. As noted above, the prior art either has relied on sophisticated systems of markers or beacons or has limited the utility of the robot to rooms with simple rectangular geometries. Attempts to use pseudo-deterministic control algorithms can leave areas of the space systematically neglected.


OBJECTS AND ADVANTAGES

It is an object of the present invention to provide a system, and method to allow a mobile robot to operate in a plurality of modes in order to effectively cover an area.


It is an object of the present invention to provide a mobile robot, with at least one sensor, to operate in a number of modes including spot-coverage, obstacle following and bounce.


It is a further object of the invention to provide a mobile robot that alternates between obstacle following and bounce mode to ensure coverage.


It is an object of the invention to return to spot-coverage after the robot has traveled a pre-determined distance.


It is an object of the invention to provide a mobile robot able to track the average distance between obstacles and use the average distance as an input to alternate between operational modes.


It is yet another object of the invention to optimize the distance the robot travels in an obstacle following mode as a function of the frequency of obstacle following and the work width of the robot, and to provide a minimum and maximum distance for operating in obstacle following mode.


It is an object of a preferred embodiment of the invention to use a control system for a mobile robot with an operational system program able to run a plurality of behaviors and using an arbiter to select which behavior is given control over the robot.


It is still another object of the invention to incorporate various escape programs or behavior to allow the robot to avoid becoming stuck.


Finally, it is an object of the invention to provide one or more methods for controlling a mobile robot to benefit from the various objects and advantages disclosed herein.





BRIEF DESCRIPTION OF THE DRAWINGS

These and further features of the present invention will be apparent with reference to the accompanying drawings, wherein:



FIGS. 1A-D illustrate coverage patterns of various robots in the prior art;



FIG. 2 is a top-view schematic representation of the basic components of a mobile robot used in a preferred embodiment of the invention;



FIG. 3 demonstrates a hardware block diagram of the robot shown in FIG. 2;



FIG. 4A is a diagram showing a method of determining the angle at which the robot encounters an obstacle; FIG. 4B is a diagram showing the orientation of a preferred embodiment of the robot control system;



FIG. 5 is a schematic representation of the operational modes of the instant invention;



FIG. 6A is a schematic representation of the coverage pattern for a preferred embodiment of SPIRAL behavior;



FIG. 6B is a schematic representation of the coverage pattern for an alternative embodiment of SPIRAL behavior;



FIG. 6C is a schematic representation of the coverage pattern for yet another alternative embodiment of SPIRAL behavior;



FIG. 7 is a flow-chart illustration of the spot-coverage algorithm of a preferred embodiment of the invention;



FIGS. 8A & 8B are schematic representations of the coverage pattern for a preferred embodiment of operation in obstacle following mode;



FIG. 8C is a schematic illustration of the termination of the obstacle following mode when an obstacle is encountered after the mobile robot has traveled a minimum distance.



FIG. 8D is a schematic illustration of the termination of the obstacle following mode after the mobile robot has traveled a maximum distance.



FIG. 9A is a flow-chart illustration of the obstacle following algorithm of a preferred embodiment of the invention;



FIG. 9B is a flow-chart illustration of a preferred algorithm for determining when to exit obstacle following mode.



FIG. 10 is a schematic representation of the coverage pattern for a preferred embodiment of BOUNCE behavior;



FIG. 11 is a flowchart illustration of the room coverage algorithm of a preferred embodiment of the invention;



FIGS. 12A & 12B are flow-chart illustrations of an exemplary escape behavior;



FIG. 13A is a schematic representation of the coverage pattern of a mobile robot with only a single operational mode;



FIG. 13B is a schematic representation of the coverage pattern for a preferred embodiment of the instant invention using obstacle following and room coverage modes; and



FIG. 14 is a schematic representation of the coverage pattern for a preferred embodiment of the instant invention using spot-coverage, obstacle following and room coverage modes.





DETAILED DESCRIPTION

In the present invention, a mobile robot is designed to provide maximum coverage at an effective coverage rate in a room of unknown geometry. In addition, the perceived effectiveness of the robot is enhanced by the inclusion of patterned or deliberate motion. In addition, in a preferred embodiment, effective coverage requires a control system able to prevent the robot from becoming immobilized in an unknown environment.


While the physical structures of mobile robots are known in the art, the components of a preferred, exemplary embodiment of the present invention is described herein. A preferred embodiment of the present invention is a substantially circular robotic sweeper containing certain features. As shown in FIG. 2, for example, the mobile robot 10 of a preferred embodiment includes a chassis 11 supporting mechanical and electrical components. These components include various sensors, including two bump sensors 12 & 13 located in the forward portion of the robot, four cliff sensors 14 located on the robot shell 15, and a wall following sensor 16 mounted on the robot shell 15. In other embodiments, as few as one sensor may be used in the robot. One of skill in the art will recognize that the sensor(s) may be of a variety of types including sonar, tactile, electromagnetic, capacitive, etc. Because of cost restraints, a preferred embodiment of the present invention uses bump (tactile) sensors 12 & 13 and reflective IR proximity sensors for the cliff sensors 14 and the wall-following sensor 16. Details of the IR sensors are described in U.S. patent application U.S. Ser. No. 09/768,773, which disclosure is hereby incorporated by reference.


A preferred embodiment of the robot also contains two wheels 20, motors 21 for driving the wheels independently, an inexpensive low-end microcontroller 22, and a rechargeable battery 23 or other power source known in the art. These components are well known in the art and are not discussed in detail herein. The robotic cleaning device 10 further includes one or more cleaning heads 30. The cleaning head might contain a vacuum cleaner, various brushes, sponges, mops, electrostatic cloths or a combination of various cleaning elements. The embodiment shown in FIG. 2 also includes a side brush 32.


As mentioned above, a preferred embodiment of the robotic cleaning device 10 comprises an outer shell 15 defining a dominant side, non-dominant side, and a front portion of the robot 10. The dominant side of the robot is the side that is kept near or in contact with an object (or obstacle) when the robot cleans the area adjacent to that object (or obstacle). In a preferred embodiment, as shown in FIG. 1, the dominant side of the robot 10 is the right-hand side relative to the primary direction of travel, although in other embodiments the dominant side may be the left-hand side. In still other embodiments the robot may be symmetric and thereby does not need a dominant side; however, in a preferred embodiment, a dominant side is chosen for reasons of cost. The primary direction of travel is as shown in FIG. 2 by arrow 40.


In a preferred embodiment, two bump sensors 12 & 13 are located forward of the wheels 20 relative to the direction of forward movement, shown by arrow 40. One bump sensor 13 is located on the dominant side of the robot 10 and the other bump sensor 12 is located on the non-dominant side of the robot 10. When both of these bump sensors 12 & 13 are activated simultaneously, the robot 10 recognizes an obstacle in the front position. In other embodiments, more or fewer individual bump sensors can be used. Likewise, any number of bump sensors can be used to divide the device into any number of radial segments. While in a preferred embodiment the bump sensors 12 & 13 are IR break beam sensors activated by contact between the robot 10 and an obstacle, other types of sensors can be used, including mechanical switches and capacitive sensors that detect the capacitance of objects touching the robot or between two metal plates in the bumper that are compressed on contact. Non-contact sensors, which allow the robot to sense proximity to objects without physically touching the object, such as capacitive sensors or a curtain of IR light, can also be used.


It is useful to have a sensor or sensors that are not only able to tell if a surface has been contacted (or is nearby), but also the angle relative to the robot at which the contact was made. In the case of a preferred embodiment, the robot is able to calculate the time between the activation of the right and left bump switches 12 & 13, if both are activated. The robot is then able to estimate the angle at which contact was made. In a preferred embodiment shown in FIG. 4A, the bump sensor comprises a single mechanical bumper 44 at the front of the robot with sensors 42 & 43 substantially at the two ends of the bumper that sense the movement of the bumper. When the bumper is compressed, the timing between the sensor events is used to calculate the approximate angle at which the robot contacted the obstacle. When the bumper is compressed from the right side, the right bump sensor detects the bump first, followed by the left bump sensor, due to the compliance of the bumper and the bump detector geometry. This way, the bump angle can be approximated with only two bump sensors.


For example, in FIG. 4A, bump sensors 42 & 43 are able to divide the forward portion of the robot into six regions (I-VI). When a bump sensor is activated, the robot calculates the time before the other sensor is activated (if at all). For example, when the right bump sensor 43 is activated, the robot measures the time (t) before the left bump sensor 42 is activated. If t is less than t.sub.1, then the robot assumes contact occurred in region IV. If t is greater than or equal to t.sub.1 and less than t.sub.2, then the robot assumes contact was made in region V. If t is greater than or equal to t.sub.2 (including the case of where the left bump sensor 42 is not activated at all within the time monitored), then the robot assures the contact occurred in region VI. If the bump sensors are activated simultaneously, the robot assumes the contact was made from straight ahead. This method can be used the divide the bumper into an arbitrarily large number of regions (for greater precision) depending on of the timing used and geometry of the bumper. As an extension, three sensors can be used to calculate the bump angle in three dimensions instead of just two dimensions as in the preceding example.


A preferred embodiment also contains a wall-following or wall-detecting sensor 16 mounted on the dominant side of the robot 10. In a preferred embodiment, the wall following sensor is an IR sensor composed of an emitter and detector pair collimated so that a finite volume of intersection occurs at the expected position of the wall. This focus point is approximately three inches ahead of the drive wheel in the direction of robot forward motion. The radial range of wall detection is about 0.75 inches.


A preferred embodiment also contains any number of IR cliff sensors 14 that prevent the device from tumbling over stairs or other vertical drops. These cliff sensors are of a construction similar to that of the wall following sensor but directed to observe the floor rather than a wall. As an additional safety and sensing measure, the robot 10 includes a wheel-drop sensor that is able to detect if one or more wheels is unsupported by the floor. This wheel-drop sensor can therefore detect not only cliffs but also various obstacles upon which the robot is able to drive, such as lamps bases, high floor transitions, piles of cords, etc.


Other embodiments may use other known sensors or combinations of sensors.



FIG. 3 shows a hardware block diagram of the controller and robot of a preferred embodiment of the invention. In a preferred embodiment, a Winbond W78XXX series processor is used. It is a microcontroller compatible with the MCS-51 family with 36 general purpose I/O ports, 256 bytes of RAM and 16K of ROM. It is clocked at 40 MHz which is divided down for a processor speed of 3.3 MHz. It has two timers which are used for triggering interrupts used to process sensors and generate output signals as well as a watchdog timer. The lowest bits of the fast timer are also used as approximate random numbers where needed in the behaviors. There are also two external interrupts which are used to capture the encoder inputs from the two drive wheels. The processor also has a UART which is used for testing and debugging the robot control program.


The I/O ports of the microprocessor are connected to the sensors and motors of the robot and are the interface connecting it to the internal state of the robot and its environment. For example, the wheel drop sensors are connected to an input port and the brush motor PWM signal is generated on an output port. The ROM on the microprocessor is used to store the coverage and control program, for the robot. This includes the behaviors (discussed below), sensor processing algorithms and signal generation. The RAM is used to store the active state of the robot, such as the average bump distance, run time and distance, and the ID of the behavior in control and its current motor commands.


For purposes of understanding the movement of the robotic device, FIG. 4B shows the orientation of the robot 10 centered about the x and y axes in a coordinate plane; this coordinate system is attached to the robot. The directional movement of the robot 10 can be understood to be the radius at which the robot 10 will move. In order to rapidly turn away from the wall 100, the robot 10 should set a positive, small value of r (r.sub.3 in FIG. 4B); in order to rapidly turn toward the wall, the robot should set a negative, small value of r (r.sub.1 in FIG. 4B). On the other hand, to make slight turns, the robot should set larger absolute values for r—positive values to move left (i.e. away from the wall, r.sub.4 in FIG. 4B) and negative values to move right (i.e. toward the wall, (r.sub.2 in FIG. 4B). This coordinate scheme is used in the examples of control discussed below. The microcontroller 22 controlling differential speed at which the individual wheel motors 21 are run, determines the turning radius.


Also, in certain embodiments, the robot may include one or more user inputs. For example, as shown in FIG. 2, a preferred embodiment includes three simple buttons 33 that allow the user to input the approximate size of the surface to be covered. In a preferred embodiment, these buttons labeled “small,” “medium,” and “large” correspond respectively to rooms of 11.1, 20.8 and 27.9 square meters.


As mentioned above, the exemplary robot is a preferred embodiment for practicing the instant invention, and one of skill in the art is able to choose from elements known in the art to design a robot for a particular purpose. Examples of suitable designs include those described in the following U.S. Pat. No. 4,306,329 (Yokoi), U.S. Pat. No. 5,109,566 (Kobayashi et al.), U.S. Pat. No. 5,203,955 (Lee), U.S. Pat. No. 5,369,347 (Yoo), U.S. Pat. No. 5,440,216 (Kim), U.S. Pat. No. 5,534,762 (Kim), U.S. Pat. No. 5,613,261 (Kawakami et al), U.S. Pat. No. 5,634,237 (Paranjpe), U.S. Pat. No. 5,781,960 (Kilstrom et al.), U.S. Pat. No. 5,787,545 (Colens), U.S. Pat. No. 5,815,880 (Nakanishi), U.S. Pat. No. 5,839,156 (Park et al.), U.S. Pat. No. 5,926,909 (McGee), U.S. Pat. No. 6,038,501 (Kawakami) U.S. Pat. No. 6,076,226 (Reed), all of which are hereby incorporated by reference.



FIG. 5 shows a simple block representation of the various operational modes of a device. In a preferred embodiment, and by way of example only, operational modes may include spot cleaning (where the user or robot designates a specific region for cleaning), edge cleaning, and room cleaning. Each operational mode comprises complex combinations of instructions and/or internal behaviors, discussed below. These complexities, however, are generally hidden from the user. In one embodiment, the user can select the particular operational mode by using an input element, for example, a selector switch or push button. In other preferred embodiments, as described below, the robot is able to autonomously cycle through the operational modes.


The coverage robot of the instant invention uses these various operational modes to effectively cover the area. While one of skill in the art may implement these various operational modes in a variety of known architectures, a preferred embodiment relies on behavior control. Here, behaviors are simply layers of control systems that all run in parallel. The microcontroller 22 then runs a prioritized arbitration scheme to resolve the dominant behavior for a given scenario. A description of behavior control can be found in Mobile Robots, supra, the text of which is hereby incorporated by reference.


In other words, in a preferred embodiment, the robot's microprocessor and control software run a number of behaviors simultaneously. Depending on the situation, control of the robot will be given to one or more various behaviors. For purposes of detailing the preferred operation of the present invention, the behaviors will be described as (1) coverage behaviors, (2) escape behaviors or (3) user/safety behaviors. Coverage behaviors are primarily designed to allow the robot to perform its coverage operation in an efficient manner. Escape behaviors are special behaviors that are given priority when one or more sensor inputs suggest that the robot may not be operating freely. As a convention for this specification, behaviors discussed below are written in all capital letters.


1. Coverage Behaviors



FIGS. 6-14 show the details of each of the preferred operational modes: Spot Coverage, Wall Follow (or Obstacle Follow) and Room Coverage.


Operational Mode: Spot Coverage


Spot coverage or, for example, spot cleaning allows the user to clean an isolated dirty area. The user places the robot 10 on the floor near the center of the area (see reference numeral 40 in FIGS. 6A, 6B) that requires cleaning and selects the spot-cleaning operational mode. The robot then moves in such a way that the immediate area within, for example, a defined radius, is brought into contact with the cleaning head 30 or side brush 32 of the robot.


In a preferred embodiment, the method of achieving spot cleaning is a control algorithm providing outward spiral movement, or SPIRAL behavior, as shown in FIG. 6A. In general, spiral movement is generated by increasing the turning radius as a function of time. In a preferred embodiment, the robot 10 begins its spiral in a counter-clockwise direction, marked in FIG. 6A by movement line 45, in order to keep the dominant side on the outward, leading-edge of the spiral. In another embodiment, shown in FIG. 6B, spiral movement of the robot 10 is generated inward such that the radius of the turns continues to decrease. The inward spiral is shown as movement line 45 in FIG. 6B. It is not necessary, however, to keep the dominant side of the robot on the outside during spiral motion.


The method of spot cleaning used in a preferred embodiment—outward spiraling—is set forth in FIG. 7. Once the spiraling is initiated (step 201) and the value of r is set at its minimum, positive value (which will produce the tightest possible counterclockwise turn), the spiraling behavior recalculates the value of r as a function of, where represents the angular turning since the initiation of the spiraling behavior (step 210). By using the equation r=a, where a is a constant coefficient, the tightness or desired overlap of the spiral can be controlled. (Note that is not normalized to 2). The value of a can be chosen by the equation a=d 2; where d is the distance between two consecutive passes of the spiral. For effective cleaning, a value for d should be chosen that is less than the width of the cleaning mechanism 30. In a preferred embodiment, a value of d is selected that is between one-half and two-thirds of the width of the cleaning head 30.


In other embodiments, the robot tracks its total distance traveled in spiral mode. The spiral will deteriorate after some distance, i.e. the centerpoint of the spiral motion will tend to drift over time due to surface dependant wheel slippage and/or inaccuracies in the spiral approximation algorithm and calculation precision. In certain embodiments, therefore, the robot may exit spiral mode after the robot has traveled a specific distance (“maximum spiral distance”), such as 6.3 or 18.5 meters (step 240). In a preferred embodiment, the robot uses multiple maximum spiral distances depending on whether the robot is performing an initial spiral or a later spiral. If the maximum spiral distance is reached without a bump, the robot gives control to a different behavior, and the robot, for example, then continues to move in a predominately straight line. (In a preferred embodiment, a STRAIGHT LINE behavior is a low priority, default behavior that propels the robot in an approximate straight line at a preset velocity of approximately 0.306 m/s when no other behaviors are active.


In spiral mode, various actions can be taken when an obstacle is encountered. For example, the robot could (a) seek to avoid the obstacle and continue the spiral in the counter-clockwise direction, (b) seek to avoid the obstacle and continue the spiral in the opposite direction (e.g. changing from counter-clockwise to clockwise), or (c) change operational modes. Continuing the spiral in the opposite direction is known as reflective spiraling and is represented in FIG. 6C, where the robot 10 reverses its movement path 45 when it comes into contact with obstacle 101. In a preferred embodiment, as detailed in step 220, the robot 10 exits spot cleaning mode upon the first obstacle encountered by a bump sensor 12 or 13.


While a preferred embodiment describes a spiral motion for spot coverage, any self-bounded area can be used, including but not limited to regular polygon shapes such as squares, hexagons, ellipses, etc.


Operational Mode: Wall/Obstacle Following


Wall following or, in the case of a cleaning robot, edge cleaning, allows the user to clean only the edges of a room or the edges of objects within a room. The user places the robot 10 on the floor near an edge to be cleaned and selects the edge-cleaning operational mode. The robot 10 then moves in such a way that it follows the edge and cleans all areas brought into contact with the cleaning head 30 of the robot.


The movement of the robot 10 in a room 110 is shown in FIGS. 8A, 8B. In FIG. 8A, the robot 10 is placed along wall 100, with the robot's dominant side next to the wall. The robot then runs along the wall indefinitely following movement path 46. Similarly, in FIG. 8B, the robot 10 is placed in the proximity of an obstacle 101. The robot then follows the edge of the obstacle 101 indefinitely following movement path 47.


In a preferred embodiment, in the wall-following mode, the robot uses the wall-following sensor 16 to position itself a set distance from the wall. The robot then proceeds to travel along the perimeter of the wall. As shown in FIGS. 8A & 8B, in a preferred embodiment, the robot 10 is not able to distinguish between a wall 100 and another solid obstacle 101.


The method used in a preferred embodiment for following the wall is detailed in FIG. 9A and provides a smooth wall following operation even with a one-bit sensor. (Here the one-bit sensor detects only the presence or absence of the wall within a particular volume rather than the distance between wall and sensor.) Other methods of detecting a wall or object can be used such as bump sensing or sonar sensors.


Once the wall-following operational mode, or WALL FOLLOWING behavior of a preferred embodiment, is initiated (step 301), the robot first sets its initial value for the steering at r.sub.0. The WALL-FOLLOWING behavior then initiates the emit-detect routine in the wall-follower sensor 16 (step 310). The existence of a reflection for the IR transmitter portion of the sensor 16 translates into the existence of an object within a predetermined distance from the sensor 16. The WALL-FOLLOWING behavior then determines whether there has been a transition from a reflection (object within range) to a non-reflection (object outside of range) (step 320). If there has been a transition (in other words, the wall is now out of range), the value of r is set to its most negative value and the robot will veer slightly to the right (step 325). The robot then begins the emit-detect sequence again (step 310). If there has not been a transition from a reflection to a non-reflection, the wall-following behavior then determines whether there has been a transition from non-reflection to reflection (step 330). If there has been such a transition, the value of r is set to its most positive value and the robot will veer slightly left (step 335).


In the absence of either type of transition event, the wall-following behavior reduces the absolute value of r (step 340) and begins the emit-detect sequence (step 310) anew. By decreasing the absolute value of r, the robot 10 begins to turn more sharply in whatever direction it is currently heading. In a preferred embodiment, the rate of decreasing the absolute value of r is a constant rate dependant on the distance traveled.


The wall follower mode can be continued for a predetermined or random time, a predetermined or random distance or until some additional criteria are met (e.g. bump sensor is activated, etc.). In one embodiment, the robot continues to follow the wall indefinitely. In a preferred embodiment, as shown in FIGS. 8C & 8D wherein reference numeral 46 identifies the movement of the robot, minimum and maximum travel distances are determined, whereby the robot will remain in WALL-FOLLOWING behavior until the robot has either traveled the maximum distance (FIG. 8D) or traveled at least the minimum distance and encountered an obstacle 101 (FIG. 8C). This implementation of WALL-FOLLOWING behavior ensures the robot spends an appropriate amount of time in WALL-FOLLOWING behavior as compared to its other operational modes, thereby decreasing systemic neglect and distributing coverage to all areas. By increasing wall following, the robot is able to move in more spaces, but the robot is less efficient at cleaning any one space. In addition, by tending to exit WALL-FOLLOWING behavior after obstacle detection, the robot increases its perceived effectiveness.



FIG. 9B is a flow-chart illustration showing this embodiment of determining when to exit WALL-FOLLOWING (WF) behavior. The robot first determines the minimum distance to follow the wall (d.sub.min) and the maximum distance to follow the wall (d.sub.max). While in wall (or obstacle) following mode, the control system tracks the distance the robot has traveled in that mode (d.sub.WF). If d.sub.WF is greater than d.sub.max (step 350), then the robot exits wall-following mode (step 380). If, however, d.sub.WF is less than d.sub.max (step 350) and d.sub.WF is less than d.sub.min (step 360), the robot remains in wall-following mode (step 385). If d.sub.WF is greater than d.sub.min (step 360) and an obstacle is encountered (step 370), the robot exits wall-following mode (step 380).


Theoretically, the optimal distance for the robot to travel in WALL-FOLLOWING behavior is a function of room size and configuration and robot size. In a preferred embodiment, the minimum and maximum distances to remain in WALL-FOLLOWING are set based upon the approximate room size, the robots width and a random component, where by the average minimum travel distance is 2 w/p, where w, is the width of the work element of the robot and p is the probability that the robot will enter WALL-FOLLOWING behavior in a given interaction with an obstacle. By way of example, in a preferred embodiment, w is approximately between 15 cm and 25 cm, and p is 0.095 (where the robot encounters 6 to 15 obstacles, or an average of 10.5 obstacles, before entering an obstacle following mode). The minimum distance is then set randomly as a distance between approximately 115 cm and 350 cm; the maximum distance is then set randomly as a distance between approximately 170 cm and 520 cm. In certain embodiments the ratio between the minimum distance to the maximum distance is 2:3. For the sake of perceived efficiency, the robot's initial operation in a obstacle following mode can be set to be longer than its later operations in obstacle following mode. In addition, users may place the robot along the longest wall when starting the robot, which improves actual as well as perceived coverage.


The distance that the robot travels in wall following mode can also be set by the robot depending on the number and frequency of objects encountered (as determined by other sensors), which is a measure of room “clutter.” If more objects are encountered, the robot would wall follow for a greater distance in order to get into all the areas of the floor. Conversely, if few obstacles are encountered, the robot would wall follow less in order to not over-cover the edges of the space in favor of passes through the center of the space. An initial wall-following distance can also be included to allow the robot to follow the wall a longer or shorter distance during its initial period where the WALL-FOLLOWING behavior has control.


In a preferred embodiment, the robot may also leave wall-following mode if the robot turns more than, for example, 270 degrees and is unable to locate the wall (or object) or if the robot has turned a total of 360 degrees since entering wall-following mode.


In certain embodiments, when the WALL-FOLLOWING behavior is active and there is a bump, the ALIGN behavior becomes active. The ALIGN behavior turns the robot counter-clockwise to align the robot with the wall. The robot always turns a minimum angle to avoid getting the robot getting into cycles of many small turns. After it has turned through its minimum angle, the robot monitors its wall sensor and if it detects a wall and then the wall detection goes away, the robot stops turning. This is because at the end of the wall follower range, the robot is well aligned to start WALL-FOLLOWING. If the robot has not seen its wall detector go on and then off by the time it reaches its maximum angle, it stops anyway. This prevents the robot from turning around in circles when the wall is out of range of its wall sensor. When the most recent bump is within the side 60 degrees of the bumper on the dominant side, the minimum angle is set to 14 degrees and the maximum angle is 19 degrees. Otherwise, if the bump is within 30 degrees of the front of the bumper on the dominant side or on the non-dominant side, the minimum angle is 20 degrees and the maximum angle is 44 degrees. When the ALIGN behavior has completed turning, it cedes control to the WALL-FOLLOWING behavior


Operational Mode: Room Coverage


The third operational mode is here called room-coverage or room cleaning mode, which allows the user to clean any area bounded by walls, stairs, obstacles or other barriers. To exercise this option, the user places the robot on the floor and selects room-cleaning mode. The robot them moves about the room cleaning all areas that it is able to reach.


In a preferred embodiment, the method of performing the room cleaning behavior is a BOUNCE behavior in combination with the STRAIGHT LINE behavior. As shown in FIG. 10, the robot 10 travels until a bump sensor 12 and/or 13 is activated by contact with an obstacle 101 or a wall 100 (see FIG. 11). The robot 10 then turns and continues to travel. A sample movement path is shown in FIG. 11 as line 48.


The algorithm for random bounce behavior is set forth in FIG. 10. The robot 10 continues its forward movement (step 401) until a bump sensor 12 and/or 13 is activated (step 410). The robot 10 then calculates an acceptable range of new directions based on a determination of which bump sensor or sensors have been activated (step 420). A determination is then made with some random calculation to choose the new heading within that acceptable range, such as 90 to 270 degrees relative to the object the robot encountered. The angle of the object the robot has bumped is determined as described above using the timing between the right and left bump sensors. The robot then turns to its new headings. In a preferred embodiment, the turn is either clockwise or counterclockwise depending on which direction requires the least movement to achieve the new heading. In other embodiments, the turn is accompanied by movement forward in order to increase the robot's coverage efficiency.


The statistics of the heading choice made by the robot can be distributed uniformly across the allowed headings, i.e. there is an equivalent chance for any heading within the acceptable range. Alternately we can choose statistics based on a Gaussian or other distribution designed to preferentially drive the robot perpendicularly away from a wall.


In other embodiments, the robot could change directions at random or predetermined times and not based upon external sensor activity. Alternatively, the robot could continuously make small angle corrections based on long range sensors to avoid even contacting an object and, thereby cover the surface area with curved paths


In a preferred embodiment, the robot stays in room-cleaning mode until a certain number of bounce interactions are reached, usually between 6 and 13.


2. Escape Behaviors


There are several situations the robot may encounter while trying to cover an area that prevent or impede it from covering all of the area efficiently. A general class of sensors and behaviors called escape behaviors are designed to get the robot out of these situations, or in extreme cases to shut the robot off if it is determined it cannot escape. In order to decide whether to give an escape behavior priority among the various behaviors on the robot, the robot determines the following: (1) is an escape behavior needed; (2) if yes, which escape behavior is warranted?


By way of example, the following situations illustrate situations where an escape behavior is needed for an indoor cleaning robot and an appropriate behavior to run:


(i) Situation 1. The robot detects a situation where it might get stuck—for example, a high spot in a carpet or near a lamp base that acts like a ramp for the robot. The robot performs small “panic” turn behaviors to get out of the situation;


(ii) Situation 2. The robot is physically stuck—for example, the robot is wedged under a couch or against a wall, tangled in cords or carpet tassels, or stuck on a pile of electrical cords with its wheels spinning. The robot performs large panic turn behaviors and turns off relevant motors to escape from the obstruction;


(iii) Situation 3. The robot is in a small, confined area—for example, the robot is between the legs of a chair or in the open area under a dresser, or in a small area created by placing a lamp close to the corner of a room. The robot edge follows using its bumper and/or performs panic turn behaviors to escape from the area; and


(iv) Situation 4. The robot has been stuck and cannot free itself—for example, the robot is in one of the cases in category (ii), above, and has not been able to free itself with any of its panic behaviors. In this case, the robot stops operation and signals to the user for help. This preserves battery life and prevents damage to floors or furniture.


In order to detect the need for each escape situation, various sensors are used. For example:


(i) Situation 1. (a) When the brush or side brush current rise above a threshold, the voltage applied to the relevant motor is reduced. Whenever this is happening, a stall rate variable is increased. When the current is below the threshold, the stall rate is reduced. If the stall level rises above a low threshold and the slope of the rate is positive, the robot performs small panic turn behaviors. It only repeats these small panic turn behaviors when the level has returned to zero and risen to the threshold again. (b) Likewise, there is a wheel drop level variable which is increased when a wheel drop event is detected and is reduced steadily over time. When a wheel drop event is detected and the wheel drop level is above a threshold (meaning there have been several wheel drops recently), the robot performs small or large panic turn behaviors depending on the wheel drop level.


(ii) Situation 2. (a) When the brush stall rate rises above a high threshold and the slope is positive, the robot turns off the brush for 13 seconds and performs large panic turn behaviors at 1, 4, and 7 seconds. At the end of the 13 seconds, the brush is turned back on. (b) When the drive stall rate rises above a medium threshold and the slope is positive, the robot performs large panic turn behaviors continuously. (c) When the drive stall rate rises above a high threshold, the robot turns off all of the motors for 15 seconds. At the end of the 15 seconds, the motors are turned back on. (d) When the bumper of the robot is held in constantly for 5 seconds (as in a side wedging situation), the robot performs a large panic turn behavior. It repeats the panic turn behavior every 5 seconds until the bumper is released. (e) When the robot has gotten no bumps for a distance of 20 feet, it assumes that it might be stuck with its wheels spinning. To free itself, it performs a spiral. If has still not gotten a bump for 10 feet after the end of the spiral, performs a large panic turn behavior. It continues this every 10 feet until it gets a bump.


(iii) Situation 3. (a) When the average distance between bumps falls below a low threshold, the robot performs edge following using its bumper to try to escape from the confined area. (b) When the average distance between bumps falls below a very low threshold, the robot performs large panic turn behaviors to orient it so that it may better be able to escape from the confined area.


(iv) Situation 4. (a) When the brush has stalled and been turned off several times recently and the brush stall rate is high and the slope is positive, the robot shuts off. (b) When the drive has stalled and the motors turned off several times recently and the drive stall rate is high and the slope is positive, the robot shuts off. (c) When any of the wheels are dropped continuously for greater than 2 seconds, the robot shuts off. (d) When many wheel drop events occur in a short time, the robot shuts off. (e) When any of the cliff sensors sense a cliff continuously for 10 seconds, the robot shuts off. (f) When the bump sensor is constantly depressed for a certain amount of time, for example 10 seconds, it is likely that the robot is wedged, and the robot shuts off.


As a descriptive example, FIGS. 12A & 12B illustrate the analysis used in a preferred embodiment for identifying the need for an escape behavior relative to a stalled brush motor, as described above in Situations 1, 2 and 4. Each time the brush current exceeds a given limit for the brush motor (step 402), a rate register is incremented by 1 (step 404); if no limit is detected, the rate register is decremented by 1 (step 406). A separate slope register stores the recent values for a recent time period such as 120 cycles. If the rate is above 600 (where 600 corresponds to one second of constant stall) (step 414) and the slope is positive (step 416), then the robot will run an escape behavior (step 420) if the escape behavior is enabled (step 418). The escape behaviors are disabled after running (step 428) until the rate has returned to zero (step 422), re-enabled (step 424) and risen to 600 again. This is done to avoid the escape behavior being triggered constantly at rates above 600.


If, however, the rate is above 2400 (step 410) and the slope is positive (step 412), the robot will run a special set of escape behaviors shown in FIG. 12B. In a preferred embodiment, the brush motor will shut off (step 430), the “level” is incremented by a predetermined amount (50 to 90) (step 430), the stall time is set (step 430), and a panic behavior (step 452) is performed at 1 second (step 445), 4 seconds (stop 450) and 7 seconds (step 455) since the brush shut off. The control system then restarts the brush at 13 seconds (steps 440 & 442). Level is decremented by 1 every second (steps 444). If level reaches a maximum threshold (step 435), the robot ceases all operation (step 437). In addition, the robot may take additional actions when certain stalls are detected, such as limiting the voltage to the motor to prevent damage to the motor.


A preferred embodiment of the robot has four escape behaviors: TURN, EDGE, WHEEL DROP and SLOW.


TURN. The robot turns in place in a random direction, starting at a higher velocity (approximately twice of its normal turning velocity) and decreasing to a lower velocity (approximately one-half of its normal turning velocity). Varying the velocity may aid the robot in escaping from various situations. The angle that the robot should turn can be random or a function of the degree of escape needed or both. In a preferred embodiment, in low panic situations the robot turns anywhere from 45 to 90 degrees, and in high panic situations the robot turns anywhere from 90 to 270 degrees.


EDGE. The robot follows the edge using its bump sensor until (a) the robot turns 60 degrees without a bump or (b) the robot cumulatively has turned more than 170 degrees since the EDGE behavior initiated. The EDGE behavior may be useful if the average bump distance is low (but not so low as to cause a panic behavior). The EDGE behavior allows the robot to fit through the smallest openings physically possible for the robot and so can allow the robot to escape from confined areas.


WHEEL DROP. The robot back drives wheels briefly, then stops them. The back driving of the wheels helps to minimize false positive wheel drops by giving the wheels a small kick in the opposite direction. If the wheel drop is gone within 2 seconds, the robot continues normal operation.


SLOW. If a wheel drop or a cliff detector goes off, the robot slows down to speed of 0.235 m/s (or 77% of its normal speed) for a distance of 0.5 m and then ramps back up to its normal speed.


In addition to the coverage behaviors and the escape behaviors, the robot also might contain additional behaviors related to safety or usability. For example, if a cliff is detected for more than a predetermined amount of time, the robot may shut off. When a cliff is first detected, a cliff avoidance response behavior takes immediate precedence over all other behaviors, rotating the robot away from the cliff until the robot no longer senses the cliff. In a preferred embodiment, the cliff detection event does not cause a change in operational modes. In other embodiments, the robot could use an algorithm similar to the wall-following behavior to allow for cliff following.


The individual operation of the three operational modes has been described above; we now turn to the preferred mode of switching between the various modes.


In order to achieve the optimal coverage and cleaning efficiency, a preferred embodiment uses a control program that gives priority to various coverage behaviors. (Escape behaviors, if needed, are always given a higher priority.) For example, the robot 10 may use the wall following mode for a specified or random time period and then switch operational modes to the room cleaning. By switching between operational modes, the robotic device of the present invention is able to increase coverage, cleaning efficiency and perceived effectiveness.


By way of example, FIGS. 13A & 13B show a mobile robot 10 in a “dog bone” shaped environment in which two rooms 115 & 116 of roughly equal dimensions are connected by a narrow passageway 105. (This example illustrates the robot diffusion problem discussed earlier.) This arrangement is a simplified version of typical domestic environments, where the “dog bone” may be generated by the arrangements of obstacles within the room. In FIG. 13A, the path of robot 10 is traced as line 54 as robot 10 operates on in random bounce mode. The robot 10 is unable to move from room 116 into 115 during the limited run because the robot's random behavior did not happen to lead the robot through passageway 105. This method leaves the coverage far less than optimal and the cleaning rate decreased due to the number of times the robot 10 crosses its own path.



FIG. 13B shows the movement of a preferred embodiment of robot 10, whereby the robot cycles between BOUNCE and WALL FOLLOWING behaviors. As the robot follows path 99, each time the robot 10 encounters a wall 100, the robot follows the wall for a distance equal to twice the robot's diameter. The portions of the path 99 in which the robot 10 operates in wall following mode are labeled 51. This method provides greatly increased coverage, along with attendant increases in cleaning rate and perceived effectiveness.


Finally, a preferred embodiment of the present invention is detailed in FIG. 14, in which all three operational modes are used. In a preferred embodiment, the device 10 begins in spiral mode (movement line 45). If a reflective spiral pattern is used, the device continues in spiral mode until a predetermined or random number of reflective events has occurred. If a standard spiral is used (as shown in FIG. 14), the device should continue until any bump sensor event. In a preferred embodiment, the device immediately enters wall following mode after the triggering event.


In a preferred embodiment, the device then switches between wall following mode (movement lines 51) and random bounce modes (movement lines 48) based on bump sensor events or the completion of the wall following algorithm. In one embodiment, the device does not return to spiral mode; in other embodiments, however, the device can enter spiral mode based on a predetermined or random event.


In a preferred embodiment, the robot keeps a record of the average distance traveled between bumps. The robot then calculates an average bump distance (ABD) using the following formula: (¾.times.ABD)+(¼. times.most recent distance between bumps). If the ABD is a above a predetermined threshold, the robot will again give priority to the SPIRAL behavior. In still other embodiments, the robot may have a minimum number of bump events before the SPIRAL behavior will again be given priority. In other embodiments, the robot may enter SPIRAL behavior if it travels a maximum distance, for example 20 feet, without a bump event.


In addition, the robot can also have conditions upon which to stop all operations. For example, for a given room size, which can be manually selected, a minimum and maximum run time are set and a minimum total distance is selected. When the minimum time and the minimum distance have been reached the robot shuts off. Likewise, if the maximum time has been reached, the robot shuts off.


Of course, a manual control for selecting between operational modes can also be used. For example, a remote control could be used to change or influence operational modes or behaviors. Likewise, a switch mounted on the shell itself could be used to set the operation mode or the switching between modes. For instance, a switch could be used to set the level of clutter in a room, to allow the robot a more appropriate coverage algorithm with limited sensing ability.


One of skill in the art will recognize that portions of the instant invention can be used in autonomous vehicles for a variety of purposes besides cleaning. The scope of the invention should be determined by the appended claims and their legal equivalents, rather than by the examples given.

Claims
  • 1. A mobile robot operable to move on a surface in a room, the mobile robot comprising: a shell;a chassis including at least two wheels;at least one motor connected to the at least two wheels for moving the mobile robot on the surface;a controller operable to control the at least one motor to move the mobile robot on the surface in accordance with a first mode and a second mode;a cleaner operable to clean the surface as the mobile robot moves on the surface;a forward obstacle sensor approximately as wide as the shell and operable to detect obstacles forward of the shell along a direction of travel; anda following sensor directed at a lateral position outside the shell and forward of the wheels, and operable to detect a wall in the room as the mobile robot moves on the surface,wherein, in the first mode, the mobile robot moves generally adjacent to and along a portion of the wall in response to detection of the wall by the following sensor, the portion of the wall comprising at least one of an inside corner defining an angle less than 180° and an outside corner defining an angle between 180° and 360°, wherein to negotiate the inside corner, the controller is operable to control the mobile robot according to a corner-handling routine comprising: detecting a forward obstacle by the forward obstacle sensor,turning the mobile robot in place to scan the following sensor away from a followed obstacle;while turning, with the following sensor, losing detection of the followed obstacle, acquiring detection of the forward obstacle, and subsequently losing detection of the forward obstacle, andin response to the following sensor losing detection of the forward obstacle, ceasing the turning of the mobile robot;in the second mode, the mobile robot moves away from the wall, andthe controller is operable to control the mobile robot to move in accordance with the second mode after controlling the mobile robot to move in accordance with the first mode for a distance.
  • 2. The mobile robot according to claim 1, wherein the controller is operable to control the mobile robot to move in accordance with the second mode after controlling the mobile robot to move in accordance with the first mode for the distance and encountering an obstacle.
  • 3. The mobile robot according to claim 1, wherein the controller is further operable to control the mobile robot to change directions on the surface in accordance with one of predetermined times and random times during the second mode.
  • 4. The mobile robot according to claim 3, wherein the controller is operable to control the mobile robot to move in accordance with the second mode until a number of second interactions is reached.
  • 5. The mobile robot according to claim 3, wherein the controller is operable to control the mobile robot to cycle between the first mode and the second mode.
  • 6. The mobile robot according to claim 1, wherein the controller is further operable to control the mobile robot to move in accordance with a spot cleaning mode in which the mobile robot moves only within a predetermined area of the room.
  • 7. The mobile robot according to claim 6, wherein, in the spot cleaning mode, the mobile robot moves from a center of the predetermined area to a periphery of the predetermined area by successively moving around the center in increasing paths.
  • 8. The mobile robot according to claim 7, further comprising: a side brush disposed on a brush side of the mobile robot,wherein, in the spot cleaning mode, the mobile robot moves around the center of the predetermined area with the brush side toward the periphery of the predetermined area.
  • 9. The mobile robot according to claim 7, further comprising: a side brush disposed on a brush side of the mobile robot,wherein, in the spot cleaning mode, the mobile robot moves around the center of the predetermined area with the brush side toward the center of the predetermined area.
  • 10. The mobile robot according to claim 1, further comprising: wherein the controller is further operable to control the mobile robot to move in accordance with an obstacle mode in which the mobile robot moves to avoid physically contacting the obstacle in response to detection of the presence of the obstacle.
  • 11. The mobile robot according to claim 10, wherein, in the obstacle mode, the mobile robot moves generally adjacent to and along a periphery of the obstacle.
  • 12. The mobile robot according to claim 1, further comprising: a cliff sensor operable to detect a cliff in the room,wherein the controller is further operable to control the mobile robot to move in accordance with a cliff mode in which the mobile robot moves away from the cliff in response to detection of the cliff by the cliff sensor.
  • 13. The mobile robot according to claim 1, wherein detecting a forward obstacle by the obstacle sensor comprises detecting contact between the robot and the forward obstacle.
  • 14. The mobile robot according to claim 1, wherein turning the mobile robot in place comprises turning the mobile robot through at least a minimum angle based on an angle-of-encounter between the forward obstacle and the mobile robot.
  • 15. The mobile robot according to claim 14, wherein the minimum angle is at least one of 14° and 20° .
  • 16. The mobile robot according to claim 1, wherein turning the mobile robot in place comprises turning the mobile robot through at most a maximum angle based on an angle-of-encounter between the forward obstacle and the mobile robot.
  • 17. The mobile robot according to claim 16, wherein the maximum angle is at least one of 19° and 44° .
  • 18. The mobile robot according to claim 1, wherein the corner-handling routine further comprises: after the turning has ceased, controlling the mobile robot to move generally adjacent to and along a portion of the forward obstacle.
  • 19. A mobile robot operable to move on a surface in a room, the mobile robot comprising: a shell;a chassis including at least two wheels;at least one motor connected to the at least two wheels for moving the mobile robot on the surface;a controller operable to control the at least one motor to move the mobile robot on the surface in accordance with a plurality of modes;a cleaner operable to clean the surface as the mobile robot moves on the surface;a following sensor directed at a lateral position outside the shell and forward of the wheels, and operable to detect a wall in the room as the mobile robot moves on the surface;a cliff sensor operable to detect a cliff in the room as the mobile robot moves on the surface; anda forward obstacle sensor approximately as wide as the shell and operable to detect obstacles forward of the shell along a direction of travel,wherein the controller is operable to control the mobile robot to move in accordance with one of the plurality of modes after controlling the mobile robot to move generally adjacent to and along a portion of the wall including at least one of an inside corner defining an angle less than 180° and an outside corner defining an angle between 180° and 360°; andwherein to negotiate the inside corner, the controller is operable to control the mobile robot according to a corner-handling routine comprising: detecting a forward obstacle by the forward obstacle sensor, turning the mobile robot in place to scan the following sensor away from a followed obstacle;while turning, with the following sensor, losing detection of the followed obstacle, acquiring detection of the forward obstacle, and subsequently losing detection of the forward obstacle, andin response to the following sensor losing detection of the forward obstacle, ceasing the turning of the mobile robot.
  • 20. The mobile robot according to claim 19, wherein the controller is operable to control the mobile robot to move in accordance with the plurality of modes in response to detection of the wall by the following sensor, detection of the cliff by the cliff sensor, and detection of the presence of the obstacle by the obstacle sensor.
  • 21. The mobile robot according to claim 19, wherein the plurality of modes includes a wall following mode in which the mobile robot moves generally adjacent to and along the wall in response to detection of the wall by the following sensor.
  • 22. The mobile robot according to claim 21, wherein the plurality of modes includes a bounce mode in which the mobile robot moves away from the wall after the mobile robot moves in accordance with the wall following mode for one of a predetermined time and distance.
  • 23. The mobile robot according to claim 22, wherein the controller is operable to control the mobile robot to change directions on the surface in accordance with one of predetermined times and random times during the bounce mode.
  • 24. The mobile robot according to claim 23, wherein the controller is operable to control the mobile robot to cycle between the wall following mode and the bounce mode.
  • 25. The mobile robot according to claim 19, wherein the plurality of modes includes a spot cleaning mode in which the mobile robot moves only within a predetermined area of the room from a center of the predetermined area to a periphery of the predetermined area by successively moving around the center in increasing paths.
  • 26. The mobile robot according to claim 19, wherein the plurality of modes includes an obstacle mode in which the mobile robot moves to avoid physically contacting the obstacle in response to detection of the presence of the obstacle by the obstacle sensor.
  • 27. The mobile robot according to claim 26, wherein, in the obstacle mode, the mobile robot moves generally adjacent to and along a periphery of the obstacle.
  • 28. The mobile robot according to claim 19, wherein the plurality of modes includes a cliff mode in which the mobile robot moves away from the cliff in response to detection of the cliff by the cliff sensor.
  • 29. The mobile robot according to claim 19, wherein detecting a forward obstacle by the obstacle sensor comprises detecting contact between the robot and the forward obstacle.
  • 30. The mobile robot according to claim 19, wherein turning the mobile robot in place comprises turning the mobile robot through at least a minimum angle based on an angle-of-encounter between the forward obstacle and the mobile robot.
  • 31. The mobile robot according to claim 30, wherein the minimum angle is at least one of 14° and 20° .
  • 32. The mobile robot according to claim 19, wherein turning the mobile robot in place comprises turning the mobile robot through at most a maximum angle based on an angle-of-encounter between the forward obstacle and the mobile robot.
  • 33. The mobile robot according to claim 32, wherein the maximum angle is at least one of 19° and 44° .
  • 34. The mobile robot according to claim 19, wherein the corner-handling routine further comprises: after the turning has ceased, controlling the mobile robot to move generally adjacent to and along a portion of the forward obstacle.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application for U.S. patent is a continuation of U.S. patent application Ser. No. 11/671,305 filed Feb. 5, 2007, which is a continuation of U.S. patent application Ser. No. 10/839,374 filed May 5, 2004, now U.S. Pat. No. 7,173,391, which is a continuation of U.S. patent application Ser. No. 10/167,851 filed Jun. 12, 2002, now U.S. Pat. No. 6,809,490, which claims the benefit of U.S. Provisional Application No. 60/297,718 filed Jun. 12, 2001, the contents of all of which are expressly incorporated by reference herein in their entireties.

US Referenced Citations (409)
Number Name Date Kind
3457575 Bienek Jul 1969 A
3550714 Bellinger Dec 1970 A
3674316 De Brey Jul 1972 A
3744586 Leinauer Jul 1973 A
3937174 Haaga Feb 1976 A
4099284 Shinozaki et al. Jul 1978 A
4119900 Kremnitz Oct 1978 A
4196727 Verkaart et al. Apr 1980 A
4306329 Yokoi Dec 1981 A
4328545 Halsall et al. May 1982 A
4369543 Chen et al. Jan 1983 A
4513469 Godfrey et al. Apr 1985 A
4518437 Sommer May 1985 A
4556313 Miller et al. Dec 1985 A
4626995 Lofgren et al. Dec 1986 A
4628454 Ito Dec 1986 A
4662854 Fang May 1987 A
4674048 Okumura Jun 1987 A
4679152 Perdue Jul 1987 A
4696074 Cavalli Sep 1987 A
4700301 Dyke Oct 1987 A
4700427 Knepper Oct 1987 A
4716621 Zoni Jan 1988 A
4733430 Westergren Mar 1988 A
4733431 Martin Mar 1988 A
4756049 Uehara Jul 1988 A
4777416 George, II et al. Oct 1988 A
4782550 Jacobs Nov 1988 A
4796198 Boultinghouse et al. Jan 1989 A
4811228 Hyyppa Mar 1989 A
4815157 Tsuchiya Mar 1989 A
4851661 Everett, Jr. Jul 1989 A
4854000 Takimoto Aug 1989 A
4854006 Nishimura et al. Aug 1989 A
4887415 Martin Dec 1989 A
4893025 Lee Jan 1990 A
4901394 Nakamura et al. Feb 1990 A
4912643 Beirne Mar 1990 A
4918441 Bohman Apr 1990 A
4919224 Shyu et al. Apr 1990 A
4920605 Takashima May 1990 A
4933864 Evans et al. Jun 1990 A
4956891 Wulff Sep 1990 A
4962453 Pong et al. Oct 1990 A
4967862 Pong et al. Nov 1990 A
4974283 Holsten et al. Dec 1990 A
5002145 Waqkaumi et al. Mar 1991 A
5012886 Jonas et al. May 1991 A
5023788 Kitazume et al. Jun 1991 A
5032775 Mizuno et al. Jul 1991 A
5086535 Grossmeyer et al. Feb 1992 A
5093955 Blehert et al. Mar 1992 A
5105502 Takashima Apr 1992 A
5109566 Kobayashi et al. May 1992 A
5115538 Cochran et al. May 1992 A
5136750 Takashima et al. Aug 1992 A
5142985 Stearns et al. Sep 1992 A
5144715 Matsuyo et al. Sep 1992 A
5152028 Hirano Oct 1992 A
5163202 Kawakami et al. Nov 1992 A
5165064 Mattaboni Nov 1992 A
5182833 Yamaguchi et al. Feb 1993 A
5204814 Noonan et al. Apr 1993 A
5208521 Aoyama May 1993 A
5216777 Moro et al. Jun 1993 A
5233682 Abe et al. Aug 1993 A
5239720 Wood et al. Aug 1993 A
5251358 Moro et al. Oct 1993 A
5261139 Lewis Nov 1993 A
5276618 Everett, Jr. Jan 1994 A
5279672 Betker et al. Jan 1994 A
5284522 Kobayashi et al. Feb 1994 A
5293955 Lee Mar 1994 A
5303448 Hennessey et al. Apr 1994 A
5307273 Oh et al. Apr 1994 A
5309592 Hiratsuka May 1994 A
5319828 Waldhauser et al. Jun 1994 A
5321614 Ashworth Jun 1994 A
5324948 Dudar et al. Jun 1994 A
5341540 Soupert et al. Aug 1994 A
5353224 Lee et al. Oct 1994 A
5369347 Yoo Nov 1994 A
5399951 Lavallee et al. Mar 1995 A
5400244 Watanabe et al. Mar 1995 A
5410479 Coker Apr 1995 A
5440216 Kim Aug 1995 A
5444965 Colens Aug 1995 A
5446356 Kim Aug 1995 A
5454129 Kell Oct 1995 A
5455982 Armstrong et al. Oct 1995 A
5465525 Mifune et al. Nov 1995 A
5467273 Faibish et al. Nov 1995 A
5497529 Boesi Mar 1996 A
5507067 Hoekstra et al. Apr 1996 A
5515572 Hoekstra et al. May 1996 A
5534762 Kim Jul 1996 A
5537017 Feiten et al. Jul 1996 A
5539953 Kurz Jul 1996 A
5542146 Hoekstra et al. Aug 1996 A
5548511 Bancroft Aug 1996 A
5553349 Kilstrom et al. Sep 1996 A
5555587 Guha Sep 1996 A
5560077 Crotchett Oct 1996 A
5568589 Hwang Oct 1996 A
5610488 Miyazawa Mar 1997 A
5611106 Wulff Mar 1997 A
5611108 Knowlton et al. Mar 1997 A
5613261 Kawakami et al. Mar 1997 A
5621291 Lee Apr 1997 A
5622236 Azumi et al. Apr 1997 A
5634237 Paranjpe Jun 1997 A
5634239 Tuvin et al. Jun 1997 A
5636402 Kubo et al. Jun 1997 A
5650702 Azumi Jul 1997 A
5652489 Kawakami Jul 1997 A
5682313 Edlund et al. Oct 1997 A
5682839 Grimsley et al. Nov 1997 A
5696675 Nakamura et al. Dec 1997 A
5709007 Chiang Jan 1998 A
5714119 Kawagoe et al. Feb 1998 A
5717484 Hamaguchi et al. Feb 1998 A
5720077 Nakamura et al. Feb 1998 A
5735959 Kubo et al. Apr 1998 A
5761762 Kubo et al. Jun 1998 A
5781960 Kilstrom et al. Jul 1998 A
5787545 Colens Aug 1998 A
5794297 Muta Aug 1998 A
5812267 Everett, Jr. et al. Sep 1998 A
5815880 Nakanishi Oct 1998 A
5819008 Asama et al. Oct 1998 A
5820821 Kawagoe et al. Oct 1998 A
5825981 Matsuda Oct 1998 A
5839156 Park et al. Nov 1998 A
5841259 Kim et al. Nov 1998 A
5867800 Leif Feb 1999 A
5869910 Colens Feb 1999 A
5894621 Kubo Apr 1999 A
5903124 Kawakami May 1999 A
5926909 McGee Jul 1999 A
5935179 Kleiner et al. Aug 1999 A
5940927 Haegermarck et al. Aug 1999 A
5940930 Oh et al. Aug 1999 A
5942869 Katou et al. Aug 1999 A
5943730 Boomgaarden Aug 1999 A
5943733 Tagliaferri Aug 1999 A
5947225 Kawakami et al. Sep 1999 A
5959423 Nakanishi et al. Sep 1999 A
5974348 Rocks Oct 1999 A
5987383 Keller et al. Nov 1999 A
5991951 Kubo et al. Nov 1999 A
5995884 Allen et al. Nov 1999 A
5998953 Nakamura et al. Dec 1999 A
6023814 Imamura Feb 2000 A
6025687 Himeda et al. Feb 2000 A
6038501 Kawakami Mar 2000 A
6041471 Charky et al. Mar 2000 A
6070290 Schwarze et al. Jun 2000 A
6076025 Ueno et al. Jun 2000 A
6076226 Reed Jun 2000 A
6108076 Hanseder Aug 2000 A
6112143 Allen et al. Aug 2000 A
6112996 Matsuo Sep 2000 A
6119057 Kawagoe Sep 2000 A
6124694 Bancroft et al. Sep 2000 A
6138063 Himeda Oct 2000 A
6142252 Kinto et al. Nov 2000 A
6145145 Besel Nov 2000 A
6226830 Hendriks et al. May 2001 B1
6240342 Fiegert et al. May 2001 B1
6255793 Peless et al. Jul 2001 B1
6259979 Holmquist Jul 2001 B1
6285930 Dickson et al. Sep 2001 B1
6300737 Bergvall et al. Oct 2001 B1
6321515 Colens Nov 2001 B1
6327741 Reed Dec 2001 B1
6338013 Ruffner Jan 2002 B1
6339735 Peless et al. Jan 2002 B1
6370453 Sommer Apr 2002 B2
6374155 Wallach et al. Apr 2002 B1
6385515 Dickson et al. May 2002 B1
6389329 Colens May 2002 B1
6408226 Byrne et al. Jun 2002 B1
6430471 Kintou et al. Aug 2002 B1
6438456 Feddema et al. Aug 2002 B1
6442476 Poropat Aug 2002 B1
6443509 Levin et al. Sep 2002 B1
6444003 Sutcliffe Sep 2002 B1
6459955 Bartsch et al. Oct 2002 B1
6463368 Feiten et al. Oct 2002 B1
6465982 Bergvall et al. Oct 2002 B1
6481515 Kirkpatrick et al. Nov 2002 B1
6493612 Bisset et al. Dec 2002 B1
6493613 Peless et al. Dec 2002 B2
6496754 Song et al. Dec 2002 B2
6496755 Wallach et al. Dec 2002 B2
6507773 Parker et al. Jan 2003 B2
6525509 Petersson et al. Feb 2003 B1
6530102 Pierce et al. Mar 2003 B1
6532404 Colens Mar 2003 B2
6535793 Allard Mar 2003 B2
6540607 Mokris et al. Apr 2003 B2
6548982 Papanikolopoulos et al. Apr 2003 B1
6571415 Gerber et al. Jun 2003 B2
6574536 Kawagoe et al. Jun 2003 B1
6580246 Jacobs Jun 2003 B2
6584376 Van Kommer Jun 2003 B1
6586908 Petersson et al. Jul 2003 B2
6590222 Bisset et al. Jul 2003 B1
6594551 McKinney, Jr. Jul 2003 B2
6594844 Jones Jul 2003 B2
6601265 Burlington Aug 2003 B1
6604022 Parker Aug 2003 B2
6605156 Clark et al. Aug 2003 B1
6611120 Song et al. Aug 2003 B2
6611734 Parker et al. Aug 2003 B2
6611738 Ruffner Aug 2003 B2
6615108 Peless et al. Sep 2003 B1
6625843 Kim et al. Sep 2003 B2
6629028 Paromtchik et al. Sep 2003 B2
6633150 Wallach et al. Oct 2003 B1
6637546 Wang Oct 2003 B1
6658693 Reed, Jr. Dec 2003 B1
6661239 Ozick Dec 2003 B1
6671592 Bisset et al. Dec 2003 B1
6690134 Jones et al. Feb 2004 B1
6741054 Koselka et al. May 2004 B2
6741364 Lange et al. May 2004 B2
6748297 Song et al. Jun 2004 B2
6764373 Osawa et al. Jul 2004 B1
6774596 Bisset Aug 2004 B1
6781338 Jones et al. Aug 2004 B2
6809490 Jones et al. Oct 2004 B2
6810305 Kirkpatrick, Jr. Oct 2004 B2
6830120 Yashima et al. Dec 2004 B1
6841963 Song et al. Jan 2005 B2
6845297 Allard Jan 2005 B2
6859010 Jeon et al. Feb 2005 B2
6865447 Lau et al. Mar 2005 B2
6870792 Chiappetta Mar 2005 B2
6883201 Jones et al. Apr 2005 B2
6901624 Mori et al. Jun 2005 B2
6925679 Wallach et al. Aug 2005 B2
6929548 Wang Aug 2005 B2
D510066 Hickey et al. Sep 2005 S
6938298 Aasen Sep 2005 B2
6940291 Ozick Sep 2005 B1
6941199 Bottomley et al. Sep 2005 B1
6956348 Landry et al. Oct 2005 B2
6957712 Song et al. Oct 2005 B2
6965209 Jones et al. Nov 2005 B2
6971140 Kim Dec 2005 B2
6999850 McDonald Feb 2006 B2
7024278 Chiappetta et al. Apr 2006 B2
7024280 Parker et al. Apr 2006 B2
7031805 Lee et al. Apr 2006 B2
7053578 Diehl et al. May 2006 B2
7055210 Keppler et al. Jun 2006 B2
7059012 Song et al. Jun 2006 B2
7069124 Whittaker et al. Jun 2006 B1
7079923 Abramson et al. Jul 2006 B2
7085624 Aldred et al. Aug 2006 B2
7113847 Chmura et al. Sep 2006 B2
7133746 Abramson et al. Nov 2006 B2
7155308 Jones Dec 2006 B2
7171285 Kim et al. Jan 2007 B2
7173391 Jones et al. Feb 2007 B2
7188000 Chiappetta et al. Mar 2007 B2
7201786 Wegelin et al. Apr 2007 B2
7206677 Hulden Apr 2007 B2
7225500 Diehl et al. Jun 2007 B2
7246405 Yan Jul 2007 B2
7248951 Hulden Jul 2007 B2
7288912 Landry et al. Oct 2007 B2
7318248 Yan Jan 2008 B1
7324870 Lee Jan 2008 B2
7346428 Huffman et al. Mar 2008 B1
7352153 Yan Apr 2008 B2
7359766 Jeon et al. Apr 2008 B2
7360277 Moshenrose et al. Apr 2008 B2
7388343 Jones et al. Jun 2008 B2
7389156 Ziegler et al. Jun 2008 B2
7389166 Harwig et al. Jun 2008 B2
7408157 Yan Aug 2008 B2
7418762 Arai et al. Sep 2008 B2
7429843 Jones et al. Sep 2008 B2
7430455 Casey et al. Sep 2008 B2
7444206 Abramson et al. Oct 2008 B2
7459871 Landry et al. Dec 2008 B2
7474941 Kim et al. Jan 2009 B2
7503096 Lin Mar 2009 B2
7515991 Egawa et al. Apr 2009 B2
7555363 Augenbraun et al. Jun 2009 B2
7568259 Yan Aug 2009 B2
7578020 Jaworski et al. Aug 2009 B2
7600521 Woo Oct 2009 B2
7603744 Reindle Oct 2009 B2
7617557 Reindle Nov 2009 B2
7647144 Haegermarck Jan 2010 B2
7650666 Jang Jan 2010 B2
7660650 Kawagoe et al. Feb 2010 B2
7663333 Jones et al. Feb 2010 B2
7693605 Park Apr 2010 B2
7720554 DiBernardo et al. May 2010 B2
7801645 Taylor et al. Sep 2010 B2
7805220 Taylor et al. Sep 2010 B2
7849555 Hahm et al. Dec 2010 B2
7920941 Park et al. Apr 2011 B2
7937800 Yan May 2011 B2
7957836 Myeong et al. Jun 2011 B2
20010004719 Sommer Jun 2001 A1
20010047231 Peless et al. Nov 2001 A1
20020011813 Koselka et al. Jan 2002 A1
20020016649 Jones Feb 2002 A1
20020049530 Poropat Apr 2002 A1
20020120364 Colens Aug 2002 A1
20020156556 Ruffner Oct 2002 A1
20020173877 Zweig Nov 2002 A1
20030019071 Field et al. Jan 2003 A1
20030023356 Keable Jan 2003 A1
20030025472 Jones et al. Feb 2003 A1
20030030399 Jacobs Feb 2003 A1
20030060928 Abramson et al. Mar 2003 A1
20030120389 Abramson et al. Jun 2003 A1
20030137268 Papanikolopoulos et al. Jul 2003 A1
20030192144 Song et al. Oct 2003 A1
20030216834 Allard Nov 2003 A1
20030229421 Chmura et al. Dec 2003 A1
20030233177 Johnson et al. Dec 2003 A1
20040020000 Jones Feb 2004 A1
20040030448 Solomon Feb 2004 A1
20040030449 Solomon Feb 2004 A1
20040030450 Solomon Feb 2004 A1
20040030571 Solomon Feb 2004 A1
20040031113 Wosewick et al. Feb 2004 A1
20040049877 Jones et al. Mar 2004 A1
20040068351 Solomon Apr 2004 A1
20040068415 Solomon Apr 2004 A1
20040068416 Solomon Apr 2004 A1
20040074044 Diehl et al. Apr 2004 A1
20040076324 Burl et al. Apr 2004 A1
20040083570 Song et al. May 2004 A1
20040088079 Lavarec et al. May 2004 A1
20040111184 Chiappetta et al. Jun 2004 A1
20040134336 Solomon Jul 2004 A1
20040134337 Solomon Jul 2004 A1
20040143919 Wilder Jul 2004 A1
20040156541 Jeon et al. Aug 2004 A1
20040158357 Lee et al. Aug 2004 A1
20040187249 Jones et al. Sep 2004 A1
20040187457 Colens Sep 2004 A1
20040200505 Taylor et al. Oct 2004 A1
20040204792 Taylor et al. Oct 2004 A1
20040207355 Jones et al. Oct 2004 A1
20040211444 Taylor et al. Oct 2004 A1
20040236468 Taylor et al. Nov 2004 A1
20040244138 Taylor et al. Dec 2004 A1
20050000543 Taylor et al. Jan 2005 A1
20050010331 Taylor et al. Jan 2005 A1
20050028316 Thomas et al. Feb 2005 A1
20050085947 Aldred et al. Apr 2005 A1
20050137749 Jeon et al. Jun 2005 A1
20050150519 Keppler et al. Jul 2005 A1
20050156562 Cohen et al. Jul 2005 A1
20050166355 Tani Aug 2005 A1
20050172445 Diehl et al. Aug 2005 A1
20050183230 Uehigashi Aug 2005 A1
20050187678 Myeong et al. Aug 2005 A1
20050192707 Park et al. Sep 2005 A1
20050204717 Colens Sep 2005 A1
20050209736 Kawagoe Sep 2005 A1
20050213082 DiBernardo et al. Sep 2005 A1
20050217042 Reindle Oct 2005 A1
20050218852 Landry et al. Oct 2005 A1
20050235451 Yan Oct 2005 A1
20050273967 Taylor et al. Dec 2005 A1
20060020369 Taylor et al. Jan 2006 A1
20060020370 Abramson Jan 2006 A1
20060021168 Nishikawa Feb 2006 A1
20060037170 Shimizu Feb 2006 A1
20060061657 Rew et al. Mar 2006 A1
20060100741 Jung May 2006 A1
20060119839 Bertin et al. Jun 2006 A1
20060150361 Aldred et al. Jul 2006 A1
20060196003 Song et al. Sep 2006 A1
20060259194 Chiu Nov 2006 A1
20060288519 Jaworski et al. Dec 2006 A1
20070006404 Cheng et al. Jan 2007 A1
20070017061 Yan Jan 2007 A1
20070028574 Yan Feb 2007 A1
20070032904 Kawagoe et al. Feb 2007 A1
20070114975 Cohen et al. May 2007 A1
20070142964 Abramson Jun 2007 A1
20070150096 Yeh et al. Jun 2007 A1
20070157420 Lee et al. Jul 2007 A1
20070213892 Jones et al. Sep 2007 A1
20070285041 Jones et al. Dec 2007 A1
20080001566 Jones et al. Jan 2008 A1
20080007193 Jones et al. Jan 2008 A1
20080007203 Cohen et al. Jan 2008 A1
20080015738 Casey et al. Jan 2008 A1
20080052846 Kapoor et al. Mar 2008 A1
20080184518 Taylor et al. Aug 2008 A1
20080302586 Yan Dec 2008 A1
20090038089 Landry et al. Feb 2009 A1
20090049640 Lee et al. Feb 2009 A1
20090055022 Casey et al. Feb 2009 A1
20100049362 Hatuka Feb 2010 A1
20100049365 Jones et al. Feb 2010 A1
20100063628 Landry et al. Mar 2010 A1
Foreign Referenced Citations (353)
Number Date Country
3536907 Apr 1986 DE
9311014 Oct 1993 DE
4338841 May 1995 DE
4414683 Oct 1995 DE
19849978 Feb 2001 DE
10242257 Apr 2003 DE
102004038074 Jun 2005 DE
10357636 Jul 2005 DE
102005046813 Apr 2007 DE
338988 Dec 1988 DK
265542 May 1988 EP
281085 Sep 1988 EP
286328 Oct 1988 EP
294101 Dec 1988 EP
307381 Mar 1989 EP
352045 Jan 1990 EP
358628 Mar 1990 EP
389459 Sep 1990 EP
433697 Jun 1991 EP
479273 Apr 1992 EP
554978 Aug 1993 EP
615719 Sep 1994 EP
0792726 Sep 1997 EP
845237 Jun 1998 EP
1018315 Jul 2000 EP
0963173 Jan 2002 EP
1172719 Jan 2002 EP
1331537 Jul 2003 EP
1380245 Jan 2004 EP
1380246 Jan 2004 EP
1380245 Apr 2004 EP
1557730 Jul 2005 EP
1642522 Apr 2006 EP
1672455 Jun 2006 EP
2238196 Aug 2005 ES
722755 Mar 1932 FR
2601443 Jan 1988 FR
2828589 Aug 2001 FR
2225221 May 1990 GB
2267360 Dec 1993 GB
2283838 May 1995 GB
2284957 Jun 1995 GB
2300082 Oct 1996 GB
2344747 Jun 2000 GB
2409966 Jul 2005 GB
53110257 Sep 1978 JP
57064217 Apr 1982 JP
59033511 Feb 1984 JP
59094005 May 1984 JP
59099308 Jun 1984 JP
59112311 Jun 1984 JP
59-120124 Jul 1984 JP
59131668 Jul 1984 JP
59164973 Sep 1984 JP
59184917 Oct 1984 JP
59212924 Dec 1984 JP
59226909 Dec 1984 JP
60089213 May 1985 JP
60211510 Oct 1985 JP
61023221 Jan 1986 JP
61097712 Jun 1986 JP
61160366 Jul 1986 JP
62070709 May 1987 JP
60259895 Jun 1987 JP
62120510 Jun 1987 JP
60293095 Jul 1987 JP
62154008 Jul 1987 JP
62189057 Aug 1987 JP
62263507 Nov 1987 JP
62263508 Nov 1987 JP
63-079623 Apr 1988 JP
63-158032 Jul 1988 JP
63183032 Jul 1988 JP
63203483 Aug 1988 JP
62074018 Oct 1988 JP
63241610 Oct 1988 JP
1162454 Jun 1989 JP
02006312 Jan 1990 JP
02555263 Apr 1990 JP
03051023 Mar 1991 JP
3197758 Aug 1991 JP
3201903 Sep 1991 JP
4019586 Jan 1992 JP
4074285 Mar 1992 JP
4084921 Mar 1992 JP
5-042076 Feb 1993 JP
5023269 Feb 1993 JP
5040519 Feb 1993 JP
05046239 Feb 1993 JP
05046246 Feb 1993 JP
5060049 Mar 1993 JP
5091604 Apr 1993 JP
5150827 Jun 1993 JP
5150829 Jun 1993 JP
5-54620 Jul 1993 JP
5257527 Oct 1993 JP
5285861 Nov 1993 JP
06003251 Jan 1994 JP
6026312 Feb 1994 JP
6-105781 Apr 1994 JP
6137828 May 1994 JP
06-154143 Jun 1994 JP
06327898 Nov 1994 JP
7059702 Mar 1995 JP
07129239 May 1995 JP
07222705 Aug 1995 JP
07-281752 Oct 1995 JP
7270518 Oct 1995 JP
7311041 Nov 1995 JP
07-313417 Dec 1995 JP
07-319542 Dec 1995 JP
08-016241 Jan 1996 JP
08016776 Jan 1996 JP
08-063229 Mar 1996 JP
8083125 Mar 1996 JP
08-84696 Apr 1996 JP
8-089449 Apr 1996 JP
08089451 Apr 1996 JP
08-123548 May 1996 JP
08152916 Jun 1996 JP
08-256960 Oct 1996 JP
08-263137 Oct 1996 JP
08-286741 Nov 1996 JP
08-286744 Nov 1996 JP
08-286745 Nov 1996 JP
08-286747 Nov 1996 JP
08-322774 Dec 1996 JP
08-335112 Dec 1996 JP
8339297 Dec 1996 JP
09-047413 Feb 1997 JP
9044240 Feb 1997 JP
09-066855 Mar 1997 JP
9145309 Jun 1997 JP
09160644 Jun 1997 JP
07338573 Jul 1997 JP
08000393 Jul 1997 JP
09179625 Jul 1997 JP
9179625 Jul 1997 JP
09185410 Jul 1997 JP
9192069 Jul 1997 JP
09-204223 Aug 1997 JP
09-204224 Aug 1997 JP
9206258 Aug 1997 JP
9233712 Sep 1997 JP
9251318 Sep 1997 JP
09-265319 Oct 1997 JP
09-269807 Oct 1997 JP
09-269810 Oct 1997 JP
09-269824 Oct 1997 JP
09-319431 Dec 1997 JP
09-319432 Dec 1997 JP
09-319434 Dec 1997 JP
09-325812 Dec 1997 JP
10-027020 Jan 1998 JP
10-055215 Feb 1998 JP
10-105233 Apr 1998 JP
10-117973 May 1998 JP
10-118963 May 1998 JP
10177414 Jun 1998 JP
10-240342 Sep 1998 JP
10-240343 Sep 1998 JP
10-260727 Sep 1998 JP
09043901 Sep 1998 JP
10-295595 Nov 1998 JP
10214114 Nov 1998 JP
10-314088 Dec 1998 JP
11-065655 Mar 1999 JP
11-065657 Mar 1999 JP
11-102219 Apr 1999 JP
11-102220 Apr 1999 JP
11162454 Jun 1999 JP
11-174145 Jul 1999 JP
11-175149 Jul 1999 JP
11178764 Jul 1999 JP
11178765 Jul 1999 JP
11-213157 Aug 1999 JP
11212642 Aug 1999 JP
11508810 Aug 1999 JP
11248806 Sep 1999 JP
11510935 Sep 1999 JP
11-295412 Oct 1999 JP
11282532 Oct 1999 JP
11282533 Oct 1999 JP
07295636 Nov 1999 JP
11346964 Dec 1999 JP
2000-056006 Feb 2000 JP
2000-056831 Feb 2000 JP
2000-060782 Feb 2000 JP
2000047728 Feb 2000 JP
2000-066722 Mar 2000 JP
2000-075925 Mar 2000 JP
2000-510750 Aug 2000 JP
2000275321 Oct 2000 JP
2000353014 Dec 2000 JP
2001-022443 Jan 2001 JP
2001067588 Mar 2001 JP
2001121455 May 2001 JP
2001125641 May 2001 JP
2001216482 Aug 2001 JP
2001-265437 Sep 2001 JP
2001258807 Sep 2001 JP
2001275908 Oct 2001 JP
2001289939 Oct 2001 JP
2001306170 Nov 2001 JP
2001320781 Nov 2001 JP
2001525567 Dec 2001 JP
2002078650 Mar 2002 JP
2002204768 Jul 2002 JP
2002204769 Jul 2002 JP
2002247510 Aug 2002 JP
2002532178 Oct 2002 JP
2002-333920 Nov 2002 JP
2002323925 Nov 2002 JP
3356170 Dec 2002 JP
2002355206 Dec 2002 JP
2002360471 Dec 2002 JP
2002360479 Dec 2002 JP
2002360482 Dec 2002 JP
2002366227 Dec 2002 JP
2002369778 Dec 2002 JP
2003-010088 Jan 2003 JP
2003-015740 Jan 2003 JP
2003005296 Jan 2003 JP
2003010076 Jan 2003 JP
3375843 Feb 2003 JP
2003036116 Feb 2003 JP
2003038401 Feb 2003 JP
2003038402 Feb 2003 JP
2003052596 Feb 2003 JP
2003505127 Feb 2003 JP
2003061882 Mar 2003 JP
2003061882 Mar 2003 JP
2003084994 Mar 2003 JP
2003-167628 Jun 2003 JP
2003180586 Jul 2003 JP
2003180587 Jul 2003 JP
2003186539 Jul 2003 JP
2003190064 Jul 2003 JP
2003241836 Aug 2003 JP
2003262520 Sep 2003 JP
2003304992 Oct 2003 JP
2003285288 Oct 2003 JP
2003310509 Nov 2003 JP
2003310489 Nov 2003 JP
2003330543 Nov 2003 JP
2004-123040 Apr 2004 JP
2004148021 May 2004 JP
2004219185 May 2004 JP
2004-160102 Jun 2004 JP
2004-166968 Jun 2004 JP
2004-174228 Jun 2004 JP
2004198330 Jul 2004 JP
2004351234 Dec 2004 JP
2005-135400 May 2005 JP
2005118354 May 2005 JP
2005-224265 Aug 2005 JP
2005-245916 Sep 2005 JP
2005230032 Sep 2005 JP
2005352707 Oct 2005 JP
2005296511 Oct 2005 JP
2005-346700 Dec 2005 JP
2006043071 Feb 2006 JP
2006-079145 Mar 2006 JP
2006-079157 Mar 2006 JP
2006155274 Jun 2006 JP
2006-164223 Jun 2006 JP
2006089307 Aug 2006 JP
2006227673 Aug 2006 JP
2006-247467 Sep 2006 JP
2006-260161 Sep 2006 JP
2006-293662 Oct 2006 JP
2006-296697 Nov 2006 JP
2007034866 Feb 2007 JP
2007-213180 Aug 2007 JP
2009-015611 Jan 2009 JP
2010-198552 Sep 2010 JP
WO 9526512 Oct 1995 WO
WO9530887 Nov 1995 WO
WO9617258 Jun 1996 WO
WO 9715224 May 1997 WO
WO 9740734 Nov 1997 WO
WO 9741451 Nov 1997 WO
WO 9853456 Nov 1998 WO
WO9905580 Feb 1999 WO
WO 9916078 Apr 1999 WO
WO 9928800 Jun 1999 WO
WO 9938056 Jul 1999 WO
WO 9938237 Jul 1999 WO
WO 9943250 Sep 1999 WO
WO 9959042 Nov 1999 WO
WO 0004430 Jan 2000 WO
WO 0036962 Jan 2000 WO
0038028 Jun 2000 WO
WO 0038026 Jun 2000 WO
WO 0038029 Jun 2000 WO
WO 0078410 Dec 2000 WO
WO 0106904 Feb 2001 WO
WO 0106905 Feb 2001 WO
WO0106904 Feb 2001 WO
WO0106905 Feb 2001 WO
WO0180703 Nov 2001 WO
WO0191623 Dec 2001 WO
WO0224292 Mar 2002 WO
WO 0239864 May 2002 WO
WO 0239868 May 2002 WO
WO 02058527 Aug 2002 WO
WO 02062194 Aug 2002 WO
02069775 Sep 2002 WO
02071175 Sep 2002 WO
WO 02067744 Sep 2002 WO
WO 02067745 Sep 2002 WO
WO 02071175 Sep 2002 WO
WO 02074150 Sep 2002 WO
WO 02075356 Sep 2002 WO
WO 02075469 Sep 2002 WO
WO 02075470 Sep 2002 WO
WO02067752 Sep 2002 WO
WO02069774 Sep 2002 WO
WO02075350 Sep 2002 WO
WO02081074 Oct 2002 WO
WO 02101477 Dec 2002 WO
WO 03026474 Apr 2003 WO
WO 03040845 May 2003 WO
WO 03040846 May 2003 WO
WO 2004004533 Jan 2004 WO
WO 2004006034 Jan 2004 WO
WO2004004534 Jan 2004 WO
2004043215 May 2004 WO
2004058028 Jul 2004 WO
2004059409 Jul 2004 WO
WO 2004058028 Jul 2004 WO
WO 2004059409 Jul 2004 WO
2005006935 Jan 2005 WO
2005036292 Apr 2005 WO
2005055795 Jun 2005 WO
2005055796 Jun 2005 WO
WO 2005055795 Jun 2005 WO
WO 2005077244 Aug 2005 WO
WO2005076545 Aug 2005 WO
WO2005077243 Aug 2005 WO
2005082223 Sep 2005 WO
WO2005083541 Sep 2005 WO
WO2005098475 Oct 2005 WO
WO2005098476 Oct 2005 WO
WO2006046400 May 2006 WO
2006061133 Jun 2006 WO
2006068403 Jun 2006 WO
WO 2006061133 Jun 2006 WO
WO 2006068403 Jun 2006 WO
2006073248 Jul 2006 WO
2007028049 Mar 2007 WO
2007036490 Apr 2007 WO
WO2007065033 Jun 2007 WO
Non-Patent Literature Citations (156)
Entry
Dory, “Sweep strategies for a sensory-driven, behavior-based vacuum cleaning agent” AAAI 1993 Fall Symposium Series, 1993, pp. 42-50.
Doty, Keith L., and Reid Harrison. “Sweep Strategies for a Sensory-Driven, Behavior-Based Vacuum Cleaning Agent.” AAAI 1993 Fall Symposium Series Instantiating Real-World Agents Research Triangle Park, Raleigh, NC. 1993, pp. 42-50.
Foreign Office Action in corresponding JP 2010-507685, dated Jun. 9, 2011, and English language translation thereof.
Foreign Office Action in corresponding JP 2010-507698, dated Jun. 28, 2011, and English language translation thereof.
Cameron Morland, Autonomous Lawn Mower Control, Jul. 24, 2002.
Doty, Keith L et al, “Sweep Strategies for a Sensory-Driven, Behavior-Based Vacuum Cleaning Agent” AAAI 1993 Fall Symposium Series Instantiating Real-World Agents Research Triangle Park, Raleigh, NC, Oct. 22-24, 1993, pp. 1-6.
Electrolux designed for the well-lived home, website: http://www.electroluxusa.com/node57.as[?currentURL=node142.asp%3F, acessed Mar. 18, 2005, 5 pgs.
eVac Robotic Vacuum S1727 Instruction Manual, Sharper Image Corp, Copyright 2004, 16 pgs.
Everyday Robots, website: http://www.everydayrobots.com/index.php?option=content&task=view&id=9, accessed Apr. 20, 2005, 7 pgs.
Facts on the Trilobite webpage: “http://trilobiteelectroluxse/presskit—en/nodel1335asp?print=yes&pressID=” accessed Dec. 12, 2003 (2 pages).
Friendly Robotics Robotic Vacuum RV400—The Robot Store website: http://www.therobotstore.com/s.nl/sc.9/category,-109/it.A/id.43/.f, accessed Apr. 20, 2005, 5 pgs.
Gat, Erann, Robust Low-computation Sensor-driven Control for Task-Directed Navigation, Proceedings of the 1991 IEEE, International Conference on Robotics and Automation, Sacramento, California, Apr. 1991, pp. 2484-2489.
Hitachi: News release: The home cleaning robot of the autonomous movement type (experimental machine) is developed, website: http://www.i4u.com/japanreleases/hitachirobot.htm., accessed Mar. 18, 2005, 5 pgs.
Kärcher Product Manual Download webpage: “http://wwwkarchercom/bta/downloadenshtml?ACTION=SELECTTEILENR&ID=rc3000&submitButtonName=Select+Product+Manual” and associated pdf file “5959-915enpdf (47 MB) English/English” accessed Jan. 21, 2004 (16 pages).
Karcher RC 3000 Cleaning Robot—user manual Manufacturer: Alfred-Karcher GmbH & Co, Cleaning Systems, Alfred Karcher-Str 28-40, PO Box 160, D-71349 Winnenden, Germany, Dec. 2002.
Karcher RoboCleaner RC 3000 Product Details webpages: “http://wwwrobocleanerde/english/screen3html” through “. . . screen6html” accessed Dec. 12, 2003 (4 pages).
Karcher USA, RC3000 Robotic Cleaner, website: http://www.karcher-usa.com/showproducts.php?op=view—prod¶m1=143¶m2=¶m3=, accessed Mar. 18, 2005, 6 pgs.
Koolvac Robotic Vacuum Cleaner Owner's Manual, Koolatron, Undated; 26 pgs.
NorthStar Low-Cost, Indoor Localization, Evolution robotics, Powering Intelligent Products, 2 pgs.
Put Your Roomba . . . On “Automatic” Roomba Timer> Timed Cleaning—Floorvac Robotic Vacuum webpages: http://cgi.ebay.com/ws/eBayISAPI.d11?Viewhem&category=43575198387&rd= 1 , accessed Apr. 20, 2005, 5 pgs.
Put Your Roomba . . . On “Automatic” webpages: “http://www.acomputeredge.com/roomba,” accessed Apr. 20, 2005, 5 pgs.
RoboMaid Sweeps Your Floors So You Won't Have to, the Official Site, website: http://www.thereobomaid.com/, acessed Mar. 18, 2005, 2 pgs.
Robot Review Samsung Robot Vacuum (VC-RP3OW), website: http://www.onrobo.com/reviews/At—Home/Vacuum—Cleaners/on00vcrp30rosam/index.htm, accessed Mar. 18, 2005, 11 pgs.
Robotic Vacuum Cleaner-Blue, website: http://www.sharperimage.com/us/en/catalog/productview.jhtml?sku=S1727BLU, accessed Mar. 18, 2005, 3 pgs.
Sebastian Thrun, Learning Occupancy Grip Maps With Forward Sensor Models, School of Computer Science, Carnegie Mellon University, pp. 1-28.
Schofield, Monica, “Neither Master nor Slave” A Practical Study in the Development and Employment of Cleaning Robots, Emerging Technologies and Factory Automation, 1999 Proceedings EFA'99 1999 7th IEEE International Conference on Barcelona, Spain Oct. 18-21, 1999, pp. 1427-1434.
Wired News: Robot Vacs Are in the House, website: http://www.wired.com/news/print/0,1294,59237,00.html, accessed Mar. 18, 2005, 6 pgs.
Zoombot Remote Controlled Vaccum-RV-500 New Roomba 2, website: http://cgi.ebay.com/ws/eBayISAPI.d11?ViewItem&category=43526&item=4373497618&rd=1 , accessed Apr. 20, 2005, 7 pgs.
Yata et al., “Wall Following Using Angle Information Measured by a Single Ultrasonic Transducer” IEEE, May 1998, 7 pages.
Prassler, et al., A Short History of Cleaning Robots, Autonomous Robots 9, 211-226, 2000, 16 pages.
Braunsting, et al., “Fuzzy Logic Wall Following of a mobile Robot Based on the Concept of General Perception”, Sep. 1995, ICAR '95 7th International Conference on Advanced Robotics, pp. 367-376.
Examination report in Japanese Application No. JP 2008-246310, dated Jun. 14, 2011.
Examination report in European Application No. EP 02734767.3, dated Apr. 11, 2004.
Examination report in European Application No. EP 02734767.3, dated Apr. 29, 2005.
Examination report in European Application No. EP 02734767.3, dated Nov. 8, 2005.
Notice of Allowance in U.S. Appl. No. 10/167,851, dated Jun. 17, 2004.
Notice of Allowance in U.S. Appl. No. 10/167,851, dated Mar. 22, 2004.
Office Action in U.S. Appl. No. 10/167,851, dated Dec. 29, 2003.
Office Action in U.S. Appl. No. 10/167,851, dated Sep. 3, 2003.
Notice of Allowance in U.S. Appl. No. 10/839,374, dated Oct. 19, 2006.
Office Action in U.S. Appl. No. 10/839,374, dated Mar. 27, 2006.
Office Action in U.S. Appl. No. 10/839,374, dated May 4, 2005.
Office Action in U.S. Appl. No. 10/839,374, dated Aug. 9, 2004.
Notice of Allowance in U.S. Appl. No. 11/671,305, dated Apr. 18, 2011.
Notice of Allowance in U.S. Appl. No. 11/671,305, dated Dec. 29. 2010.
Notice of Allowance in U.S. Appl. No. 11/671,305, dated Sep. 30, 2010.
Notice of Allowance in U.S. Appl. No. 11/671,305, dated Jun. 17, 2010.
Notice of Allowance in U.S. Appl. No. 11/671,305, dated Feb. 22, 2010.
Notice of Allowance in U.S. Appl. No. 11/671,305, dated Oct. 26, 2009.
Office Action in U.S. App. No. 11/671,305, dated Apr. 17, 2009.
Office Action in U.S. Appl. No. 11/671,305, dated Feb. 26, 2008.
Office Action in U.S. Appl. No. 11/671,305, dated Aug. 22, 2007.
Notice of Allowance in U.S. Appl. No. 11/771,356, dated Jun. 18, 2008.
Office Action in U.S. Appl. No. 11/771,356, dated Mar. 14, 2008.
Notice of Allowance in U.S. Appl. No. 11/771,433, dated Dec. 17, 2009.
Notice of Allowance in U.S. Appl. No. 11/771,433, dated Oct. 6, 2009.
Office Action in U.S. Appl. No. 11/771,433, dated Jul. 24, 2009.
Office Action in U.S. Appl. No. 11/771,433, dated May 14, 2009.
Notice of Allowance in U.S. Appl. No. 11/777,085, dated Mar. 11, 2008.
Notice of Allowance in U.S. Appl. No. 12/609,124, dated Jun. 17, 2011.
Office Action in U.S. Appl. No. 12/609,124, dated Oct. 1, 2010.
Official Action in Japanese Application No. JP 2003-504174, dated Mar. 9, 2004, and an English language translation thereof.
Official Action in Japanese Application No. JP 2003-504174, dated Apr. 5, 2005, and an English language translation thereof.
Official Action in Japanese Application No. JP 2003-504174, dated Nov. 15, 2005, and an English language translation thereof.
Official Action in Japanese Application No. JP 2003-504174, dated Aug. 15, 2006, and an English language translation thereof.
Official Action in Japanese Application No. JP 2003-504174, dated Mar. 25, 2008, and an English language translation thereof.
Official Action in Japanese Application No. JP 2008-246310, dated Jun. 14, 2011, and an English language translation thereof.
Jarosiewicz., Eugenio, “EEL 5666 Intelligent Machine Design Laboratory”, University of Florida, Department of Electrical and Computer Engineering, Aug. 4, 1999, 50 pages.
LG RoboKing V-R4000, http://www.popco.net/zboard/view.php?id=tr—review&no=40, Aug. 5, 2005, 15 pages, copyright date 1999-2011.
Dome Style Robot Room Cleaner, http://www.rakuten.co.jp/matsucame/587179/711512/, 7 pages.
Dyson's Robot Vacuum Cleaner—the DC06, http://www.gizmag.com/go/1282/ 3 pages, dated May 2, 2004.
Electrolux Trilobite ZA1, http://www.electrolux-ui.com:8080/2002%5C822%5C833102EN.pdf 10 pages, dated Jan. 12, 2001.
Electrolux Trilobite, http://www.robocon.co.kr/trilobite/Presentation—Trilobite—Kor—030104.ppt 19 pp., undated.
Electrolux web site Sep. 2002, http://www.frc.ri.cmu.edu/˜hpm/talks/Extras/trilobite.desc.html 2 pages, dated Sep. 2002.
Euroflex Intellegente Monster manual, English language excerpt, cover and pp. 17-30, undated.
Euroflex Monster, http://www.euroflex.tv/novita—dett.php?id=15 1 page, dated Jan. 1, 2006.
Floorbotics VR-8 Floor Cleaning Robot, http://www.consensus.com.au/SoftwareAwards/CSAarchive/CSA2004/CSAart04/FloorBot/FX1%20Product%20Description%2020%20January%202004.pdf, (2004), 11 pages.
Friendly Robotics RV Manual, http://www.robotsandrelax.com/PDFs/RV400Manual.pdf pp. 1-18. dated 2004.
Hitachi Robot Cleaner, It's eye, www.hitachi.co.jp/rd/pdf/topics/hitac2003—10.pdf, Oct. 2003, 2 pages, copyright date 2003.
Hitachi Robot Cleaner, http://www.hitachi.co.jp/New/cnews/h1—030529—h1—030529.pdf, 8 pages, dated May 29, 2003.
LG Announces the First Robotic Vacuum Cleaner of Korea, Robot Buying Guide, http://robotbg.com/news/2003/04/22/1g—announces—the—first—robotic—vacuum—cleaner—of—korea, 1 page, Apr. 21, 2003.
Roboking-Not Just a Vacuum Cleaner, a Robot!, http;//infocom.uz/2004/01/21/robokingne-prosto-pyilesos-a-robot/, Jan. 21, 2004, foreign language version, 7 pages.
Roboking-Not Just a Vacuum Cleaner, a Robot!, http;//infocom.uz/2004/01/21/robokingne-prosto-pyilesos-a-robot/, Jan. 21, 2004, English version, 5 pages.
Clean Mate 365 Intelligent Automatic Vacuum Cleaner Model QQ-1 User Manual, www.metapo.com/support/user—manual.pdf 3 pages, undated.
Microrobot UBot MR-UBO1K, http://us.aving.net/news/view.php?articleld=23031 5 pages, dated Aug. 25, 2006.
Robotic Vacuum by Matsushita about to undergo Field Testing, http://www.taipeitimes.com/News/worldbiz/archives/2002/03/26/0000129338 2 pages, dated Mar. 26, 2002, copyright date 1999-2011.
Matsushita robotic cleaner, http://techon.nikkeibp.co.jp/members/0ldb/200203/1006501/ 3 pages, dated Mar. 25, 2002, copyright date 1995-2011.
Matsushita robotic cleaner, http://ascii.jp/elem/000/000/330/330024/ 9 pages, dated Mar. 25, 2002.
Sanyo Robot Cleaner http://www.itmedia.co.jp/news/0111/16/robofesta—m.html, 4 pages, dated Nov. 16, 2001.
Sanyo Robot Cleaner http://www.itmedia.co.jp/news/0111/16/robofesta—m2.html, 3 pages, dated Nov. 16, 2001.
Yujin Robotics, An Intelligent Cleaning Robot “Iclebo Q”, http://us.aving.net/news/view.php?articleld=7257 8 pages, dated Sep. 2, 2005.
Vacuum Cleaner Robot Operated in Conjunction with 3G Cellular Phone, http://www.toshiba.co.jp/tech/review/2004/09/59—09pdf/a13.pdf pp. 53-55, dated 2004.
Toshiba prototype, http://warp.ndl.go.jp/info:ndljp/pid/258151/www.soumu.go.jp/joho—tsusin/policyreports/chousa/netrobot/pdf/030214—1—33—a.pdf, pp. 1-16, dated 2003.
Svet Kompujutera Robot usisivac http://www.sk.rs/1999/10/sknt01.html, foreign language version, 1 page, dated Oct. 1999, copyright date 1984-2011.
Svet Kompjutera Robot Vacuum Cleaner, SKWeb 2:54, English version, dated Oct. 1999, 1 page, copyright date 1984-2011.
Robo Vac, Arbeitet ohne Aufsicht, Maschinemarkt, Wurzburg 105 (1999) 27, 3 pages, dated Jul. 5, 1999.
U.S. Appl. No. 60/605,066, filed Aug. 27, 2004.
U.S. Appl. No. 60/605,181, filed Aug. 27, 2004.
Hitachi, “Feature,” http://kadenfan.hitachi.co.jp/robot/feature/feature.html, 1 page, accessed Nov. 19, 2008, dated May 29, 2003.
Microrobot, “Home Robot—UBOT,” http://www.microrobotusa.com/product—1—.html, 2 pages, accessed Dec. 2, 2008, copyright date 2007.
InMach, “Intelligent Machines,” http://www.inmach.de/inside.html, 1 page, accessed Nov. 19, 2008.
Hammacher Schlemmer, “Electrolux Trilobite Robotic Vacuum at Hammacher Schlemmer,” www.hammacher.com/publish/71579.asp?promo=xsells, 3 pages, accessed Mar. 18, 2005, copyright date 2004.
TotalVac.com, “Karcher RC3000 RoboCleaner Robot Vacuum at TotalVac,” www.totalvac.com/robot—vacuum.htm, 3 pages, accessed Mar. 18, 2005, copyright date 2004.
MobileMag, Samsung unveils high-tech robot vacuum cleaner, http://www.mobilemag.com/content/100/102/C2261/, 4 pages, accessed Mar. 18, 2005, dated Nov. 25, 2003, copyright date 2002-2004.
Ilirobotics.com, Samsung unveils its multifunction robot vacuum,Samsung Robot Vacuum (VC-RP3OW), http://www.iirobotics.com/webpages/hotstuff.php?ubre=111, 3 pages, accessed Mar. 18, 2005, dated Aug. 31, 2004.
OnRobo, Samsung unveils it multifunction robot vacuum, http://www.onrobo.com/enews/0210/samsung—vacuum.shtml, 3 pages, accessed Mar. 18, 2005, copyright date 2004.
Gregg, M. et al., “Autonomous Lawn Care Applications”, 2006 Florida Conference on Recent Advances in Robotics, FCRAR May 25-26, 2006, pp. 1-5.
UAMA (Asia) Industrial Co. Ltd., “Robot Family,” 1 page, indicates available in 2005.
Matsutek Enterprises Co. Ltd., “Automatic Rechargeable Vacuum Cleaner,” http://matsutek.manufacturer.globalsources.com/si/6008801427181/pdtl/Home-vacuum/10, 3 pages, accessed Apr. 23, 2007, copyright date 2007.
LG, RoboKing, 4 pages. Undated.
Collection of pictures of robotic cleaners, devices AA-BF, 50 pages. Undated.
Braunstingl et al., “Fuzzy Logic Wall Following of a Mobile Robot Based on the Concept of General Perception,” Sep. 1995, ICAR '95, 7th Int'l Conference on Advanced Robotics, Sant Feliu De Guixols, Spain, pp. 367-376.
Yata et al., “Wall Following Using Angle Information Measured by a Single Ultrasonic Transducer,” Proceedings of the 1998 IEEE, International Conference on Robotics & Automation, Leuven, Belgium, pp. 1590-1596, May, 1998.
Tse et al., “Design of a Navigation System for a Household Mobile Robot Using Neural Networks” Dept. of Manufacturing Engg. & Engg. Management, City University of Hong Kong, pp. 2151-2156, 1998.
Wolf, J. et al., “Robust Vision-Based Localization by Combining an Image-Retrieval System with Monte Carlo Localization”, IEEE Transactions on Robotics, vol. 21, No. 2 pp. 208-216, Apr. 2005.
Eren et al., “Accuracy in Position Estimation of Mobile Robots Based on Coded Infrared Signal Transmission,” Proceedings: Integrating Intelligent Instrumentation and Control, Instrumentation and Measurement Technology Conference, IMTC/95, pp. 548-551, 1995.
Karlsson, N. et al. “Core Technologies for Service Robotics”, Proceedings of 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems, Sep. 28-Oct. 2, 2004, Sendai Japan, pp. 2979-2984.
Leonard et al., “Mobile Robot Localization by Tracking Geometric Beacons,” IEEE Transactions on Robotics and Automation, vol. 7, No. 3, pp. 376-382, Jun. 1991.
Paromtchik, “Toward Optical Guidance of Mobile Robots.” Proceedings of the Fourth World Multiconference on Systemics, Cybernetics and Informatics, Orlando, FL, USA, Jul. 23-26, 2000, vol. IX, six pages.
Wong, EIED Online>>Robot Business, ED Online ID# 13114, 17 pages, Jul. 26, 2006, copyright date 2006.
Facchinetti et al., “Using and Learning Vision-Based Self-Positioning for Autonomous Robot Navigation,” ICARCV '94, The Third International Conference on Automation, Robotics and Computer Vision, Singapore, vol. 3 pp. 1694-1698, Nov. 1994.
Facchinetti et al., “Self-Positioning Robot Navigation Using Ceiling Images Sequences,” ACCV'95, pp. 1-5, Dec. 5-8, 1995.
King et al., “Heplmate-TM- Autonomous Mobile Robot Navigation System,” SPIE, vol. 1388, Mobile Robots V, pp. 190-198, 1990.
Fairfield et al., “Mobile Robot Localization with Sparse Landmarks,” Proceedings of SPIE, vol. 4573, pp. 148-155, 2002.
Benayad-Cherif et al., “Mobile Robot Navigation Sensors,” SPIE, vol. 1831, Mobile Robots VII pp. 378-387, 1992.
The Sharper Image, e-Vac Robotic Vacuum, S1727 Instructions, www.sharperimage.com , 18 pages, copyright 2004.
Ebay, Roomba Timer -> Timed Cleaning—Floorvac Robotic Vacuum, Cgi.ebay.com/ws/eBayISAPI.dll?viewitem&category=43526&item=4375198387&rd=1, 5 pages, Apr. 20, 2005.
Friendly Robotics , “Friendly Robotics—Friendly Vac, Robotic Vacuum Cleaner,” http://www.friendlyrobotics.com/vac.htm, 4 pages, accessed Apr. 20, 2005.
Ebay, Zoombot Remote Controlled Vacuum—Rv-500 New Roomba 2, Cgi.ebay.com/ws/ebay|SAP|.d11?viewitem&category?43526&item=4373497618&RD=1, 7 pages, Apr. 20, 2005.
The Sharper Image, E Vac Robotic Vacuum, http://www.sharperimage.com/us/en/templates/products/pipmoreworklprintable.jhtml, 1 page, accessed Mar. 18, 2005.
Office Action from U.S. Appl. No. 11/671,305, dated Aug. 22, 2007.
Japan Office Action in JP 2011-230659, dated Aug. 29, 2012, along with an English language translation thereof.
Japan Office Action in JP 2010-507698, dated Feb. 24, 2012, along with an English language translation thereof.
Japan Office Action in JP 2008-246310, dated Apr. 12, 2012, along with an English language translation thereof.
English language Abstract and English language translation of JP 10-314088, JP 10-314088 being published on Dec. 02, 1998.
English language Abstract and English language translation of JP 06-154143, JP 06-154143 being published on Jun. 03, 1994.
English language Abstract and English language translation of JP 2000-060782, JP 2000-060782 being published on Feb. 29, 2000.
English language Abstract and English language translation of JP 08-263137, JP 08-263137 being published on Oct. 11, 1996.
Japan Office Action in JP 2010-507685, dated Jul. 02, 2012, along with an English language translation thereof.
Facts on the Trilobite http://www.frc.ri.cmu.edu/˜hpm/talks/Extras/trilobite.desc.html 2 pages accessed Nov. 1, 2011.
Florbot GE Plastics, 1989-1990, 2 pages, available at http://www.fuseid.com/, accessed Sep. 27, 2012.
Fuentes, et al. “Mobile Robotics 1994”, University of Rochester. Computer Science Department, TR 588, 44 pages, Dec. 7, 1994.
Grumet “Robots Clean House”, Popular Mechanics, Nov. 2003.
http://www.karcher.de/versions/int/assets/video/2—4—robo—en.swf Accessed Sep. 25, 2009.
Chae, et al. “StarLITE: A new artificial landmark for the navigation of mobile robots”, http://www.irc.atr.jp/jk-nrs2005/pdf/Starlite.pdf, 4 pages, 2005.
Dorfmüller-Ulhaas “Optical Tracking From User Motion to 3D Interaction”, http://www.cg.tuwien.ac.at/research/publications/2002/Dorfmueller-Ulhaas-thesis, 182 pages, 2002.
Eren, et al. “Accuracy in position estimation of mobile robots based on coded infrared signal transmission”, Proceedings: Integrating Intelligent Instrumentation and Control, Instrumentation and Measurement Technology Conference, 1995. IMTC/95. pp. 548-551, 1995.
Eren, et al. “Operation of Mobile Robots in a Structured Infrared Environment”, Proceedings. ‘Sensing, Processing, Networking’, IEEE Instrumentation and Measurement Technology Conference, 1997 (IMTC/97), Ottawa, Canada vol. 1, pp. 20-25, May 19-21, 1997.
Linde “Dissertation, “On Aspects of Indoor Localization””https://eldorado.tu-dortmund.de/handle/2003/22854, University of Dortmund, 138 pages, Aug. 28, 2006.
Ma “Thesis: Documentation on Northstar”, California Institute of Technology, 14 pages, May 17, 2006.
McGillem, et al. “Infra-red Lacation System for Navigation and Autonomous Vehicles”, 1988 IEEE International Conference on Robotics and Automation, vol. 2, pp. 1236-1238, Apr., 24-29, 1988.
McGillem,et al. “A Beacon Navigation Method for Autonomous Vehicles”, IEEE Transactions on Vehicular Technology, vol. 38, No. 3, pp. 132-139, Aug. 1989.
McLurkin Stupid Robot Tricks: A Behavior-based Distributed Algorithm Library for Programming Swarms of Robots, Paper submitted for requirements of BSEE at MIT, May 2004.
McLurkin “The Ants: A community of Microrobots”, Paper submitted for requirements of BSEE at MIT, May 12, 1995.
Paromtchik, et al. “Optical Guidance System for Multiple mobile Robots”, Proceedings 2001 ICRA. IEEE International Conference on Robotics and Automation, vol. 3, pp. 2935-2940 (May 21-26, 2001).
Ronnback “On Methods for Assistive Mobile Robots”, http://www.openthesis.org/documents/methods-assistive-mobile-robots-595019.html, 218 pages, Jan. 1, 2006.
Related Publications (1)
Number Date Country
20100263142 A1 Oct 2010 US
Provisional Applications (1)
Number Date Country
60297718 Jun 2001 US
Continuations (3)
Number Date Country
Parent 11671305 Feb 2007 US
Child 12826909 US
Parent 10839374 May 2004 US
Child 11671305 US
Parent 10167851 Jun 2002 US
Child 10839374 US