Method and system for multi-mode coverage for an autonomous robot

Information

  • Patent Grant
  • 9104204
  • Patent Number
    9,104,204
  • Date Filed
    Tuesday, May 14, 2013
    11 years ago
  • Date Issued
    Tuesday, August 11, 2015
    9 years ago
Abstract
A control system for a mobile robot (10) is provided to effectively cover a given area by operating in a plurality of modes, including an obstacle following mode (51) and a random bounce mode (49). In other embodiments, spot coverage, such as spiraling (45), or other modes are also used to increase effectiveness. In addition, a behavior based architecture is used to implement the control system, and various escape behaviors are used to ensure full coverage.
Description
FIELD OF THE INVENTION

This invention relates generally to autonomous vehicles or robots, and more specifically to methods and mobile robotic devices for covering a specific area as might be required of, or used as, robotic cleaners or lawn mowers.


DESCRIPTION OF PRIOR ART

For purposes of this description, examples will focus on the problems faced in the prior art as related to robotic cleaning (e.g., dusting, buffing, sweeping, scrubbing, dry mopping or vacuuming). The claimed invention, however, is limited only by the claims themselves, and one of skill in the art will recognize the myriad of uses for the present invention beyond indoor, domestic cleaning.


Robotic engineers have long worked on developing an effective method of autonomous cleaning. By way of introduction, the performance of cleaning robots should concentrate on three measures of success: coverage, cleaning rate and perceived effectiveness. Coverage is the percentage of the available space visited by the robot during a fixed cleaning time, and ideally, a robot cleaner would provide 100 percent coverage given an infinite run time. Unfortunately, designs in the prior art often leave portions of the area uncovered regardless of the amount of time the device is allowed to complete its tasks. Failure to achieve complete coverage can result from mechanical limitations—e.g., the size and shape of the robot may prevent it from reaching certain areas—or the robot may become trapped, unable to vary its control to escape. Failure to achieve complete coverage can also result from an inadequate coverage algorithm. The coverage algorithm is the set of instructions used by the robot to control its movement. For the purposes of the present invention, coverage is discussed as a percentage of the available area visited by the robot during a finite cleaning time. Due to mechanical and/or algorithmic limitations, certain areas within the available space may be systematically neglected. Such systematic neglect is a significant limitation in the prior art.


A second measure of a cleaning robot's performance is the cleaning rate given in units of area cleaned per unit time. Cleaning rate refers to the rate at which the area of cleaned floor increases; coverage rate refers to the rate at which the robot covers the floor regardless of whether the floor was previously clean or dirty. If the velocity of the robot is v and the width of the robot's cleaning mechanism (also called work width) is w then the robot's coverage rate is simply wv, but its cleaning rate may be drastically lower.


A robot that moves in a purely randomly fashion in a closed environment has a cleaning rate that decreases relative to the robot's coverage rate as a function of time. This is because the longer the robot operates the more likely it is to revisit already cleaned areas. The optimal design has a cleaning rate equivalent to the coverage rate, thus minimizing unnecessary repeated cleanings of the same spot. In other words, the ratio of cleaning rate to coverage rate is a measure of efficiency and an optimal cleaning rate would mean coverage of the greatest percentage of the designated area with the minimum number of cumulative or redundant passes over an area already cleaned.


A third metric of cleaning robot performance is the perceived effectiveness of the robot. This measure is ignored in the prior art. Deliberate movement and certain patterned movement is favored as users will perceive a robot that contains deliberate movement as more effective.


While coverage, cleaning rate and perceived effectiveness are the performance criteria discussed herein, a preferred embodiment of the present invention also takes into account the ease of use in rooms of a variety of shapes and sizes (containing a variety of unknown obstacles) and the cost of the robotic components. Other design criteria may also influence the design, for example the need for collision avoidance and appropriate response to other hazards.


As described in detail in Jones, Flynn & Seiger, Mobile Robots: Inspiration to Implementation second edition, 1999, A K Peters, Ltd., and elsewhere, numerous attempts have been made to build vacuuming and cleaning robots. Each of these robots has faced a similar challenge: how to efficiently cover the designated area given limited energy reserves.


We refer to maximally efficient cleaning, where the cleaning rate equals the coverage rate, as deterministic cleaning. As shown in FIG. 1A, a robot 1 following a deterministic path moves in such a way as to completely cover the area 2 while avoiding all redundant cleaning. Deterministic cleaning requires that the robot know both where it is and where it has been; this in turn requires a positioning system. Such a positioning system—a positioning system suitably accurate to enable deterministic cleaning might rely on scanning laser rangers, ultrasonic transducers, carrier phase differential GPS, or other methods—can be prohibitively expensive and involve user set-up specific to the particular room geometries. Also, methods that rely on global positioning are typically incapacitated by the failure of any part of the positioning system.


One example of using highly sophisticated (and expensive) sensor technologies to create deterministic cleaning is the RoboScrub device built by Denning Mobile Robotics and Windsor Industries, which used sonar, infrared detectors, bump sensors and high-precision laser navigation. RoboScrub's navigation system required attaching large bar code targets at various positions in the room. The requirement that RoboScrub be able to see at least four targets simultaneously was a significant operational problem. RoboScrub, therefore, was limited to cleaning large open areas.


Another example, RoboKent, a robot built by the Kent Corporation, follows a global positioning strategy similar to RobotScrub. RoboKent dispenses with RobotScrub's more expensive laser positioning system but having done so RoboKent must restrict itself only to areas with a simple rectangular geometry, e.g. long hallways. In these more constrained regions, position correction by sonar ranging measurements is sufficient. Other deterministic cleaning systems are described, for example, in U.S. Pat. No. 4,119,900 (Kremnitz), U.S. Pat. No. 4,700,427 (Knepper), U.S. Pat. No. 5,353,224 (Lee et al.), U.S. Pat. No. 5,537,017 (Feiten et al.), U.S. Pat. No. 5,548,511 (Bancroft), U.S. Pat. No. 5,650,702 (Azumi).


Because of the limitations and difficulties of deterministic cleaning, some robots have relied on pseudo-deterministic schemes. One method of providing pseudo-deterministic cleaning is an autonomous navigation method known as dead reckoning. Dead reckoning consists of measuring the precise rotation of each robot drive wheel (using for example optical shaft encoders). The robot can then calculate its expected position in the environment given a known starting point and orientation. One problem with this technique is wheel slippage. If slippage occurs, the encoder on that wheel registers a wheel rotation even though that wheel is not driving the robot relative to the ground. As shown in FIG. 1B, as the robot 1 navigates about the room, these drive wheel slippage errors accumulate making this type of system unreliable for runs of any substantial duration. (The path no longer consists of tightly packed rows, as compared to the deterministic coverage shown in FIG. 1A.) The result of reliance on dead reckoning is intractable systematic neglect; in other words, areas of the floor are not cleaned.


One example of a pseudo-deterministic a system is the Cye robot from Probotics, Inc. Cye depends exclusively on dead reckoning and therefore takes heroic measures to maximize the performance of its dead reckoning system. Cye must begin at a user-installed physical registration spot in a known location where the robot fixes its position and orientation. Cye then keeps track of position as it moves away from that spot. As Cye moves, uncertainty in its position and orientation increase. Cye must make certain to return to a calibration spot before this error grows so large that it will be unlikely to locate a calibration spot. If a calibration spot is moved or blocked or if excessive wheel slippage occurs then Cye can become lost (possibly without realizing that it is lost). Thus Cye is suitable for use only in relatively small benign environments. Other examples of this approach are disclosed in U.S. Pat. No. 5,109,566 (Kobayashi et al.) and U.S. Pat. No. 6,255,793 (Peless et al.).


Another approach to robotic cleaning is purely random motion. As shown in FIG. 1C, in a typical room without obstacles, a random movement algorithm will provide acceptable coverage given significant cleaning time. Compared to a robot with a deterministic algorithm, a random cleaning robot must operate for a longer time to achieve acceptable coverage. To have high confidence that the random-motion robot has cleaned 98% of an obstacle-free room, the random motion robot must run approximately five times as long as a deterministic robot with the same cleaning mechanism moving at the same speed.


The coverage limitations of a random algorithm can be seen in FIG. 1D. An obstacle 5 in the room can create the effect of segmenting the room into a collection of chambers. The coverage over time of a random algorithm robot in such a room is analogous to the time density of gas released in one chamber of a confined volume. Initially, the density of gas is highest in the chamber where it is released and lowest in more distant chambers. Similarly the robot is most likely to thoroughly clean the chamber where it starts, rather than more distant chambers, early in the process. Given enough time a gas reaches equilibrium with equal density in all chambers. Likewise given time, the robot would clean all areas thoroughly. The limitations of practical power supplies, however, usually guarantee that the robot will have insufficient time to clean all areas of a space cluttered with obstacles. We refer to this phenomenon as the robot diffusion problem.


As discussed, the commercially available prior art has not been able to produce an effective coverage algorithm for an area of unknown geometry. As noted above, the prior art either has relied on sophisticated systems of markers or beacons or has limited the utility of the robot to rooms with simple rectangular geometries. Attempts to use pseudo-deterministic control algorithms can leave areas of the space systematically neglected.


OBJECTS AND ADVANTAGES

It is an object of the present invention to provide a system and method to allow a mobile robot to operate in a plurality of modes in order to effectively cover an area.


It is an object of the present invention to provide a mobile robot, with at least one sensor, to operate in a number of modes including spot-coverage, obstacle following and bounce.


It is a further object of the invention to provide a mobile robot that alternates between obstacle following and bounce mode to ensure coverage.


It is an object of the invention to return to spot-coverage after the robot has traveled a pre-determined distance.


It is an object of the invention to provide a mobile robot able to track the average distance between obstacles and use the average distance as an input to alternate between operational modes.


It is yet another object of the invention to optimize the distance the robot travels in an obstacle following mode as a function of the frequency of obstacle following and the work width of the robot, and to provide a minimum and maximum distance for operating in obstacle following mode.


It is an object of a preferred embodiment of the invention to use a control system for a mobile robot with an operational system program able to run a plurality of behaviors and using an arbiter to select which behavior is given control over the robot.


It is still another object of the invention to incorporate various escape programs or behavior to allow the robot to avoid becoming stuck.


Finally, it is an object of the invention to provide one or more methods for controlling a mobile robot to benefit from the various objects and advantages disclosed herein.





BRIEF DESCRIPTION OF THE DRAWINGS

These and further features of the present invention will be apparent with reference to the accompanying drawings, wherein:



FIGS. 1A-D illustrate coverage patterns of various robots in the prior art;



FIG. 2 is a top-view schematic representation of the basic components of a mobile robot used in a preferred embodiment of the invention;



FIG. 3 demonstrates a hardware block diagram of the robot shown in FIG. 2;



FIG. 4A is a diagram showing a method of determining the angle at which the robot encounters an obstacle; FIG. 4B is a diagram showing the orientation of a preferred embodiment of the robot control system;



FIG. 5 is a schematic representation of the operational modes of the instant invention;



FIG. 6A is a schematic representation of the coverage pattern for a preferred embodiment of SPIRAL behavior; FIG. 6B is a schematic representation of the coverage pattern for an alternative embodiment of SPIRAL behavior; FIG. 6C is a schematic representation of the coverage pattern for yet another alternative embodiment of SPIRAL behavior;



FIG. 7 is a flow-chart illustration of the spot-coverage algorithm of a preferred embodiment of the invention;



FIGS. 8A & 8B are schematic representations of the coverage pattern for a preferred embodiment of operation in obstacle following mode;



FIG. 8C is a schematic illustration of the termination of the obstacle following mode when an obstacle is encountered after the mobile robot has traveled a minimum distance. FIG. 8D is a schematic illustration of the termination of the obstacle following mode after the mobile robot has traveled a maximum distance.



FIG. 9A is a flow-chart illustration of the obstacle following algorithm of a preferred embodiment of the invention; FIG. 9B is a flow-chart illustration of a preferred algorithm for determining when to exit obstacle following mode.



FIG. 10 is a schematic representation of the coverage pattern for a preferred embodiment of BOUNCE behavior;



FIG. 11 is a flow-chart illustration of the room coverage algorithm of a preferred embodiment of the invention;



FIGS. 12A & 12B are flow-chart illustrations of an exemplary escape behavior;



FIG. 13A is a schematic representation of the coverage pattern of a mobile robot with only a single operational mode; FIG. 13B is a schematic representation of the coverage pattern for a preferred embodiment of the instant invention using obstacle following and room coverage modes; and



FIG. 14 is a schematic representation of the coverage pattern for a preferred embodiment of the instant invention using spot-coverage, obstacle following and room coverage modes.





DESCRIPTION OF INVENTION

In the present invention, a mobile robot is designed to provide maximum coverage at an effective coverage rate in a room of unknown geometry. In addition, the perceived effectiveness of the robot is enhanced by the inclusion of patterned or deliberate motion. In addition, in a preferred embodiment, effective coverage requires a control system able to prevent the robot from becoming immobilized in an unknown environment.


While the physical structures of mobile robots are known in the art, the components of a preferred, exemplary embodiment of the present invention is described herein. A preferred embodiment of the present invention is a substantially circular robotic sweeper containing certain features. As shown in FIG. 2, for example, the mobile robot 10 of a preferred embodiment includes a chassis 11 supporting mechanical and electrical components. These components include various sensors, including two bump sensors 12 & 13 located in the forward portion of the robot, four cliff sensors 14 located on the robot shell 15, and a wall following sensor 16 mounted on the robot shell 15. In other embodiments, as few as one sensor may be used in the robot. One of skill in the art will recognize that the sensor(s) may be of a variety of types including sonar, tactile, electromagnetic, capacitive, etc. Because of cost restraints, a preferred embodiment of the present invention uses bump (tactile) sensors 12 & 13 and reflective IR proximity sensors for the cliff sensors 14 and the wall-following sensor 16. Details of the IR sensors are described in U.S. patent application Ser. No. 09/768,773, which disclosure is hereby incorporated by reference.


A preferred embodiment of the robot also contains two wheels 20, motors 21 for driving the wheels independently, an inexpensive low-end microcontroller 22, and a rechargeable battery 23 or other power source known in the art. These components are well known in the art and are not discussed in detail herein. The robotic cleaning device 10 further includes one or more cleaning heads 30. The cleaning head might contain a vacuum cleaner, various brushes, sponges, mops, electrostatic cloths or a combination of various cleaning elements. The embodiment shown in FIG. 2 also includes a side brush 32.


As mentioned above, a preferred embodiment of the robotic cleaning device 10 comprises an outer shell 15 defining a dominant side, non-dominant side, and a front portion of the robot 10. The dominant side of the robot is the side that is kept near or in contact with an object (or obstacle) when the robot cleans the area adjacent to that object (or obstacle). In a preferred embodiment, as shown in FIG. 1, the dominant side of the robot 10 is the right-hand side relative to the primary direction of travel, although in other embodiments the dominant side may be the left-hand side. In still other embodiments, the robot may be symmetric and thereby does not need a dominant side; however, in a preferred embodiment, a dominant side is chosen for reasons of cost. The primary direction of travel is as shown in FIG. 2 by arrow 40.


In a preferred embodiment, two bump sensors 12 & 13 are located forward of the wheels 20 relative to the direction of forward movement, shown by arrow 40. One bump sensor 13 is located on the dominant side of the robot 10 and the other bump sensor 12 is located on the non-dominant side of the robot 10. When both of these bump sensors 12 & 13 are activated simultaneously, the robot 10 recognizes an obstacle in the front position. In other embodiments, more or fewer individual bump sensors can be used. Likewise, any number of bump sensors can be used to divide the device into any number of radial segments. While in a preferred embodiment the bump sensors 12 & 13 are IR break beam sensors activated by contact between the robot 10 and an obstacle, other types of sensors can be used, including mechanical switches and capacitive sensors that detect the capacitance of objects touching the robot or between two metal plates in the bumper that are compressed on contact. Non-contact sensors, which allow the robot to sense proximity to objects without physically touching the object, such as capacitive sensors or a curtain of IR light, can also be used.


It is useful to have a sensor or sensors that are not only able to tell if a surface has been contacted (or is nearby), but also the angle relative to the robot at which the contact was made. In the case of a preferred embodiment, the robot is able to calculate the time between the activation of the right and left bump switches 12 & 13, if both are activated. The robot is then able to estimate the angle at which contact was made. In a preferred embodiment shown in FIG. 4A, the bump sensor comprises a single mechanical bumper 44 at the front of the robot with sensors 42 & 43 substantially at the two ends of the bumper that sense the movement of the bumper. When the bumper is compressed, the timing between the sensor events is used to calculate the approximate angle at which the robot contacted the obstacle. When the bumper is compressed from the right side, the right bump sensor detects the bump first, followed by the left bump sensor, due to the compliance of the bumper and the bump detector geometry. This way, the bump angle can be approximated with only two bump sensors.


For example, in FIG. 4A, bump sensors 42 & 43 are able to divide the forward portion of the robot into six regions (I-VI). When a bump sensor is activated, the robot calculates the time before the other sensor is activated (if at all). For example, when the right bump sensor 43 is activated, the robot measures the time (t) before the left bump sensor 42 is activated. If t is less than t1, then the robot assumes contact occurred in region IV. If t is greater than or equal to t1 and less than t2, then the robot assumes contact was made in region V. If t is greater than or equal to t2 (including the case of where the left bump sensor 42 is not activated at all within the time monitored), then the robot assumes the contact occurred in region VI. If the bump sensors are activated simultaneously, the robot assumes the contact was made from straight ahead. This method can be used the divide the bumper into an arbitrarily large number of regions (for greater precision) depending on of the timing used and geometry of the bumper. As an extension, three sensors can be used to calculate the bump angle in three dimensions instead of just two dimensions as in the preceding example.


A preferred embodiment also contains a wall-following or wall-detecting sensor 16 mounted on the dominant side of the robot 10. In a preferred embodiment, the wall following sensor is an IR sensor composed of an emitter and detector pair collimated so that a finite volume of intersection occurs at the expected position of the wall. This focus point is approximately three inches ahead of the drive wheel in the direction of robot forward motion. The radial range of wall detection is about 0.75 inches.


A preferred embodiment also contains any number of IR cliff sensors 14 that prevent the device from tumbling over stairs or other vertical drops. These cliff sensors are of a construction similar to that of the wall following sensor but directed to observe the floor rather than a wall. As an additional safety and sensing measure, the robot 10 includes a wheel-drop sensor that is able to detect if one or more wheels is unsupported by the floor. This wheel-drop sensor can therefore detect not only cliffs but also various obstacles upon which the robot is able to drive, such as lamps bases, high floor transitions, piles of cords, etc.


Other embodiments may use other known sensors or combinations of sensors.



FIG. 3 shows a hardware block diagram of the controller and robot of a preferred embodiment of the invention. In a preferred embodiment, a Winbond W78XXX series processor is used. It is a microcontroller compatible with the MCS-51 family with 36 general purpose I/O ports, 256 bytes of RAM and 16K of ROM. It is clocked at 40 MHz which is divided down for a processor speed of 3.3 MHz. It has two timers which are used for triggering interrupts used to process sensors and generate output signals as well as a watchdog timer. The lowest bits of the fast timer are also used as approximate random numbers where needed in the behaviors. There are also two external interrupts which are used to capture the encoder inputs from the two drive wheels. The processor also has a UART which is used for testing and debugging the robot control program.


The I/O ports of the microprocessor are connected to the sensors and motors of the robot and are the interface connecting it to the internal state of the robot and its environment. For example, the wheel drop sensors are connected to an input port and the brush motor PWM signal is generated on an output port. The ROM on the microprocessor is used to store the coverage and control program for the robot. This includes the behaviors (discussed below), sensor processing algorithms and signal generation. The RAM is used to store the active state of the robot, such as the average bump distance, run time and distance, and the ID of the behavior in control and its current motor commands.


For purposes of understanding the movement of the robotic device, FIG. 4B shows the orientation of the robot 10 centered about the x and y axes in a coordinate plane; this coordinate system is attached to the robot. The directional movement of the robot 10 can be understood to be the radius at which the robot 10 will move. In order to rapidly turn away from the wall 100, the robot 10 should set a positive, small value of r (r3 in FIG. 4B); in order to rapidly turn toward the wall, the robot should set a negative, small value of r (r1 in FIG. 4B). On the other hand, to make slight turns, the robot should set larger absolute values for r—positive values to move left (i.e. away from the wall, r4 in FIG. 4B) and negative values to move right (i.e. toward the wall, (r2 in FIG. 4B). This coordinate scheme is used in the examples of control discussed below. The microcontroller 22 controlling differential speed at which the individual wheel motors 21 are run, determines the turning radius.


Also, in certain embodiments, the robot may include one or more user inputs. For example, as shown in FIG. 2, a preferred embodiment includes three simple buttons 33 that allow the user to input the approximate size of the surface to be covered. In a preferred embodiment, these buttons labeled “small,” “medium,” and “large” correspond respectively to rooms of 11.1, 20.8 and 27.9 square meters.


As mentioned above, the exemplary robot is a preferred embodiment for practicing the instant invention, and one of skill in the art is able to choose from elements known in the art to design a robot for a particular purpose. Examples of suitable designs include those described in the following U.S. Pat. No. 4,306,329 (Yokoi), U.S. Pat. No. 5,109,566 (Kobayashi et al.), U.S. Pat. No. 5,293,955 (Lee), U.S. Pat. No. 5,369,347 (Yoo), U.S. Pat. No. 5,440,216 (Kim), U.S. Pat. No. 5,534,762 (Kim), U.S. Pat. No. 5,613,261 (Kawakami et al), U.S. Pat. No. 5,634,237 (Paranjpe), U.S. Pat. No. 5,781,960 (Kilstrom et al.), U.S. Pat. No. 5,787,545 (Colens), U.S. Pat. No. 5,815,880 (Nakanishi), U.S. Pat. No. 5,839,156 (Park et al.), U.S. Pat. No. 5,926,909 (McGee), U.S. Pat. No. 6,038,501 (Kawakami), U.S. Pat. No. 6,076,226 (Reed), all of which are hereby incorporated by reference.



FIG. 5 shows a simple block representation of the various operational modes of a device. In a preferred embodiment, and by way of example only, operational modes may include spot cleaning (where the user or robot designates a specific region for cleaning), edge cleaning, and room cleaning. Each operational mode comprises complex combinations of instructions and/or internal behaviors, discussed below. These complexities, however, are generally hidden from the user. In one embodiment, the user can select the particular operational mode by using an input element, for example, a selector switch or push button. In other preferred embodiments, as described below, the robot is able to autonomously cycle through the operational modes.


The coverage robot of the instant invention uses these various operational modes to effectively cover the area. While one of skill in the art may implement these various operational modes in a variety of known architectures, a preferred embodiment relies on behavior control. Here, behaviors are simply layers of control systems that all run in parallel. The microcontroller 22 then runs a prioritized arbitration scheme to resolve the dominant behavior for a given scenario. A description of behavior control can be found in Mobile Robots, supra, the text of which is hereby incorporated by reference.


In other words, in a preferred embodiment, the robot's microprocessor and control software run a number of behaviors simultaneously. Depending on the situation, control of the robot will be given to one or more various behaviors. For purposes of detailing the preferred operation of the present invention, the behaviors will be described as (1) coverage behaviors, (2) escape behaviors or (3) user/safety behaviors. Coverage behaviors are primarily designed to allow the robot to perform its coverage operation in an efficient manner. Escape behaviors are special behaviors that are given priority when one or more sensor inputs suggest that the robot may not be operating freely. As a convention for this specification, behaviors discussed below are written in all capital letters.


1. Coverage Behaviors



FIGS. 6-14 show the details of each of the preferred operational modes: Spot Coverage, Wall Follow (or Obstacle Follow) and Room Coverage.


Operational Mode: Spot Coverage


Spot coverage or, for example, spot cleaning allows the user to clean an isolated dirty area. The user places the robot 10 on the floor near the center of the area (see reference numeral 40 in FIGS. 6A, 6B) that requires cleaning and selects the spot-cleaning operational mode. The robot then moves in such a way that the immediate area within, for example, a defined radius, is brought into contact with the cleaning head 30 or side brush 32 of the robot.


In a preferred embodiment, the method of achieving spot cleaning is a control algorithm providing outward spiral movement, or SPIRAL behavior, as shown in FIG. 6A. In general, spiral movement is generated by increasing the turning radius as a function of time. In a preferred embodiment, the robot 10 begins its spiral in a counter-clockwise direction, marked in FIG. 6A by movement line 45, in order to keep the dominant side on the outward, leading-edge of the spiral. In another embodiment, shown in FIG. 6B, spiral movement of the robot 10 is generated inward such that the radius of the turns continues to decrease. The inward spiral is shown as movement line 45 in FIG. 6B. It is not necessary, however, to keep the dominant side of the robot on the outside during spiral motion.


The method of spot cleaning used in a preferred embodiment—outward spiraling—is set forth in FIG. 7. Once the spiraling is initiated (step 201) and the value of r is set at its minimum, positive value (which will produce the tightest possible counterclockwise turn), the spiraling behavior recalculates the value of r as a function of where • represents the angular turning since the initiation of the spiraling behavior (step 210). By using the equation r=a•, where a is a constant coefficient, the tightness or desired overlap of the spiral can be controlled. (Note that • is not normalized to 2•). The value of a can be chosen by the equation







a
=

d

2
·



;





where d is the distance between two consecutive passes of the spiral. For effective cleaning, a value for d should be chosen that is less than the width of the cleaning mechanism 30. In a preferred embodiment, a value of d is selected that is between one-half and two-thirds of the width of the cleaning head 30.


In other embodiments, the robot tracks its total distance traveled in spiral mode. The spiral will deteriorate after some distance, i.e. the centerpoint of the spiral motion will tend to drift over time due to surface dependant wheel slippage and/or inaccuracies in the spiral approximation algorithm and calculation precision. In certain embodiments, therefore, the robot may exit spiral mode after the robot has traveled a specific distance (“maximum spiral distance”), such as 6.3 or 18.5 meters (step 240). In a preferred embodiment, the robot uses multiple maximum spiral distances depending on whether the robot is performing an initial spiral or a later spiral. If the maximum spiral distance is reached without a bump, the robot gives control to a different behavior, and the robot, for example, then continues to move in a predominately straight line. (In a preferred embodiment, a STRAIGHT LINE behavior is a low priority, default behavior that propels the robot in an approximate straight line at a preset velocity of approximately 0.306 m/s when no other behaviors are active.


In spiral mode, various actions can be taken when an obstacle is encountered. For example, the robot could (a) seek to avoid the obstacle and continue the spiral in the counter-clockwise direction, (b) seek to avoid the obstacle and continue the spiral in the opposite direction (e.g. changing from counter-clockwise to clockwise), or (c) change operational modes. Continuing the spiral in the opposite direction is known as reflective spiraling and is represented in FIG. 6C, where the robot 10 reverses its movement path 45 when it comes into contact with obstacle 101. In a preferred embodiment, as detailed in step 220, the robot 10 exits spot cleaning mode upon the first obstacle encountered by a bump sensor 12 or 13.


While a preferred embodiment describes a spiral motion for spot coverage, any self-bounded area can be used, including but not limited to regular polygon shapes such as squares, hexagons, ellipses, etc.


Operational Mode: Wall/Obstacle Following


Wall following or, in the case of a cleaning robot, edge cleaning, allows the user to clean only the edges of a room or the edges of objects within a room. The user places the robot 10 on the floor near an edge to be cleaned and selects the edge-cleaning operational mode. The robot 10 then moves in such a way that it follows the edge and cleans all areas brought into contact with the cleaning head 30 of the robot.


The movement of the robot 10 in a room 110 is shown in FIG. 8A, 8B. In FIG. 8A, the robot 10 is placed along wall 100, with the robot's dominant side next to the wall. The robot then runs along the wall indefinitely following movement path 46. Similarly, in FIG. 8B, the robot 10 is placed in the proximity of an obstacle 101. The robot then follows the edge of the obstacle 101 indefinitely following movement path 47.


In a preferred embodiment, in the wall-following mode, the robot uses the wall-following sensor 16 to position itself a set distance from the wall. The robot then proceeds to travel along the perimeter of the wall. As shown in FIGS. 8A & 8B, in a preferred embodiment, the robot 10 is not able to distinguish between a wall 100 and another solid obstacle 101.


The method used in a preferred embodiment for following the wall is detailed in FIG. 9A and provides a smooth wall following operation even with a one-bit sensor. (Here the one-bit sensor detects only the presence or absence of the wall within a particular volume rather than the distance between wall and sensor.) Other methods of detecting a wall or object can be used such as bump sensing or sonar sensors.


Once the wall-following operational mode, or WALL FOLLOWING behavior of a preferred embodiment, is initiated (step 301), the robot first sets its initial value for the steering at r0. The WALL-FOLLOWING behavior then initiates the emit-detect routine in the wall-follower sensor 16 (step 310). The existence of a reflection for the IR transmitter portion of the sensor 16 translates into the existence of an object within a predetermined distance from the sensor 16. The WALL-FOLLOWING behavior then determines whether there has been a transition from a reflection (object within range) to a non-reflection (object outside of range) (step 320). If there has been a transition (in other words, the wall is now out of range), the value of r is set to its most negative value and the robot will veer slightly to the right (step 325). The robot then begins the emit-detect sequence again (step 310). If there has not been a transition from a reflection to a non-reflection, the wall-following behavior then determines whether there has been a transition from non-reflection to reflection (step 330). If there has been such a transition, the value of r is set to its most positive value and the robot will veer slightly left (step 335).


In the absence of either type of transition event, the wall-following behavior reduces the absolute value of r (step 340) and begins the emit-detect sequence (step 310) anew. By decreasing the absolute value of r, the robot 10 begins to turn more sharply in whatever direction it is currently heading. In a preferred embodiment, the rate of decreasing the absolute value of r is a constant rate dependant on the distance traveled.


The wall follower mode can be continued for a predetermined or random time, a predetermined or random distance or until some additional criteria are met (e.g. bump sensor is activated, etc.). In one embodiment, the robot continues to follow the wall indefinitely. In a preferred embodiment, as shown in FIGS. 8C & 8D wherein reference numeral 46 identifies the movement of the robot, minimum and maximum travel distances are determined, whereby the robot will remain in WALL-FOLLOWING behavior until the robot has either traveled the maximum distance (FIG. 8D) or traveled at least the minimum distance and encountered an obstacle 101 (FIG. 8C). This implementation of WALL-FOLLOWING behavior ensures the robot spends an appropriate amount of time in WALL-FOLLOWING behavior as compared to its other operational modes, thereby decreasing systemic neglect and distributing coverage to all areas. By increasing wall following, the robot is able to move in more spaces, but the robot is less efficient at cleaning any one space. In addition, by tending to exit WALL-FOLLOWING behavior after obstacle detection, the robot increases its perceived effectiveness.



FIG. 9B is a flow-chart illustration showing this embodiment of determining when to exit WALL-FOLLOWING (WF) behavior. The robot first determines the minimum distance to follow the wall (dmin) and the maximum distance to follow the wall (dmax). While in wall (or obstacle) following mode, the control system tracks the distance the robot has traveled in that mode (dWF). If dWF is greater than dmax (step 350), then the robot exits wall-following mode (step 380). If, however, dWF is less than dmax (step 350) and dWF is less than dmin (step 360), the robot remains in wall-following mode (step 385). If dWF is greater than dmin (step 360) and an obstacle is encountered (step 370), the robot exits wall-following mode (step 380).


Theoretically, the optimal distance for the robot to travel in WALL-FOLLOWING behavior is a function of room size and configuration and robot size. In a preferred embodiment, the minimum and maximum distances to remain in WALL-FOLLOWING are set based upon the approximate room size, the robots width and a random component, where by the average minimum travel distance is 2 w/p, where w is the width of the work element of the robot and p is the probability that the robot will enter WALL-FOLLOWING behavior in a given interaction with an obstacle. By way of example, in a preferred embodiment, w is approximately between 15 cm and 25 cm, and p is 0.095 (where the robot encounters 6 to 15 obstacles, or an average of 10.5 obstacles, before entering an obstacle following mode). The minimum distance is then set randomly as a distance between approximately 115 cm and 350 cm; the maximum distance is then set randomly as a distance between approximately 170 cm and 520 cm. In certain embodiments the ratio between the minimum distance to the maximum distance is 2:3. For the sake of perceived efficiency, the robot's initial operation in a obstacle following mode can be set to be longer than its later operations in obstacle following mode. In addition, users may place the robot along the longest wall when starting the robot, which improves actual as well as perceived coverage.


The distance that the robot travels in wall following mode can also be set by the robot depending on the number and frequency of objects encountered (as determined by other sensors), which is a measure of room “clutter.” If more objects are encountered, the robot would wall follow for a greater distance in order to get into all the areas of the floor. Conversely, if few obstacles are encountered, the robot would wall follow less in order to not over-cover the edges of the space in favor of passes through the center of the space. An initial wall-following distance can also be included to allow the robot to follow the wall a longer or shorter distance during its initial period where the WALL-FOLLOWING behavior has control.


In a preferred embodiment, the robot may also leave wall-following mode if the robot turns more than, for example, 270 degrees and is unable to locate the wall (or object) or if the robot has turned a total of 360 degrees since entering wall-following mode.


In certain embodiments, when the WALL-FOLLOWING behavior is active and there is a bump, the ALIGN behavior becomes active. The ALIGN behavior turns the robot counter-clockwise to align the robot with the wall. The robot always turns a minimum angle to avoid getting the robot getting into cycles of many small turns. After it has turned through its minimum angle, the robot monitors its wall sensor and if it detects a wall and then the wall detection goes away, the robot stops turning. This is because at the end of the wall follower range, the robot is well aligned to start WALL-FOLLOWING. If the robot has not seen its wall detector go on and then off by the time it reaches its maximum angle, it stops anyway. This prevents the robot from turning around in circles when the wall is out of range of its wall sensor. When the most recent bump is within the side 60 degrees of the bumper on the dominant side, the minimum angle is set to 14 degrees and the maximum angle is 19 degrees. Otherwise, if the bump is within 30 degrees of the front of the bumper on the dominant side or on the non-dominant side, the minimum angle is 20 degrees and the maximum angle is 44 degrees. When the ALIGN behavior has completed turning, it cedes control to the WALL-FOLLOWING behavior


Operational Mode: Room Coverage


The third operational mode is here called room-coverage or room cleaning mode, which allows the user to clean any area bounded by walls, stairs, obstacles or other barriers. To exercise this option, the user places the robot on the floor and selects room-cleaning mode. The robot them moves about the room cleaning all areas that it is able to reach.


In a preferred embodiment, the method of performing the room cleaning behavior is a BOUNCE behavior in combination with the STRAIGHT LINE behavior. As shown in FIG. 10, the robot 10 travels until a bump sensor 12 and/or 13 is activated by contact with an obstacle 101 or a wall 100 (see FIG. 11). The robot 10 then turns and continues to travel. A sample movement path is shown in FIG. 11 as line 48.


The algorithm for random bounce behavior is set forth in FIG. 10. The robot 10 continues its forward movement (step 401) until a bump sensor 12 and/or 13 is activated (step 410). The robot 10 then calculates an acceptable range of new directions based on a determination of which bump sensor or sensors have been activated (step 420). A determination is then made with some random calculation to choose the new heading within that acceptable range, such as 90 to 270 degrees relative to the object the robot encountered. The angle of the object the robot has bumped is determined as described above using the timing between the right and left bump sensors. The robot then turns to its new headings. In a preferred embodiment, the turn is either clockwise or counterclockwise depending on which direction requires the least movement to achieve the new heading. In other embodiments, the turn is accompanied by movement forward in order to increase the robot's coverage efficiency.


The statistics of the heading choice made by the robot can be distributed uniformly across the allowed headings, i.e. there is an equivalent chance for any heading within the acceptable range. Alternately we can choose statistics based on a Gaussian or other distribution designed to preferentially drive the robot perpendicularly away from a wall.


In other embodiments, the robot could change directions at random or predetermined times and not based upon external sensor activity. Alternatively, the robot could continuously make small angle corrections based on long range sensors to avoid even contacting an object and, thereby cover the surface area with curved paths


In a preferred embodiment, the robot stays in room-cleaning mode until a certain number of bounce interactions are reached, usually between 6 and 13.


2. Escape Behaviors


There are several situations the robot may encounter while trying to cover an area that prevent or impede it from covering all of the area efficiently. A general class of sensors and behaviors called escape behaviors are designed to get the robot out of these situations, or in extreme cases to shut the robot off if it is determined it cannot escape. In order to decide whether to give an escape behavior priority among the various behaviors on the robot, the robot determines the following: (1) is an escape behavior needed; (2) if yes, which escape behavior is warranted?


By way of example, the following situations illustrate situations where an escape behavior is needed for an indoor cleaning robot and an appropriate behavior to run:

    • (i) Situation 1. The robot detects a situation where it might get stuck—for example, a high spot in a carpet or near a lamp base that acts like a ramp for the robot. The robot performs small “panic” turn behaviors to get out of the situation;
    • (ii) Situation 2. The robot is physically stuck—for example, the robot is wedged under a couch or against a wall, tangled in cords or carpet tassels, or stuck on a pile of electrical cords with its wheels spinning. The robot performs large panic turn behaviors and turns off relevant motors to escape from the obstruction;
    • (iii) Situation 3. The robot is in a small, confined area—for example, the robot is between the legs of a chair or in the open area under a dresser, or in a small area created by placing a lamp close to the corner of a room. The robot edge follows using its bumper and/or performs panic turn behaviors to escape from the area; and
    • (iv) Situation 4. The robot has been stuck and cannot free itself—for example, the robot is in one of the cases in category (ii), above, and has not been able to free itself with any of its panic behaviors. In this case, the robot stops operation and signals to the user for help. This preserves battery life and prevents damage to floors or furniture.


In order to detect the need for each escape situation, various sensors are used. For example:

    • (i) Situation 1. (a) When the brush or side brush current rise above a threshold, the voltage applied to the relevant motor is reduced. Whenever this is happening, a stall rate variable is increased. When the current is below the threshold, the stall rate is reduced. If the stall level rises above a low threshold and the slope of the rate is positive, the robot performs small panic turn behaviors. It only repeats these small panic turn behaviors when the level has returned to zero and risen to the threshold again. (b) Likewise, there is a wheel drop level variable which is increased when a wheel drop event is detected and is reduced steadily over time. When a wheel drop event is detected and the wheel drop level is above a threshold (meaning there have been several wheel drops recently), the robot performs small or large panic turn behaviors depending on the wheel drop level.
    • (ii) Situation 2. (a) When the brush stall rate rises above a high threshold and the slope is positive, the robot turns off the brush for 13 seconds and performs large panic turn behaviors at 1, 4, and 7 seconds. At the end of the 13 seconds, the brush is turned back on. (b) When the drive stall rate rises above a medium threshold and the slope is positive, the robot performs large panic turn behaviors continuously. (c) When the drive stall rate rises above a high threshold, the robot turns off all of the motors for 15 seconds. At the end of the 15 seconds, the motors are turned back on. (d) When the bumper of the robot is held in constantly for 5 seconds (as in a side wedging situation), the robot performs a large panic turn behavior. It repeats the panic turn behavior every 5 seconds until the bumper is released. (e) When the robot has gotten no bumps for a distance of 20 feet, it assumes that it might be stuck with its wheels spinning. To free itself, it performs a spiral. If has still not gotten a bump for 10 feet after the end of the spiral, performs a large panic turn behavior. It continues this every 10 feet until it gets a bump.
    • (iii) Situation 3. (a) When the average distance between bumps falls below a low threshold, the robot performs edge following using its bumper to try to escape from the confined area. (b) When the average distance between bumps falls below a very low threshold, the robot performs large panic turn behaviors to orient it so that it may better be able to escape from the confined area.
    • (iv) Situation 4. (a) When the brush has stalled and been turned off several times recently and the brush stall rate is high and the slope is positive, the robot shuts off. (b) When the drive has stalled and the motors turned off several times recently and the drive stall rate is high and the slope is positive, the robot shuts off. (c) When any of the wheels are dropped continuously for greater than 2 seconds, the robot shuts off. (d) When many wheel drop events occur in a short time, the robot shuts off. (e) When any of the cliff sensors sense a cliff continuously for 10 seconds, the robot shuts off. (f) When the bump sensor is constantly depressed for a certain amount of time, for example 10 seconds, it is likely that the robot is wedged, and the robot shuts off.


As a descriptive example, FIGS. 12A & 12B illustrate the analysis used in a preferred embodiment for identifying the need for an escape behavior relative to a stalled brush motor, as described above in Situations 1, 2 and 4. Each time the brush current exceeds a given limit for the brush motor (step 402), a rate register is incremented by 1 (step 404); if no limit is detected, the rate register is decremented by 1 (step 406). A separate slope register stores the recent values for a recent time period such as 120 cycles. If the rate is above 600 (where 600 corresponds to one second of constant stall) (step 414) and the slope is positive (step 416), then the robot will run an escape behavior (step 420) if the escape behavior is enabled (step 418). The escape behaviors are disabled after running (step 428) until the rate has returned to zero (step 422), re-enabled (step 424) and risen to 600 again. This is done to avoid the escape behavior being triggered constantly at rates above 600.


If, however, the rate is above 2400 (step 410) and the slope is positive (step 412), the robot will run a special set of escape behaviors, shown in FIG. 12B. In a preferred embodiment, the brush motor will shut off (step 430), the “level” is incremented by a predetermined amount (50 to 90) (step 430), the stall time is set (step 430), and a panic behavior (step 452) is performed at 1 second (step 445), 4 seconds (step 450) and 7 seconds (step 455) since the brush shut off. The control system then restarts the brush at 13 seconds (steps 440 & 442). Level is decremented by 1 every second (steps 444). If level reaches a maximum threshold (step 435), the robot ceases all operation (step 437). In addition, the robot may take additional actions when certain stalls are detected, such as limiting the voltage to the motor to prevent damage to the motor.


A preferred embodiment of the robot has four escape behaviors: TURN, EDGE, WHEEL DROP and SLOW.

    • TURN. The robot turns in place in a random direction, starting at a higher velocity (approximately twice of its normal turning velocity) and decreasing to a lower velocity (approximately one-half of its normal turning velocity). Varying the velocity may aid the robot in escaping from various situations. The angle that the robot should turn can be random or a function of the degree of escape needed or both. In a preferred embodiment, in low panic situations the robot turns anywhere from 45 to 90 degrees, and in high panic situations the robot turns anywhere from 90 to 270 degrees.
    • EDGE. The robot follows the edge using its bump sensor until (a) the robot turns 60 degrees without a bump or (b) the robot cumulatively has turned more than 170 degrees since the EDGE behavior initiated. The EDGE behavior may be useful if the average bump distance is low (but not so low as to cause a panic behavior). The EDGE behavior allows the robot to fit through the smallest openings physically possible for the robot and so can allow the robot to escape from confined areas.
    • WHEEL DROP. The robot back drives wheels briefly, then stops them. The back driving of the wheels helps to minimize false positive wheel drops by giving the wheels a small kick in the opposite direction. If the wheel drop is gone within 2 seconds, the robot continues normal operation.
    • SLOW. If a wheel drop or a cliff detector goes off, the robot slows down to speed of 0.235 m/s (or 77% of its normal speed) for a distance of 0.5 m and then ramps back up to its normal speed.


In addition to the coverage behaviors and the escape behaviors, the robot also might contain additional behaviors related to safety or usability. For example, if a cliff is detected for more than a predetermined amount of time, the robot may shut off. When a cliff is first detected, a cliff avoidance response behavior takes immediate precedence over all other behaviors, rotating the robot away from the cliff until the robot no longer senses the cliff. In a preferred embodiment, the cliff detection event does not cause a change in operational modes. In other embodiments, the robot could use an algorithm similar to the wall-following behavior to allow for cliff following.


The individual operation of the three operational modes has been described above; we now turn to the preferred mode of switching between the various modes.


In order to achieve the optimal coverage and cleaning efficiency, a preferred embodiment uses a control program that gives priority to various coverage behaviors. (Escape behaviors, if needed, are always given a higher priority.) For example, the robot 10 may use the wall following mode for a specified or random time period and then switch operational modes to the room cleaning. By switching between operational modes, the robotic device of the present invention is able to increase coverage, cleaning efficiency and perceived effectiveness.


By way of example, FIGS. 13A & 13B show a mobile robot 10 in a “dog bone” shaped environment in which two rooms 115 & 116 of roughly equal dimensions are connected by a narrow passageway 105. (This example illustrates the robot diffusion problem discussed earlier.) This arrangement is a simplified version of typical domestic environments, where the “dog bone” may be generated by the arrangements of obstacles within the room. In FIG. 13A, the path of robot 10 is traced as line 54 as robot 10 operates on in random bounce mode. The robot 10 is unable to move from room 116 into 115 during the limited run because the robot's random behavior did not happen to lead the robot through passageway 105. This method leaves the coverage far less than optimal and the cleaning rate decreased due to the number of times the robot 10 crosses its own path.



FIG. 13B shows the movement of a preferred embodiment of robot 10, whereby the robot cycles between BOUNCE and WALL FOLLOWING behaviors. As the robot follows path 99, each time the robot 10 encounters a wall 100, the robot follows the wall for a distance equal to twice the robot's diameter. The portions of the path 99 in which the robot 10 operates in wall following mode are labeled 51. This method provides greatly increased coverage, along with attendant increases in cleaning rate and perceived effectiveness.


Finally, a preferred embodiment of the present invention is detailed in FIG. 14, in which all three operational modes are used. In a preferred embodiment, the device 10 begins in spiral mode (movement line 45). If a reflective spiral pattern is used, the device continues in spiral mode until a predetermined or random number of reflective events has occurred. If a standard spiral is used (as shown in FIG. 14), the device should continue until any bump sensor event. In a preferred embodiment, the device immediately enters wall following mode after the triggering event.


In a preferred embodiment, the device then switches between wall following mode (movement lines 51) and random bounce modes (movement lines 48) based on bump sensor events or the completion of the wall following algorithm. In one embodiment, the device does not return to spiral mode; in other embodiments, however, the device can enter spiral mode based on a predetermined or random event.


In a preferred embodiment, the robot keeps a record of the average distance traveled between bumps. The robot then calculates an average bump distance (ABD) using the following formula: (¾×ABD)+(¼× most recent distance between bumps). If the ABD is a above a predetermined threshold, the robot will again give priority to the SPIRAL behavior. In still other embodiments, the robot may have a minimum number of bump events before the SPIRAL behavior will again be given priority. In other embodiments, the robot may enter SPIRAL behavior if it travels a maximum distance, for example 20 feet, without a bump event.


In addition, the robot can also have conditions upon which to stop all operations. For example, for a given room size, which can be manually selected, a minimum and maximum run time are set and a minimum total distance is selected. When the minimum time and the minimum distance have been reached the robot shuts off. Likewise, if the maximum time has been reached, the robot shuts off.


Of course, a manual control for selecting between operational modes can also be used. For example, a remote control could be used to change or influence operational modes or behaviors. Likewise, a switch mounted on the shell itself could be used to set the operation mode or the switching between modes. For instance, a switch could be used to set the level of clutter in a room to allow the robot a more appropriate coverage algorithm with limited sensing ability.


One of skill in the art will recognize that portions of the instant invention can be used in autonomous vehicles for a variety of purposes besides cleaning. The scope of the invention should be determined by the appended claims and their legal equivalents, rather than by the examples given.

Claims
  • 1. A mobile coverage robot, comprising: a drive mechanism comprising drive wheels that both drive the robot forward across a surface in a drive direction and turns the robot to change the drive direction;a floor cleaner disposed on a lateral side of the robot;a proximity sensor aimed forward of the drive wheels in the drive direction, the proximity sensor responsive to an object proximate the lateral side of the robot;a tactile sensor responsive to a bump event between the robot and an object, the tactile sensor comprising a bumper switch; anda plurality of floor level sensors, each floor level sensor responsive to a condition of an area below the robot, the floor level sensors comprising: a cliff sensor aimed forward of the drive wheels in the drive direction and responsive to a presence of a cliff in the drive direction of the robot; anda wheel drop sensor responsive to a wheel drop event; anda drive controller in communication with the proximity sensor, the tactile sensor, and the plurality of floor level sensors, the drive controller configured to: operate the robot to follow a sensed object on the lateral side of the robot;operate the robot to travel in an altered direction in response to a bump event between the robot and the object, and to shut off the robot in response to determining that the bumper switch has been constantly depressed for a predetermined amount of time;operate the robot to avoid a cliff; andreduce the velocity of the robot in response to a wheel drop event.
  • 2. The mobile robot of claim 1, wherein the floor cleaner comprises a rotating brush, and wherein the drive controller is further configured to: change a moving direction of the robot when a stall rate of the brush exceeds a predetermined threshold and a slope of the stall rate is positive, andshut the robot off when the stall rate of the brush continues to exceed the predetermined threshold, and the slope of the stall rate continues to be positive, after the drive controller changes the moving direction multiple times.
  • 3. The mobile robot of claim 1, wherein the drive controller operates the robot to avoid a cliff by rotating the robot away from the cliff.
  • 4. The mobile robot of claim 1, wherein the proximity sensor is aimed at a finite volume of detection space at an expected object.
  • 5. The mobile robot of claim 1, wherein the tactile sensor further comprises a second mechanical bumper switch.
  • 6. The mobile robot of claim 5, wherein the first mechanical bumper switch and the second mechanical bumper switch are located on opposing lateral sides of the robot.
  • 7. The mobile robot of claim 5, wherein each of the first and second bumper switches are located forward of the drive mechanism in the drive direction.
  • 8. The mobile robot of claim 5, wherein the drive controller is configured to determine an angle relative to the robot at which the bump event occurred by calculating an elapsed time between activation of the first and second mechanical bumper switches.
  • 9. The mobile robot of claim 8, wherein the drive controller is configured to determine that the bump event occurred directly in front of the robot in the drive direction when the first and second mechanical bumper switches are activated simultaneously.
  • 10. The mobile robot of claim 1, wherein the tactile sensor further comprises a plurality of sensors responsive to movement of the bumper switch.
  • 11. The mobile robot of claim 10, wherein the drive controller is configured to determine an angle relative to the robot at which the bump event occurred by calculating an elapsed time between activation of the sensors.
  • 12. The mobile robot of claim 10, wherein the sensors divide a forward portion of the robot into a plurality of regions, and wherein the drive controller is configured to determine the region in which the bump event occurred.
  • 13. The mobile robot of claim 1, wherein the drive controller is configured to monitor a number of bump events that have occurred within a period of time, and to alter operation of the robot when the number of bump events has surpassed a predetermined threshold within the period of time.
  • 14. The mobile robot of claim 1, wherein the drive controller is configured to monitor an average distance between bump events, and to alter operation of the robot when the average distance has surpassed a predetermined threshold.
  • 15. The mobile robot of claim 1, wherein the drive controller is configured to monitor a number of wheel drop events that have occurred within a period of time, and to shut the robot off when the number of wheel drop events has passed a selected threshold.
  • 16. The mobile robot of claim 1, wherein the drive controller operates the robot to avoid a cliff by monitoring the cliff sensor to detect when a cliff is present proximate the drive direction of the robot; andin response to detecting that a cliff is present proximate the drive direction of the robot, turning the robot away from the cliff.
  • 17. The mobile robot of claim 16, wherein the drive controller is further configured to reduce the velocity of the robot in response to detecting the cliff.
  • 18. The mobile robot of claim 16, wherein the drive controller is further configured to monitor a duration of cliff detection and shut the robot off when the duration has passed a selected threshold.
  • 19. The mobile robot of claim 1, wherein the drive controller is further configured to operate the robot to align the lateral side with an object by monitoring the tactile sensor to detect when a bump event has occurred between the robot and the object; andin response to detecting that a bump event has occurred, turning the robot such that the object is located on the lateral side of the robot.
  • 20. The mobile robot of claim 1, wherein the drive controller operates the robot to travel in a direction away from an object in response to a bump event between the robot and the object by monitoring the tactile sensor to detect when the bump event has occurred; andin response to detecting that the bump event has occurred, turning the robot away from a surface of the object.
  • 21. The mobile robot of claim 20, wherein the drive controller is further configured to drive the robot forward while turning the robot.
  • 22. The mobile robot of claim 20, wherein the drive controller is further configured to determine a selected angle and direction by which to turn the robot based on an angle relative to the robot at which the bump event occurred.
  • 23. The mobile robot of claim 1, wherein the drive controller is further configured to shut the robot off in response to continued detection of a wheel drop event by the wheel drop sensor for a predetermined amount of time.
  • 24. A mobile coverage robot, comprising: a drive mechanism comprising drive wheels that both drives the robot forward across a surface in a drive direction and turns the robot to change the drive direction;a floor cleaner disposed on a lateral side of the robot;a proximity sensor aimed forward of the drive wheels in the drive direction, the proximity sensor responsive to an object proximate the lateral side of the robot;a tactile sensor responsive to a bump event between the robot and an object, the tactile sensor comprising a bumper switch; anda plurality of floor level sensors, each floor level sensor responsive to a condition of an area below the robot, the floor level sensors comprising: a cliff sensor aimed forward of the drive wheels in the drive direction and responsive to a presence of a cliff in the drive direction of the robot; anda wheel drop sensor responsive to a wheel drop event; anda drive controller in communication with the proximity sensor, the tactile sensor, and the plurality of floor level sensors, the drive controller configured to: operate the robot to follow a sensed object on the lateral side of the robot;operate the robot to travel in an altered direction in response to a bump event between the robot and the object; andin response to one or more predetermined signals from a sensor, operate the robot according to an escape sequence, and to shut off the robot in response to determining that the bumper switch has been constantly depressed for a predetermined amount of time during the escape sequence.
  • 25. The mobile robot of claim 24, wherein the floor cleaner comprises a motorized rotating brush and a brush sensor responsive a current a brush motor, the brush sensor in communication with the drive controller, and wherein the one or more predetermined signals comprises a signal from the brush sensor indicating a current rise of the brush motor above a predetermined threshold.
  • 26. The mobile robot of claim 24, wherein the escape sequence comprises turning the robot in place at an initial velocity, and progressively decreasing the velocity over time.
  • 27. The mobile robot of claim 24, wherein the one or more predetermined signals comprise at least one of: a detection signal from the cliff sensor;a detection signal from the wheel drop sensor;a plurality of detection signals from the tactile sensor within a predetermined distance of travel by the robot.
CROSS REFERENCE TO RELATED APPLICATIONS

This U.S. patent application is a continuation of, and claims priority under 35 U.S.C. §120 from, U.S. patent application Ser. No. 12/609,124, filed on Oct. 30, 2009, which is a continuation of U.S. patent application Ser. No. 11/671,305, filed on Feb. 5, 2007, which is a continuation of U.S. patent application Ser. No. 10/839,374, filed on May 5, 2004, which is a continuation of U.S. patent application Ser. No. 10/167,851, filed on Jun. 12, 2002, which claims priority under 35 U.S.C. §119(e) to U.S. Provisional Application 60/297,718, filed on Jun. 12, 2001. The disclosures of these prior applications are considered part of the disclosure of this application and are hereby incorporated by reference in their entireties.

US Referenced Citations (1011)
Number Name Date Kind
1755054 Darst Apr 1930 A
1780221 Buchmann Nov 1930 A
1970302 Gerhardt Aug 1934 A
2136324 John Nov 1938 A
2302111 Dow et al. Nov 1942 A
2353621 Sav et al. Jul 1944 A
2770825 Pullen Nov 1956 A
2930055 Fallen et al. Mar 1960 A
3119369 Harland et al. Jan 1964 A
3166138 Dunn Jan 1965 A
3333564 Waters Aug 1967 A
3375375 Robert et al. Mar 1968 A
3381652 Schaefer et al. May 1968 A
3457575 Bienek Jul 1969 A
3550714 Bellinger Dec 1970 A
3569727 Aggarwal et al. Mar 1971 A
3649981 Woodworth Mar 1972 A
3674316 De Bray Jul 1972 A
3678882 Kinsella Jul 1972 A
3690559 Rudloff Sep 1972 A
3744586 Leinauer Jul 1973 A
3756667 Bombardier et al. Sep 1973 A
3809004 Leonheart May 1974 A
3816004 Bignardi Jun 1974 A
3845831 James Nov 1974 A
RE28268 Autrand Dec 1974 E
3851349 Lowder Dec 1974 A
3853086 Asplund Dec 1974 A
3863285 Hukuba Feb 1975 A
3888181 Kups Jun 1975 A
3937174 Haaga Feb 1976 A
3952361 Wilkins Apr 1976 A
3989311 Debrey Nov 1976 A
3989931 Phillips Nov 1976 A
4004313 Capra Jan 1977 A
4012681 Finger et al. Mar 1977 A
4070170 Leinfelt Jan 1978 A
4099284 Shinozaki et al. Jul 1978 A
4119900 Kremnitz Oct 1978 A
4175589 Nakamura et al. Nov 1979 A
4175892 De bray Nov 1979 A
4196727 Verkaart et al. Apr 1980 A
4198727 Farmer Apr 1980 A
4199838 Simonsson Apr 1980 A
4209254 Reymond et al. Jun 1980 A
D258901 Keyworth Apr 1981 S
4297578 Carter Oct 1981 A
4305234 Pichelman Dec 1981 A
4306329 Yokoi Dec 1981 A
4309758 Halsall et al. Jan 1982 A
4328545 Halsall et al. May 1982 A
4367403 Miller Jan 1983 A
4369543 Chen et al. Jan 1983 A
4401909 Gorsek Aug 1983 A
4416033 Specht Nov 1983 A
4445245 Lu May 1984 A
4465370 Yuasa et al. Aug 1984 A
4477998 You Oct 1984 A
4481692 Kurz Nov 1984 A
4482960 Pryor Nov 1984 A
4492058 Goldfarb et al. Jan 1985 A
4513469 Godfrey et al. Apr 1985 A
D278732 Ohkado May 1985 S
4518437 Sommer May 1985 A
4534637 Suzuki et al. Aug 1985 A
4556313 Miller et al. Dec 1985 A
4575211 Matsumura et al. Mar 1986 A
4580311 Kurz Apr 1986 A
4601082 Kurz Jul 1986 A
4618213 Chen Oct 1986 A
4620285 Perdue Oct 1986 A
4624026 Olson et al. Nov 1986 A
4626995 Lofgren et al. Dec 1986 A
4628454 Ito Dec 1986 A
4638445 Mattaboni Jan 1987 A
4644156 Takahashi et al. Feb 1987 A
4649504 Krouglicof et al. Mar 1987 A
4652917 Miller Mar 1987 A
4654492 Koerner et al. Mar 1987 A
4654924 Getz et al. Apr 1987 A
4660969 Sorimachi et al. Apr 1987 A
4662854 Fang May 1987 A
4674048 Okumura Jun 1987 A
4679152 Perdue Jul 1987 A
4680827 Hummel Jul 1987 A
4696074 Cavalli Sep 1987 A
D292223 Trumbull Oct 1987 S
4700301 Dyke Oct 1987 A
4700427 Knepper Oct 1987 A
4703820 Reinaud Nov 1987 A
4709773 Clement et al. Dec 1987 A
4710020 Maddox et al. Dec 1987 A
4712740 Duncan et al. Dec 1987 A
4716621 Zoni Jan 1988 A
4728801 O'Connor Mar 1988 A
4733343 Yoneda et al. Mar 1988 A
4733430 Westergren Mar 1988 A
4733431 Martin Mar 1988 A
4735136 Lee et al. Apr 1988 A
4735138 Gawler et al. Apr 1988 A
4748336 Fujie et al. May 1988 A
4748833 Nagasawa Jun 1988 A
4756049 Uehara Jul 1988 A
4767213 Hummel Aug 1988 A
4769700 Pryor Sep 1988 A
4777416 George et al. Oct 1988 A
D298766 Tanno et al. Nov 1988 S
4782550 Jacobs Nov 1988 A
4796198 Boultinghouse et al. Jan 1989 A
4806751 Abe et al. Feb 1989 A
4811228 Hyyppa Mar 1989 A
4813906 Matsuyama et al. Mar 1989 A
4815157 Tsuchiya Mar 1989 A
4817000 Eberhardt Mar 1989 A
4818875 Weiner Apr 1989 A
4829442 Kadonoff et al. May 1989 A
4829626 Harkonen et al. May 1989 A
4832098 Palinkas et al. May 1989 A
4851661 Everett Jul 1989 A
4854000 Takimoto Aug 1989 A
4854006 Nishimura et al. Aug 1989 A
4855915 Dallaire Aug 1989 A
4857912 Everett et al. Aug 1989 A
4858132 Holmquist Aug 1989 A
4867570 Sorimachi et al. Sep 1989 A
4880474 Koharagi et al. Nov 1989 A
4887415 Martin Dec 1989 A
4891762 Chotiros Jan 1990 A
4893025 Lee Jan 1990 A
4901394 Nakamura et al. Feb 1990 A
4905151 Weiman et al. Feb 1990 A
4909972 Britz Mar 1990 A
4912643 Beirne Mar 1990 A
4918441 Bohman Apr 1990 A
4919224 Shyu et al. Apr 1990 A
4919489 Kopsco Apr 1990 A
4920060 Parrent et al. Apr 1990 A
4920605 Takashima May 1990 A
4933864 Evans et al. Jun 1990 A
4937912 Kurz Jul 1990 A
4953253 Fukuda et al. Sep 1990 A
4954962 Evans et al. Sep 1990 A
4955714 Stotler et al. Sep 1990 A
4956891 Wulff Sep 1990 A
4961303 McCarty et al. Oct 1990 A
4961304 Ovsborn et al. Oct 1990 A
4962453 Pong et al. Oct 1990 A
4967862 Pong et al. Nov 1990 A
4971591 Raviv et al. Nov 1990 A
4973912 Kaminski et al. Nov 1990 A
4974283 Holsten et al. Dec 1990 A
4977618 Allen Dec 1990 A
4977639 Takahashi et al. Dec 1990 A
4986663 Cecchi et al. Jan 1991 A
5001635 Yasutomi et al. Mar 1991 A
5002145 Wakaumi et al. Mar 1991 A
5012886 Jonas et al. May 1991 A
5018240 Holman May 1991 A
5020186 Lessig et al. Jun 1991 A
5022812 Coughlan et al. Jun 1991 A
5023788 Kitazume et al. Jun 1991 A
5024529 Svetkoff et al. Jun 1991 A
D318500 Malewicki et al. Jul 1991 S
5032775 Mizuno et al. Jul 1991 A
5033151 Kraft et al. Jul 1991 A
5033291 Podoloff et al. Jul 1991 A
5040116 Evans et al. Aug 1991 A
5045769 Everett Sep 1991 A
5049802 Mintus et al. Sep 1991 A
5051906 Evans et al. Sep 1991 A
5062819 Mallory Nov 1991 A
5070567 Holland Dec 1991 A
5084934 Lessig et al. Feb 1992 A
5086535 Grossmeyer et al. Feb 1992 A
5090321 Abouav Feb 1992 A
5093955 Blehert et al. Mar 1992 A
5094311 Akeel Mar 1992 A
5098262 Wecker et al. Mar 1992 A
5105502 Takashima Apr 1992 A
5105550 Shenoha Apr 1992 A
5109566 Kobayashi et al. May 1992 A
5111401 Everett, Jr. et al. May 1992 A
5115538 Cochran et al. May 1992 A
5127128 Lee Jul 1992 A
5136675 Hodson Aug 1992 A
5136750 Takashima et al. Aug 1992 A
5142985 Stearns et al. Sep 1992 A
5144471 Takanashi et al. Sep 1992 A
5144714 Mori et al. Sep 1992 A
5144715 Matsuyo et al. Sep 1992 A
5152028 Hirano Oct 1992 A
5152202 Strauss Oct 1992 A
5154617 Suman et al. Oct 1992 A
5155684 Burke et al. Oct 1992 A
5163202 Kawakami et al. Nov 1992 A
5163320 Goshima et al. Nov 1992 A
5164579 Pryor et al. Nov 1992 A
5165064 Mattaboni Nov 1992 A
5170352 McTamaney et al. Dec 1992 A
5173881 Sindle Dec 1992 A
5182833 Yamaguchi et al. Feb 1993 A
5187662 Kamimura et al. Feb 1993 A
5202742 Frank et al. Apr 1993 A
5204814 Noonan et al. Apr 1993 A
5206500 Decker et al. Apr 1993 A
5208521 Aoyama May 1993 A
5211115 Maier et al. May 1993 A
5216777 Moro et al. Jun 1993 A
5222786 Sovis et al. Jun 1993 A
5227985 DeMenthon Jul 1993 A
5233682 Abe et al. Aug 1993 A
5239720 Wood et al. Aug 1993 A
5251358 Moro et al. Oct 1993 A
5261139 Lewis Nov 1993 A
5276618 Everett Jan 1994 A
5276939 Uenishi Jan 1994 A
5277064 Knigga et al. Jan 1994 A
5279672 Betker et al. Jan 1994 A
5284452 Corona Feb 1994 A
5284522 Kobayashi et al. Feb 1994 A
5293955 Lee Mar 1994 A
D345707 Alister Apr 1994 S
5303448 Hennessey et al. Apr 1994 A
5307273 Oh et al. Apr 1994 A
5309592 Hiratsuka May 1994 A
5310379 Hippely et al. May 1994 A
5315227 Pierson et al. May 1994 A
5319827 Yang Jun 1994 A
5319828 Waldhauser et al. Jun 1994 A
5321614 Ashworth Jun 1994 A
5323483 Baeg Jun 1994 A
5324948 Dudar et al. Jun 1994 A
5331713 Tipton Jul 1994 A
5341186 Kato Aug 1994 A
5341540 Soupert et al. Aug 1994 A
5341549 Wirtz et al. Aug 1994 A
5345649 Whitlow Sep 1994 A
5352901 Poorman Oct 1994 A
5353224 Lee et al. Oct 1994 A
5363305 Cox et al. Nov 1994 A
5363935 Schempf et al. Nov 1994 A
5369347 Yoo Nov 1994 A
5369838 Wood et al. Dec 1994 A
5386862 Glover et al. Feb 1995 A
5399951 Lavallee et al. Mar 1995 A
5400244 Watanabe et al. Mar 1995 A
5404612 Ishikawa Apr 1995 A
5410479 Coker Apr 1995 A
5435405 Schempf et al. Jul 1995 A
5440216 Kim Aug 1995 A
5442358 Keeler et al. Aug 1995 A
5444965 Colens Aug 1995 A
5446356 Kim Aug 1995 A
5446445 Bloomfield et al. Aug 1995 A
5451135 Schempf et al. Sep 1995 A
5454129 Kell Oct 1995 A
5455982 Armstrong et al. Oct 1995 A
5465525 Mifune et al. Nov 1995 A
5465619 Sotack et al. Nov 1995 A
5467273 Faibish et al. Nov 1995 A
5471560 Allard et al. Nov 1995 A
5491670 Weber Feb 1996 A
5497529 Boesi Mar 1996 A
5498948 Bruni et al. Mar 1996 A
5502638 Takenaka Mar 1996 A
5505072 Oreper Apr 1996 A
5507067 Hoekstra et al. Apr 1996 A
5510893 Suzuki Apr 1996 A
5511147 Abdel-Malek Apr 1996 A
5515572 Hoekstra et al. May 1996 A
5534762 Kim Jul 1996 A
5535476 Kresse et al. Jul 1996 A
5537017 Feiten et al. Jul 1996 A
5537711 Tseng Jul 1996 A
5539953 Kurz Jul 1996 A
5542146 Hoekstra et al. Aug 1996 A
5542148 Young Aug 1996 A
5546631 Chambon Aug 1996 A
5548511 Bancroft Aug 1996 A
5551119 Wörwag Sep 1996 A
5551525 Pack et al. Sep 1996 A
5553349 Kilstrom et al. Sep 1996 A
5555587 Guha Sep 1996 A
5560077 Crotchett Oct 1996 A
5568589 Hwang Oct 1996 A
D375592 Ljunggren Nov 1996 S
5608306 Rybeck et al. Mar 1997 A
5608894 Kawakami et al. Mar 1997 A
5608944 Gordon Mar 1997 A
5610488 Miyazawa Mar 1997 A
5611106 Wulff Mar 1997 A
5611108 Knowlton et al. Mar 1997 A
5613261 Kawakami et al. Mar 1997 A
5613269 Miwa Mar 1997 A
5621291 Lee Apr 1997 A
5622236 Azumi et al. Apr 1997 A
5634237 Paranjpe Jun 1997 A
5634239 Tuvin et al. Jun 1997 A
5636402 Kubo et al. Jun 1997 A
5642299 Hardin et al. Jun 1997 A
5646494 Han Jul 1997 A
5647554 Ikegami et al. Jul 1997 A
5650702 Azumi Jul 1997 A
5652489 Kawakami Jul 1997 A
5677606 Otake Oct 1997 A
5682313 Edlund et al. Oct 1997 A
5682839 Grimsley et al. Nov 1997 A
5696675 Nakamura et al. Dec 1997 A
5698861 Oh Dec 1997 A
5709007 Chiang Jan 1998 A
5710506 Broell et al. Jan 1998 A
5714119 Kawagoe et al. Feb 1998 A
5717169 Liang et al. Feb 1998 A
5717484 Hamaguchi et al. Feb 1998 A
5720077 Nakamura et al. Feb 1998 A
5732401 Conway Mar 1998 A
5735017 Barnes et al. Apr 1998 A
5735959 Kubo et al. Apr 1998 A
5742975 Knowlton et al. Apr 1998 A
5745235 Vercammen et al. Apr 1998 A
5752871 Tsuzuki May 1998 A
5756904 Oreper et al. May 1998 A
5761762 Kubo Jun 1998 A
5764888 Bolan et al. Jun 1998 A
5767437 Rogers Jun 1998 A
5767960 Orman Jun 1998 A
5770936 Hirai et al. Jun 1998 A
5777596 Herbert Jul 1998 A
5778486 Kim Jul 1998 A
5781697 Jeong Jul 1998 A
5781960 Kilstrom et al. Jul 1998 A
5784755 Karr et al. Jul 1998 A
5786602 Pryor et al. Jul 1998 A
5787545 Colens Aug 1998 A
5793900 Nourbakhsh et al. Aug 1998 A
5794297 Muta Aug 1998 A
5802665 Knowlton et al. Sep 1998 A
5812267 Everett et al. Sep 1998 A
5814808 Takada et al. Sep 1998 A
5815880 Nakanishi Oct 1998 A
5815884 Imamura et al. Oct 1998 A
5819008 Asama et al. Oct 1998 A
5819360 Fujii Oct 1998 A
5819936 Saveliev et al. Oct 1998 A
5820821 Kawagoe et al. Oct 1998 A
5821730 Drapkin Oct 1998 A
5825981 Matsuda Oct 1998 A
5828770 Leis et al. Oct 1998 A
5831597 West et al. Nov 1998 A
5836045 Anthony et al. Nov 1998 A
5839156 Park et al. Nov 1998 A
5839532 Yoshiji et al. Nov 1998 A
5841259 Kim et al. Nov 1998 A
5867800 Leif Feb 1999 A
5867861 Kasen et al. Feb 1999 A
5869910 Colens Feb 1999 A
5894621 Kubo Apr 1999 A
5896611 Haaga Apr 1999 A
5903124 Kawakami May 1999 A
5905209 Oreper May 1999 A
5907886 Buscher Jun 1999 A
5910700 Crotzer Jun 1999 A
5911260 Suzuki Jun 1999 A
5916008 Wong Jun 1999 A
5924167 Wright et al. Jul 1999 A
5926909 McGee Jul 1999 A
5933102 Miller et al. Aug 1999 A
5933913 Wright et al. Aug 1999 A
5935179 Kleiner et al. Aug 1999 A
5935333 Davis Aug 1999 A
5940346 Sadowsky et al. Aug 1999 A
5940927 Haegermarck et al. Aug 1999 A
5940930 Oh et al. Aug 1999 A
5942869 Katou et al. Aug 1999 A
5943730 Boomgaarden Aug 1999 A
5943733 Tagliaferri Aug 1999 A
5943933 Evans et al. Aug 1999 A
5947225 Kawakami et al. Sep 1999 A
5950408 Schaedler Sep 1999 A
5959423 Nakanishi et al. Sep 1999 A
5968281 Wright et al. Oct 1999 A
5974348 Rocks Oct 1999 A
5974365 Mitchell Oct 1999 A
5983448 Wright et al. Nov 1999 A
5984880 Lander et al. Nov 1999 A
5987383 Keller et al. Nov 1999 A
5989700 Krivopal Nov 1999 A
5991951 Kubo et al. Nov 1999 A
5995883 Nishikado Nov 1999 A
5995884 Allen et al. Nov 1999 A
5996167 Close Dec 1999 A
5998953 Nakamura et al. Dec 1999 A
5998971 Corbridge Dec 1999 A
6000088 Wright et al. Dec 1999 A
6009358 Angott et al. Dec 1999 A
6012618 Matsuo et al. Jan 2000 A
6021545 Delgado et al. Feb 2000 A
6023813 Thatcher et al. Feb 2000 A
6023814 Imamura Feb 2000 A
6025687 Himeda et al. Feb 2000 A
6026539 Mouw et al. Feb 2000 A
6030464 Azevedo Feb 2000 A
6030465 Marcussen et al. Feb 2000 A
6032327 Oka et al. Mar 2000 A
6032542 Warnick et al. Mar 2000 A
6036572 Sze Mar 2000 A
6038501 Kawakami Mar 2000 A
6040669 Hog Mar 2000 A
6041471 Charky et al. Mar 2000 A
6041472 Kasen et al. Mar 2000 A
6046800 Ohtomo et al. Apr 2000 A
6049620 Dickinson et al. Apr 2000 A
6050648 Keleny Apr 2000 A
6052821 Chouly et al. Apr 2000 A
6054822 Harada Apr 2000 A
6055042 Sarangapani Apr 2000 A
6055702 Imamura et al. May 2000 A
6061868 Moritsch et al. May 2000 A
6065182 Wright et al. May 2000 A
6070290 Schwarze et al. Jun 2000 A
6073432 Schaedler Jun 2000 A
6076025 Ueno et al. Jun 2000 A
6076026 Jambhekar et al. Jun 2000 A
6076226 Reed Jun 2000 A
6076227 Schallig et al. Jun 2000 A
6081257 Zeller Jun 2000 A
6088020 Mor Jul 2000 A
6094775 Behmer Aug 2000 A
6099091 Campbell Aug 2000 A
6101670 Song Aug 2000 A
6101671 Wright et al. Aug 2000 A
6108031 King et al. Aug 2000 A
6108067 Okamoto Aug 2000 A
6108076 Hanseder Aug 2000 A
6108269 Kabel Aug 2000 A
6108597 Kirchner et al. Aug 2000 A
6108859 Burgoon Aug 2000 A
6112143 Allen et al. Aug 2000 A
6112996 Matsuo Sep 2000 A
6119057 Kawagoe Sep 2000 A
6122798 Kobayashi et al. Sep 2000 A
6124694 Bancroft et al. Sep 2000 A
6125498 Roberts et al. Oct 2000 A
6131237 Kasper et al. Oct 2000 A
6138063 Himeda Oct 2000 A
6142252 Kinto et al. Nov 2000 A
6146041 Chen et al. Nov 2000 A
6146278 Kobayashi Nov 2000 A
6154279 Thayer Nov 2000 A
6154694 Aoki et al. Nov 2000 A
6160479 Ahlen et al. Dec 2000 A
6167332 Kurtzberg et al. Dec 2000 A
6167587 Kasper et al. Jan 2001 B1
6192548 Huffman Feb 2001 B1
6192549 Kasen et al. Feb 2001 B1
6202243 Beaufoy et al. Mar 2001 B1
6216307 Kaleta et al. Apr 2001 B1
6220865 Macri et al. Apr 2001 B1
6226830 Hendriks et al. May 2001 B1
6230362 Kasper et al. May 2001 B1
6237741 Guidetti May 2001 B1
6240342 Fiegert et al. May 2001 B1
6243913 Frank et al. Jun 2001 B1
6255793 Peless et al. Jul 2001 B1
6259979 Holmquist Jul 2001 B1
6261379 Conrad et al. Jul 2001 B1
6263539 Baig Jul 2001 B1
6263989 Won Jul 2001 B1
6272936 Oreper et al. Aug 2001 B1
6276478 Hopkins et al. Aug 2001 B1
6278918 Dickson et al. Aug 2001 B1
6279196 Kasen et al. Aug 2001 B2
6282526 Ganesh Aug 2001 B1
6283034 Miles Sep 2001 B1
6285778 Nakajima et al. Sep 2001 B1
6285930 Dickson et al. Sep 2001 B1
6286181 Kasper et al. Sep 2001 B1
6300737 Bergvall et al. Oct 2001 B1
6321337 Reshef et al. Nov 2001 B1
6321515 Colens Nov 2001 B1
6323570 Nishimura et al. Nov 2001 B1
6324714 Walz et al. Dec 2001 B1
6327741 Reed Dec 2001 B1
6332400 Meyer Dec 2001 B1
6339735 Peless et al. Jan 2002 B1
6359405 Tsurumi Mar 2002 B1
6362875 Burkley Mar 2002 B1
6370453 Sommer Apr 2002 B2
6374155 Wallach et al. Apr 2002 B1
6374157 Takamura Apr 2002 B1
6381802 Park May 2002 B2
6385515 Dickson et al. May 2002 B1
6388013 Saraf et al. May 2002 B1
6389329 Colens May 2002 B1
6397429 Legatt et al. Jun 2002 B1
6400048 Nishimura et al. Jun 2002 B1
6401294 Kasper Jun 2002 B2
6408226 Byrne et al. Jun 2002 B1
6412141 Kasper et al. Jul 2002 B2
6415203 Inoue et al. Jul 2002 B1
6418586 Fulghum Jul 2002 B2
6421870 Basham et al. Jul 2002 B1
6427285 Legatt et al. Aug 2002 B1
6430471 Kintou et al. Aug 2002 B1
6431296 Won Aug 2002 B1
6437227 Theimer Aug 2002 B1
6437465 Nishimura et al. Aug 2002 B1
6438456 Feddema et al. Aug 2002 B1
6438793 Miner et al. Aug 2002 B1
6442476 Poropat Aug 2002 B1
6442789 Legatt et al. Sep 2002 B1
6443509 Levin et al. Sep 2002 B1
6444003 Sutcliffe Sep 2002 B1
6446302 Kasper et al. Sep 2002 B1
6454036 Airey et al. Sep 2002 B1
D464091 Christianson Oct 2002 S
6457206 Judson Oct 2002 B1
6459955 Bartsch et al. Oct 2002 B1
6463368 Feiten et al. Oct 2002 B1
6465982 Bergvall et al. Oct 2002 B1
6473167 Odell Oct 2002 B1
6480762 Uchikubo et al. Nov 2002 B1
6481515 Kirkpatrick et al. Nov 2002 B1
6482252 Conrad et al. Nov 2002 B1
6490539 Dickson et al. Dec 2002 B1
6491127 Holmberg et al. Dec 2002 B1
6493612 Bisset et al. Dec 2002 B1
6493613 Peless et al. Dec 2002 B2
6496754 Song et al. Dec 2002 B2
6496755 Wallach et al. Dec 2002 B2
6502657 Kerrebrock et al. Jan 2003 B2
6504610 Bauer et al. Jan 2003 B1
6507773 Parker et al. Jan 2003 B2
6519808 Legatt et al. Feb 2003 B2
6525509 Petersson et al. Feb 2003 B1
D471243 Cioffi et al. Mar 2003 S
6530102 Pierce et al. Mar 2003 B1
6530117 Peterson Mar 2003 B2
6532404 Colens Mar 2003 B2
6535793 Allard Mar 2003 B2
6540424 Hall et al. Apr 2003 B1
6540607 Mokris et al. Apr 2003 B2
6548982 Papanikolopoulos et al. Apr 2003 B1
6553612 Dyson et al. Apr 2003 B1
6556722 Russell et al. Apr 2003 B1
6556892 Kuroki et al. Apr 2003 B2
6557104 Vu et al. Apr 2003 B2
D474312 Stephens et al. May 2003 S
6563130 Dworkowski et al. May 2003 B2
6571415 Gerber et al. Jun 2003 B2
6571422 Gordon et al. Jun 2003 B1
6572711 Sclafani et al. Jun 2003 B2
6574536 Kawagoe et al. Jun 2003 B1
6580246 Jacobs Jun 2003 B2
6584376 Van Kommer Jun 2003 B1
6586908 Petersson et al. Jul 2003 B2
6587573 Stam et al. Jul 2003 B1
6590222 Bisset et al. Jul 2003 B1
6594551 McKinney et al. Jul 2003 B2
6594844 Jones Jul 2003 B2
6597076 Scheible et al. Jul 2003 B2
D478884 Slipy et al. Aug 2003 S
6601265 Burlington Aug 2003 B1
6604021 Imai et al. Aug 2003 B2
6604022 Parker et al. Aug 2003 B2
6605156 Clark et al. Aug 2003 B1
6609269 Kasper Aug 2003 B2
6611120 Song et al. Aug 2003 B2
6611734 Parker et al. Aug 2003 B2
6611738 Ruffner Aug 2003 B2
6615108 Peless et al. Sep 2003 B1
6615434 Davis et al. Sep 2003 B1
6615885 Ohm Sep 2003 B1
6622465 Jerome et al. Sep 2003 B2
6624744 Wilson et al. Sep 2003 B1
6625843 Kim et al. Sep 2003 B2
6629028 Paromtchik et al. Sep 2003 B2
6633150 Wallach et al. Oct 2003 B1
6637546 Wang Oct 2003 B1
6639659 Granger Oct 2003 B2
6658325 Zweig Dec 2003 B2
6658354 Lin Dec 2003 B2
6658692 Lenkiewicz et al. Dec 2003 B2
6658693 Reed Dec 2003 B1
6661239 Ozick Dec 2003 B1
6662889 De Fazio et al. Dec 2003 B2
6668951 Won Dec 2003 B2
6670817 Fournier et al. Dec 2003 B2
6671592 Bisset et al. Dec 2003 B1
6671925 Field et al. Jan 2004 B2
6677938 Maynard Jan 2004 B1
6687571 Byrne et al. Feb 2004 B1
6690134 Jones et al. Feb 2004 B1
6690993 Foulke et al. Feb 2004 B2
6697147 Ko et al. Feb 2004 B2
6705332 Field et al. Mar 2004 B2
6711280 Stafsudd et al. Mar 2004 B2
6732826 Song et al. May 2004 B2
6735811 Field et al. May 2004 B2
6735812 Hekman et al. May 2004 B2
6737591 Lapstun et al. May 2004 B1
6741054 Koselka et al. May 2004 B2
6741364 Lange et al. May 2004 B2
6748297 Song et al. Jun 2004 B2
6756703 Chang Jun 2004 B2
6760647 Nourbakhsh et al. Jul 2004 B2
6764373 Osawa et al. Jul 2004 B1
6769004 Barrett Jul 2004 B2
6774596 Bisset Aug 2004 B1
6779380 Nieuwkamp Aug 2004 B1
6781338 Jones et al. Aug 2004 B2
6809490 Jones et al. Oct 2004 B2
6810305 Kirkpatrick Oct 2004 B2
6810350 Blakley Oct 2004 B2
6830120 Yashima et al. Dec 2004 B1
6832407 Salem et al. Dec 2004 B2
6836701 McKee Dec 2004 B2
6841963 Song et al. Jan 2005 B2
6845297 Allard Jan 2005 B2
6848146 Wright et al. Feb 2005 B2
6854148 Rief et al. Feb 2005 B1
6856811 Burdue et al. Feb 2005 B2
6859010 Jeon et al. Feb 2005 B2
6859682 Naka et al. Feb 2005 B2
6860206 Rudakevych et al. Mar 2005 B1
6865447 Lau et al. Mar 2005 B2
6870792 Chiappetta Mar 2005 B2
6871115 Huang et al. Mar 2005 B2
6883201 Jones et al. Apr 2005 B2
6886651 Slocum et al. May 2005 B1
6888333 Laby May 2005 B2
6901624 Mori et al. Jun 2005 B2
6906702 Tanaka et al. Jun 2005 B1
6914403 Tsurumi Jul 2005 B2
6917854 Bayer Jul 2005 B2
6925357 Wang et al. Aug 2005 B2
6925679 Wallach et al. Aug 2005 B2
6929548 Wang Aug 2005 B2
D510066 Hickey et al. Sep 2005 S
6938298 Aasen Sep 2005 B2
6940291 Ozick Sep 2005 B1
6941199 Bottomley et al. Sep 2005 B1
6956348 Landry et al. Oct 2005 B2
6957712 Song et al. Oct 2005 B2
6960986 Asama et al. Nov 2005 B2
6965209 Jones et al. Nov 2005 B2
6965211 Tsurumi Nov 2005 B2
6968592 Takeuchi et al. Nov 2005 B2
6971140 Kim Dec 2005 B2
6975246 Trudeau Dec 2005 B1
6980229 Ebersole Dec 2005 B1
6985556 Shanmugavel et al. Jan 2006 B2
6993954 George et al. Feb 2006 B1
6999850 McDonald Feb 2006 B2
7013527 Thomas et al. Mar 2006 B2
7024278 Chiappetta et al. Apr 2006 B2
7024280 Parker et al. Apr 2006 B2
7027893 Perry et al. Apr 2006 B2
7030768 Wanie Apr 2006 B2
7031805 Lee et al. Apr 2006 B2
7032469 Bailey Apr 2006 B2
7040869 Beenker May 2006 B2
7041029 Fulghum et al. May 2006 B2
7051399 Field et al. May 2006 B2
7053578 Diehl et al. May 2006 B2
7054716 McKee et al. May 2006 B2
7055210 Keppler et al. Jun 2006 B2
7057120 Ma et al. Jun 2006 B2
7057643 Iida et al. Jun 2006 B2
7059012 Song et al. Jun 2006 B2
7065430 Naka et al. Jun 2006 B2
7066291 Martins et al. Jun 2006 B2
7069124 Whittaker et al. Jun 2006 B1
7079923 Abramson et al. Jul 2006 B2
7085623 Siegers Aug 2006 B2
7085624 Aldred et al. Aug 2006 B2
7113847 Chmura et al. Sep 2006 B2
7133746 Abramson et al. Nov 2006 B2
7142198 Lee Nov 2006 B2
7148458 Schell et al. Dec 2006 B2
7155308 Jones Dec 2006 B2
7167775 Abramson et al. Jan 2007 B2
7171285 Kim et al. Jan 2007 B2
7173391 Jones et al. Feb 2007 B2
7174238 Zweig Feb 2007 B1
7188000 Chiappetta et al. Mar 2007 B2
7193384 Norman et al. Mar 2007 B1
7196487 Jones et al. Mar 2007 B2
7201786 Wegelin et al. Apr 2007 B2
7206677 Hulden Apr 2007 B2
7211980 Bruemmer et al. May 2007 B1
7225500 Diehl et al. Jun 2007 B2
7246405 Yan Jul 2007 B2
7248951 Hulden Jul 2007 B2
7275280 Haegermarck et al. Oct 2007 B2
7283892 Boillot et al. Oct 2007 B1
7288912 Landry et al. Oct 2007 B2
7318248 Yan et al. Jan 2008 B1
7320149 Huffman et al. Jan 2008 B1
7321807 Laski Jan 2008 B2
7324870 Lee Jan 2008 B2
7328196 Peters Feb 2008 B2
7332890 Cohen et al. Feb 2008 B2
7346428 Huffman et al. Mar 2008 B1
7352153 Yan Apr 2008 B2
7359766 Jeon et al. Apr 2008 B2
7360277 Moshenrose et al. Apr 2008 B2
7363108 Noda et al. Apr 2008 B2
7388879 Sabe et al. Jun 2008 B2
7389156 Ziegler et al. Jun 2008 B2
7389166 Harwig et al. Jun 2008 B2
7408157 Yan Aug 2008 B2
7418762 Arai et al. Sep 2008 B2
7430455 Casey et al. Sep 2008 B2
7430462 Chiu et al. Sep 2008 B2
7441298 Svendsen et al. Oct 2008 B2
7444206 Abramson et al. Oct 2008 B2
7448113 Jones et al. Nov 2008 B2
7459871 Landry et al. Dec 2008 B2
7467026 Sakagami et al. Dec 2008 B2
7474941 Kim et al. Jan 2009 B2
7503096 Lin Mar 2009 B2
7515991 Egawa et al. Apr 2009 B2
7539557 Yamauchi May 2009 B2
7555363 Augenbraun et al. Jun 2009 B2
7557703 Yamada et al. Jul 2009 B2
7568259 Yan Aug 2009 B2
7571511 Jones et al. Aug 2009 B2
7578020 Jaworski et al. Aug 2009 B2
7600521 Woo Oct 2009 B2
7603744 Reindle Oct 2009 B2
7611583 Buckley et al. Nov 2009 B2
7617557 Reindle Nov 2009 B2
7620476 Morse et al. Nov 2009 B2
7636928 Uno Dec 2009 B2
7636982 Jones et al. Dec 2009 B2
7647144 Haegermarck Jan 2010 B2
7650666 Jang Jan 2010 B2
7660650 Kawagoe et al. Feb 2010 B2
7663333 Jones et al. Feb 2010 B2
7693605 Park Apr 2010 B2
7706917 Chiappetta et al. Apr 2010 B1
7761954 Ziegler et al. Jul 2010 B2
7765635 Park Aug 2010 B2
7784147 Burkholder et al. Aug 2010 B2
7801645 Taylor et al. Sep 2010 B2
7805220 Taylor et al. Sep 2010 B2
7809944 Kawamoto Oct 2010 B2
7832048 Harwig et al. Nov 2010 B2
7849555 Hahm et al. Dec 2010 B2
7853645 Brown et al. Dec 2010 B2
7860680 Arms et al. Dec 2010 B2
7920941 Park et al. Apr 2011 B2
7937800 Yan May 2011 B2
7957836 Myeong et al. Jun 2011 B2
8087117 Kapoor et al. Jan 2012 B2
20010004719 Sommer Jun 2001 A1
20010013929 Torsten Aug 2001 A1
20010020200 Das et al. Sep 2001 A1
20010025183 Shahidi Sep 2001 A1
20010037163 Allard Nov 2001 A1
20010043509 Green et al. Nov 2001 A1
20010045883 Holdaway et al. Nov 2001 A1
20010047231 Peless et al. Nov 2001 A1
20010047895 De Fazio et al. Dec 2001 A1
20020011367 Kolesnik Jan 2002 A1
20020011813 Koselka et al. Jan 2002 A1
20020016649 Jones Feb 2002 A1
20020021219 Edwards Feb 2002 A1
20020027652 Paromtchik et al. Mar 2002 A1
20020036779 Kiyoi et al. Mar 2002 A1
20020081937 Yamada et al. Jun 2002 A1
20020095239 Wallach et al. Jul 2002 A1
20020097400 Jung et al. Jul 2002 A1
20020104963 Mancevski Aug 2002 A1
20020108209 Peterson Aug 2002 A1
20020112742 Bredo et al. Aug 2002 A1
20020113973 Ge Aug 2002 A1
20020116089 Kirkpatrick Aug 2002 A1
20020120364 Colens Aug 2002 A1
20020124343 Reed Sep 2002 A1
20020153185 Song et al. Oct 2002 A1
20020156556 Ruffner Oct 2002 A1
20020159051 Guo Oct 2002 A1
20020166193 Kasper Nov 2002 A1
20020169521 Goodman et al. Nov 2002 A1
20020173877 Zweig Nov 2002 A1
20020189871 Won Dec 2002 A1
20030009259 Hattori et al. Jan 2003 A1
20030015232 Nguyen Jan 2003 A1
20030019071 Field et al. Jan 2003 A1
20030023356 Keable Jan 2003 A1
20030024986 Mazz et al. Feb 2003 A1
20030025472 Jones et al. Feb 2003 A1
20030028286 Glenn et al. Feb 2003 A1
20030030399 Jacobs Feb 2003 A1
20030058262 Sato et al. Mar 2003 A1
20030060928 Abramson et al. Mar 2003 A1
20030067451 Tagg et al. Apr 2003 A1
20030097875 Lentz et al. May 2003 A1
20030120389 Abramson et al. Jun 2003 A1
20030124312 Autumn Jul 2003 A1
20030126352 Barrett Jul 2003 A1
20030137268 Papanikolopoulos et al. Jul 2003 A1
20030146384 Logsdon et al. Aug 2003 A1
20030159232 Hekman et al. Aug 2003 A1
20030168081 Lee et al. Sep 2003 A1
20030175138 Beenker Sep 2003 A1
20030192144 Song et al. Oct 2003 A1
20030193657 Uomori et al. Oct 2003 A1
20030216834 Allard Nov 2003 A1
20030221114 Hino et al. Nov 2003 A1
20030229421 Chmura et al. Dec 2003 A1
20030229474 Suzuki et al. Dec 2003 A1
20030233171 Heiligensetzer Dec 2003 A1
20030233177 Johnson et al. Dec 2003 A1
20030233870 Mancevski Dec 2003 A1
20030233930 Ozick Dec 2003 A1
20040016077 Song et al. Jan 2004 A1
20040020000 Jones Feb 2004 A1
20040030448 Solomon Feb 2004 A1
20040030449 Solomon Feb 2004 A1
20040030450 Solomon Feb 2004 A1
20040030451 Solomon Feb 2004 A1
20040030570 Solomon Feb 2004 A1
20040030571 Solomon Feb 2004 A1
20040031113 Wosewick et al. Feb 2004 A1
20040049877 Jones et al. Mar 2004 A1
20040055163 McCambridge et al. Mar 2004 A1
20040068351 Solomon Apr 2004 A1
20040068415 Solomon Apr 2004 A1
20040068416 Solomon Apr 2004 A1
20040074038 Im et al. Apr 2004 A1
20040074044 Diehl et al. Apr 2004 A1
20040076324 Burl et al. Apr 2004 A1
20040083570 Song et al. May 2004 A1
20040085037 Jones et al. May 2004 A1
20040088079 Lavarec et al. May 2004 A1
20040093122 Galibraith May 2004 A1
20040098167 Yi et al. May 2004 A1
20040111184 Chiappetta et al. Jun 2004 A1
20040111821 Lenkiewicz et al. Jun 2004 A1
20040113777 Matsuhira et al. Jun 2004 A1
20040117064 McDonald Jun 2004 A1
20040117846 Karaoguz et al. Jun 2004 A1
20040118998 Wingett et al. Jun 2004 A1
20040128028 Miyamoto et al. Jul 2004 A1
20040133316 Dean Jul 2004 A1
20040134336 Solomon Jul 2004 A1
20040134337 Solomon Jul 2004 A1
20040143919 Wilder Jul 2004 A1
20040148419 Chen et al. Jul 2004 A1
20040148731 Damman et al. Aug 2004 A1
20040153212 Profio et al. Aug 2004 A1
20040156541 Jeon et al. Aug 2004 A1
20040158357 Lee et al. Aug 2004 A1
20040181706 Chen et al. Sep 2004 A1
20040187249 Jones et al. Sep 2004 A1
20040187457 Colens Sep 2004 A1
20040196451 Aoyama Oct 2004 A1
20040200505 Taylor et al. Oct 2004 A1
20040201361 Koh et al. Oct 2004 A1
20040204792 Taylor et al. Oct 2004 A1
20040204804 Lee et al. Oct 2004 A1
20040210345 Noda et al. Oct 2004 A1
20040210347 Sawada et al. Oct 2004 A1
20040211444 Taylor et al. Oct 2004 A1
20040221790 Sinclair et al. Nov 2004 A1
20040236468 Taylor et al. Nov 2004 A1
20040244138 Taylor et al. Dec 2004 A1
20040255425 Arai et al. Dec 2004 A1
20050000543 Taylor et al. Jan 2005 A1
20050010330 Abramson et al. Jan 2005 A1
20050010331 Taylor et al. Jan 2005 A1
20050015920 Kim et al. Jan 2005 A1
20050021181 Kim et al. Jan 2005 A1
20050028316 Thomas et al. Feb 2005 A1
20050053912 Roth et al. Mar 2005 A1
20050055796 Wright et al. Mar 2005 A1
20050067994 Jones et al. Mar 2005 A1
20050081782 Buckley et al. Apr 2005 A1
20050085947 Aldred et al. Apr 2005 A1
20050091782 Gordon et al. May 2005 A1
20050091786 Wright et al. May 2005 A1
20050137749 Jeon et al. Jun 2005 A1
20050144751 Kegg et al. Jul 2005 A1
20050150074 Diehl et al. Jul 2005 A1
20050150519 Keppler et al. Jul 2005 A1
20050154795 Kuz et al. Jul 2005 A1
20050156562 Cohen et al. Jul 2005 A1
20050162119 Landry et al. Jul 2005 A1
20050163119 Ito et al. Jul 2005 A1
20050165508 Kanda et al. Jul 2005 A1
20050166354 Uehigashi Aug 2005 A1
20050166355 Tani Aug 2005 A1
20050172445 Diehl et al. Aug 2005 A1
20050183229 Uehigashi Aug 2005 A1
20050183230 Uehigashi Aug 2005 A1
20050187678 Myeong et al. Aug 2005 A1
20050192707 Park et al. Sep 2005 A1
20050204717 Colens Sep 2005 A1
20050209736 Kawagoe Sep 2005 A1
20050211880 Schell et al. Sep 2005 A1
20050212929 Schell et al. Sep 2005 A1
20050213082 DiBernardo et al. Sep 2005 A1
20050213109 Schell et al. Sep 2005 A1
20050217042 Reindle Oct 2005 A1
20050218852 Landry et al. Oct 2005 A1
20050222933 Wesby Oct 2005 A1
20050229340 Sawalski et al. Oct 2005 A1
20050229355 Crouch et al. Oct 2005 A1
20050235451 Yan Oct 2005 A1
20050251292 Casey et al. Nov 2005 A1
20050255425 Pierson Nov 2005 A1
20050258154 Blankenship et al. Nov 2005 A1
20050273967 Taylor et al. Dec 2005 A1
20050288819 De Guzman Dec 2005 A1
20060000050 Cipolla et al. Jan 2006 A1
20060009879 Lynch et al. Jan 2006 A1
20060010638 Shimizu et al. Jan 2006 A1
20060020369 Taylor et al. Jan 2006 A1
20060020370 Abramson Jan 2006 A1
20060021168 Nishikawa Feb 2006 A1
20060025134 Cho et al. Feb 2006 A1
20060037170 Shimizu Feb 2006 A1
20060042042 Mertes et al. Mar 2006 A1
20060044546 Lewin et al. Mar 2006 A1
20060060216 Woo Mar 2006 A1
20060061657 Rew et al. Mar 2006 A1
20060064828 Stein et al. Mar 2006 A1
20060087273 Ko et al. Apr 2006 A1
20060089765 Pack et al. Apr 2006 A1
20060100741 Jung May 2006 A1
20060107894 Buckley et al. May 2006 A1
20060119839 Bertin et al. Jun 2006 A1
20060143295 Costa-Requena et al. Jun 2006 A1
20060146776 Kim Jul 2006 A1
20060150361 Aldred et al. Jul 2006 A1
20060184293 Konandreas et al. Aug 2006 A1
20060185690 Song et al. Aug 2006 A1
20060190133 Konandreas et al. Aug 2006 A1
20060190134 Ziegler et al. Aug 2006 A1
20060190146 Morse et al. Aug 2006 A1
20060196003 Song et al. Sep 2006 A1
20060200281 Ziegler et al. Sep 2006 A1
20060220900 Ceskutti et al. Oct 2006 A1
20060229774 Park et al. Oct 2006 A1
20060259194 Chiu Nov 2006 A1
20060259494 Watson et al. Nov 2006 A1
20060278161 Burkholder et al. Dec 2006 A1
20060288519 Jaworski et al. Dec 2006 A1
20060293787 Kanda et al. Dec 2006 A1
20060293808 Qian Dec 2006 A1
20070006404 Cheng et al. Jan 2007 A1
20070016328 Ziegler et al. Jan 2007 A1
20070017061 Yan Jan 2007 A1
20070028574 Yan Feb 2007 A1
20070032904 Kawagoe et al. Feb 2007 A1
20070042716 Goodall et al. Feb 2007 A1
20070043459 Abbott et al. Feb 2007 A1
20070061041 Zweig Mar 2007 A1
20070061043 Ermakov et al. Mar 2007 A1
20070114975 Cohen et al. May 2007 A1
20070142964 Abramson Jun 2007 A1
20070150096 Yeh et al. Jun 2007 A1
20070156286 Yamauchi Jul 2007 A1
20070157415 Lee et al. Jul 2007 A1
20070157420 Lee et al. Jul 2007 A1
20070179670 Chiappetta et al. Aug 2007 A1
20070226949 Hahm et al. Oct 2007 A1
20070234492 Svendsen et al. Oct 2007 A1
20070244610 Ozick et al. Oct 2007 A1
20070245511 Hahm et al. Oct 2007 A1
20070250212 Halloran et al. Oct 2007 A1
20070261193 Gordon et al. Nov 2007 A1
20070266508 Jones et al. Nov 2007 A1
20080007203 Cohen et al. Jan 2008 A1
20080039974 Sandin et al. Feb 2008 A1
20080052846 Kapoor et al. Mar 2008 A1
20080091304 Ozick et al. Apr 2008 A1
20080109126 Sandin et al. May 2008 A1
20080134458 Ziegler et al. Jun 2008 A1
20080140255 Ziegler et al. Jun 2008 A1
20080155768 Ziegler et al. Jul 2008 A1
20080184518 Taylor et al. Aug 2008 A1
20080266748 Lee Oct 2008 A1
20080276407 Schnittman et al. Nov 2008 A1
20080281470 Gilbert et al. Nov 2008 A1
20080282494 Won et al. Nov 2008 A1
20080294288 Yamauchi Nov 2008 A1
20080302586 Yan Dec 2008 A1
20080307590 Jones et al. Dec 2008 A1
20090007366 Svendsen et al. Jan 2009 A1
20090038089 Landry et al. Feb 2009 A1
20090048727 Hong et al. Feb 2009 A1
20090049640 Lee et al. Feb 2009 A1
20090055022 Casey et al. Feb 2009 A1
20090102296 Greene et al. Apr 2009 A1
20090292393 Casey et al. Nov 2009 A1
20100006028 Buckley et al. Jan 2010 A1
20100011529 Won et al. Jan 2010 A1
20100049365 Jones et al. Feb 2010 A1
20100063628 Landry et al. Mar 2010 A1
20100082193 Chiappetta Apr 2010 A1
20100107355 Won et al. May 2010 A1
20100257690 Jones et al. Oct 2010 A1
20100257691 Jones et al. Oct 2010 A1
20100263158 Jones et al. Oct 2010 A1
20100268384 Jones et al. Oct 2010 A1
20100293742 Chung et al. Nov 2010 A1
20100312429 Jones et al. Dec 2010 A1
Foreign Referenced Citations (325)
Number Date Country
2128842 Dec 1980 DE
3317376 Dec 1987 DE
3536907 Feb 1989 DE
3404202 Dec 1992 DE
199311014 Oct 1993 DE
4338841 May 1995 DE
4414683 Oct 1995 DE
19849978 Feb 2001 DE
102004038074 Jun 2005 DE
10357636 Jul 2005 DE
102004041021 Aug 2005 DE
102005046813 Apr 2007 DE
338988 Dec 1988 DK
0265542 May 1988 EP
0281085 Sep 1988 EP
0286328 Oct 1988 EP
0294101 Dec 1988 EP
0352045 Jan 1990 EP
0433697 Jun 1991 EP
0437024 Jul 1991 EP
0554978 Aug 1993 EP
0615719 Sep 1994 EP
0792726 Sep 1997 EP
0930040 Jul 1999 EP
0845237 Apr 2000 EP
0861629 Sep 2001 EP
1228734 Aug 2002 EP
1380245 Jan 2004 EP
1380246 Jan 2004 EP
1018315 Nov 2004 EP
1553472 Jul 2005 EP
1557730 Jul 2005 EP
1642522 Apr 2006 EP
1836941 Sep 2007 EP
2238196 Aug 2005 ES
722755 Mar 1932 FR
2601443 Jan 1988 FR
2828589 Feb 2003 FR
702426 Jan 1954 GB
2128842 May 1984 GB
2225221 May 1990 GB
2267360 Dec 1993 GB
2283838 May 1995 GB
2284957 Jun 1995 GB
2300082 Oct 1996 GB
2344747 Jun 2000 GB
2404330 Feb 2005 GB
2417354 Feb 2006 GB
53021869 Feb 1978 JP
53110257 Sep 1978 JP
57064217 Apr 1982 JP
59005315 Jan 1984 JP
59033511 Mar 1984 JP
59094005 May 1984 JP
59099308 Jun 1984 JP
59112311 Jun 1984 JP
59120124 Jul 1984 JP
59131668 Sep 1984 JP
59164973 Sep 1984 JP
59184917 Oct 1984 JP
2283343 Nov 1984 JP
59212924 Dec 1984 JP
59226909 Dec 1984 JP
60089213 May 1985 JP
60211510 Oct 1985 JP
60259895 Dec 1985 JP
61023221 Jan 1986 JP
61097712 May 1986 JP
61160366 Jul 1986 JP
62070709 Apr 1987 JP
62074018 Apr 1987 JP
62120510 Jun 1987 JP
62154008 Jul 1987 JP
62164431 Jul 1987 JP
62263507 Nov 1987 JP
62263508 Nov 1987 JP
62189057 Dec 1987 JP
63079623 Apr 1988 JP
63158032 Jul 1988 JP
63203483 Aug 1988 JP
63241610 Oct 1988 JP
1118752 Aug 1989 JP
2-6312 Jan 1990 JP
02241420 Sep 1990 JP
3051023 Mar 1991 JP
4019586 Jan 1992 JP
4074285 Mar 1992 JP
4084921 Mar 1992 JP
04260905 Sep 1992 JP
5023269 Feb 1993 JP
5042076 Feb 1993 JP
5046246 Feb 1993 JP
5091604 Apr 1993 JP
5095879 Apr 1993 JP
5150827 Jun 1993 JP
5150829 Jun 1993 JP
5054620 Jul 1993 JP
5040519 Oct 1993 JP
05257527 Oct 1993 JP
5257533 Oct 1993 JP
05285861 Nov 1993 JP
5302836 Nov 1993 JP
5312514 Nov 1993 JP
05046239 Dec 1993 JP
5341904 Dec 1993 JP
6003251 Jan 1994 JP
6038912 Feb 1994 JP
6105781 Apr 1994 JP
6137828 May 1994 JP
6154143 Jun 1994 JP
6293095 Oct 1994 JP
06327 598 Nov 1994 JP
7047046 Feb 1995 JP
07129239 May 1995 JP
7059702 Jun 1995 JP
07155273 Jun 1995 JP
07222705 Aug 1995 JP
07246175 Sep 1995 JP
7270518 Oct 1995 JP
7313417 Dec 1995 JP
8000393 Jan 1996 JP
8016776 Jan 1996 JP
8084696 Apr 1996 JP
8089449 Apr 1996 JP
08089451 Apr 1996 JP
8123548 May 1996 JP
8152916 Jun 1996 JP
8263137 Oct 1996 JP
8335112 Dec 1996 JP
8339297 Dec 1996 JP
943901 Feb 1997 JP
9044240 Feb 1997 JP
09062354 Mar 1997 JP
9066855 Mar 1997 JP
9145309 Jun 1997 JP
09160644 Jun 1997 JP
09179625 Jul 1997 JP
09185410 Jul 1997 JP
9192069 Jul 1997 JP
2555263 Aug 1997 JP
9204223 Aug 1997 JP
09206258 Aug 1997 JP
09233712 Sep 1997 JP
9265319 Oct 1997 JP
9269807 Oct 1997 JP
9269810 Oct 1997 JP
9319431 Dec 1997 JP
9319432 Dec 1997 JP
9319434 Dec 1997 JP
9325812 Dec 1997 JP
10055215 Feb 1998 JP
10117973 May 1998 JP
10118963 May 1998 JP
10165738 Jun 1998 JP
10177414 Jun 1998 JP
10295595 Nov 1998 JP
10314088 Dec 1998 JP
11015941 Jan 1999 JP
11102220 Apr 1999 JP
11162454 Jun 1999 JP
11174145 Jul 1999 JP
11175149 Jul 1999 JP
11178765 Jul 1999 JP
114008764 Jul 1999 JP
11212642 Aug 1999 JP
11213157 Aug 1999 JP
11510935 Sep 1999 JP
11282532 Oct 1999 JP
11282533 Oct 1999 JP
11295412 Oct 1999 JP
2000047728 Feb 2000 JP
2000056006 Feb 2000 JP
2000056831 Feb 2000 JP
2000060782 Feb 2000 JP
2000066722 Mar 2000 JP
2000075925 Mar 2000 JP
2000102499 Apr 2000 JP
2000275321 Oct 2000 JP
2000279353 Oct 2000 JP
2000353014 Dec 2000 JP
2001022443 Jan 2001 JP
2001067588 Mar 2001 JP
2001087182 Apr 2001 JP
2001121455 May 2001 JP
2001125641 May 2001 JP
2001508572 Jun 2001 JP
2001197008 Jul 2001 JP
3197758 Aug 2001 JP
3201903 Aug 2001 JP
2001216482 Aug 2001 JP
2001258807 Sep 2001 JP
2001265437 Sep 2001 JP
2001275908 Oct 2001 JP
2001289939 Oct 2001 JP
2001306170 Nov 2001 JP
2002073170 Mar 2002 JP
2002078650 Mar 2002 JP
2002204768 Jul 2002 JP
2002204769 Jul 2002 JP
2002247510 Aug 2002 JP
2002532180 Oct 2002 JP
2002323925 Nov 2002 JP
2002333920 Nov 2002 JP
2002355206 Dec 2002 JP
2002360471 Dec 2002 JP
2002360482 Dec 2002 JP
2002366227 Dec 2002 JP
2002369778 Dec 2002 JP
2003005296 Jan 2003 JP
2003010076 Jan 2003 JP
2003010088 Jan 2003 JP
2003028528 Jan 2003 JP
2003036116 Feb 2003 JP
2003038401 Feb 2003 JP
2003038402 Feb 2003 JP
2003047579 Feb 2003 JP
2003061882 Mar 2003 JP
2003084994 Mar 2003 JP
2003167628 Jun 2003 JP
2003180586 Jul 2003 JP
2003180587 Jul 2003 JP
2003186539 Jul 2003 JP
2003190064 Jul 2003 JP
2003241836 Aug 2003 JP
2003262520 Sep 2003 JP
2003304992 Oct 2003 JP
2003310509 Nov 2003 JP
2003330543 Nov 2003 JP
2004123040 Apr 2004 JP
2004148021 May 2004 JP
2004160102 Jun 2004 JP
2004166968 Jun 2004 JP
2004198330 Jul 2004 JP
2004219185 Aug 2004 JP
2004351234 Dec 2004 JP
2005118354 May 2005 JP
2005211360 Aug 2005 JP
2005224265 Aug 2005 JP
2005230032 Sep 2005 JP
2005245916 Sep 2005 JP
2005352707 Dec 2005 JP
2006043071 Feb 2006 JP
2006155274 Jun 2006 JP
2006164223 Jun 2006 JP
2006227673 Aug 2006 JP
2006247467 Sep 2006 JP
2006260161 Sep 2006 JP
2006293662 Oct 2006 JP
2006296697 Nov 2006 JP
2007034866 Feb 2007 JP
2007213180 Aug 2007 JP
2009015611 Jan 2009 JP
2010198552 Sep 2010 JP
9526512 Oct 1995 WO
9530887 Nov 1995 WO
9617258 Jun 1996 WO
9715224 May 1997 WO
9740734 Nov 1997 WO
9741451 Nov 1997 WO
9853456 Nov 1998 WO
9905580 Feb 1999 WO
9916078 Apr 1999 WO
9938056 Jul 1999 WO
9938237 Jul 1999 WO
9943250 Sep 1999 WO
0038026 Jun 2000 WO
0038028 Jun 2000 WO
0038029 Jun 2000 WO
0004430 Oct 2000 WO
0078410 Dec 2000 WO
0106904 Feb 2001 WO
0106905 Feb 2001 WO
0180703 Nov 2001 WO
0191623 Dec 2001 WO
0224292 Mar 2002 WO
0239864 May 2002 WO
0239868 May 2002 WO
02058527 Aug 2002 WO
02062194 Aug 2002 WO
02067744 Sep 2002 WO
02067745 Sep 2002 WO
02067752 Sep 2002 WO
02069774 Sep 2002 WO
02069775 Sep 2002 WO
02071175 Sep 2002 WO
02074150 Sep 2002 WO
02075356 Sep 2002 WO
02075469 Sep 2002 WO
02075470 Sep 2002 WO
2002075350 Sep 2002 WO
02081074 Oct 2002 WO
02101477 Dec 2002 WO
03015220 Feb 2003 WO
03024292 Mar 2003 WO
03040546 May 2003 WO
03040845 May 2003 WO
03040846 May 2003 WO
03062850 Jul 2003 WO
03062852 Jul 2003 WO
2004004533 Jan 2004 WO
2004004534 Jan 2004 WO
2004006034 Jan 2004 WO
2004025947 Mar 2004 WO
2004058028 Jul 2004 WO
2004059409 Jul 2004 WO
2005006935 Jan 2005 WO
2005037496 Apr 2005 WO
2005055795 Jun 2005 WO
2005055796 Jun 2005 WO
2005076545 Aug 2005 WO
2005077243 Aug 2005 WO
2005077244 Aug 2005 WO
2005081074 Sep 2005 WO
2005083541 Sep 2005 WO
2005098475 Oct 2005 WO
2005098476 Oct 2005 WO
2006046400 May 2006 WO
2006061133 Jun 2006 WO
2006068403 Jun 2006 WO
2006073248 Jul 2006 WO
2006089307 Aug 2006 WO
2007028049 Mar 2007 WO
2007036490 Apr 2007 WO
2007065033 Jun 2007 WO
2007137234 Nov 2007 WO
Non-Patent Literature Citations (223)
Entry
Andersen et al., “Landmark based navigation strategies,” SPIE Conference on Mobile Robots XIII, SPIE vol. 3525, pp. 170-181, Jan. 8, 1999.
Ascii, Mar. 25, 2002, http://ascii.jp/elem/000/000/330/330024/, accessed Nov. 2011, 15 pages (with English translation).
Barker, “Navigation by the Stars—Ben Barker 4th Year Project,” Nov. 2004, 20 pages.
Becker et al., “Reliable Navigation Using Landmarks,” IEEE International Conference on Robotics and Automation, 0-7803-1965-6, pp. 401-406, 1995.
Benayad-Cherif et al., “Mobile Robot Navigation Sensors,” SPIE vol. 1831 Mobile Robots, VII, pp. 378-387, 1992.
Betke et al. “Mobile robot localization using landmarks,” Proceedings of the IEEE/RSJ/GI International Conference on Intelligent Robots and Systems ‘94 Advanced Robotic Systems and the Real World’ (IROS '94), Accessed via IEEE Xplore, 1994, 8 pages.
Bison et al., “Using a structured beacon for cooperative position estimation,” Robotics and Autonomous Systems, 29(1):33-40, Oct. 1999.
Blaasvaer et al., “AMOR—An Autonomous Mobile Robot Navigation System,” Proceedings of the IEEE International Conference on Systems, Man, and Cybernetics, pp. 2266-2271, 1994.
Borges et al., “Optimal Mobile Robot Pose Estimation Using Geometrical Maps,” IEEE Transactions on Robotics and Automation, 18(1): 87-94, Feb. 2002.
Braunstingl et al., “Fuzzy Logic Wall Following of a Mobile Robot Based on the Concept of General Perception,” ICAR '95, 7th International Conference on Advanced Robotics, Sant Feliu De Guixols, Spain, pp. 367-376, Sep. 1995.
Bulusu et al., “Self Configuring Localization systems: Design and Experimental Evaluation,”ACM Transactions on Embedded Computing Systems, 3(1):24-60, 2003.
Caccia et al., “Bottom-Following for Remotely Operated Vehicles,” 5th IFAC Conference, Alaborg, Denmark, pp. 245-250, Aug. 2000.
U.S. Appl. No. 60/605,066 as provided to WIPO in PCT/US2005/030422, corresponding to U.S. National Stage Entry U.S. Appl. No. 11/574,290, U.S.publication 2008/0184518, filing date Aug. 27, 2004.
U.S. Appl. No. 60/605,181 as provided to WIPO in PCT/US2005/030422, corresponding to U.S. National Stage Entry U.S. Appl. No. 11/574,290, U.S.publication 2008/0184518, filing date Aug. 27, 2004.
Chae et al., “StarLITE: A new artificial landmark for the navigation of mobile robots,” http://www.irc.atr.jp/jk-nrs2005/pdf/Starlite.pdf, 4 pages, 2005.
Chamberlin et al., “Team 1: Robot Locator Beacon System, ” NASA Goddard SFC, Design Proposal, 15 pages, Feb. 2006.
Champy, “Physical management of IT assets in Data Centers using RFID technologies,” RFID 2005 University, Oct. 12-14, 2005 , 19 pages.
Chin, “Joystick Control for Tiny OS Robot,” http://www.eecs.berkeley.edu/Programs/ugrad/superb/papers2002/chiri.pdf. 12 pages, Aug. 2002.
Christensen et al. “Theoretical Methods for Planning and Control in Mobile Robotics,” 1997 First International Conference on Knowledge-Based Intelligent Electronic Systems, Adelaide, Australia, pp. 81-86, May 1997.
CleanMate 365, Intelligent Automatic Vacuum Cleaner, Model No. QQ-1, User Manual www.metapo.com/support/user—manual.pdf, Dec. 2005, 11 pages.
Clerentin et al., “A localization method based on two omnidirectional perception systems cooperation,” Proc of IEEE International Conference on Robotics & Automation, San Francisco, CA vol. 2, pp. 1219-1224, Apr. 2000.
Corke, “High Performance Visual serving for robots end-point control,” SPIE vol. 2056, Intelligent Robots and Computer Vision, 1993, 10 pages.
Cozman et al., “Robot Localization using a Computer Vision Sextant,” IEEE International Midwest Conference on Robotics and Automation, pp. 106-111, 1995.
D'Orazio et al., “Model based Vision System for mobile robot position estimation”, SPIE, vol. 2058 Mobile Robots VIII, pp. 38-49, 1992.
De Bakker et al., “Smart PSD-array for sheet of light range imaging”, Proc. of SPIE, vol. 3965, pp. 1-12, May 2000.
Denning Roboscrub image (1989), 1 page.
Desaulniers et al., “An Efficient Algorithm to find a shortest path for a car-like Robot,” IEEE Transactions on robotics and Automation, 11(6):819-828, Dec. 1995.
Dorfmüller-Ulhaas, “Optical Tracking From User Motion to 3D Interaction,” http://www.cg.tuwien.ac.at/research/publications/2002/Dorfmueller-Ulhaas-thesis, 182 pages, 2002.
Dorsch et al., “Laser Triangulation: Fundamental uncertainty in distance measurement,” Applied Optics, 33(7):1306-1314, Mar. 1994.
Doty et al., “Sweep Strategies for a Sensory-Driven, Behavior-Based Vacuum Cleaning Agent,” AAAI 1993 Fall Symposium Series, Instantiating Real-World Agents, pp. 1-6, Oct. 22-24, 1993.
Dudek et al., “Localizing a Robot with Minimum Travel” Proceedings of the sixth annual ACM-SIAM symposium on Discrete Algorithms, 27(2):583-604, Apr. 1998.
Dulimarta et al., “Mobile Robot Localization in Indoor Environment”, Pattern Recognition, 30(1):99-111, 1997.
Dyson's Robot Vacuum Cleaner—the DC06, May 2004, Retrieved from the Internet: URL<http://www.gizmag.com/go/1282/>. Accessed Nov. 2011, 3 pages.
EBay, “Roomba Timer—> Timed Cleaning—Floorvac Robotic Vacuum,” Retrieved from the Internet: URL Cgi.ebay.com/ws/eBay|SAP|.dll?viewitem&category=43526&item=4375198387&rd=1, 5 pages, Apr. 2005.
Electrolux Trilobite, “Time to enjoy life,” Retrieved from the Internet: URL<http://www.robocon.co.kr/trilobite/Presentation—Trilobite—Kor—030104.ppt, 26 pages, accessed Dec. 2011.
Electrolux Trilobite, Jan. 12, 2001, http://www.electroluxui.com:8080/2002%5C822%5C833102EN.pdf, accessed Jul. 2, 2012, 10 pages.
Electrolux, “Designed for the well-lived home,” Retrieved from the Internet: URL<http://www.electroluxusa.com/node57.as[?currentURL=node142.asp%3F >. Accessed Mar. 2005, 2 pages.
Eren et al., “Accuracy in position estimation of mobile robots based on coded infrared signal transmission,” Proceedings: Integrating Intelligent Instrumentation and Control, Instrumentation and Measurement Technology Conference, 1995, IMTC/95. pp. 548-551, 1995.
Eren et al., “Operation of Mobile Robots in a Structured Infrared Environment,” Proceedings ‘Sensing, Processing, Networking’, IEEE Instrumentation and Measurement Technology Conference, 1997 (IMTC/97), Ottawa, Canada vol. 1, pp. 20-25, May 1997.
Euroflex Intelligente Monstre, (English excerpt only), 2006, 15 pages.
Euroflex, Jan. 2006, Retrieved from the Internet: URL<http://www.euroflex.tv/novitadett.php?id=15, accessed Nov. 2011, 1 page.
eVac Robotic Vacuum S1727 Instruction Manual, Sharper Image Corp, Copyright 2004, 16 pages.
Everyday Robots, “Everyday Robots: Reviews, Discussion and News for Consumers,” Aug. 2004, Retrieved from the Internet: URL<www.everydayrobots.com/index.php?option=content&task=view&id=9> (Sep. 2012), 4 pages.
Evolution Robotics, “NorthStar—Low-cost Indoor Localiztion—How it Works,” E Evolution Robotics , 2 pages, 2005.
Facchinetti Claudio et al., “Self-Positioning Robot Navigation Using Ceiling Images Sequences,” ACCV '95, 5 pages, Dec. 1995.
Facchinetti Claudio et al., “Using and Learning Vision-Based Self-Positioning for Autonomous Robot Navigation,” ICARCV '94, vol. 3, pp. 1694-1698, 1994.
Facts on Trilobite, webpage, Retrieved from the Internet: URL<http://trilobiteelectroluxse/presskit—en/model11335asp?print=yes&pressID=>. 2 pages, accessed Dec. 2003.
Fairfield et al., “Mobile Robot Localization with Sparse Landmarks,” SPIE vol. 4573, pp. 148-155, 2002.
Favre-Bulle, “Efficient tracking of 3D—Robot Position by Dynamic Triangulation,” IEEE Instrumentation and Measurement Technology Conference IMTC 98 Session on Instrumentation and Measurement in Robotics, vol. 1, pp. 446-449, May 1998.
Fayman, “Exploiting Process Integration and Composition in the context of Active Vision,” IEEE Transactions on Systems, Man, and Cybernetics—Part C: Application and reviews, vol. 29, No. 1, pp. 73-86, Feb. 1999.
Floorbot GE Plastics—IMAGE, available at http://www.fuseid.com/, 1989-1990, Accessed Sep. 2012, 1 page.
Floorbotics, VR8 Floor Cleaning Robot, Product Description for Manufacturing, URL: <http://www.consensus.sem.au/SoftwareAwards/CSAarchive/CSA2004/CSAart04/FloorBot/F>. Mar. 2004, 11 pages.
Franz et al., “Biomimetric robot navigation”, Robotics and Autonomous Systems, vol. 30 pp. 133-153, 2000.
Friendly Robotics, “Friendly Robotics- Friendly Vac, Robotic Vacuum Cleaner,” Retrieved from the Internet: URL< www.friendlyrobotics.com/vac.htm > 5 pages, Apr. 2005.
Friendly Robotics, Retrieved from the Internet: URL<http://www.robotsandrelax.com/PDFs/RV400Manual.pdf>. 18 pages, accessed Dec. 2011.
Fuentes et al., “Mobile Robotics 1994,” University of Rochester. Computer Science Department, TR 588, 44 pages, Dec. 1994.
Fukuda et al., “Navigation System based on Ceiling Landmark Recognition for Autonomous mobile robot,” 1995 IEEE/RSJ International Conference on Intelligent Robots and Systems 95. ‘Human Robot Interaction and Cooperative Robots’, Pittsburgh, PA, pp. 1466/1471, Aug. 1995.
Gat, “Robust Low-Computation Sensor-driven Control for Task-Directed Navigation,” Proc of IEEE International Conference on Robotics and Automation , Sacramento, CA pp. 2484-2489, Apr. 1991.
Gionis, “A hand-held optical surface scanner for environmental Modeling and Virtual Reality,” Virtual Reality World, 16 pages, 1996.
Goncalves et al., “A Visual Front-End for Simultaneous Localization and Mapping”, Proceedings of the 2005 IEEE International Conference on Robotics and Automation, Barcelona, Spain, pp. 44-49, Apr. 2005.
Gregg et al., “Autonomous Lawn Care Applications,” 2006 Florida Conference on Recent Advances in Robotics, Miami, Florida, May 25-26, 2006, Florida International University, 5 pages.
Grumet, “Robots Clean House,” Popular Mechanics, Nov. 2003, 3 pages.
Hamamatsu “SI PIN Diode S5980, S5981 S5870—Multi-element photodiodes for surface mounting,” Hamatsu Photonics, 2 pages, Apr. 2004.
Haralick et al. “Pose Estimation from Corresponding Point Data”, IEEE Transactions on Systems, Man, and Cybernetics, 19(6):1426-1446, Nov. 1989.
Hausler, “About the Scaling Behaviour of Optical Range Sensors,” Fringe '97, Proceedings of the 3rd International Workshop on Automatic Processing of Fringe Patterns, Bremen, Germany, pp. 147-155, Sep. 1997.
Hitachi, accessed at http://www.hitachi.co.jp/New/cnews/hi—030529—hi—030529.pdf , May 29, 2003, 15 pages (with English translation).
Hitachi: News release: “The home cleaning robot of the autonomous movement type (experimental machine),” Retrieved from the Internet: URL< www.i4u.com./japanreleases/hitachirobot.htm>. 5 pages, Mar. 2005.
Hoag et al., “Navigation and Guidance in interstellar space,” ACTA Astronautica, vol. 2, pp. 513-533 , Feb. 1975.
Home Robot—UBOT; Microbotusa.com, retrieved from the WWW at www.microrobotusa.com, accessed Dec. 2, 2008, 2 pages.
Huntsberger et al., “CAMPOUT: A Control Architecture for Tightly Coupled Coordination of Multirobot Systems for Planetary Surface Exploration,” IEEE Transactions on Systems, Man, and Cybernetics—Part A: Systems and Humans, 33(5):550-559, Sep. 2003.
Iirobotics.com, “Samsung Unveils Its Multifunction Robot Vacuum,” Retrieved from the Internet: URL<.www.iirobotics.com/webpages/hotstuff.php?ubre=111>. 3 pages, Mar. 2005.
InMach “Intelligent Machines,” Retrieved from the Internet: URL<www.inmach.de/inside.html>. 1 page , Nov. 2008.
Innovation First, “2004 EDU Robot Controller Reference Guide,” Retrieved from the Internet: URL<http://www.ifirobotics.com>. 13 pages, Mar. 2004.
IT media, Retrieved from the Internet: URL<http://www.itmedia.co.jp/news/0111/16/robofesta—m.html>. Accessed Nov. 1, 2011, 8 pages (with English translation).
It's eye, Retrieved from the Internet: URL< www.hitachi.co.jp/rd/pdf/topics/hitac2003—10.pdf>. 11 pages, 2003 (with English translation).
Jarosiewicz et al., “Final Report—Lucid,” University of Florida, Departmetn of Electrical and Computer Engineering, EEL 5666—Intelligent Machine Design Laboratory, 50 pages, Aug. 1999.
Jensfelt et al., “Active Global Localization for a mobile robot using multiple hypothesis tracking,” IEEE Transactions on Robots and Automation, 17(5): 748-760, Oct. 2001.
Jeong et al., “An intelligent map-building system for indoor mobile robot using low cost photo sensors,” SPIE, vol. 6042, 6 pages, 2005.
Kahney, “Robot Vacs are in the House,” Retrieved from the Internet: URL<www.wired.com/news/technology/o,1282,59237,00.html>. 6 pages, Jun. 2003.
Karcher “Karcher RoboCleaner RC 3000,” Retrieved from the Internet: URL<www.robocleaner.de/english/screen3.html>. 4 pages, Dec. 2003.
Karcher RC 3000 Cleaning Robot-user manual Manufacturer: Alfred-Karcher GmbH & Co, Cleaning Systems, Alfred Karcher-Str 28-40, PO Box 160, D-71349 Winnenden, Germany, Dec. 2002, 8 pages.
Karcher RC3000 RoboCleaner,—Image, Accessed at <http://www.karcher.de/versions/int/assets/video/2—4—robo—en.swf>. Accessed Sep. 2009, 1 page.
Karcher USA, RC3000 Robotic Cleaner, website: http://www.karcher-usa.com/showproducts.php?op=view prod&param1=143&param2=&param3=, 3 pages, accessed Mar. 2005.
Karcher, “Product Manual Download Karch”, available at www.karcher.com, 16 pages, 2004.
Karlsson et al, “Core Technologies for service Robotics,” IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2004), vol. 3, pp. 2979-2984, Sep. 2004.
Karlsson et al., The vSLAM Algorithm for Robust Localization and Mapping, Proceedings of the 2005 IEEE International Conference on Robotics and Automation, Barcelona, Spain, pp. 24-29, Apr. 2005.
King and Weiman, “HelpmateTM Autonomous Mobile Robots Navigation Systems,” SPIE vol. 1388 Mobile Robots, pp. 190-198, 1990.
Kleinberg, The Localization Problem for Mobile Robots, Laboratory for Computer Science, Massachusetts Institute of Technology, 1994 IEEE, pp. 521-531, 1994.
Knights, et al., “Localization and Identification of Visual Landmarks,” Journal of Computing Sciences in Colleges, 16(4):312-313, May 2001.
Kolodko et al., “Experimental System for Real-Time Motion Estimation,” Proceedings of the 2003 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM 2003), pp. 981-986, 2003.
Komoriya et al., “Planning of Landmark Measurement for the Navigation of a Mobile Robot,” Proceedings of the 1992 IEEE/RSJ International Cofnerence on Intelligent Robots and Systems, Raleigh, NC pp. 1476-1481, Jul. 1992.
KOOLVAC Robotic Vacuum Cleaner Owner's Manual, Koolatron, 2004, 13 pages.
Krotkov et al., “Digital Sextant,” Downloaded from the internet at: http://www.cs.cmu.edu/˜epk/ , 1 page, 1995.
Krupa et al., “Autonomous 3-D Positioning of Surgical Instruments in Robotized Laparoscopic Surgery Using Visual Servoin,” IEEE Transactions on Robotics and Automation, 19(5):842-853, Oct. 2003.
Kuhl et al., “Self Localization in Environments using Visual Angles,” VRCAI '04 Proceedings of the 2004 ACM SIGGRAPH international conference on Virtual Reality continuum and its applications in industry, pp. 472-475, 2004.
Kurs et al, Wireless Power transfer via Strongly Coupled Magnetic Resonances, Downloaded from www.sciencemag.org, Aug. 2007, 5 pages.
Kurth, “Range-Only Robot Localization and SLAM with Radio”, http://www.ri.cmu.edu/pub—files/pub4/kurth—derek—2004—1/kurth—derek—2004—1.pdf. 60 pages, May 2004, accessed Jul. 27, 2012.
Kwon et al., “Table Recognition through Range-based Candidate Generation and Vision based Candidate Evaluation,” ICAR 2007, The 13th International Conference on Advanced Robotics Aug. 21-24, 2007, Jeju, Korea, pp. 918-923, 2007.
Lambrinos et al., “A mobile robot employing insect strategies for navigation,” Retrieved from the Internal: URL<http://www8.cs.umu.se/kurser/TDBD17/VT04/dl/Assignment%20Papers/lambrinos-RAS-2000.pdf>. 38 pages, Feb. 1999.
Lang et al., “Visual Measurement of Orientation Using Ceiling Features”, 1994 IEEE, pp. 552-555, 1994.
Lapin, “Adaptive position estimation for an automated guided vehicle,” SPIE, vol. 1831 Mobile Robots VII, pp. 82-94, 1992.
LaValle et al., “Robot Motion Planning in a Changing, Partially Predictable Environment,” 1994 IEEE International Symposium on Intelligent Control, Columbus, OH, pp. 261-266, Aug. 1994.
Lee et al., “Development of Indoor Navigation system for Humanoid Robot Using Multi-sensors Integration”, ION NTM, San Diego, CA pp. 798-805, Jan 2007.
Lee et al., “Localization of a Mobile Robot Using the Image of a Moving Object,” IEEE Transaction on Industrial Electronics, 50(3):612-619, Jun. 2003.
Leonard et al., “Mobile Robot Localization by tracking Geometric Beacons,” IEEE Transaction on Robotics and Automation, 7(3):376-382, Jun. 1991.
Li et al. “Robust Statistical Methods for Securing Wireless Localization in Sensor Networks,” Information Processing in Sensor Networks, 2005, Fourth International Symposium on, pp. 91-98, Apr. 2005.
Li et al., “Making a Local Map of Indoor Environments by Swiveling a Camera and a Sonar,” Proceedings of the 1999 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 954-959, 1999.
Lin et al., “Mobile Robot Navigation Using Artificial Landmarks,” Journal of robotics System, 14(2): 93-106, 1997.
Linde, Dissertation—“On Aspects of Indoor Localization,” Available at: https://eldorado.tu-dortmund.de/handle/2003/22854, University of Dortmund, 138 pages, Aug. 2006.
Lumelsky et al., “An Algorithm for Maze Searching with Azimuth Input”, 1994 IEEE International Conference on Robotics and Automation, San Diego, CA vol. 1, pp. 111-116, 1994.
Luo et al., “Real-time Area-Covering Operations with Obstacle Avoidance for Cleaning Robots,” IEEE, pp. 2359-2364, 2002.
Ma, Thesis—“Documentation on Northstar,” California Institute of Technology, 14 pages, May 2006.
Madsen et al., “Optimal landmark selection for triangulation of robot position,” Journal of Robotics and Autonomous Systems, vol. 13 pp. 277-292, 1998.
Malik et al., “Virtual Prototyping for Conceptual Design of a Tracked Mobile Robot,” Electrical and Computer Engineering, Canadian Conference on, IEEE, PI. pp. 2349-2352, May 2006.
Martishevcky, “The Accuracy of point light target coordinate determination by dissectoral tracking system”, SPIE vol. 2591, pp. 25-30, Oct. 23, 2005.
Maschinemarkt Wiirzburg 105, No. 27, pp. 3, 30, Jul. 5, 1999 (with English translation).
Matsumura Camera Online Shop: Retrieved from the Internet: URL<http://www.rakuten.co.jp/matsucame/587179/711512/>. Accessed Nov. 2011, 15 pages (with English translation).
Matsutek Enterprises Co. Ltd, “Automatic Rechargeable Vacuum Cleaner,” http://matsutek.manufacturer.globalsources.com/si/6008801427181/pdtl/Home-vacuum/10 . . . , Apr. 2007, 3 pages.
McGillem et al., “Infra-red Lacation System for Navigation and Autonomous Vehicles,” 1988 IEEE International Conference on Robotics and Automation, vol. 2, pp. 1236-1238, Apr. 1988.
McGillem,et al. “A Beacon Navigation Method for Autonomous Vehicles,” IEEE Transactions on Vehicular Technology, 38(3):132-139, Aug. 1989.
McLurkin “Stupid Robot Tricks: A Behavior-based Distributed Algorithm Library for Programming Swarms of Robots,” Paper submitted for requirements of BSEE at MIT, May 2004, 127 pages.
McLurkin, “The Ants: A community of Microrobots,” Paper submitted for requirements of BSEE at MIT, May 1995, 60 pages.
Michelson, “Autonomous navigation,” McGraw-Hill—Access Science, Encyclopedia of Science and Technology Online, 2007, 4 pages.
Miro et al., “Towards Vision Based Navigation in Large Indoor Environments,” Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Beijing, China, pp. 2096-2102, Oct. 2006.
MobileMag, “Samsung Unveils High-tech Robot Vacuum Cleaner,” Retrieved from the Internet: URL<http://www.mobilemag.com/content/100/102/C2261/>. 4 pages, Mar. 2005.
Monteiro et al., “Visual Servoing for Fast Mobile Robot: Adaptive Estimation of Kinematic Parameters,” Proceedings of the IECON '93., International Conference on Industrial Electronics, Maui, HI, pp. 1588-1593, Nov. 1993.
Moore et al., “A simple Map-bases Localization strategy using range measurements,” SPIE, vol. 5804 pp. 612-620, 2005.
Morland,“Autonomous Lawnmower Control”, Downloaded from the internet at: http://cns.bu.edu/˜cjmorlan/robotics/lawnmower/report.pdf, 10 pages, Jul. 2002.
Munich et al., “ERSP: A Software Platform and Architecture for the Service Robotics Industry,” Intelligent Robots and Systems, 2005. (IROS 2005), pp. 460-467, Aug. 2005.
Munich et al., “SIFT-ing Through Features with ViPR”, IEEE Robotics & Automation Magazine, pp. 72-77, Sep. 2006.
Nam et al., “Real-Time Dynamic Visual Tracking Using PSD Sensors and extended Trapezoidal Motion Planning”, Applied Intelligence 10, pp. 53-70, 1999.
Nitu et al., “Optomechatronic System for Position Detection of a Mobile Mini-Robot,” IEEE Ttransactions on Industrial Electronics, 52(4):969-973, Aug. 2005.
On Robo, “Robot Reviews Samsung Robot Vacuum (VC-RP3OW),” Retrieved from the Internet: URL <www.orffobo.com/reviews/AT—Home/vacuum—cleaners/on00vcrb30rosam/index.htm>. 2 pages, 2005.
OnRobo “Samsung Unveils Its Multifunction Robot Vacuum,” Retrieved from the Internet: URL <www.onrobo.com/enews/0210/samsung—vacuum.shtml>. 3 pages, Mar. 2005.
Pages et al., “A camera-projector system for robot positioning by visual serving,” Proceedings of the 2006 Conference on Computer Vision and Pattern Recognition Workshop (CVPRW06), 8 pages, Jun. 2006.
Pages et al., “Optimizing Plane-to-Plane Positioning Tasks by Image-Based Visual Servoing and Structured Light,” IEEE Transactions on Robotics, 22(5):1000-1010, Oct. 2006.
Pages et al., “Robust decoupled visual servoing based on structured light,” 2005 IEEE/RSJ, Int. Conf. on Intelligent Robots and Systems, pp. 2676-2681, 2005.
Park et al., “A Neural Network Based Real-Time Robot Tracking Controller Using Position Sensitive Detectors,” IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on Neutral Networks, Orlando, Florida pp. 2754-2758, Jun./Jul. 1994.
Park et al., “Dynamic Visual Servo Control of Robot Manipulators using Neutral Networks,” The Korean Institute Telematics and Electronics, 29-B(10):771-779, Oct. 1992.
Paromtchik “Toward Optical Guidance of Mobile Robots,” Proceedings of the Fourth World Multiconference on Systemics, Cybermetics and Informatics, Orlando, FL, USA, Jul. 23, 2000, vol. IX, pp. 44-49, available at http://emotion.inrialpes.fr/˜paromt/infos/papers/paromtchik:asama:sci:2000.ps.gz, accessed Jul. 3, 2012, 6 pages.
Paromtchik et al., “Optical Guidance System for Multiple mobile Robots,” Proceedings 2001 ICRA. IEEE International Conference on Robotics and Automation, vol. 3, pp. 2935-2940, May 2001.
Penna et al., “Models for Map Building and Navigation”, IEEE Transactions on Systems. Man. and Cybernetics., 23(5):1276-1301, Sep./Oct. 1993.
Pirjanian et al. “Representation and Execution of Plan Sequences for Multi-Agent Systems,” Proceedings of the 2001 IEEE/RSJ International Conference on Intelligent Robots and Systems, Maui, Hawaii, pp. 2117-2123, Oct. 2001.
Pirjanian et al., “A decision-theoretic approach to fuzzy behavior coordination”, 1999 IEEE International Symposium on Computational Intelligence in Robotics and Automation, 1999. CIRA '99., Monterey, CA, pp. 101-106, Nov. 1999.
Pirjanian et al., “Distributed Control for a Modular, Reconfigurable Cliff Robot,” Proceedings of the 2002 IEEE International Conference on Robotics & Automation, Washington, D.C. pp. 4083-4088, May 2002.
Pirjanian et al., “Improving Task Reliability by Fusion of Redundant Homogeneous Modules Using Voting Schemes,” Proceedings of the 1997 IEEE International Conference on Robotics and Automation, Albuquerque, NM, pp. 425-430, Apr. 1997.
Pirjanian et al., “Multi-Robot Target Acquisition using Multiple Objective Behavior Coordination,” Proceedings of the 2000 IEEE International Conference on Robotics & Automation, San Francisco, CA, pp. 2696-2702, Apr. 2000.
Pirjanian, “Challenges for Standards for consumer Robotics,” IEEE Workshop on Advanced Robotics and its Social impacts, pp. 260-264, Jun. 2005.
Pirjanian, “Reliable Reaction,” Proceedings of the 1996 IEEE/SICE/RSJ International Conference on Multisensor Fusion and Integration for Intelligent Systems, pp. 158-165, 1996.
Prassler et al., “A Short History of Cleaning Robots,” Autonomous Robots 9, 211-226, 2000, 16 pages.
Put Your Roomba . . . On, Automatic webpages: http://www.acomputeredge.com/roomba, 5 pages, accessed Apr. 2005.
Remazeilles et al., “Image based robot navigation in 3D environments,” Proc. of SPIE, vol. 6052, pp. 1-14, Dec. 2005.
Rives et al., “Visual servoing based on ellipse features,” SPIE, vol. 2056 Intelligent Robots and Computer Vision pp. 356-367, 1993.
Roboking—not just a vacuum cleaner, a robot!, Jan. 21, 2004, infocom.uz/2004/01/21/robokingne-prosto-pyilesos-a-robot/, accessed Oct. 10, 2011, 5 pages.
RoboMaid Sweeps Your Floors So You Won't Have to, the Official Site, website: Retrieved from the Internet: URL<http://therobomaid.com>. 2 pages, accessed Mar. 2005.
Robot Buying Guide, “LG announces the first robotic vacuum cleaner for Korea,” Retrieved from the Internet: URL<http://robotbg.corn/news/2003/04/22/lg—announces—the—first—robotic—vacu>. 1 page, Apr. 2003.
Robotics World, “A Clean Sweep,” 5 pages, Jan. 2001.
Ronnback, “On Methods for Assistive Mobile Robots,” Retrieved from the Internet: URL<http://www.openthesis.org/documents/methods-assistive-mobile-robots-595019.html>. 218 pages, Jan. 2006.
Roth-Tabak et al., “Environment Model for mobile Robots Indoor Navigation,” SPIE, vol. 1388 Mobile Robots, pp. 453-463, 1990.
Sahin et al., “Development of a Visual Object Localization Module for Mobile Robots,” 1999 Third European Workshop on Advanced Mobile Robots, (Eurobot '99), pp. 65-72, 1999.
Salomon et al., “Low-Cost Optical Indoor Localization system for Mobile Objects without Image Processing,” IEEE Conference on Emerging Technologies and Factory Automation, 2006. (ETFA '06), pp. 629-632, Sep. 2006.
Sato, “Range Imaging Based on Moving Pattern Light and Spatio-Temporal Matched Filter,” Proceedings International Conference on Image Processing, vol. 1., Lausanne, Switzerland, pp. 33-36, Sep. 1996.
Schenker et al., “Lightweight rovers for Mars science exploration and sample return,” Intelligent Robots and Computer Vision XVI, SPIE Proc. 3208, pp. 24-36, 1997.
Schlemmer, “Electrolux Trilobite Robotic Vacuum,” Retrieved from the Internet: URL<www.hammacher.com/publish/71579.asp?promo=xsells>. 3 pages, Mar. 2005.
Schofield, “Neither Master nor slave—A Practical Study in the Development and Employment of Cleaning Robots, Emerging Technologies and Factory Automation,” 1999 Proceedings ETFA '99 1999 7th IEEE International Conference on Barcelona, Spain, pp. 1427-1434, Oct. 1999.
Shimoga et al., “Touch and Force Reflection for Telepresence Surgery,” Engineering in Medicine and Biology Society, 1994. Engineering Advances: New Opportunities for Biomedical Engineers. Proceedings of the 16th Annual International Conference of the IEEE, Baltimore, MD, pp. 1049-1050, 1994.
Sim et al, “Learning Visual Landmarks for Pose Estimation,” IEEE International Conference on Robotics and Automation, vol. 3, Detroit, MI, pp. 1972-1978, May 1999.
Sobh et al., “Case Studies in Web-Controlled Devices and Remote Manipulation,” Automation Congress, 2002 Proceedings of the 5th Biannual World, pp. 435-440, Dec. 2002.
Special Reports, “Vacuum Cleaner Robot Operated in Conjunction with 3G Celluar Phone,” 59(9): 3 pages, Retrieved from the Internet: URL<http://www.toshiba.co.jp/tech/review/2004/09/59—0>. 2004.
Stella et al., “Self-Location for Indoor Navigation of Autonomous Vehicles,” Part of the SPIE conference on Enhanced and Synthetic Vision SPIE vol. 3364, pp. 298-302, 1998.
Summet, “Tracking Locations of Moving Hand-held Displays Using Projected Light,” Pervasive 2005, LNCS 3468, pp. 37-46, 2005.
Svedman et al., “Structure from Stereo Vision using Unsynchronized Cameras for Simultaneous Localization and Mapping,” 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 2993-2998, 2005.
Svet Computers—New Technologies—Robot Vacuum Cleaner, Oct. 1999, available at http://www.sk.rs/1999/10/sknt01.html, 1 page, accessed Nov. 1, 2011.
Taipei Times, “Robotic vacuum by Matsuhita about to undergo testing,” Retrieved from the Internet: URL<http://www.taipeitimes.com/News/worldbiz/archives/2002/03/26/0000129338>. accessed Mar. 2002, 2 pages.
Takio et al., “Real-Time Position and Pose Tracking Method of Moving Object Using Visual Servo System,” 47th IEEE International Symposium on Circuits and Systems, pp. 167-170, 2004.
“Tech-on!,” Retrieved from the Internet: URL http://techon.nikkeibp.co.jp/members/01db/200203/1006501/, accessed Nov. 2011, 7 pages (with English translation).
Teller, “Pervasive pose awareness for people, Objects and Robots,” http://www.ai.mit.edu/lab/dangerous-ideas/Spring2003/teller-pose.pdf, 6 pages, Apr. 2003.
Terada et al., “An Acquisition of the Relation between Vision and Action using Self-Organizing Map and Reinforcement Learning,” 1998 Second International Conference on Knowledge-Based Intelligent Electronic Systems, Adelaide, Australia, pp. 429-434, Apr. 1998.
The Sharper Image, eVac Robotic Vacuum—Product Details, www.sharperiamge.com/us/en/templates/products/pipmoreworklprintable.jhtml, 1 page, Accessed Mar. 2005.
TheRobotStore.com, “Friendly Robotics Robotic Vacuum RV400—The Robot Store,” www.therobotstore.com/s.nl/sc.9/category.-109/it.A/id.43/.f, 1 page, Apr. 2005.
Thrun, Sebastian, “Learning Occupancy Grid Maps With Forward Sensor Models,” Autonomous Robots 15, 28 pages, Sep. 1, 2003.
TotalVac.com, RC3000 RoboCleaner website, 2004, Accessed at http://ww.totalvac.com/robot—vacuum.htm (Mar. 2005), 3 pages.
Trebi-Ollennu et al., “Mars Rover Pair Cooperatively Transporting a Long Payload,” Proceedings of the 2002 IEEE International Conference on Robotics & Automation, Washington, D.C. pp. 3136-3141, May 200202156.
Tribelhorn et al., “Evaluating the Roomba: A low-cost, ubiquitous platform for robotics research and education,” IEEE, pp. 1393-1399, 2007.
Tse et al., “Design of a Navigation System for a Household Mobile Robot Using Neural Networks,” Department of Manufacturing Engg. & Engg. Management, City University of Hong Kong, pp. 2151-2156, 1998.
UAMA (Asia) Industrial Co., Ltd., “RobotFamily,” 2005, 1 page.
UBOT, cleaning robot capable of wiping with a wet duster, Retrieved from the Internet: URL<http://us.aving.net/news/view.php?articleId=23031>. 4 pages, accessed Nov. 2011.
Watanabe et al., “Position Estimation of Mobile Robots With Internal and External Sensors Using Uncertainty Evolution Technique,” 1990 IEEE International Conference on Robotics and Automation, Cincinnati, OH, pp. 2011-2016, May 1990.
Watts, “Robot, boldly goes where no man can,” The Times—pp. 20, Jan. 1985.
Wijk et al., “Triangulation-Based Fusion of Sonar Data with Application in Robot Pose Tracking,” IEEE Transactions on Robotics and Automation, 16(6):740-752, Dec. 2000.
Wolf et al., “Robust Vision-Based Localization by Combining an Image-Retrieval System with Monte Carol Localization,”, IEEE Transactions on Robotics, 21(2):208-216, Apr. 2005.
Wolf et al., “Robust Vision-based Localization for Mobile Robots Using an Image Retrieval System Based on Invariant Features,” Proceedings of the 2002 IEEE International Conference on Robotics & Automation, Washington, D.C., pp. 359-365, May 2002.
Wong, “EIED Online>> Robot Business”, ED Online ID# 13114, 17 pages, Jul. 2006.
Yamamoto et al., “Optical Sensing for Robot Perception and Localization,” 2005 IEEE Workshop on Advanced Robotics and its Social Impacts, pp. 14-17, 2005.
Yata et al., “Wall Following Using Angle Information Measured by a Single Ultrasonic Transducer,” Proceedings of the 1998 IEEE, International Conference on Robotics & Automation, Leuven, Belgium, pp. 1590-1596, May 1998.
Yujin Robotics,“An intelligent cleaning robot,” Retrieved from the Internet: URL<http://us.aving.net/news/view.php?articleId=7257>. 8 pages, accessed Nov. 2011.
Yun et al., “Image-Based Absolute Positioning System for Mobile Robot Navigation,” IAPR International Workshops SSPR, Hong Kong, pp. 261-269, Aug. 2006.
Yun et al., “Robust Positioning a Mobile Robot with Active Beacon Sensors,” Lecture Notes in Computer Science, 2006, vol. 4251, pp. 890-897, 2006.
Yuta et al., “Implementation of an Active Optical Range sensor Using Laser Slit for In-Door Intelligent Mobile Robot,” IEE/RSJ International Workshop on Intelligent Robots and Systems (IROS 91) vol. 1, Osaka, Japan, pp. 415-420, Nov. 3-5, 1991.
Zha et al., “Mobile Robot Localization Using Incomplete Maps for Change Detection in a Dynamic Environment,” Advanced Intelligent Mechatronics '97. Final Program and Abstracts., IEEE/ASME International Conference, pp. 110, Jun. 1997.
Zhang et al., “A Novel Mobile Robot Localization Based on Visio612n,” SPIE vol. 6279, 6 pages, Jan. 2007.
Zoombot Remote Controlled Vaccuum-RV-500 New Roomba 2, website: http://cgi.ebay.com/ws/eBayISAPI.dll?ViewItem&category=43526&item=4373497618&rd=1, accessed Apr. 20, 2005, 7 pages.
Written Opinion of the International Searching Authority, PCT/US2004/001504, Aug. 20, 2012, 9 pages.
Examination report dated Oct. 18, 2011 for corresponding application No. 10183299.6.
Examination report dated Oct. 18, 2011 for corresponding application No. 10183338.2.
Examination report dated Oct. 18, 2011 for corresponding application No. 10183328.8.
Ep Search report dated Sep. 5, 2011 for corresponding application No. 10183299.6.
EP Search report dated Sep. 5, 2011 for corresponding application No. 10183338.2.
EP Search report dated Sep. 5, 2011 for corresponding application No. 10183328.8.
Examination report dated Mar. 14, 2008 for U.S. Appl. No. 11/771,356.
Examination report dated Mar. 27, 2006 for U.S. Appl. No. 10/839,374.
Examination report dated Apr. 11, 2004 for corresponding EP application No. 02734767.3.
Examination report dated Apr. 29, 2005 for corresponding EP Application No. 02734767.3.
Examination report dated May 4, 2005 for U.S. Appl. No. 10/839,374.
Examination report dated May 14, 2009 for U.S. Appl. No. 11/771,433.
Examination report dated Jul. 24, 2009 for U.S. Appl. No. 11/771,433.
Examination report dated Aug. 9, 2004 for U.S. Appl. No. 10/839,374.
Examination report dated Sep. 3, 2003 for U.S. Appl. No. 10/167,851.
Examination report dated Oct. 1, 2010 for U.S. Appl. No. 12/609,124.
Examination report dated Nov. 8, 2005 for corresponding EP application No. 02734767.3.
Examination report dated Nov. 30, 2010 for U.S. Appl. No. 12/826,909.
Examination report dated Dec. 29, 2003 for U.S. Appl. No. 10/167,851.
Examination report dated Jun. 14, 2011 for corresponding application No. JP 2008-246310.
Related Publications (1)
Number Date Country
20130325178 A1 Dec 2013 US
Provisional Applications (1)
Number Date Country
60297718 Jun 2001 US
Continuations (4)
Number Date Country
Parent 12609124 Oct 2009 US
Child 13893762 US
Parent 11671305 Feb 2007 US
Child 12609124 US
Parent 10839374 May 2004 US
Child 11671305 US
Parent 10167851 Jun 2002 US
Child 10839374 US