Compact autonomous coverage robot

Information

  • Patent Grant
  • 8839477
  • Patent Number
    8,839,477
  • Date Filed
    Wednesday, December 19, 2012
    11 years ago
  • Date Issued
    Tuesday, September 23, 2014
    10 years ago
Abstract
An autonomous coverage robot includes a chassis having forward and rearward portions and a drive system carried by the chassis. The forward portion of the chassis defines a substantially rectangular shape. The robot includes a cleaning assembly mounted on the forward portion of the chassis and a bin disposed adjacent the cleaning assembly and configured to receive debris agitated by the cleaning assembly. A bin cover is pivotally attached to a lower portion of the chassis and configured to rotate between a first, closed position providing closure of an opening defined by the bin and a second, open position providing access to the bin opening. The robot includes a body attached to the chassis and a handle disposed on an upper portion of the body. A bin cover release is actuatable from substantially near the handle.
Description
TECHNICAL FIELD

This disclosure relates to autonomous coverage robots for cleaning floors or other surfaces.


BACKGROUND

Autonomous robots are robots which can perform desired tasks in unstructured environments without continuous human guidance. Many kinds of robots are autonomous to some degree. Different robots can be autonomous in different ways. An autonomous coverage robot traverses a work surface without continuous human guidance to perform one or more tasks. In the field of home, office and/or consumer-oriented robotics, mobile robots that perform household functions such as vacuum cleaning, floor washing, patrolling, lawn cutting and other such tasks have been widely adopted.


Mobile robots for cleaning floors have been described in, for example, U.S. Pat. No. 6,883,201 to JONES et al. (“JONES”), which discloses an autonomous floor-cleaning robot that traverses a floor while removing debris using rotating brushes, vacuums, or other cleaning mechanisms. JONES further describes a robot having a generally round form factor supported by three wheels, which can rotate freely to maneuver around obstacles, inter alia.


SUMMARY

Presently disclosed is a compact mobile robot for cleaning floors, countertops, and other related surfaces, such as tile, hardwood or carpeted flooring. The robot has a rectangular front form factor that facilitates cleaning along wall edges or in corners. In one example, the robot includes both a rounded section and a rectangular section, in which a cleaning mechanism within the rectangular section is disposed proximally to opposite side corners of the rectangular section. As an advantage, the robot can maneuver so as to bring the rectangular section flush with a wall corner or wall edge, with the cleaning mechanism extending into the wall corner or wall edge.


In one aspect, an autonomous coverage robot includes a chassis having forward and rearward portions and a drive system carried by the chassis. The forward portion of the chassis defines a substantially rectangular shape. The robot includes a cleaning assembly mounted on the forward portion of the chassis and a bin disposed adjacent the cleaning assembly. The bin is configured to receive debris agitated by the cleaning assembly. A bin cover is pivotally attached to a lower portion of the chassis and configured to rotate between a first, closed position providing closure of an opening defined by the bin and a second, open position providing access to the bin opening. The robot includes a body attached to the chassis and a handle disposed on an upper portion of the body. A bin cover release is configured to control movement of the bin cover between its first and second positions. The bin cover release is actuatable from substantially near the handle.


Implementations of this aspect of the disclosure may include one or more of the following features. In some implementations, the bin cover release is configured to move between a first, locking position which locks the bin cover in its first, closed position and a second, disengaged position which allows the bin cover to move to its second, open position. The bin cover release may be a spring biased latch. In some examples, the bin cover release includes a button disposed on the handle configured to actuate the latch, thereby allowing actuation of the bin cover release while holding the handle. In some implementations, the drive system includes right and left differentially driven drive wheels rotatably mounted to the rearward portion of the chassis. The drive system is capable of maneuvering the robot to pivot in place.


In another aspect, an autonomous coverage robot includes a chassis having forward and rearward portions, and a drive system carried by the rearward portion of the chassis. The drive system is configured to maneuver the robot over a cleaning surface. The robot includes a controller in communication with the drive system. The controller is configured to maneuver the robot to pivot in place. The robot includes a cleaning assembly mounted on the forward portion of the chassis. The robot includes a bump sensor in communication with the controller which is configured to detect movement in multiple directions. A body is flexibly attached to the chassis and substantially covers the chassis. Contact with the body is translated to the bump sensor for detection. The controller is configured to alter a drive direction of the robot in response to a signal received from the bump sensor. The bump sensor includes a sensor base, a sensor shroud positioned adjacent the sensor base and connected to the body, an emitter housed by the sensor shroud, and at least three detectors carried by the sensor base. The emitter emits a signal onto the sensor base, and the detectors are configured to detect the emitted signal. Movement of the sensor shroud causes movement of the emitted signal over the detectors. In some implementations, the robot includes a bin disposed adjacent the cleaning assembly and configured to receive debris agitated by the cleaning assembly.


Implementations of this aspect of the disclosure may include one or more of the following features. In some implementations, the bump sensor detects 360 degrees of movement of the body about the bump sensor. Preferably, the bump sensor includes four detectors arranged in a rectangular configuration with respect to each other. The sensor shroud defines an orifice through which the emitter emits its signal onto the sensor base. The emitter comprises an infrared light emitter and the detectors comprise infrared light detectors, the orifice collimating the emitted signal onto the sensor base. In some examples, the robot includes a bumper guide configured to confine body movements to along two directions. The bumper guide may include two orthogonal grooves defined by the body and configured to receive a guide pin disposed on the chassis. The forward portion defines a substantially rectangular shape, in some examples. The drive system, in some examples, includes right and left drive wheels differentially driven by corresponding right and left motors.


In yet another aspect, an autonomous coverage robot includes a chassis having forward and rearward portions, and a drive system carried by the rearward portion of the chassis. The forward portion defines a substantially rectangular shape and the rearward portion of the chassis defines an arcuate shape. The drive system is configured to maneuver the robot over a cleaning surface and includes right and left drive wheels differentially driven by corresponding right and left motors. The robot includes a controller in communication with the drive system. The controller is configured to maneuver the robot to pivot in place. The robot includes a cleaning assembly mounted on the forward portion of the chassis. The robot includes an accelerometer in communication with the controller, which controls the drive system in response to a signal received from the accelerometer. In some implementations, the robot includes a bin disposed adjacent the cleaning assembly and configured to receive debris agitated by the cleaning assembly.


Implementations of this aspect of the disclosure may include one or more of the following features. In some implementations, the controller alters a drive direction of the robot in response to a signal received from the accelerometer indicating an abrupt speed change. The controller alters a drive direction of the robot in response to a signal received from the accelerometer indicating stasis of the robot. The controller reduces a drive speed of the robot in response to a signal received from the accelerometer indicating a maximum speed. In some examples, the maximum speed is between about 200 mm/s and about 400 mm/s.


Implementations of the above aspects of the disclosure may include one or more of the following features. In some implementations, the cleaning assembly includes a first roller brush rotatably mounted substantially near a front edge of the chassis. The cleaning assembly may include a second roller brush rotatably mounted substantially parallel to and rearward of the first roller brush, the first and second roller brushes rotate in opposite directions. The bin is disposed rearward of the first and second roller brushes and forward of the drive system. Each roller brush includes right and left end brushes extending from respective ends of the roller brush beyond a lateral extend of the body, each end brush extending at angle φ of between 0° and about 90° from a longitudinal axis defined by the roller brush. In other implementations, the cleaning assembly includes a front roller brush rotatably mounted substantially near the front edge of the chassis, and right and left side roller brushes rotatably mounted orthogonal to the front brush substantially near the respective right and left side edges of the chassis. The bin is disposed rearward of the front roller brush and substantially between the right and left side roller brushes and forward of the drive system.


In another aspect, an autonomous coverage robot includes a chassis having forward and rearward portions, and a drive system carried by the rearward portion of the chassis. The forward portion defines a substantially rectangular shape and the rearward portion defines an arcuate shape. The drive system is configured to maneuver the robot over a cleaning surface and includes right and left drive wheels differentially driven by corresponding right and left motors. The robot includes a controller in communication with the drive system. The controller is configured to maneuver the robot to pivot in place. The robot includes a cleaning assembly mounted on the forward portion of the chassis and includes a first roller brush rotatably mounted substantially near a front edge of the chassis and a second roller brush rotatably mounted substantially parallel to and rearward of the first roller brush. The first and second roller brushes rotate in opposite directions. A bin is disposed rearward of the cleaning assembly and is configured to receive debris agitated by the cleaning assembly. A bin cover is pivotally attached to a lower portion of the chassis and is configured to rotate between a first, closed position providing closure of an opening defined by the bin and a second, open position providing access to the bin opening. The robot includes a bin cover release configured to control movement of the bin cover between its first and second positions. A handle is disposed on the chassis. The bin cover release is actuatable from substantially near the handle. A body is flexibly attached to the chassis and substantially covers the chassis. The body is movable in relation to the handle and the chassis. The robot includes a bump sensor in communication with the controller and configured to detect movement in multiple directions. Contact with the body is translated to the bump sensor for detection. The controller is configured to alter a drive direction of the robot in response to a signal received from the bump sensor.


Implementations of this aspect of the disclosure may include one or more of the following features. In some implementations, the bump sensor includes a sensor base, a sensor shroud positioned adjacent the sensor base and connected to the body, an emitter housed by the sensor shroud, and at least three detectors carried by the sensor base. The emitter emits a signal onto the sensor base and the detectors detect the emitted signal. Movement of the sensor shroud causes movement of the emitted signal over the detectors.


In some implementations, the robot includes a bumper guide configured to confine body movements to along two directions. The bumper guide may include two orthogonal grooves defined by the body and configured to receive a guide pin disposed on the chassis.


The robot may include an idler wheel disposed on the bin cover. In some examples, the rearward portion of the chassis defines a substantially semi-circular shape and the idler wheel is position at least ⅓ the radius of the substantially semi-circular shaped rearward portion forward of the drive wheels.


In specific examples, the drive wheels are disposed less than 9 cm rearward of the cleaning assembly. The robot may include a power source disposed in the rearward portion of the chassis substantially between the right and left wheels. The power source is disposed adjacent and rearward of the bin. The cleaning assembly further comprises a brush motor configured to drive the first and second roller brushes. In some examples, the brush motor is disposed substantially near a forward edge of the chassis. The first roller brush may be disposed substantially near a forward edge of the chassis.


Implementations of the disclosure may include one or more of the following features. In some implementations, the right and left drive wheels are rotatably mounted to the rearward portion of the chassis, and the drive system is capable of maneuvering the robot to pivot in place. Preferably, the rearward portion of the chassis defines an arcuate shape; however other shapes are possible as well, such as rectangular or polygonal. In some examples, the rearward portion of the chassis defines a substantially semi-circular shape and the axes of the right and left drive wheels are disposed on or rearward of a center axis defined by the rearward portion of the chassis. In some implementations, the chassis and the body together have a length of less than 23 cm and a width of less than 19 cm.


In some implementations, the robot includes at least one proximity sensor carried by a dominant side of the robot. The at least one proximity sensor responds to an obstacle substantially near the body. The controller alters a drive direction in response to a signal received from the at least one proximity sensor.


In some implementations, the robot includes at least one cliff sensor carried by a forward portion of the body and arranged substantially near a front edge of the body. The at least one cliff sensor responds to a potential cliff forward of the robot. The drive system alters a drive direction in response to a signal received from the cliff sensor indicating a potential cliff. In some examples, right and left front cliff sensors are disposed at the respective right and left corners of a forward portion of the robot. This allows the robot to detect when a either of the front corners swing over a cliff edge, so as to avoid moving the drive wheels any closer to the cliff edge. In some implementations, the robot includes at least one cliff sensor carried by a rearward portion of the body and arranged substantially near the rear edge of the body. The at least one cliff sensor responds to a potential cliff rearward of the robot. The drive system alters a drive direction in response to a signal received from the cliff sensor indicating a potential cliff. In some examples, right and left rear cliff sensors are disposed directly rearward of the respective right and left drive wheels. This allow the robot to detect a cliff edge while driving in reverse at an angle or in an arc, in which the drive wheel may encounter the cliff edge before the rear center portion of the robot.


In some implementations, the robot includes an idler wheel disposed on the bin cover. Preferably, the rearward portion of the chassis defines a substantially semi-circular shape, which allows the robot to spin in place without catching any portion of the rearward portion of the chassis on a detected obstacle. The idler wheel is position at least ⅓ the radius of the substantially semi-circular shaped rearward portion forward of the drive wheels. In some examples, the idler wheel is a stasis detector including a magnet disposed in or on the idler wheel, and a magnet detector disposed adjacent the wheel for detecting the magnet as the idler wheel rotates.


In other more general aspects that are combinable with any of the above implementations, an autonomous coverage robot includes a chassis and a drive system carried by the chassis. The drive system is configured to maneuver the robot over a cleaning surface. In some examples, the drive system includes right and left differentially driven drive wheels; however other means of driving the robot are applicable as well, such as skid steer tracks. In some examples, the chassis has forward and rearward portions with the forward portion defining a substantially rectangular shape. Optionally, the rearward portion can define an arcuate shape.


In some implementations, the robot includes a cleaning assembly mounted on the forward portion of the chassis (e.g. substantially near a forward edge of the chassis). A bin is disposed adjacent the cleaning assembly and configured to receive debris agitated by the cleaning assembly. In some examples, a bin cover is pivotally attached to the robot and is configured to rotate between a first, closed position providing closure of an opening defined by the bin and a second, open position providing access to the bin opening. In other examples, the bin cover is slidably attached to the robot and slides between the first, closed position and the second, open position.


In some implementations, a body is attached to the chassis. The body may conform to the profile of the chassis. In some examples, the body is flexibly or movably attached to the chassis. The robot may include a handle for carrying the robot. The handle can be disposed on the body or on the chassis. If the handle is disposed on the chassis, the body is allowed to move in relation to the handle and/or the chassis. The robot may include a bin cover release configured to control movement of the bin cover between its first and second positions. Preferably, the bin cover release is actuatable from substantially near the handle. However, the bin cover release may be actuatable from substantially near or on the bin cover.


In some implementations, the robot includes a bump sensor, which may be configured to detect movement in multiple directions. In some examples, contact with the body is translated to the bump sensor for detection. The robot may include a controller configured to alter a drive direction of the robot in response to a signal received from the bump sensor. In some examples, the robot includes an accelerometer in communication with the controller, such that the controller controls the drive system in response to a signal received from the accelerometer.


The details of one or more implementations of the disclosure are set forth in the accompanying drawings and the description below. Other features, objects, and advantages will be apparent from the description and drawings, and from the claims.





DESCRIPTION OF DRAWINGS


FIG. 1 is a top perspective view of a compact autonomous coverage robot.



FIG. 2 is a bottom perspective view of the robot shown in FIG. 1.



FIG. 3 is a top view of the robot shown in FIG. 1.



FIG. 4 is a bottom view of the robot shown in FIG. 1.



FIG. 5 is an exploded view of the top aspect shown in FIG. 1.



FIG. 6 is a front view of the robot shown in FIG. 1.



FIG. 7 is a rear view of the robot shown in FIG. 1.



FIG. 8 is a left side view of the robot shown in FIG. 1 with a bin cover in its open position.



FIG. 9 is right side view of the robot shown in FIG. 1.



FIG. 10 is top perspective view of a compact autonomous coverage robot.



FIG. 11A is a side view of a stasis detector.



FIG. 11B is a top schematic view of a compact autonomous coverage robot.



FIG. 11C is a side schematic view of a compact autonomous coverage robot.



FIG. 12A is a top view of a compact autonomous coverage robot scraping along a wall.



FIG. 12B is a top view of a compact autonomous coverage robot bumping a wall.



FIG. 13A is a top schematic view of a compact autonomous coverage robot with a bumper guide.



FIG. 13B is a side section view of a bump sensor.



FIG. 13C is a top schematic view of a bump sensor system with a bumper guide.



FIG. 13D is a perspective view of a bump sensor system.



FIG. 14 is a contour shaded diagram of the view of the compact cleaning robot shown in FIG. 3.



FIG. 15 is a perspective exploded view of an omni-directional sensor.



FIG. 16 is a side view of the omni-directional sensor shown in FIG. 15.



FIG. 17 is a top perspective view of a compact autonomous coverage robot.



FIG. 18 is a bottom perspective view of the robot shown in FIG. 17.



FIG. 19 is a top view of the robot shown in FIG. 17.



FIG. 20 is a bottom view of the robot shown in FIG. 17.



FIG. 21 is an exploded view of the top aspect shown in FIG. 17.



FIG. 22 is a front view of the robot shown in FIG. 17.



FIG. 23 is a rear view of the robot shown in FIG. 17.



FIG. 24 is a left side view of the robot shown in FIG. 17 with a bin cover in its open position.



FIG. 25 is right side view of the robot shown in FIG. 17.



FIG. 26 is an oblique view of a compact cleaning robot having rectangular form traversing along a wall edge.



FIG. 27 is a plan view of a compact cleaning robot navigating flush into a wall corner.



FIG. 28 is a plan view of a round robot navigating into a wall corner, illustrating a gap that the round robot cannot traverse.



FIG. 29-32 collectively provide a schematic view of a control circuit for an autonomous coverage robot.



FIG. 33 is a schematic view of a software architecture for a behavioral system of autonomous coverage robot.





Like reference symbols in the various drawings indicate like elements.


DETAILED DESCRIPTION

Referring to FIGS. 1-3, an autonomous coverage robot 100 includes a chassis 200 having a forward portion 210 and a rearward portion 220. The forward portion 210 of the chassis 200 defines a substantially rectangular shape. In the example shown, the rearward portion 220 of the chassis 200 defines an arcuate shape (e.g., in the example shown the rearward portion 220 is rounded); however, the rearward portion 220 may define other shapes as well, such as, but not limited to, rectangular, triangular, pointed, or wavy shapes.


Referring to FIGS. 1 and 5, the robot 100 includes a body 300 configured to substantially follow the contours of the chassis 200. The body 300 may be flexibly connected to the chassis 200 (e.g., by a spring or elastic element), so as to move over the chassis 200. In some examples, a handle 330 is disposed on or defined by an upper portion of the body 300. In other examples, the handle 330 is secured to or extends from a mounting piece 332, which is secured to an upper portion 205 of the chassis 200. The mounting piece 332 can be removable and interchangeable with other mounting pieces 332 that have different arrangements or carry other components (e.g., different handles 330 and/or sensors). The body 300 moves with respect to the mounting piece 332 and the chassis 200. In the example shown, the body 300 floats below the mounting piece 332. The mounting piece 332 can by circular and sized to be offset from a respective opening defined by an upper portion (305) of the body 300, so as to provide a 360° displacement limit for body movement (e.g., 2-4 mm of bumper movement) due to contact with the body (e.g., along a lower portion 303 of the body 300 (see FIG. 8). The robot 100 (including the chassis 200 and the body 300) has a compact footprint with a length of less than 23 cm and a width of less than 19 cm.


Referring to FIGS. 4 and 5, the robot 100 includes a drive system 400 carried by the chassis 200 and configured to maneuver the robot 100 over a cleaning surface. In the example shown, the drive system 400 includes right and left drive wheels 410 and 420, respectively, which are differentially driven by corresponding right and left drive motors 412 and 422, respectively. The drive motors 412, 422 are mounted above their respective drive wheels 410, 420, in the example shown, to help maintain the compact footprint of the robot 100. However, other implementations include having the drive motors 412, 422 mounted adjacent (e.g., co-axially with) their respective drive wheels 410, 420. In some examples, the robot includes a gear box 414, 424 coupled between the drive wheel 410, 420 and its respective drive motor 412, 422. The gear boxes 414, 424 and the drive motors 412, 422 are configured to propel the robot at a maximum velocity of between about 200 mm/s and about 400 mm/s (preferably 306 mm/s) and a maximum acceleration of about 500 mm/s2. In some implementations, the center axles of the drive wheels 410, 420 are disposed less than 9 cm (preferably 8 cm) rearward of a cleaning assembly 500, which will be described below. The robot 100 includes a controller 450 in communication with the drive system 400. The controller 450 is configured to maneuver the robot 100 to pivot in place.


The advantage of the conventional cylindrical robot with drives wheels disposed on the diameter of the robot is that it is not hindered from turning in the presence of obstacles. This enables a simple and effective escape strategy—spin in place until no objects are detected forward of the robot. If the robot is non-cylindrical or the axes of wheel rotation are not on a diameter of the robot then the normal and tangential forces on the robot change as the robot rotates while in contact with an object. To ensure that such a non-conventional robot is able to escape an arbitrary collision, the forces and torques applied to the robot by the environmental cannot combine with the robot-generated forces and torques to halt robot motion. In practice this means that the robot shape should be constant width (to within the shell compliance distance) and that the robot's wheels be capable of lateral motion. Particular shapes then yield different requirements for maximum lateral wheel forces and maximum allowable environmental coefficient of friction. However, the robot 100 presently disclosed, in some examples, has a rectangular forward portion 210 to allow cleaning fully into corners.


Referring again to the example shown in FIG. 4, a profile circle 221 defining the substantially semi-circular profile of the rearward portion 220 of the chassis 200 extends into the forward portion 210 of the chassis 200 and has a center axis 223. The drive wheels 410, 420 are positioned on or substantially near the center axis 223 of the profile circle 221. In the example shown, the drive wheels 410, 420 are positioned slightly rearward of the center axis 223 of the profile circle 221. By positioning the drive wheels 410, 420 on or rearward of the center axis 223 of the profile circle 221, the robot 100 can turn in place without catching rearward portion 220 of the chassis 200 on an obstacle.


Referring to FIGS. 2, 4 and 5-9, the robot 100 includes a cleaning assembly 500 mounted on the front portion 210 of the chassis 200 substantially near a front edge 202 of the chassis 200. In the examples shown, the cleaning assembly 500 includes first and second roller brushes 510, 520 rotatably mounted substantially parallel to each other. The roller brushes 510, 520 are driven by a cleaning motor 530 coupled to a middle portion of the roller brushes 510, 520 by a gear box 532. The cleaning motor 530 is positioned above the roller brushes 510, 520 to confine the cleaning assembly 500 to the forward portion 210 of the chassis 200 and to help maintain a compact robot with a relatively small footprint. Each roller brush 510, 520 may include an end brush 540 disposed at each longitudinal end 512, 514, 522, 524 of the roller brush 510, 520. Each end brush 540 is disposed at an angle φ with a longitudinal axis 513, 523 defined by the roller brush 510, 520 of between 0° and about 90° (preferably 45°). The end brush 540 extends beyond the chassis 200 and the body 300 (e.g., beyond respective right and left side edges 306, 308) to agitate debris on or along objects adjacent the robot 100 (e.g., to clean up against walls). Other implementations of the cleaning assembly 500 will be discussed later with reference to another implementation of the robot 100.


Referring to FIGS. 1-5, 8, and 10, the robot 100 includes a bin assembly 600 disposed adjacent the cleaning assembly 500 and configured to receive debris agitated by the cleaning assembly 500. In some examples, the chassis 200 defines a debris chamber or bin 610 (see FIG. 10). In other examples, a bin 610 is disposed below the chassis and positioned to receive debris agitated by the cleaning assembly 500. In the examples shown, the bin 610 is positioned substantially between the cleaning assembly 500 and the drive system 400. Specifically, the bin 610 is forward of the drive wheels 410, 420 and rearward of the roller brushes 510, 520.


Preferably, the debris chamber/bin 610 is defined by, and thus formed integrally with, the chassis 200. In an alternative configuration, the robot 101 may include a modular, removable cartridge or bag serving as the debris chamber/bin 610, such that the user can remove the debris by removing and emptying the cartridge or bag. The cartridge or bag 610 is removably secured to the chassis 200.


A bin cover 620 is pivotally attached to a lower portion 203 of the chassis 200 and configured to rotate between a first, closed position providing closure of an opening 612 defined by the bin 610 and a second, open position providing access to the bin opening 610. In some examples, the bin cover 620 is releasably connected to the chassis 200 by one or more hinges 622. The bin assembly 600 includes a bin-cover release 630 configured to control movement of the bin cover 620 between its first and second positions. The bin-cover release 630 is configured to move between a first, locking position which locks the bin cover 620 in its first, closed position and a second, disengaged position which allows the bin cover 620 to move to its second, open position (see FIG. 8). The bin-cover release 630 is actuatable from substantially near or at the handle 330, thereby allowing actuation of the bin-cover release 630 while holding the handle 330. This allows a user to pick up the robot 100 via the handle 330 with one hand, hold the robot 100 over a trash bin (not shown), and actuate the bin-cover release 630 with the same hand holding the handle 330 to release the bin cover 620 and empty the contents of the bin 610 into the trash bin. In some implementations, the bin cover release 630 is a spring biased latch or latching button attractable by pressing downwardly (e.g., button) or pulling upwardly (e.g., trigger).


The robot 100 includes a power source 160 (e.g., battery) in communication with the drive system 400 and/or the controller 450, and removably secured to the chassis 200. In the examples shown in FIGS. 2, 4, 5, and 7-8, the power source 160 is received by a power receptacle 260 defined by the rearward portion 220 of the chassis 200. In some examples, the power source 160 is positioned substantially under the controller 450 and between the right and left drive wheels 410, 420, while extending forward to a distance sufficient to place a center of gravity of the robot 100 substantially at the center of the chassis 200 or substantially between a first transverse axis 415 defined by the drive wheels 410, 420 and a second transverse axis 425 defined by a free-wheel 722 (e.g., stasis wheel 722) (see FIG. 4). If the weight of the power source 160 is positioned too far rearward, there will not be enough weight over the cleaning assembly 500, allowing the forward portion 210 of the chassis 200 to tip upward. As being a compact robot 100 with a relatively small footprint, the arrangement of components on and within the chassis 200 is important to achieve the compact size of the robot 100 while remaining functional. Referring to FIGS. 5 and 8, the debris chamber/bin 610 impedes the forward placement of the power source 160 (e.g., the power source 160 is limited to positioning in the rearward portion 210 of the chassis 200). Nevertheless, the power source 160 is positioned between the drive wheels 410, 420 and as far forward as possible, substantially abutting the bin 610, so as to place the center of gravity of the robot forward of the first transverse axis 415 defined by the drive wheels 410, 420. By placing the center of gravity forward of the drive wheels 410, 420, the robot 100 is less likely to tip up and backwards (e.g., when going over thresholds).


Referring to FIGS. 1-11, the robot 100 includes a navigational sensor system 700 in communication with the controller 450 that allows the robot 100 to be aware of its surroundings/environment and react in prescribed manners or behaviors according to its sensed perception of its surroundings/environment. A description of behavior control can be found in detail in Jones, Flyun & Seiger, Mobile Robots: Inspiration to Implementation second edition, 1999, A K Peters, Ltd., the text of which is hereby incorporated by reference in its entirety. The navigational sensor system 700 includes one or more cliff sensors 710, a stasis detector 720, a proximity sensor 730, at least one bump sensor 800, and/or an omni-directional receiver 900. Using input from the navigational sensor system 700, the controller 450 generates commands to be carried out by the robot 100. As a result, the robot 100 is capable of cleaning surfaces in an autonomous fashion.


The cliff sensors 710 may be used to sense when the robot 100 has encountered the edge of the floor or work surface, such as when it encounters a set of stairs. The robot 100 may have behaviors that cause it to take an action, such as changing its direction of travel, when an edge is detected. In the examples shown in FIGS. 2, 4, 5, and 10, the body 300 of the robot 100 houses four cliff sensors 710 along a perimeter of the body 300, with two cliff sensors 710 substantially along a front edge 302 of a forward portion 310 of the body 300 (preferably near forward outer corners or lateral edges) and two cliff sensors 710 substantially along a rearward edge 304 of a rearward portion 320 of the body 300 (preferably near rearward outer corners or lateral edges) (see FIG. 4). Each cliff sensor 710 includes an emitter 712 that sends a signal and a receiver 714 configured to detect a reflected signal. In some implementations, cliff sensors 1074 may be installed within a mounting apparatus that stabilizes and protects the sensor and which positions the sensor to point towards the window installed onto the bottom of the mounting apparatus. Together the sensor, the mounting apparatus and the window comprise a cliff sensor unit. Reliability of the cliff sensor 710 may be increased by reducing dust buildup. In some implementations, a window may be installed on the bottom of the mounting apparatus which includes a shield mounted within a slanted molding composed of a material which prevents dust build up, such as an antistatic material. The shield component and the molding may be welded together. To further facilitate the reduction in dust and dirt buildup, the shield may be mounted on a slant to allow dirt to more easily slide off. In some implementations, a secondary cliff sensor 710 may be present behind existing cliff sensors 710 to detect floor edges in the event that a primary cliff sensor 710 fails.


Robots defining shapes of constant width can turn in place about their centroid locus of the respective shape. A shape of constant width is a convex planar shape whose width, measured by the distance between two opposite parallel lines touching its boundary, is the same regardless of the direction of those two parallel lines. The width of the shape in a given direction is defined as the perpendicular distance between the parallels perpendicular to that direction. The Reuleaux triangle is the simplest example (after the circle) of shapes of constant width. However, in the examples shown, the robot 100 has a rectangular shaped forward portion 210 of the chassis 200, and thus not a robot of constant width, which can prevent the robot from spinning in place to escape from various stuck positions, such as with canyoning situations, inter alia. Canyoning situations arise when the robot 100 drives down a narrow corridor (with side walls) or plank (with side cliffs) that is slightly wider than the robot 100. When the robot 100 reaches the end of the corridor or plank it can only escape by driving in reverse back out of the corridor or off of the plank. If the robot 100 tries to spin in place (e.g., to rotate 180°) one of the robot's corners will hit a wall or go off a cliff. In the case of cliffs, the placement of cliff sensors 710 substantially along a rearward edge 304 of the body 300 or a rearward edge 204 of the chassis 200 allows the robot 100 to backup intelligently to escape without backing off a cliff. Similarly, the bump sensor 800, which will be described below, detects reward bumps, allowing the robot 100 to back out of narrow corridors.


Referring to FIGS. 2, 4, 5 and 11A, the stasis detector 720 indicates when the robot 100 is moving or stationary. In the examples shown, the stasis detector 720 includes a stasis wheel 722 with a magnet 724 either embedded in or disposed on the wheel 722. A magnetic receiver 726 (e.g., inductor) is position adjacent the wheel 722 to detect the magnet 724 moving past. The magnetic receiver 726 provides an output signal to the controller 450 that indicates when the magnet 724 moves past the magnetic receiver 726. The controller 450 can be configured to determine how fast and far the robot 100 is traveling based on the output signal of the magnetic receiver 726 and the circumference of the stasis wheel 722. In other implementations, the stasis detector 720 includes a stasis wheel 722 with circumferential surface having at least two different reflective characteristics (e.g., white and black). A stasis emitter and receiver pair (e.g., infrared) is disposed adjacent the stasis wheel 722. The stasis emitter is configured to emit a signal onto the circumferential surface of the stasis wheel 722, and the stasis receiver is configured to detect or receive a reflected signal off of the circumferential surface of the stasis wheel 722. The stasis detector 720 monitors the transitions between reflection states and non-reflection states to determine if the robot 100 is moving, and perhaps even the rate of movement.


Again due to the compact nature of the robot 100 and the compact positioning of components, the stasis wheel 722 acts as a third wheel for stable ground contact. If the stasis wheel 722 was placed forward of the cleaning assembly 500, it would need to be a caster wheel, rather than a directional wheel, which would drag in an arc when the robot 100 turns. However, the need for a rectangular forward portion 210 of the chassis 200, so as to fully clean in corners, prohibits placement of the stasis wheel 722 forward of the cleaning assembly 500 (e.g., which would result in a shape other than rectangular). A wheel is needed forward of the drive wheels 410, 420 to lift the forward portion 210 of the chassis 200 to an appropriate height for cleaning and brush rotation.


Referring to again FIG. 4, the stasis wheel 722 is disposed in the bin cover 620, just rearward of the cleaning assembly 500 and forward of the drive system 400 and the power source 160. The stasis/idler wheel 722 is positioned forward of the drive wheels 410, 420, forward of the center axis 223 of the profile circle 221, and within the profile circle 221. This positioning of the stasis wheel 722 allows the robot 100 to turn in place without substantially dragging the stasis wheel 722 across its rolling direction, while also providing support and stability to the forward portion 210 of the chassis 200. Preferably, the stasis wheel 722 is positioned at least ⅓ the radius of the center axis 223. The forward positioning of the stasis wheel 722 and the power source 160 is obstructed by the cleaning assembly 500. As a result, decreasing the size of the cleaning assembly 500 would allow further forward placement of the stasis wheel 722 and the power source 160 or a decrease in the overall length of the robot 100.


The examples shown in FIGS. 11B-11C illustrate the placement of components in the robot 100 to achieve a compact morphology as well as stability for movement. Where LD=flat cliff detector 710 A, 710B thickness, CH=cleaning head 500 front-to-back length, WB=wheelbase, RD=angled cliff detector 710C, 710D front to back length, WT=Wheel Track, and CR=circular radius (>½ wheel track), the tombstone shaped robot 100 has a length that is: 1) greater than LD+CH+WB+CR and 2) Equal to or less than 1.4 CR, where 3) RD<½ CR, WB>⅓ CR, CG is within WB. The placement of the components to satisfy the above relationship places the center of gravity 105 of the robot forward of the drive wheels 410, 420 and within the circular radius CR. The figures also illustrate the placement of two of the heaviest components, which include the power source 160 having a center of gravity 165 and the brush motor 515 having a center of gravity 517. The brush motor 515 is positioned as far forward as possible to place its center of gravity 517 as far forward as possible, so as to offset the weight of the power source 160. Similarly, the power source 160 is positioned as far forward as possible to place its center of gravity 165 as far forward as possible as well. However, forward placement of the power source 160 is generally obstructed by the cleaning assembly 500 and the bin 610.


Referring to FIGS. 1, 5 and 9, the proximity sensor 730 may be used to determine when an obstacle is close to or proximate the robot 100. The proximity sensor 730 may, for example, be an infrared light or ultrasonic sensor that provides a signal when an object is within a given range of the robot 100. In the examples shown, the proximity sensor 730 is disposed on a side (e.g., right side) of the robot 100 for detecting when an object, such as a wall, is proximate that side.


In a preferred implementation, as shown, the side of the robot 100 having the proximity sensor 730 is the dominant side of the robot 100, which in this case is the right-hand side relative to a primary direction of travel 105. In some examples, the wall proximity sensor 730 is an infrared light sensor composed of an emitter and detector pair collimated so that a finite volume of intersection occurs at the expected position of a wall. This focus point is approximately three inches ahead of the drive wheels 410, 420 in the direction of robot forward motion. The radial range of wall detection is about 0.75 inches. The proximity sensor 730 may be used to execute wall following behaviors, examples of which are described in U.S. Pat. No. 6,809,490, the entire contents of which is hereby incorporated by reference in its entirety.


In some implementation, the proximity sensor 730 includes an emitter and a detector disposed substantially parallel. The emitter has an emission field projected substantially parallel to a detection field of the detector. The proximity sensor 730 provides a signal to the controller 450, which determines a distance to a detected object (e.g., a wall). The proximity sensor 730 needs to be calibrated to accurately detect and allow the controller 450 to determine an object distance. To calibrate the proximity sensor 730 to the albedo (e.g., color or reflectivity) of an adjacent object, the robot 100 bumps into the object on its dominant side and records a reflection characteristic. In the example of an infrared emitter and detector, the controller 450 records a reflection intensity at the moment of contact with the object, which is assumed to be a wall. Based on the recorded reflection intensity at the known calibration distance between the edge of the body 300 and the proximity sensor 730, the controller 450 can determine a distance to the wall thereafter while driving alongside the wall. The controller 450 can implement servo control on the drive motors 412, 422 to drive at a certain distance from the wall, and hence wall follow. The robot 100 may periodically turn into the wall to side-bump the wall and re-calibrate the proximity sensor 730. If the proximity sensor 730 senses an absence of the wall, the robot 100 may decide to re-calibrate the proximity sensor 730 upon recognition of the wall again.


The robot 100 can actively wall follow on its dominant side by using the proximity sensor 730. The robot 100 can passively wall follow on its non-dominant side (or the dominant side if the proximity sensor 730 is not present or active). After bumping into an object (e.g., sensed by the bump sensor 800), the robot 100 can assume that the object is a wall and turn to follow the wall. The robot 100 may back-up before turning, so as to not catch a front corner of the body 300 on the object/wall, thus re-triggering the bump sensor 800 in a forward direction. After turning (e.g., about 90°), the robot 100 drives straight (e.g., along the wall) and slightly turns into the wall, so as to scrape along the wall. The robot 100 can sense that it's scraping along the wall by detecting a side-bump via the multi-directional bump sensor 800, which will be described below. The robot 100 can continue to passively wall follow until the bump sensor 800 no longer detects a side-bump on the current wall-following side of the robot 100 for a certain period of time.


The robot 100 can passively wall follow due in-part to its flat sides of the body 300 and the rear placement of the drive wheels 410, 420. The flat sides allow the robot 100 to scrape along the wall (e.g., substantially parallel to the wall). The positioning of the drive wheels 410, 420 in the rear portion 220 of the chassis 200 allows the robot 100 to swing its forward portion 210 of the chassis 200 into the wall, so as to scrape along the wall. Referring to FIG. 12A, the robot 100 moving forward while in contact with a wall 30 is subject to two forces—a force normal to the wall, Fn, and a force tangential to the wall, Ft. These forces create opposing torques about a point midway between the wheels, the natural center of rotation of the robot. It can be shown that the torque, τ, is:

τ=rF(cos θ sin θ−μ sin2 θ)

Where μ is the coefficient of friction between the wall and the robot. Given a value for μ there is some critical angle θc where the torques are balanced. For θ<θc the first term to the right in the equation is larger and the robot tends to align with the wall. If θ>θc then the second term is larger and the robot 100 tends to turn into the wall.


Certain robot geometries, such as the tombstone shape of the robot disclosed can achieve useful values for θc. Note that the standard cylindrical geometry has θc=π/2 regardless of the robot's approach angle to the wall. Thus, passive wall following cannot be achieved with this configuration. To successfully passively wall follow, the offset between the natural axis of robot rotation and the contact point with the wall should be as far forward as possible when robot motion is aligned with the wall. Also, the maximum wall step height that allows passive recovery is an important consideration and is affected by robot shape.


In some examples, the robot 100 can semi-passively wall follow. The robot 100 wall follows on its dominant side, which has the side proximity sensor 730. After detecting an object, assumed to be a wall, by either the bump sensor 800 or the proximity sensor 730, the robot 100 turns to align the dominant side of the robot 100 with the assumed wall. The robot 100 then proceeds to drive along the wall while turning slightly into the wall so as to scrape along the wall. The robot 160 maintains contact with the wall by sensing contact with the wall via the bump sensor 800 or the proximity sensor 730 and the controller 450 implements servo control or the drive motors 412, 422 to drive accordingly along the wall.


In some examples, as shown in FIG. 12B, the robot 100 includes a contact element 180 (e.g., a roller, bearing, bushing, or soft contact point) disposed at one or both of the front corners of the robot 100 to aid wall following. Preferably, the contact element 180 is at least disposed on the front corner of the dominant side of the robot 100. As the robot 100 moves along the wall, it contacts the will with the contact element 180, instead of merely scraping along the wall. In some implementations, the contact element 180 is a side brush that notes along a vertical axis and extends beyond the body 300. The side brush maintains a buffer space between a wall and the robot body 300.


The bump sensor 800 is used to determine when the robot 100 has physically encountered an object. Such sensors may use a physical property such as capacitance or physical displacement within the robot 100 to determine when it has encountered an obstacle. In some implementations, the bump sensor 800 includes contract sensors disposed about the periphery of the body 300. In preferred implementations, the bump sensor 800 is configured to detect movement of the body 300 over the chassis 200. Referring to FIGS. 5, 10 and 13A-13D, the body 300 of the robot 100 functions as a bumper and is flexibly coupled to the chassis 200 by one or more elastic elements 309 (e.g., springs, flexible pins, elastomeric pegs, etc) (see FIG. 5). The elastic elements 309 allow the bumper style body 300 to move in at least two directions (preferably three directions). In some examples, the bump sensor 800 includes a bump sensor base 810 carrying at least three (preferably four) detectors 820 (e.g., light or infrared light detectors, such as a photo-detector) equally spaced on the bump sensor base 810. In the example shown, the bump sensor base 810 is a printed circuit board carrying the detectors 820. The printed circuit board-bump sensor base 810 is in communication with and may carry the controller 450. The bump sensor 800 includes a bump sensor shroud 830 defining a cavity 832 that is positioned over and covering the bump sensor base 810. The bump sensor shroud 830 houses an emitter 840 (e.g., light or infrared light emitter), which emits a signal 842 (e.g., light) through an orifice 834 defined by the bump sensor shroud 830 through a wall 836 of the cavity 832. The orifice 834 collimates the signal 842, so as to have a directed path. As the bump sensor shroud 830 moves over the bump sensor base 810, the signal 842 moves over the detectors 820, which provide corresponding signals to the controller 450 (e.g., proportional to signal intensity). Based on the detector signals, the controller 450 is configured to determine the direction of movement of the body 300 over the chassis 200, and optionally the rate of movement. The bump sensor 800 may detect 360 degrees of movement of the bump sensor shroud 830 over the bump sensor base 810. The drive system 400 and/or the controller 450 are configured to alter a drive direction of the robot 100 in response to the detector signal(s) received from the detectors 820.


In the example shown in FIGS. 13A and 13C, the bump sensor 800 includes a bumper guide 850 that guides the body 300 along two directions of movement. As noted above, the body 300 is coupled to the chassis by elastic elements 309 that allow the body 300 to be displaces both by translation and rotation. The bumper guide 850 may be configured as a “T”, cross shaped, or orthogonal groove(s) 852 formed in a member that moves with the bumper 300 (relative to the chassis 300), mated to at least one guide pin 854 on the chassis 200 that doesn't move (relative to the chassis 200). In other implementations, the bumper guide 850 is defined in a portion of the chassis 200 and the guide pin 854 is secured to the bumper body 300. When the bumper 300 is displaced, bumper guide 850 tends to guide the bumper 300 in that area along an arm of the bumper guide 850, which permits “translate” bumps as is and tends to otherwise reduce rotational components or guide rotation into translation, improving the detection of the bump sensor 800.


In the examples shown in FIGS. 5, 10 and 13D, the bump sensor 800 includes a bumper connector arm 850 secured between the bump sensor shroud 830 and the bumper style body 300. The bumper connector arm 850 translates movement of the body 300 to the bump sensor shroud 830. The bump sensor shroud 830 can be secured to the bump sensor base 710 and be comprised of an elastic material such that the bump sensor shroud 830 can move by elastic deflection in relation to the bump sensor base 810. In other examples, the bump sensor shroud 830 is positioned over the bump sensor base 710 and allowed to move freely in relation to the bump sensor base 810.


The robot 100 has a forward drive direction and carries the omni-directional receiver 900 on an upper portion 305 of the body 300 above the forward portion 202 of the chassis 200. FIG. 1 illustrates an example position of the omni-directional receiver 900 on the robot 100, as being the highest part of the robot 100. The omni-directional receiver 900 may be used to sense when the robot 100 is in close proximity to a navigation beacon (not shown). For example, the omni-directional receiver 900 may relay a signal to a control system that indicates the strength of an emission, where a stronger signal indicates closer proximity to a navigation beacon.



FIGS. 14-16 show perspective, side, and cut-away views of the omni-directional receiver 900. The omni-directional receiver 900 includes a housing 910, a conical reflector 920 and an emission receiver 930. The housing 910 has an upper portion 912 and an inner cavity 916. The upper portion 912 may allow a transmission of an emission into the inner cavity 916. The conical reflector 920 is located on an upper surface of the cavity 916 to reflect emissions falling on the upper portion 912 of the housing 910 into the inner cavity 916. The emission receiver 930 is located in the inner cavity 916 below the conical reflector 920. In some implementations, the omni-directional receiver 900 is configured to receive transmissions of infrared light (IR). In such cases, a guide 940 (e.g., a light pipe) may guide emissions reflected off the conical reflector 920 and channel them to the emission receiver 930.


The controller 450 may be configured to propel the robot 100 according to a heading setting and a speed setting. Signals received from the navigational sensor system 700 may be used by a control system to issue commands that deal with obstacles, such as changing the commanded speed or heading of the robot 100. For instance, a signal from the proximity sensor 730 due to a nearby wall may result in the control system issuing a command to slow down. In another instance, a collision signal from the bump sensor 800 due to an encounter with an obstacle may cause the control system to issue a command to change heading. In other instances, the speed setting of the robot 100 may be reduced in response to the contact sensor and/or the heading setting of the robot 100 may be altered in response to the proximity sensor 730.


The controller 450 may include a first independent behavioral routine configured to adjust the speed setting of the robot 100; and a second independent behavioral routine configured to alter the heading setting of the robot 100, in which the first and second independent behavioral routines are configured to execute concurrently and mutually independently. The first independent behavioral routine may be configured to poll the proximity sensor 730, and the second independent behavioral routine may be configured to poll the bump sensor 800. While implementations of the robot 100 discussed herein may use behavioral based control only in part or not at all, behavior based control is effective at controlling the robot to be robust (i.e. not getting stuck or failing) as well as safe.



FIGS. 17-25 illustrate another implementation of the autonomous coverage robot 101. The robot 101 includes a chassis 200 having a forward portion 210 and a rearward portion 220 and a body 300 having a forward portion 301 and a rearward portion 303 configured to substantially follow the contours of the chassis 200. The forward portion 210 of the chassis 200 defines a substantially rectangular shape and the rearward portion 220 defines an elliptical shape. The forward portion 301 of the body 300 may be flexibly connected to the chassis 200. A handle 330 is disposed on or defined by an upper portion 305 of the rearward portion 303 of the body 300.


In an example configuration, the form factor of the robot 101 is about 15 cm in diameter, about 7.5 cm in height, and functions on battery power to clean for about six hours before requiring recharge. Also, for example, the robot 101 may effectively clean the floor of a single average-size room in about 45 minutes, or several smaller areas.


Referring to FIGS. 18, 20 and 21, the robot 101 includes a drive system 400 carried by the chassis 200, as described above. In the implementation shown, the drive motors 412, 422 are disposed adjacent and in-line (e.g., co-axial) with their respective drive wheels 410 and 420. In some examples, the robot includes a gear box 414, 424 coupled between the drive wheel 410, 420 and its respective drive motor 412, 422. The robot 101 includes a controller 450 in communication with the drive system 400. The controller 450 is configured to maneuver the robot 101 to pivot in place.


The robot 101 includes a cleaning assembly 500 mounted on the front portion 210 of the chassis 200 includes a first, front roller brush 510 rotatably mounted substantially near and substantially parallel to the front edge 202 of the chassis 200. The cleaning assembly 500 includes second and third side roller brushes 550, 560 rotatably mounted orthogonally to the front roller brush 510 substantially near respective right and left side edges 306, 308 of the body 300. The roller brushes 510, 550, 560 are driven by a cleaning motor 530 coupled to the roller brushes 510, 550, 560 by a gear box 532. The cleaning motor 530 is positioned rearward of the front roller brush 510 and between the side roller brushes 550, 560.


The robot 101, in a preferred implementation, includes only one kind of cleaning mechanism. For example, the robot 101 shown in FIG. 18 includes bristle-brush rollers for the front roller brush 510 and side roller brushes 550, 560. The bristle-brush rollers may be similar to the brush rollers found in the SCOOBA® robot marketed by iRobot Corporation, for example; or it may be similar to the R2 or R3 brush types used in the ROOMBA® robot, as further examples. In one implementation, the brush does not pick up long hairs or fibers that would tend to become tightly wrapped around the brush, in order to minimize the frequency of maintenance required by the user for removing debris from the brush. Alternatively, the robot 101 may include two or more varieties of cleaning mechanism, such as both a vacuum and bristle brushes, inter alia.


In the some examples, the front roller brush 510 and the side roller brushes 550, 560, each rotate about a horizontal axis parallel to the work surface, thereby providing a horizontal cleaning assembly 500, although the main work width of the coverage robot 100 may include vertically rotating brushes, no brushes in lieu of a vacuum, a reciprocating brush, a circulating belt member, and other known cleaning implements. Each roller brush 510, 520, 550, 560 may have a cylindrical body that defines a longitudinal axis of rotation. Bristles are attached radially to the cylindrical body, and, in some examples, flexible flaps are attached longitudinally along the cylindrical body. As the roller brush 510, 520, 550, 560 rotates, the bristles and the flexible flaps move debris on the work surface, directing it toward the bin 610 in the robot 100. In examples including a vacuum unit, the brushes 510, 520, 550, 560 may also direct debris or dirt toward a suction path under the cleaning robot 100. In the case of a wet cleaning robot, the brushes 510, 520, 550, 560 may have instead a scrubbing function, and a vacuum or other collector may collect waste fluid after scrubbing.


In the examples shown, the effective components of the cleaning assembly 500 such as the brushes 510, 550, 560 are disposed toward the extreme front corners of the forward portion 210 of the chassis 200. As a result, the area of floor that the rectangular forward portion 210 of the chassis 200 can cover is maximized, and portions of the floor that are not covered are minimized, as illustrated in FIG. 27.


By including only a single cleaning mechanism, such as the cleaning assembly 500, rather than a combination of two or more varieties of cleaning mechanisms (such as, for example, both a roller brush and a vacuum; or both wet and dry cleaning mechanisms, which may necessitate two or more storage chambers, inter alia), the robot 101 may be made more compact relative to otherwise.


Referring to FIGS. 18, 20, 21 and 24, the robot 101 includes a bin assembly 600, as described above. In the examples shown, the chassis 200 defines the debris chamber or bin 610, which is positioned between the cleaning assembly 500 and the drive system 400. In specific examples, the bin 610 is forward of the drive wheels 410, 420 and rearward of the front roller brush 510. As the front roller brush 510 and the side roller brushes 550, 560 spin against the floor, they agitate debris and sweep the debris into a debris chamber/bin 610 within the robot 101 via an intake slot or other suitable opening leading from the roller brushes 510, 550, 560 to the debris chamber 610.


The bin cover 620, in the example shown, is releasably connected to the chassis 200 by one or more hinges 622 (e.g., living hinge, peg and socket, etc.). In some implementations, the bin-cover release 630 is actuatable from substantially near or at the handle 330, thereby allowing actuation of the bin-cover release 630 while holding the handle 330. In other implementations, the bin-cover release 630 is actuatable near or on the bin cover 620, such that a user holds the handle 330 with one hand and opens the bin cover 620 via the bin-cover release 630 with another hand (see FIG. 24). In some implementations, the bin cover release 630 is a spring biased latch or latching button attractable by pressing downwardly (e.g., button) or pulling upwardly (e.g., trigger).


In the examples shown, the robot 101 includes a handle 330 is disposed on or defined by an upper portion 305 of the body 300. A user can grasp the handle 330 to lift the robot 101 and transport it manually. In addition, the robot 101 may include one or more buttons 632 proximal to the handle 330. The button 632 is preferably operable by one hand, while the user's hand grips the robot 101 by the handle 330. The button 632 is configured to actuate a bin-cover release 630, which is operable to control holding the bin cover 620 in its closed position and releasing the bin cover 620 to move to its open position. In one example, as illustrated in FIG. 24, when the user operates the button 632, the bin-cover release 630 disengages and the bin cover 620 swings open about the hinges 622. With the bin cover 620 in its open position, the contents of the debris chamber/bin 610 can drop out of the robot 101 under the force of gravity. The robot 101 may also include a spring to ensure that the bin cover 620 opens in case the weight of the debris in the debris chamber 610 is insufficient to swing the bin cover 620 open, for example.


The robot 101 includes a power source 160 (e.g., battery) in communication with the drive system 400 and/or the controller 450, and removably secured to the chassis 200. In the examples shown in FIGS. 20 and 21, the power source 160 is received by a power receptacle 260 defined by the rearward portion 220 of the chassis 200. A power cover 262 is releasably secured to the chassis 200 to hold and/or cover the power source 160 in the power receptacle 260. In the examples shown, the power source 160 is positioned in the rearward portion 220 of the chassis 200, rearward of the drive wheels 410, 420. In this position, the weight of the power source 160 offsets the weight of the cleaning assembly 500 to position a center of gravity of the robot 101 substantially about a center of the chassis 200.


The compact dimensions of the robot 101 allow the robot 101 to navigate under potential obstacles such as chairs, tables, sofas, or other household objects, and perform floor cleaning in these hard-to-reach areas. In addition, the robot 101 may include a clearance sensor disposed on a top surface thereof, such as a sonar range-finder or light-sensitive diode, that scans directly overhead. When the clearance sensor detects the presence of an object within a threshold distance—such as, for example, two feet—the robot 101 may continue moving until the overhead space is clear. Accordingly, the robot 101 may avoid becoming “lost” underneath furniture, out of view of the user, for example.


As the drive system 400 propels the robot 101 over the floor, the front roller brush 510 preferably rotates in the same direction as the drive wheels 410, 420 but at a rate faster than the rate of the robot 101 traversing over the floor, so as to sweep debris into the debris chamber 610. In addition, the side brushes 550, 560 also sweep debris inward at the same time. In one example, the bristles of the brushes 510, 550, 560 may extend downward by about 0.015 to 0.025 inches beyond the extent of the wheels 410, 420, while rotating at between about 600 and about 1600 RPM.


The form factor of the robot 101 may be made more compact by omitting a caster wheel or other support structure. Due to the width of the front brush roller 510, as well as the side brushes 550, 560 disposed at opposite lateral sides of the robot 101, the robot 101 may omit a third caster or free wheel aside from the drive wheels 410, 420 without significantly impacting the balance or stability of the robot 101. Alternatively, the robot 101 may further include support bearings 490, as shown in FIGS. 18, 20, and 22-25, disposed proximal to the extreme opposite corners of the forward portion 210 of the chassis 200. The support bearings 490 may include a single rigid member of a smooth and/or self-lubricating material, such as polytetrafluoroethylene or a polyoxymethylene polymer; or, the support bearings 490 may include a roller bearing or any other suitable mechanism for preventing the robot 101 from tipping or losing balance while providing a low frictional resistance as the robot 101 traverses the floor.


Referring to FIG. 21, the robot 101 includes a navigational sensor system 700 in communication with the controller 450 that allows the robot 101 to be aware of its surroundings/environment and react in prescribed manners or behaviors according to its sensed perception of its surroundings/environment. In the example shown, the navigational sensor system 700 includes one or more bump sensors 800 and/or a stasis detector 720. Using input from the navigational sensor system 700, the controller 450 generates commands to be carried out by the robot 101. As a result, the robot 101 is capable of cleaning surfaces in an autonomous fashion.


The bump sensor 800 is used to determine when the robot 100 has physically encountered an object. Such sensors may use a physical property such as capacitance or physical displacement within the robot 100 to determine when it has encountered an obstacle. In the example shown in FIG. 21, the bump sensor 800 is a contract switch disposed about the periphery of the front portion 210 of the chassis 200, between the chassis 200 and the forward portion 301 of the body 300. The forward portion 301 of the body 300 is flexibly or slidably attached to the chassis 200 in a manner that allows contact with an obstacle to be translated to the bump sensor(s) 800. In preferred implementations, the robot includes bump sensors 800 disposed at the forward corners of the chassis 200, with at least one bump sensor 800 disposed on each side of each corner, thus allowing the robot 100 to determine a direction and/or location of a collision. The forward portion 301 of the body 300 acts as a single mechanical bumper with sensors 800 substantially at the two ends of the bumper for sensing movement of the bumper. When the forward portion 301 of the body 300 is compressed, the timing between the sensor events is used to calculate the approximate angle at which the robot 101 contacted the obstacle. When the forward portion 301 of the body 300 is compressed from the right side, the right bump sensor detects the bump first, followed by the left bump sensor, due to the compliance of the bumper and the bump detector geometry. This way, the bump angle can be approximated with only two bump sensors.


Since the robot 101 preferably has a compact and lightweight form, the momentum carried by the robot 101 may be lighter than a standard-size robot. Accordingly, the robot 101 preferably includes “light touch” or contactless bump sensors. For example, the robot 101 may include one or more accelerometers 458 in communication with the controller 450 (see FIG. 21) for monitoring the robot's acceleration along at least one horizontal axis. When acceleration is detected that exceeds a pre-established threshold, the robot 101 may respond as though a bumper switch had been triggered. As a result, the robot 101 may omit a traditional contact-switch type bump sensor.


In some examples, the robot 101 may utilize the accelerometer 458 as a stasis detector 720. As a benefit, processing accelerometer data for stasis detection may require only a processing rate of about 30 hertz. For example, as the robot 101 is moving over a floor, vibrations cause the accelerometer 458 to detect acceleration of a particular amplitude profile. However, when the robot 101 stops moving, because of either a normal state or it has been blocked by an obstacle, the amplitude of the vibrations detected by the accelerometer 458 decrease accordingly. Therefore, the robot 101 can respond to such decreased acceleration according to a stasis-escape behavior, for example. By monitoring a single accelerometer 458 for purposes of both bump detection and/or stasis detection, the robot 101 may omit bump switches and/or other stasis detection hardware, thus potentially requiring less space aboard the robot 101.


Referring to FIGS. 26-28, the robot 100, 101 can navigate over floor surfaces such as tile, hardwood or carpeting, while collecting debris from the floor within the debris chamber/bin 610. When the robot 100, 101 navigates into a corner, the front roller brush 510 and the end brushes 540 or the side roller brushes 550, 560, respectively, can effectively clean an area that is flush up against the sides of the corner. In comparison, a round-outline robot 10, such as illustrated in FIG. 28, can approach a corner 9220 but cannot move flush against the walls 9241, 9242 intersecting at the corner 9220. As a result, the round-outline robot 10 cannot effectively clean the wedge-shaped area 9290 abutting the corner 9290. As illustrated in FIG. 26, the robot 100, 101 can navigate along a straight path while remaining substantially flush against a wall edge 9210 where a wall 9421 intersects the floor 9250. The robot 100, 101 preferably includes one or more bump sensors 800, 1800 disposed or active within the front portion 210 of the chassis 200; and as the robot 100, 101 taps against the wall 9241, the robot 100, 101 can adjust its heading so as to travel substantially parallel to the wall 9241, for example.


The operation of the robot 101 is preferably controlled by a microcontroller 450, such as a FREESCALE™ QG8 or other microcontroller suitable to receive input from the robot's sensors and operate the motors or other output devices of the robot 101. As illustrated in FIGS. 29-32, for example, the microcontroller 450 receives input from bump sensor 800 and outputs control signals to the drive motors 412, 422 coupled to the right and left drive wheels 410, 420. Alternatively, a microprocessor or other control circuitry may be used. The robot 101 may execute behavior-based control software; or may operate according to simple, single-threaded control loops, inter alia.


The rectangular outline of the front portion 210 of the chassis 200 may cause the corners thereof to collide with obstacles which might not be detected by bump sensors or cliff sensors, in contrast to round-outline robots that can rotate freely without such risk, the robot 101 preferably responds to bumps detected while rotating in place by halting the rotation and backing up directly in reverse. As a result, the robot 101 may be less likely to become inextricably wedged or stuck, notwithstanding the square corners of the front portion 210 of the chassis 200. Alternatively, the robot 101 may behave in accordance with control software generally similar to the ROOMBA™ or SCOOBA™ robots, as examples.


In accordance with a further example, the robot 100, 101 may automatically return to a cradle or base station for storage after completing a cleaning cycle. The robot 100, 101 may also include an electrical interface for recharging on-board batteries. Additionally, the cradle or base station may include a receptacle positioned below a “home” position of the robot 100, 101. When the robot 100, 101 interfaces the cradle and stops at the home position, the robot 100, 101 may automatically actuate the bin-cover release 630 and evacuate the debris from the debris chamber 610 into the cradle's receptacle positioned below the robot 100, 101.


In robot implementations using the omni-directional receiver 900, the base station may include an omni-directional beam emitter and two navigational field emitters. The robot 100 may maneuver towards base station by detecting and advancing along one of the lateral field edges of the overlapping fields aligned with a docking direction until docked with the base station. The robot 100 may detect the emissions of base station with the omni-directional receiver 900 and maneuver to detect an outer lateral field edge of at least one field emission. The robot 100 may then advance along the outer lateral field edge to the aligned lateral field edge of the overlapping fields. Upon detecting the aligned lateral field edge, the robot 100 advances along the aligned lateral field edge until docked with base station.



FIG. 33 is a block diagram showing a behavioral software architecture within the controller 450. The behavioral software architecture includes goal-oriented behaviors. The robot 100, 101 employs a control and software architecture that has a number of behaviors that are executed by an arbiter 1005 in the controller 450. The arbiter 1005 executes commands on motor drives 1010 in communicates with each drive motor 412, 422. A behavior is entered into the arbiter 1005 in response to a sensor event. In one implementation, all behaviors have a fixed relative priority with respect to one another. The arbiter 1005 (in this case) recognizes enabling conditions, which behaviors have a full set of enabling conditions, and selects the behavior having the highest priority among those that have fulfilled enabling conditions. The diagram shown in FIG. 33 does not necessarily reflect the (fixed) priority hierarchy of the robot 100, 101. In order of decreasing priority, the behaviors are generally categorized as escape and/or avoidance behaviors (such as avoiding a cliff or escaping a corner) and working behaviors (e.g., wall following, bouncing, or driving in a straight line). Movement of the robot 100, 101, if any, occurs while a behavior is arbitrated. If more than one behavior is in the arbiter 1005, the behavior with a higher priority is executed, as long as any corresponding required conditions are met. For example, a cliff avoiding behavior 1400 will not be executed unless a cliff has been detected by a cliff detection sensor, but execution of the cliff avoiding behavior 1400 always takes precedence over the execution of other behaviors that also have satisfied enabling conditions.


The reactive behaviors have, as their enabling conditions or triggers, various sensors and detections of phenomena, but, in general, not (arbitrary) states of a sequence. As shown in FIG. 33, these include sensors for obstacle avoidance and detection, such as cliff sensors 710, stasis detector 720, side proximity sensor 730, bump sensor 800, and/or an omni-directional receiver 900 (e.g., for detection of a virtual wall signal (which may instead be considered a coverage trigger)). Sensors of these types are monitored and conditioned by filters, conditioning, and their drivers, which can generate the enabling conditions as well as record data that helps the behavior act predictably and on all available information (e.g., conversion to one-bit “true/false” signals, recording of likely angle of impact or incidence based on strength or time differences from a group of sensors, or historical, averaging, frequency, or variance information).


Actual physical sensors may be represented in the architecture by “virtual” sensors synthesized from the conditioning and drivers. Additional “virtual” sensors that are synthesized from detectable or interpreted physical properties, proprioceptive or interpreted upon the robot 100, 101, such as over-current of a motor, stasis or stuck condition of the robot 100, 101, battery charge state via coulometry, and other virtual sensors “virtual N.”


In some implementations, the robot 100 includes the following behaviors listed in priority order from high to low: 1) User Interface Group 1100, 2) Factory Test Group 1200, 3) Reverse Bump Follow Group 1300, 4) Cliff Avoid Group 1400, 5) Bounce Rear 1500, 6) Bump Follow Group 1600, 7) Bounce 1700, and 8) Drive 1800. A behavior group refers to a set of behaviors that work together to implement an overall behavior. For example, the “User Interface Group” behavior is a set of three behaviors that handles the user interface while the robot is at rest.


The robot may include a user interface 370, which is a single clean/power button in the examples shown in FIGS. 1 and 17, for allowing a user to interact with the robot 100. The following sub-behaviors of the User Interface Group behavior 1100, prioritized from high to low, execute the user interface 370 implemented as a single clean/power button: 1) User Off 1110, 2) User Start 1120, and 3) User Do Nothing 1130. The following sub-behaviors of the Factory Test Group behavior 1200, prioritized from high to low, implement a factory test mode for quality control purposes: 1) Factory Test Complete 1210, 2) Factory Test Advance 1220, and 3) Factory Test 1230.


The following sub-behaviors, prioritized from high to low, implement the Reverse Bump Follow escape behavior 1300: 1) Reverse Bump Follow Escape Swing 1310, 2) Reverse Bump Follow Turn Out 1320, and 3) Reverse Bump Follow Arc In 1330. Due to the rectangular shape of the front portion 210 of the chassis 200, it is possible for the robot 100 to drive into a space that is too narrow to turn around in (e.g., like a parking space). These confinement areas are referred to as canyons. The term “canyon” refers generically to any narrow confinement source. If a cliff is similarly confining the robot 100 to a narrow space, this is referred to as a plank. Since the strategy for escaping these confinement obstacles is the same, the directional cliff sensor and bumper sensor data is aggregated into a set of four “directional confinement” sensors which are the basis for the discussion below. The four sensors are front-left, front-right, rear-left and rear-right. The direction of a reverse bump follow is clockwise if the Reverse Bump Follow Arc In behavior 1330 is driving the robot 100 backward while rotating clockwise. The direction of a reverse bump follow is counterclockwise if the Reverse Bump Follow Arc In behavior 1330 is driving the robot 100 backward while rotating counterclockwise.


The Reverse Bump Follow Escape Swing behavior 1310 causes the robot 100 to turn in place with enough angular progress to deduce that the presence of a canyon. The activation condition for the Reverse Bump Follow Escape Swing behavior 1310 is evaluated at the end of the Reverse Bump Follow Turn Out behavior 1320. After the Reverse Bump Follow Escape Swing behavior 1310 is armed, it executes once and then disables itself until armed again by the Reverse Bump Follow Turn Out behavior 1320. At the start of the Reverse Bump Follow Escape Swing behavior 1310, an escape angle is set to a random number between 120 and 160 degrees. The robot 100 then turns in place in the opposite direction of the reverse bump follow direction until the escape angle is achieved. If any rear directional confinement sources appear while turning in place, the robot 100 moves forward to avoid them. If a front directional confinement source is encountered, the turn in place is aborted. After completion of the turn in place, the success of the escape is determined in the following order. First, if the turn in place was aborted due to detection of a front confinement source, the angular progress of the turn in place is compared to a minimum escape angle which is computed by generating a random number between 80 and 120 degrees. If the angular progress does not exceed this amount, a similar maneuver for the Reverse Bump Follow Turn Out behavior 1320 is performed. This is done to return the robot 100 back to an orientation conducive to continuing the reverse bump follow. Second, if the turn in place was aborted due to detection of a front confinement source, and the angular progress exceeded the minimum escape angle computed above but fell short of the escape angle computed at the beginning of the behavior, the following is done. The reverse bump follow activation is cancelled, and a forward bump follow is triggered if the confinement source that stopped the turn in place was a bump. This improves the chances that the robot 100 will find its way out of a tight spot without detecting a new canyon and retriggering the reverse bump follow. Third, if the turn in place completed due to achieving the escape angle computed at the start of the behavior, the reverse bump follow activation is cancelled.


The Reverse Bump Follow Turn Out behavior 1320 attempts to orient the robot 100 relative to an obstacle such that forward progress can be made while arcing toward the obstacle again. Simply turning in place as a circular robot would is not sufficient for the robot 100 since the rectangular forward portion 210 of the chassis 200 would, at some point, hit the obstacle and prevent the robot 100 from turning in place further. To avoid this problem, the robot 100 instead follows a tight arc to maintain space from the obstacle. The Reverse Bump Follow Turn Out behavior 1320 begins after the backing up along an arc that is performed in the Reverse Bump Follow Arc In behavior 1330 finishes as a result of the rear bumper getting activated. The first task of the Reverse Bump Follow Turn Out behavior 1320 is to release the bumper 300 from the rear hit. This is done by driving the robot 100 forward until the bumper 300 is released. In the course of doing this, front confinement sources are handled in the following way. A front-left confinement source causes the robot 100 to turn clockwise. A front-right confinement source causes the robot 100 to turn counterclockwise. After the bumper 300 is released, the robot 100 computes a constrained random arc radius and angular progress that it must travel in the forward direction in order to reorient the robot 100 for the next iteration of the Reverse Bump Follow Arc In behavior 1330. The robot 100 travels along this arc until the computed angular progress is achieved. While doing this, the robot 100 responds to the front confinement sensor 710, 730, 800 (e.g., cliff sensor 710, proximity sensor 730, and/or bump sensor 800) on the opposite side of the robot 100 to the obstacle being followed. When this is detected, the robot 100 turns in place in the same rotational direction as the arc it is following. The Reverse Bump Follow Turn Out behavior 1320 ends when the computed angular progress is achieved or the front confinement sensor 710, 730, 800 on the same side of the robot 100 as the obstacle being followed is triggered. At the end of the behavior, a random number generator is used to decide whether or not to trigger a Reverse Bump Follow Escape Swing behavior 1310. At a minimum, the probability of triggering the Reverse Bump Follow Escape Swing behavior 1310 will be about 20%. If the angular progress of the Reverse Bump Follow Turn Out behavior 1320 was between about 2 and about 5 degrees, the probability increases to about 50%. If the angular progress is less than 2 degrees, the probability is about 100%.


The Reverse Bump Follow Arc In behavior 1330 attempts to make forward progress while keeping an obstacle close to one side of the robot 100 by driving backward in an arc that begins shallow and gets progressively more severe with elapsed time in the behavior. The Reverse Bump Follow Arc In behavior 1330 executes when the robot 100 is in the reverse bump following mode 1300 and none of the other reverse bump follow behaviors 1310, 1320 are activated. While traveling in the arc, the robot 100 will respond to the front confinement sensor 710, 730, 800 (e.g., cliff sensor 710, proximity sensor 730, and/or bump sensor 800) on the opposite side of the robot 100 to the obstacle. It does this by turning in place in the opposite rotational direction to the arc being followed. The Reverse Bump Follow Arc In behavior 1330 ends when a rear confinement sensor 710, 800 (e.g., cliff sensor 710 and/or bump sensor 800) is triggered or the arc has made over 120 degrees of angular progress.


The Cliff Avoid Group behavior 1400 is a group of escape behaviors that includes the following sub-behaviors, prioritized from high to low: 1) Cliff Avoid Rear 1410, and 2) Cliff Avoid 1420. Referring to FIG. 4, in preferred implementations, the robot 100 has four cliff sensors 710 positioned at the front-right, front-left, rear-right and rear-left extremes of the robot 100. The front-right and front-left cliff sensors 710A, 710B detect when the either of the respective front corners of the robot 100 move over a cliff. Since the drive system 400 is positioned rearward of the cleaning assembly 500, which is located near the front edge, the robot 100 can back-up before an appreciably amount of the robot 100 moves over the cliff edge. The rear-right and rear-left cliff sensors 710C, 710D are positioned directly rearward of the respective right and left drive wheels 410, 420. As a result, the rear-right and rear-left cliff sensors 710C, 710D detect when a rearward portion of the robot 100 moves over a cliff edge before the drive wheels 410, 420 move over the cliff edge, so as to prevent driving in reverse at angle off of a cliff. If the robot 100 included rear cliff sensors 710 only along a center portion of the rearward portion 220 of the chassis 200, the robot 100 could drive in reverse at an angle and move a drive wheel 410, 420 over a cliff edge before detecting the cliff edge.


The Cliff Avoid Rear behavior 1410 executes whenever the rear cliff sensors 710C, 710D are triggered. Front cliffs sensors 710A, 710B are also handled in this behavior 1410 since it is higher priority than Cliff Avoid 1420. At the beginning of the Cliff Avoid Rear behavior 1410, an escape direction of clockwise or counterclockwise is selected. The decision is made in the following order. 1) If front-left cliff sensor 710B is triggered, set to clockwise. 2) If front-right cliff sensor 710A is triggered, set to counterclockwise. 3) If rear-right cliff sensor 710C is triggered, set to clockwise. 4) If rear-left cliff sensor 710D is triggered, set to counterclockwise. After the direction is set, the robot 100 turns in the specified direction along an arc that is centered on a drive wheel 410, 420. While traveling, the front cliff sensors 710 are monitored and used to alter the direction of travel as follows. If the front-right cliff sensor 710A is triggered, the robot 100 turns in place counterclockwise. If the front-left cliff sensor 710B is triggered, the robot 100 turns in place clockwise. The robot 100 continues to travel as described above until both rear cliff sensors 710C, 710D are not triggering.


The Cliff Avoid behavior 1420 only handles the front cliff sensors 710A, 710B of the robot 100 and typically executes when the robot 100 is driving forward. At the beginning of the Cliff Avoid behavior 1420, an escape direction is chosen based on which front cliff sensors 710A, 710B have been triggered. If only the front-left cliff sensor 710B is triggered, the clockwise escape direction is chosen. If only the front-right cliff sensor 710A is triggered, counterclockwise escape direction is chosen. If both front cliff sensors 710A, 710B are triggered, the escape direction is randomly selected. An escape angle is randomly chosen between about 25 and about 50 degrees. The Cliff Avoid behavior 1420 starts by backing up straight until both of the front cliff sensors 710A, 710B are not triggering. Then, the robot 100 turns in place until the escape angle is achieved. If any of the front cliff sensor 710A, 710B is retriggered as part of the turn in place, the entire Cliff Avoid behavior 1420 is retriggered and hence re-executed.


The Bounce Rear behavior 1500 runs when the bumper 300 is activated from the rear direction. This most commonly happens when the robot 100 drives backward to release the front part of the bumper 300 as part of the Bounce behavior 1700. The robot 100 drives forward until the bumper 300 is released, and then continues forward another 5 mm in order to reduce the chance that the turn in place about to be performed will not retrigger a rear bump. A rotational direction for the turn in place is decided based on the direction of the original rear bumper hit. If the hit came from the rear-right side of the robot 100, counterclockwise is chosen. If the hit came from the rear-left side of the robot 100, clockwise is chosen. If the hit was in the center part of the rear, the direction is randomly chosen. An escape angle is randomly chosen between about 10 degrees and about 200 degrees. The robot 100 turns in the chosen direction until the escape angle is achieved.


The Bump Follow Group 1600 includes the following sub-behaviors prioritized from high to low: 1. Bump Follow Wall Align 1610, 2. Bump Follow Arc In 1620. Bump following is used to escape from and clean cluttered areas. It is also used to follow a wall with the goal of dispersing the robot 100 evenly through its floor space.


The Bump Follow Wall Align behavior 1610 is designed to align the side of the robot 100 with an obstacle such as a wall. If the bump-follow-direction is clockwise, the goal is to have the robot's left side against the wall. If the direction is counterclockwise, the goal is to have the robot's right side against the wall. When bump following is enabled, the Bump Follow Wall Align behavior 1610 begins when a front bump is triggered. The location of the bumper hit is used to decide how much the robot 100 should turn in place before performing another iteration of the Bump Follow Arc In behavior 1620. If the bumper 300 is triggered on the side of the bumper 300 that should not be near the obstacle, the robot 100 sets a turn in place goal of between about 25 and about 45 degrees. This larger increment saves time in the alignment process. If the bumper 300 is triggered on the side that should be near the obstacle, the robot 100 turns in place in the direction that swings the bumper 300 into the obstacle even more. The goal of this maneuver is to see if the bumper 300 tends to stay engaged or releases. If it releases, it suggests that the robot 100 is not yet at a very shallow angle to the wall, and a turn in place goal of between about 5 and about 25 degrees is selected. Otherwise, the robot 100 is probably at a shallow angle to the wall, and a turn in place goal of between about 1 and about 5 degrees is selected. If the turn in place goal was selected to be greater than 5 degrees, the robot 100 backs up until the bumper 300 is released. The robot 100 turns in place in the direction that swings the front of the robot 100 away from the obstacle until the target angle is achieved. If the bumper 300 is retriggered during the turn in place, the robot 100 backs up enough to release it.


The Bump Follow Arc In behavior 1620 runs when the bump following mode 1600 is enabled and Bump Follow Wall Align 1610 is not active. The robot 100 drives forward in a shallow arc, at first, in order to make forward progress. As more time elapses, the arc gradually tightens to bring the robot 100 back in contact with the obstacle. This allows the obstacle to be followed closely which can help the robot 100 find its way around it. If the bump follow mode 1600 was selected to maneuver through clutter, the robot 100 can continue arcing in without a bumper hit for up to about 100 degrees of angular progress. At that point, the bump follow 1600 is considered ended due to the robot escaping. If the bump follow mode 1600 was selected to help disperse the robot 100 through its space, it can continue arcing in without a bumper hit for up to about 210 degrees to allow for turning wall corners. At that point, the wall is considered lost and the bump follow behavior 1600 ends.


The Bounce behavior 1700 runs when the bumper 300 is activated from the front direction. The robot 100 drives backward until the bumper 300 is released. It then continues backward another 30 mm in order to reduce the chance that the turn in place about to be performed will not retrigger the bumper 300 from the front. This large additional clearance is required due to the rectangular shape of the forward portion of the bumper 300 creating the potential for the corner of the bumper 300 to swing into contact with the obstacle when turning in place. A rotational direction for the turn in place is decided based on the direction of the original front hit on the bumper 300. If the hit came from the front-right side of the robot 100, counterclockwise is chosen. If the hit came from the front-left side of the robot 100, clockwise is chosen. If the hit was in the center part of the front, the direction is randomly chosen. An escape angle is randomly chosen between about 10 degrees and about 200 degrees. The robot 100 turns in the chosen direction until the escape angle is achieved.


The drive behavior 1800 may run when no other behavior is active. The robot 100 drives straight until it experiences an event that triggers another behavior.


The robot 100 maintains concurrent processes 2000, “parallel” processes that are not generally considered reactive behaviors. As noted, filters and conditioning 2400 and drivers 2500, can interpret and translate raw signals. These processes are not considered reactive behaviors, and exercise no direct control over the motor drives or other actuators.


Some parallel processes 2000 are important in assisting the activation and execution of various behaviors. These processes are software finite state machines that are evaluated at a frequency of 64 Hertz, for example. The period is referred to as the processing interval.


In some implementations, the robot 100 includes a Canyon Detect process 2100, which assists in identifying canyons. A canyon is declared by monitoring four signals. Each of these signals is evaluated every processing interval. When the input signal is true, the output signal becomes true. The output signal becomes false after 100 consecutive processing intervals of the input signal being false. The four input signals are evaluated as follows: 1) The front-left cliff sensor 710B is active and the front-right cliff sensor 710A is inactive, or the rear-left cliff sensor 710D is active and the rear-right cliff sensor 710C is inactive. 2) The front-right cliff sensor 710A is active and the front-left cliff sensor 710B is inactive, or the rear-right cliff sensor 710C is active and the rear-left cliff sensor 710D is inactive. 3) The bumper 300 is depressed at the front-left side of the robot 100. 4) The bumper 300 is depressed at the front-right side of the robot 100. The processed versions of these signals are named, respectively, as follows: 1) cliff-left-held; 2) cliff-right-held; 3) bump-left-held; and 4) bump-right-held. A canyon is detected when cliff-left-held or bump-left-held are true while cliff-right-held or bump-right-held are true. When a canyon is detected, the Reverse Bump Following Group 1300 is enabled.


In some implementations, the robot 100 includes a Forward Progress process 2200. In the Forward Progress process 2200, every processing interval, the forward progress of the robot 100 is added to an accumulator while a fixed distance quantity corresponding to 1 millimeter is subtracted. When this accumulator reaches 100 millimeters, forward progress is declared to be true. The accumulator is not allowed to exceed 200 millimeters. When forward progress is true for 10 seconds, the Reverse Bump Following Group 1300 is enabled to escape the excessively cluttered environment the robot 100 is traveling in.


In some implementations, the robot 100 includes a Reverse Bump Follow Arc In Progress process 2300. While the robot 100 is in the reverse bump following mode 1300, the forward progress of each iteration of the Reverse Bump Follow Arc In behavior 1330 is fed into a low pass filter. At the beginning of a reverse bump follow, this filter is initialized to 60 millimeters. When the output falls below 50 millimeters, the arc in progress is considered poor. This triggers a toggle in the reverse bump follow direction, i.e. the side of the robot 100 where the primary obstacle is assumed to be.


Other robot details and features combinable with those described herein may be found in the following U.S. patent applications, entitled “AUTONOMOUS COVERAGE ROBOT,” filed on May 9, 2008, having assigned Ser. No. 12/118,219, and published as U.S. Pat. App. Pub. 2008/0276408 A1; and “AUTONOMOUS COVERAGE ROBOT SENSING,” filed on May 9, 2008, having assigned Ser. No. 12/118,250, and published as U.S. Pat. App. Pub. 2008/0281470 A1; the entire contents of the aforementioned applications are hereby incorporated by reference.


A number of implementations have been described. Nevertheless, it will be understood that various modifications may be made without departing from the spirit and scope of the disclosure. Accordingly, other implementations are within the scope of the following claims.

Claims
  • 1. An autonomous coverage robot comprising: a chassis having forward and rearward portions, when viewed from above an outer peripheral shape of the forward portion defining a substantially rectangular shape and an outer peripheral shape of the rear-ward portion defining an arcuate shape;a drive system carried by the chassis configured to maneuver the robot over a cleaning surface;right and left differentially driven drive wheels;a cleaning assembly mounted on the forward portion of the chassis; andbump sensors disposed at the forward corners of the chassis, with at least one bump sensor disposed on each side of each corner such that bump sensors on a lateral side of the robot detect bumps independently of bump sensors on a front side of the robot, thus allowing the robot to determine a direction and/or location of a collision.
  • 2. The autonomous coverage robot of claim 1, wherein the rearward portion has a semi-circular profile defined by a profile circle that extends into the forward portion and has a center axis.
  • 3. The autonomous coverage robot of claim 2, wherein the right and left drive wheels are positioned on or near the center axis of the profile circle such that the robot can turn in place without catching the rearward portion of the chassis on an obstacle.
  • 4. The autonomous coverage robot of claim 3, wherein sides of the bumper are flat to allow the robot to skim along a wall.
  • 5. The autonomous coverage robot of claim 1, comprising a bumper flexibly coupled to the chassis by one or more elastic elements, wherein the bump sensors are configured to detect movement of the bumper.
  • 6. The autonomous coverage robot of claim 5, wherein the bumper covers a front portion and side portions of the forward portion of the chassis, and the bump sensors are configured to detect movement of the bumper in a direction normal to the front portion and in a direction normal to the side portions.
  • 7. The autonomous coverage robot of claim 1, wherein the robot is configured to assume that an object it has bumped into is a wall and turn to follow the wall; and wherein the robot is configured to, after turning, drive straight along the wall and slightly turns into the wall so as to skim along the wall.
  • 8. The autonomous coverage robot of claim 1 comprising a proximity sensor, wherein the robot is configured to semi-passively wall follow including: detecting an object, assumed to be a wall, by either the bump sensor or the proximity sensor,turning to align the dominant side of the robot with the assumed wall,driving along the wall while turning slightly into the wall so as to skim along the wall,maintaining contact with the wall by sensing contact with the wall via the bump sensor or the proximity sensor.
  • 9. The autonomous coverage robot of claim 1, wherein a power source, preferably a battery, is positioned between the right and left drive wheels to place the center of gravity of the robot forward of a first transverse axis defined by the right and left drive wheels.
  • 10. The autonomous coverage robot of claim 1, wherein the robot has a navigational sensor system in communication with the controller that allows the robot to be aware of its surroundings and environment and react in prescribed manners or behaviors according to its sensed perception of its surroundings and environment, the navigational sensor system including one or more cliff sensors, a stasis detector, a proximity sensor, the bump sensors, and/or an omni-directional receiver.
  • 11. The autonomous coverage robot of claim 10, wherein the stasis detector indicates whether the robot is moving or stationary.
  • 12. The autonomous coverage robot of claim 10, wherein the stasis detector includes a stasis wheel with a magnet either embedded in or disposed on the wheel; andthe stasis wheel acts as a third wheel.
  • 13. The autonomous coverage robot of claim 1, comprising at least one forward cliff sensor carried by the forward portion and arranged substantially near a front edge of the forward portion, the at least one forward cliff sensor configured to detect a potential cliff forward of the robot, wherein the drive system is configured to alter a drive direction in response to a signal received from the forward cliff sensor; andat least one rearward cliff sensor carried by a rearward portion and arranged substantially near the rear edge of the rearward portion, the at least one rearward cliff sensor responsive to a potential cliff rearward of the robot, the drive system configured to alter a drive direction in response to a signal received from the rearward cliff sensor.
  • 14. The autonomous coverage robot of claim 1, wherein 1 the drive system is carried by the rearward portion of the chassis and includes right and left drive wheels differentially driven by corresponding right and left motors, and the autonomous coverage robot further includes a controller in communication with the drive system, the controller configured to maneuver the robot to pivot in place; andan accelerometer in communication with the controller, the controller controlling the drive system in response to a signal received from the accelerometer.
  • 15. The autonomous coverage robot of claim 14, wherein the cleaning assembly comprises a roller brush rotatably mounted substantially near a front edge of the chassis.
  • 16. The autonomous coverage robot of claim 14, wherein the controller is configured to alter a drive direction of the robot in response to a signal received from the accelerometer indicating an abrupt speed change.
  • 17. The autonomous coverage robot of claim 14, wherein the controller is configured to alter a drive direction of the robot in response to a signal received from the accelerometer indicating stasis of the robot.
  • 18. The autonomous coverage robot of claim 14, wherein the controller is configured to reduce a drive speed of the robot in response to a signal received from the accelerometer indicating a maximum speed.
  • 19. The autonomous coverage robot of claim 18, wherein the maximum speed is between about 200 mm/s and about 400 mm/s.
  • 20. The autonomous coverage robot of claim 1, wherein the bump sensor comprises four detectors arranged in a rectangular configuration with respect to each other.
  • 21. The autonomous coverage robot of claim 1, further comprising a bumper guide configured to confine body movements to along two directions.
  • 22. The autonomous coverage robot of claim 1, wherein the right and left drive wheels are disposed less than 9 cm rearward of the cleaning assembly.
  • 23. The autonomous coverage robot of claim 1, comprising at least one proximity sensor carried by a dominant side of the robot, the at least one proximity sensor responsive to an obstacle substantially near the chassis, the controller configured to alter a drive direction in response to a signal received from the at least one proximity sensor.
  • 24. The autonomous coverage robot of claim 1, comprising right and left front cliff sensors disposed at the respective right and left corners of the forward portion of the chassis, the right and left front cliff sensors responsive to a potential cliff forward of the robot, the drive system configured to alter a drive direction in response to a signal received from the right and left front cliff sensors indicating a potential cliff.
  • 25. The autonomous coverage robot of claim 1, wherein the cleaning assembly comprises a roller brush rotatably mounted within the chassis along a front edge of the chassis, anda side brush mounted adjacent to the front edge so as to extend beyond the chassis.
  • 26. The autonomous coverage robot of claim 1, wherein the cleaning assembly comprises a roller brush rotatably mounted within the chassis along a front edge of the chassis, anda side brush mounted adjacent to the front edge so as to extend beyond the chassis.
  • 27. The autonomous coverage robot of claim 1, comprising: a proximity sensor along a lateral side of the chassis, wherein the autonomous robot performs a first obstacle following based on the detection of a wall parallel to the lateral side of the chassis by the proximity sensor, and a second obstacle following based on detections of an obstacle by the bump sensor, in which the robot (i) in combination, backs up and turns in place to deactivate the bump sensor and to position to perform an obstacle following arc and (ii) the arc follows about the obstacle so that the lateral side of the chassis is arced about the obstacle.
CROSS REFERENCE TO RELATED APPLICATIONS

This U.S. patent application is a continuation, and claims priority under 35 U.S.C. §120, from U.S. patent application Ser. No. 13/245,118, filed Sep. 26, 2011, entitled Compact Autonomous Coverage Robot, which is a continuation of U.S. patent application Ser. No. 12/118,117, filed May 9, 2008 (now U.S. Pat. No. 8,239,992), which claims priority under 35 U.S.C. §119(e) to U.S. Provisional Application 60/938,699, filed on May 17, 2007 and U.S. Provisional Application 60/917,065, filed on May 9, 2007. The disclosure of each of these prior applications is considered to be part of the disclosure of this application and each of these prior applications is hereby incorporated by reference in its entirety. The contents of U.S. Pre-grant Publications 2003/0192144, 2006/0200281, and 2007/0016328, and also U.S. Pat. Nos. 6,748,297 and 6,883,201 are hereby incorporated herein by reference in their entireties.

US Referenced Citations (867)
Number Name Date Kind
1755054 Darst Apr 1930 A
1780221 Buchmann Nov 1930 A
1970302 Gerhardt Aug 1934 A
2136324 John Nov 1938 A
2302111 Dow et al. Nov 1942 A
2353621 Sav et al. Jul 1944 A
2770825 Pullen Nov 1956 A
3119369 Harland et al. Jan 1964 A
3166138 Dunn Jan 1965 A
3333564 Waters Aug 1967 A
3375375 Robert et al. Mar 1968 A
3381652 Schaefer et al. May 1968 A
3457575 Bienek Jul 1969 A
3550714 Bellinger Dec 1970 A
3569727 Aggarwal et al. Mar 1971 A
3674316 De Brey Jul 1972 A
3678882 Kinsella Jul 1972 A
3744586 Leinauer Jul 1973 A
3756667 Bombardier et al. Sep 1973 A
3809004 Leonheart May 1974 A
3816004 Bignardi Jun 1974 A
3845831 James Nov 1974 A
RE28268 Autrand Dec 1974 E
3853086 Asplund Dec 1974 A
3863285 Hukuba Feb 1975 A
3888181 Kups Jun 1975 A
3937174 Haaga Feb 1976 A
3952361 Wilkins Apr 1976 A
3989311 Debrey Nov 1976 A
3989931 Phillips Nov 1976 A
4004313 Capra Jan 1977 A
4012681 Finger et al. Mar 1977 A
4070170 Leinfelt Jan 1978 A
4099284 Shinozaki et al. Jul 1978 A
4119900 Kremnitz Oct 1978 A
4175589 Nakamura et al. Nov 1979 A
4175892 De brey Nov 1979 A
4196727 Verkaart et al. Apr 1980 A
4198727 Farmer Apr 1980 A
4199838 Simonsson Apr 1980 A
4209254 Reymond et al. Jun 1980 A
D258901 Keyworth Apr 1981 S
4297578 Carter Oct 1981 A
4306329 Yokoi Dec 1981 A
4309758 Halsall et al. Jan 1982 A
4328545 Halsall et al. May 1982 A
4367403 Miller Jan 1983 A
4369543 Chen et al. Jan 1983 A
4401909 Gorsek Aug 1983 A
4416033 Specht Nov 1983 A
4445245 Lu May 1984 A
4465370 Yuasa et al. Aug 1984 A
4477998 You Oct 1984 A
4481692 Kurz Nov 1984 A
4482960 Pryor Nov 1984 A
4492058 Goldfarb et al. Jan 1985 A
4513469 Godfrey et al. Apr 1985 A
D278732 Ohkado May 1985 S
4518437 Sommer May 1985 A
4534637 Suzuki et al. Aug 1985 A
4556313 Miller et al. Dec 1985 A
4575211 Matsumura et al. Mar 1986 A
4580311 Kurz Apr 1986 A
4601082 Kurz Jul 1986 A
4618213 Chen Oct 1986 A
4620285 Perdue Oct 1986 A
4624026 Olson et al. Nov 1986 A
4626995 Lofgren et al. Dec 1986 A
4628454 Ito Dec 1986 A
4638445 Mattaboni Jan 1987 A
4644156 Takahashi et al. Feb 1987 A
4649504 Krouglicof et al. Mar 1987 A
4652917 Miller Mar 1987 A
4654492 Koerner et al. Mar 1987 A
4654924 Getz et al. Apr 1987 A
4660969 Sorimachi et al. Apr 1987 A
4662854 Fang May 1987 A
4674048 Okumura Jun 1987 A
4679152 Perdue Jul 1987 A
4680827 Hummel Jul 1987 A
4696074 Cavalli Sep 1987 A
D292223 Trumbull Oct 1987 S
4700301 Dyke Oct 1987 A
4700427 Knepper Oct 1987 A
4703820 Reinaud Nov 1987 A
4710020 Maddox et al. Dec 1987 A
4716621 Zoni Jan 1988 A
4728801 O'Connor Mar 1988 A
4733343 Yoneda et al. Mar 1988 A
4733430 Westergren Mar 1988 A
4733431 Martin Mar 1988 A
4735136 Lee et al. Apr 1988 A
4735138 Gawler et al. Apr 1988 A
4748336 Fujie et al. May 1988 A
4748833 Nagasawa Jun 1988 A
4756049 Uehara Jul 1988 A
4767213 Hummel Aug 1988 A
4769700 Pryor Sep 1988 A
4777416 George et al. Oct 1988 A
D298766 Tanno et al. Nov 1988 S
4782550 Jacobs Nov 1988 A
4796198 Boultinghouse et al. Jan 1989 A
4806751 Abe et al. Feb 1989 A
4811228 Hyyppa Mar 1989 A
4813906 Matsuyama et al. Mar 1989 A
4815157 Tsuchiya Mar 1989 A
4817000 Eberhardt Mar 1989 A
4818875 Weiner Apr 1989 A
4829442 Kadonoff et al. May 1989 A
4829626 Harkonen et al. May 1989 A
4832098 Palinkas et al. May 1989 A
4851661 Everett Jul 1989 A
4854000 Takimoto Aug 1989 A
4854006 Nishimura et al. Aug 1989 A
4855915 Dallaire Aug 1989 A
4857912 Everett et al. Aug 1989 A
4858132 Holmquist Aug 1989 A
4867570 Sorimachi et al. Sep 1989 A
4880474 Koharagi et al. Nov 1989 A
4887415 Martin Dec 1989 A
4891762 Chotiros Jan 1990 A
4893025 Lee Jan 1990 A
4901394 Nakamura et al. Feb 1990 A
4905151 Weiman et al. Feb 1990 A
4912643 Beirne Mar 1990 A
4918441 Bohman Apr 1990 A
4919224 Shyu et al. Apr 1990 A
4919489 Kopsco Apr 1990 A
4920060 Parrent et al. Apr 1990 A
4920605 Takashima May 1990 A
4933864 Evans et al. Jun 1990 A
4937912 Kurz Jul 1990 A
4953253 Fukuda et al. Sep 1990 A
4954962 Evans et al. Sep 1990 A
4955714 Stotler et al. Sep 1990 A
4956891 Wulff Sep 1990 A
4961303 McCarty et al. Oct 1990 A
4961304 Ovsborn et al. Oct 1990 A
4962453 Pong et al. Oct 1990 A
4971591 Raviv et al. Nov 1990 A
4973912 Kaminski et al. Nov 1990 A
4974283 Holsten et al. Dec 1990 A
4977618 Allen Dec 1990 A
4977639 Takahashi et al. Dec 1990 A
4986663 Cecchi et al. Jan 1991 A
5001635 Yasutomi et al. Mar 1991 A
5002145 Wakaumi et al. Mar 1991 A
5012886 Jonas et al. May 1991 A
5018240 Holman May 1991 A
5020186 Lessig et al. Jun 1991 A
5022812 Coughlan et al. Jun 1991 A
5023788 Kitazume et al. Jun 1991 A
5024529 Svetkoff et al. Jun 1991 A
D318500 Malewicki et al. Jul 1991 S
5032775 Mizuno et al. Jul 1991 A
5033151 Kraft et al. Jul 1991 A
5033291 Podoloff et al. Jul 1991 A
5040116 Evans et al. Aug 1991 A
5045769 Everett Sep 1991 A
5049802 Mintus et al. Sep 1991 A
5051906 Evans et al. Sep 1991 A
5062819 Mallory Nov 1991 A
5070567 Holland Dec 1991 A
5084934 Lessig et al. Feb 1992 A
5086535 Grossmeyer et al. Feb 1992 A
5090321 Abouav Feb 1992 A
5093955 Blehert et al. Mar 1992 A
5094311 Akeel Mar 1992 A
5105502 Takashima Apr 1992 A
5105550 Shenoha Apr 1992 A
5109566 Kobayashi et al. May 1992 A
5115538 Cochran et al. May 1992 A
5127128 Lee Jul 1992 A
5136675 Hodson Aug 1992 A
5136750 Takashima et al. Aug 1992 A
5142985 Stearns et al. Sep 1992 A
5144471 Takanashi et al. Sep 1992 A
5144714 Mori et al. Sep 1992 A
5144715 Matsuyo et al. Sep 1992 A
5152028 Hirano Oct 1992 A
5152202 Strauss Oct 1992 A
5155684 Burke et al. Oct 1992 A
5163202 Kawakami et al. Nov 1992 A
5163320 Goshima et al. Nov 1992 A
5164579 Pryor et al. Nov 1992 A
5165064 Mattaboni Nov 1992 A
5170352 McTamaney et al. Dec 1992 A
5173881 Sindle Dec 1992 A
5182833 Yamaguchi et al. Feb 1993 A
5202742 Frank et al. Apr 1993 A
5204814 Noonan et al. Apr 1993 A
5206500 Decker et al. Apr 1993 A
5208521 Aoyama May 1993 A
5216777 Moro et al. Jun 1993 A
5227985 DeMenthon Jul 1993 A
5233682 Abe et al. Aug 1993 A
5239720 Wood et al. Aug 1993 A
5251358 Moro et al. Oct 1993 A
5261139 Lewis Nov 1993 A
5276618 Everett Jan 1994 A
5276939 Uenishi Jan 1994 A
5277064 Knigga et al. Jan 1994 A
5279672 Betker et al. Jan 1994 A
5284452 Corona Feb 1994 A
5284522 Kobayashi et al. Feb 1994 A
5293955 Lee Mar 1994 A
D345707 Alister Apr 1994 S
5303448 Hennessey et al. Apr 1994 A
5307273 Oh et al. Apr 1994 A
5309592 Hiratsuka May 1994 A
5310379 Hippely et al. May 1994 A
5315227 Pierson et al. May 1994 A
5319827 Yang Jun 1994 A
5319828 Waldhauser et al. Jun 1994 A
5321614 Ashworth Jun 1994 A
5323483 Baeg Jun 1994 A
5324948 Dudar et al. Jun 1994 A
5341186 Kato Aug 1994 A
5341540 Soupert et al. Aug 1994 A
5341549 Wirtz et al. Aug 1994 A
5345649 Whitlow Sep 1994 A
5353224 Lee et al. Oct 1994 A
5363305 Cox et al. Nov 1994 A
5363935 Schempf et al. Nov 1994 A
5369347 Yoo Nov 1994 A
5369838 Wood et al. Dec 1994 A
5386862 Glover et al. Feb 1995 A
5399951 Lavallee et al. Mar 1995 A
5400244 Watanabe et al. Mar 1995 A
5404612 Ishikawa Apr 1995 A
5410479 Coker Apr 1995 A
5435405 Schempf et al. Jul 1995 A
5440216 Kim Aug 1995 A
5442358 Keeler et al. Aug 1995 A
5444965 Colens Aug 1995 A
5446356 Kim Aug 1995 A
5446445 Bloomfield et al. Aug 1995 A
5451135 Schempf et al. Sep 1995 A
5454129 Kell Oct 1995 A
5455982 Armstrong et al. Oct 1995 A
5465525 Mifune et al. Nov 1995 A
5465619 Sotack et al. Nov 1995 A
5467273 Faibish et al. Nov 1995 A
5471560 Allard et al. Nov 1995 A
5491670 Weber Feb 1996 A
5497529 Boesi Mar 1996 A
5498948 Bruni et al. Mar 1996 A
5502638 Takenaka Mar 1996 A
5505072 Oreper Apr 1996 A
5507067 Hoekstra et al. Apr 1996 A
5510893 Suzuki Apr 1996 A
5511147 Abdel Apr 1996 A
5515572 Hoekstra et al. May 1996 A
5534762 Kim Jul 1996 A
5537017 Feiten et al. Jul 1996 A
5539953 Kurz Jul 1996 A
5542146 Hoekstra et al. Aug 1996 A
5542148 Young Aug 1996 A
5546631 Chambon Aug 1996 A
5548511 Bancroft Aug 1996 A
5551525 Pack et al. Sep 1996 A
5553349 Kilstrom et al. Sep 1996 A
5555587 Guha Sep 1996 A
5560077 Crotchett Oct 1996 A
5568589 Hwang Oct 1996 A
D375592 Ljunggren Nov 1996 S
5608306 Rybeck et al. Mar 1997 A
5608894 Kawakami et al. Mar 1997 A
5608944 Gordon Mar 1997 A
5610488 Miyazawa Mar 1997 A
5611106 Wulff Mar 1997 A
5611108 Knowlton et al. Mar 1997 A
5613261 Kawakami et al. Mar 1997 A
5613269 Miwa Mar 1997 A
5621291 Lee Apr 1997 A
5622236 Azumi et al. Apr 1997 A
5634237 Paranjpe Jun 1997 A
5634239 Tuvin et al. Jun 1997 A
5636402 Kubo et al. Jun 1997 A
5642299 Hardin et al. Jun 1997 A
5646494 Han Jul 1997 A
5647554 Ikegami et al. Jul 1997 A
5650702 Azumi Jul 1997 A
5652489 Kawakami Jul 1997 A
5682313 Edlund et al. Oct 1997 A
5682839 Grimsley et al. Nov 1997 A
5696675 Nakamura et al. Dec 1997 A
5698861 Oh Dec 1997 A
5709007 Chiang Jan 1998 A
5710506 Broell et al. Jan 1998 A
5714119 Kawagoe et al. Feb 1998 A
5717169 Liang et al. Feb 1998 A
5717484 Hamaguchi et al. Feb 1998 A
5720077 Nakamura et al. Feb 1998 A
5732401 Conway Mar 1998 A
5735959 Kubo et al. Apr 1998 A
5745235 Vercammen et al. Apr 1998 A
5752871 Tsuzuki May 1998 A
5756904 Oreper et al. May 1998 A
5761762 Kubo Jun 1998 A
5764888 Bolan et al. Jun 1998 A
5767437 Rogers Jun 1998 A
5767960 Orman Jun 1998 A
5777596 Herbert Jul 1998 A
5778486 Kim Jul 1998 A
5781697 Jeong Jul 1998 A
5781960 Kilstrom et al. Jul 1998 A
5786602 Pryor et al. Jul 1998 A
5787545 Colens Aug 1998 A
5793900 Nourbakhsh et al. Aug 1998 A
5794297 Muta Aug 1998 A
5812267 Everett et al. Sep 1998 A
5814808 Takada et al. Sep 1998 A
5815880 Nakanishi Oct 1998 A
5815884 Imamura et al. Oct 1998 A
5819008 Asama et al. Oct 1998 A
5819360 Fujii Oct 1998 A
5819936 Saveliev et al. Oct 1998 A
5820821 Kawagoe et al. Oct 1998 A
5821730 Drapkin Oct 1998 A
5825981 Matsuda Oct 1998 A
5828770 Leis et al. Oct 1998 A
5831597 West et al. Nov 1998 A
5839156 Park et al. Nov 1998 A
5839532 Yoshiji et al. Nov 1998 A
5841259 Kim et al. Nov 1998 A
5867800 Leif Feb 1999 A
5869910 Colens Feb 1999 A
5896611 Haaga Apr 1999 A
5903124 Kawakami May 1999 A
5905209 Oreper May 1999 A
5907886 Buscher Jun 1999 A
5910700 Crotzer Jun 1999 A
5911260 Suzuki Jun 1999 A
5916008 Wong Jun 1999 A
5924167 Wright et al. Jul 1999 A
5926909 McGee Jul 1999 A
5933102 Miller et al. Aug 1999 A
5933913 Wright et al. Aug 1999 A
5935179 Kleiner et al. Aug 1999 A
5940346 Sadowsky et al. Aug 1999 A
5940927 Haegermarck et al. Aug 1999 A
5940930 Oh et al. Aug 1999 A
5942869 Katou et al. Aug 1999 A
5943730 Boomgaarden Aug 1999 A
5943733 Tagliaferri Aug 1999 A
5947225 Kawakami et al. Sep 1999 A
5950408 Schaedler Sep 1999 A
5959423 Nakanishi et al. Sep 1999 A
5968281 Wright et al. Oct 1999 A
5974348 Rocks Oct 1999 A
5974365 Mitchell Oct 1999 A
5983448 Wright et al. Nov 1999 A
5984880 Lander et al. Nov 1999 A
5987383 Keller et al. Nov 1999 A
5989700 Krivopal Nov 1999 A
5991951 Kubo et al. Nov 1999 A
5995883 Nishikado Nov 1999 A
5995884 Allen et al. Nov 1999 A
5996167 Close Dec 1999 A
5998953 Nakamura et al. Dec 1999 A
5998971 Corbridge Dec 1999 A
6000088 Wright et al. Dec 1999 A
6009358 Angott et al. Dec 1999 A
6021545 Delgado et al. Feb 2000 A
6023813 Thatcher et al. Feb 2000 A
6023814 Imamura Feb 2000 A
6025687 Himeda et al. Feb 2000 A
6026539 Mouw et al. Feb 2000 A
6030464 Azevedo Feb 2000 A
6030465 Marcussen et al. Feb 2000 A
6032542 Warnick et al. Mar 2000 A
6036572 Sze Mar 2000 A
6038501 Kawakami Mar 2000 A
6040669 Hog Mar 2000 A
6041471 Charky et al. Mar 2000 A
6041472 Kasen et al. Mar 2000 A
6046800 Ohtomo et al. Apr 2000 A
6049620 Dickinson et al. Apr 2000 A
6052821 Chouly et al. Apr 2000 A
6055042 Sarangapani Apr 2000 A
6055702 Imamura et al. May 2000 A
6061868 Moritsch et al. May 2000 A
6065182 Wright et al. May 2000 A
6073432 Schaedler Jun 2000 A
6076025 Ueno et al. Jun 2000 A
6076026 Jambhekar et al. Jun 2000 A
6076226 Reed Jun 2000 A
6076227 Schallig et al. Jun 2000 A
6081257 Zeller Jun 2000 A
6088020 Mor Jul 2000 A
6094775 Behmer Aug 2000 A
6099091 Campbell Aug 2000 A
6101670 Song Aug 2000 A
6101671 Wright et al. Aug 2000 A
6108031 King et al. Aug 2000 A
6108067 Okamoto Aug 2000 A
6108076 Hanseder Aug 2000 A
6108269 Kabel Aug 2000 A
6108597 Kirchner et al. Aug 2000 A
6112143 Allen et al. Aug 2000 A
6112996 Matsuo Sep 2000 A
6119057 Kawagoe Sep 2000 A
6122798 Kobayashi et al. Sep 2000 A
6124694 Bancroft et al. Sep 2000 A
6125498 Roberts et al. Oct 2000 A
6131237 Kasper et al. Oct 2000 A
6138063 Himeda Oct 2000 A
6142252 Kinto et al. Nov 2000 A
6146278 Kobayashi Nov 2000 A
6154279 Thayer Nov 2000 A
6154694 Aoki et al. Nov 2000 A
6160479 Ahlen et al. Dec 2000 A
6167332 Kurtzberg et al. Dec 2000 A
6167587 Kasper et al. Jan 2001 B1
6192548 Huffman Feb 2001 B1
6216307 Kaleta et al. Apr 2001 B1
6220865 Macri et al. Apr 2001 B1
6226830 Hendriks et al. May 2001 B1
6230362 Kasper et al. May 2001 B1
6237741 Guidetti May 2001 B1
6240342 Fiegert et al. May 2001 B1
6243913 Frank et al. Jun 2001 B1
6255793 Peless et al. Jul 2001 B1
6259979 Holmquist Jul 2001 B1
6261379 Conrad et al. Jul 2001 B1
6263539 Baig Jul 2001 B1
6263989 Won Jul 2001 B1
6272936 Oreper et al. Aug 2001 B1
6276478 Hopkins et al. Aug 2001 B1
6278918 Dickson et al. Aug 2001 B1
6282526 Ganesh Aug 2001 B1
6283034 Miles Sep 2001 B1
6285778 Nakajima et al. Sep 2001 B1
6285930 Dickson et al. Sep 2001 B1
6300737 Bergvall et al. Oct 2001 B1
6321337 Reshef et al. Nov 2001 B1
6321515 Colens Nov 2001 B1
6323570 Nishimura et al. Nov 2001 B1
6324714 Walz et al. Dec 2001 B1
6327741 Reed Dec 2001 B1
6332400 Meyer Dec 2001 B1
6339735 Peless et al. Jan 2002 B1
6362875 Burkley Mar 2002 B1
6370453 Sommer Apr 2002 B2
6374155 Wallach et al. Apr 2002 B1
6374157 Takamura Apr 2002 B1
6381802 Park May 2002 B2
6385515 Dickson et al. May 2002 B1
6388013 Saraf et al. May 2002 B1
6389329 Colens May 2002 B1
6400048 Nishimura et al. Jun 2002 B1
6401294 Kasper Jun 2002 B2
6408226 Byrne et al. Jun 2002 B1
6412141 Kasper et al. Jul 2002 B2
6415203 Inoue et al. Jul 2002 B1
6421870 Basham et al. Jul 2002 B1
6427285 Legatt et al. Aug 2002 B1
6430471 Kintou et al. Aug 2002 B1
6431296 Won Aug 2002 B1
6437227 Theimer Aug 2002 B1
6437465 Nishimura et al. Aug 2002 B1
6438456 Feddema et al. Aug 2002 B1
6438793 Miner et al. Aug 2002 B1
6442476 Poropat Aug 2002 B1
6443509 Levin et al. Sep 2002 B1
6444003 Sutcliffe Sep 2002 B1
6446302 Kasper et al. Sep 2002 B1
6454036 Airey et al. Sep 2002 B1
D464091 Christianson Oct 2002 S
6457206 Judson Oct 2002 B1
6459955 Bartsch et al. Oct 2002 B1
6463368 Feiten et al. Oct 2002 B1
6465982 Bergvall et al. Oct 2002 B1
6473167 Odell Oct 2002 B1
6480762 Uchikubo et al. Nov 2002 B1
6481515 Kirkpatrick et al. Nov 2002 B1
6490539 Dickson et al. Dec 2002 B1
6491127 Holmberg et al. Dec 2002 B1
6493612 Bisset et al. Dec 2002 B1
6493613 Peless et al. Dec 2002 B2
6496754 Song et al. Dec 2002 B2
6496755 Wallach et al. Dec 2002 B2
6502657 Kerrebrock et al. Jan 2003 B2
6504610 Bauer et al. Jan 2003 B1
6507773 Parker et al. Jan 2003 B2
6525509 Petersson et al. Feb 2003 B1
D471243 Cioffi et al. Mar 2003 S
6532404 Colens Mar 2003 B2
6535793 Allard Mar 2003 B2
6540607 Mokris et al. Apr 2003 B2
6548982 Papanikolopoulos et al. Apr 2003 B1
6553612 Dyson et al. Apr 2003 B1
6556722 Russell et al. Apr 2003 B1
6556892 Kuroki et al. Apr 2003 B2
6557104 Vu et al. Apr 2003 B2
D474312 Stephens et al. May 2003 S
6563130 Dworkowski et al. May 2003 B2
6571415 Gerber et al. Jun 2003 B2
6571422 Gordon et al. Jun 2003 B1
6572711 Sclafani et al. Jun 2003 B2
6574536 Kawagoe et al. Jun 2003 B1
6580246 Jacobs Jun 2003 B2
6584376 Van Kommer Jun 2003 B1
6586908 Petersson et al. Jul 2003 B2
6587573 Stam et al. Jul 2003 B1
6590222 Bisset et al. Jul 2003 B1
6594551 McKinney et al. Jul 2003 B2
6594844 Jones Jul 2003 B2
D478884 Slipy et al. Aug 2003 S
6601265 Burlington Aug 2003 B1
6604021 Imai et al. Aug 2003 B2
6604022 Parker et al. Aug 2003 B2
6611120 Song et al. Aug 2003 B2
6611734 Parker et al. Aug 2003 B2
6611738 Ruffner Aug 2003 B2
6615108 Peless et al. Sep 2003 B1
6615885 Ohm Sep 2003 B1
6622465 Jerome et al. Sep 2003 B2
6624744 Wilson et al. Sep 2003 B1
6629028 Paromtchik et al. Sep 2003 B2
6639659 Granger Oct 2003 B2
6658325 Zweig Dec 2003 B2
6658354 Lin Dec 2003 B2
6658692 Lenkiewicz et al. Dec 2003 B2
6658693 Reed Dec 2003 B1
6661239 Ozick Dec 2003 B1
6662889 De Fazio et al. Dec 2003 B2
6668951 Won Dec 2003 B2
6670817 Fournier et al. Dec 2003 B2
6671592 Bisset et al. Dec 2003 B1
6687571 Byrne et al. Feb 2004 B1
6690134 Jones et al. Feb 2004 B1
6690993 Foulke et al. Feb 2004 B2
6697147 Ko et al. Feb 2004 B2
6711280 Stafsudd et al. Mar 2004 B2
6732826 Song et al. May 2004 B2
6737591 Lapstun et al. May 2004 B1
6741054 Koselka et al. May 2004 B2
6741364 Lange et al. May 2004 B2
6748297 Song et al. Jun 2004 B2
6756703 Chang Jun 2004 B2
6760647 Nourbakhsh et al. Jul 2004 B2
6764373 Osawa et al. Jul 2004 B1
6769004 Barrett Jul 2004 B2
6774596 Bisset Aug 2004 B1
6779380 Nieuwkamp Aug 2004 B1
6781338 Jones et al. Aug 2004 B2
6809490 Jones et al. Oct 2004 B2
6810305 Kirkpatrick Oct 2004 B2
6830120 Yashima et al. Dec 2004 B1
6832407 Salem et al. Dec 2004 B2
6836701 McKee Dec 2004 B2
6845297 Allard Jan 2005 B2
6856811 Burdue et al. Feb 2005 B2
6859010 Jeon et al. Feb 2005 B2
6859682 Naka et al. Feb 2005 B2
6860206 Rudakevych et al. Mar 2005 B1
6865447 Lau et al. Mar 2005 B2
6870792 Chiappetta Mar 2005 B2
6871115 Huang et al. Mar 2005 B2
6883201 Jones et al. Apr 2005 B2
6886651 Slocum et al. May 2005 B1
6888333 Laby May 2005 B2
6901624 Mori et al. Jun 2005 B2
6906702 Tanaka et al. Jun 2005 B1
6914403 Tsurumi Jul 2005 B2
6917854 Bayer Jul 2005 B2
6925357 Wang et al. Aug 2005 B2
6925679 Wallach et al. Aug 2005 B2
6929548 Wang Aug 2005 B2
D510066 Hickey et al. Sep 2005 S
6938298 Aasen Sep 2005 B2
6940291 Ozick Sep 2005 B1
6941199 Bottomley et al. Sep 2005 B1
6956348 Landry et al. Oct 2005 B2
6957712 Song et al. Oct 2005 B2
6960986 Asama et al. Nov 2005 B2
6965209 Jones et al. Nov 2005 B2
6965211 Tsurumi Nov 2005 B2
6968592 Takeuchi et al. Nov 2005 B2
6971140 Kim Dec 2005 B2
6975246 Trudeau Dec 2005 B1
6980229 Ebersole Dec 2005 B1
6985556 Shanmugavel et al. Jan 2006 B2
6993954 George et al. Feb 2006 B1
7013527 Thomas et al. Mar 2006 B2
7024278 Chiappetta et al. Apr 2006 B2
7024280 Parker et al. Apr 2006 B2
7027893 Perry et al. Apr 2006 B2
7030768 Wanie Apr 2006 B2
7031805 Lee et al. Apr 2006 B2
7032469 Bailey Apr 2006 B2
7054716 McKee et al. May 2006 B2
7057120 Ma et al. Jun 2006 B2
7057643 Iida et al. Jun 2006 B2
7065430 Naka et al. Jun 2006 B2
7066291 Martins et al. Jun 2006 B2
7069124 Whittaker et al. Jun 2006 B1
7079923 Abramson et al. Jul 2006 B2
7085623 Siegers Aug 2006 B2
7085624 Aldred et al. Aug 2006 B2
7113847 Chmura et al. Sep 2006 B2
7133746 Abramson et al. Nov 2006 B2
7142198 Lee Nov 2006 B2
7148458 Schell et al. Dec 2006 B2
7155308 Jones Dec 2006 B2
7167775 Abramson et al. Jan 2007 B2
7171285 Kim et al. Jan 2007 B2
7173391 Jones et al. Feb 2007 B2
7174238 Zweig Feb 2007 B1
7188000 Chiappetta et al. Mar 2007 B2
7193384 Norman et al. Mar 2007 B1
7196487 Jones et al. Mar 2007 B2
7201786 Wegelin et al. Apr 2007 B2
7206677 Huldén Apr 2007 B2
7211980 Bruemmer et al. May 2007 B1
7246405 Yan Jul 2007 B2
7275280 Haegermarck et al. Oct 2007 B2
7283892 Boillot et al. Oct 2007 B1
7288912 Landry et al. Oct 2007 B2
7318248 Yan Jan 2008 B1
7320149 Huffman et al. Jan 2008 B1
7324870 Lee Jan 2008 B2
7328196 Peters Feb 2008 B2
7332890 Cohen et al. Feb 2008 B2
7352153 Yan Apr 2008 B2
7359766 Jeon et al. Apr 2008 B2
7363108 Noda et al. Apr 2008 B2
7388879 Sabe et al. Jun 2008 B2
7389166 Harwig et al. Jun 2008 B2
7408157 Yan Aug 2008 B2
7418762 Arai et al. Sep 2008 B2
7430455 Casey et al. Sep 2008 B2
7430462 Chiu et al. Sep 2008 B2
7441298 Svendsen et al. Oct 2008 B2
7444206 Abramson et al. Oct 2008 B2
7448113 Jones et al. Nov 2008 B2
7459871 Landry et al. Dec 2008 B2
7467026 Sakagami et al. Dec 2008 B2
7474941 Kim et al. Jan 2009 B2
7515991 Egawa et al. Apr 2009 B2
7557703 Yamada et al. Jul 2009 B2
7568259 Yan Aug 2009 B2
7571511 Jones et al. Aug 2009 B2
7578020 Jaworski et al. Aug 2009 B2
7600521 Woo Oct 2009 B2
7617557 Reindle Nov 2009 B2
7636982 Jones et al. Dec 2009 B2
7647144 Haegermarck Jan 2010 B2
7650666 Jang Jan 2010 B2
7660650 Kawagoe et al. Feb 2010 B2
7663333 Jones et al. Feb 2010 B2
7693605 Park Apr 2010 B2
7706917 Chiappetta et al. Apr 2010 B1
7765635 Park Aug 2010 B2
7784139 Sawalski et al. Aug 2010 B2
7801645 Taylor et al. Sep 2010 B2
7805220 Taylor et al. Sep 2010 B2
7809944 Kawamoto Oct 2010 B2
7853645 Brown et al. Dec 2010 B2
7920941 Park et al. Apr 2011 B2
7937800 Yan May 2011 B2
7953526 Durkos et al. May 2011 B2
7957836 Myeong et al. Jun 2011 B2
8392021 Konandreas et al. Mar 2013 B2
20010004719 Sommer Jun 2001 A1
20010013929 Torsten Aug 2001 A1
20010020200 Das et al. Sep 2001 A1
20010025183 Shahidi Sep 2001 A1
20010037163 Allard Nov 2001 A1
20010043509 Green et al. Nov 2001 A1
20010045883 Holdaway et al. Nov 2001 A1
20010047231 Peless et al. Nov 2001 A1
20010047895 De Fazio et al. Dec 2001 A1
20020011367 Kolesnik Jan 2002 A1
20020011813 Koselka et al. Jan 2002 A1
20020016649 Jones Feb 2002 A1
20020021219 Edwards Feb 2002 A1
20020027652 Paromtchik et al. Mar 2002 A1
20020036779 Kiyoi et al. Mar 2002 A1
20020081937 Yamada et al. Jun 2002 A1
20020095239 Wallach et al. Jul 2002 A1
20020097400 Jung et al. Jul 2002 A1
20020104963 Mancevski Aug 2002 A1
20020108209 Peterson Aug 2002 A1
20020112742 Bredo et al. Aug 2002 A1
20020113973 Ge Aug 2002 A1
20020116089 Kirkpatrick Aug 2002 A1
20020124343 Reed Sep 2002 A1
20020153185 Song et al. Oct 2002 A1
20020156556 Ruffner Oct 2002 A1
20020159051 Guo Oct 2002 A1
20020166193 Kasper Nov 2002 A1
20020169521 Goodman et al. Nov 2002 A1
20020173877 Zweig Nov 2002 A1
20020189871 Won Dec 2002 A1
20030009259 Hattori et al. Jan 2003 A1
20030019071 Field et al. Jan 2003 A1
20030023356 Keable Jan 2003 A1
20030024986 Mazz et al. Feb 2003 A1
20030025472 Jones et al. Feb 2003 A1
20030028286 Glenn et al. Feb 2003 A1
20030030399 Jacobs Feb 2003 A1
20030058262 Sato et al. Mar 2003 A1
20030060928 Abramson et al. Mar 2003 A1
20030067451 Tagg et al. Apr 2003 A1
20030097875 Lentz et al. May 2003 A1
20030124312 Autumn Jul 2003 A1
20030126352 Barrett Jul 2003 A1
20030137268 Papanikolopoulos et al. Jul 2003 A1
20030146384 Logsdon et al. Aug 2003 A1
20030192144 Song et al. Oct 2003 A1
20030193657 Uomori et al. Oct 2003 A1
20030216834 Allard Nov 2003 A1
20030221114 Hino et al. Nov 2003 A1
20030229421 Chmura et al. Dec 2003 A1
20030229474 Suzuki et al. Dec 2003 A1
20030233171 Heiligensetzer Dec 2003 A1
20030233177 Johnson et al. Dec 2003 A1
20030233870 Mancevski Dec 2003 A1
20030233930 Ozick Dec 2003 A1
20040016077 Song et al. Jan 2004 A1
20040020000 Jones Feb 2004 A1
20040030448 Solomon Feb 2004 A1
20040030449 Solomon Feb 2004 A1
20040030450 Solomon Feb 2004 A1
20040030451 Solomon Feb 2004 A1
20040030570 Solomon Feb 2004 A1
20040030571 Solomon Feb 2004 A1
20040031113 Wosewick et al. Feb 2004 A1
20040049877 Jones et al. Mar 2004 A1
20040055163 McCambridge et al. Mar 2004 A1
20040068351 Solomon Apr 2004 A1
20040068415 Solomon Apr 2004 A1
20040068416 Solomon Apr 2004 A1
20040074038 Im et al. Apr 2004 A1
20040076324 Burl et al. Apr 2004 A1
20040083570 Song et al. May 2004 A1
20040085037 Jones et al. May 2004 A1
20040088079 Lavarec et al. May 2004 A1
20040093122 Galibraith May 2004 A1
20040098167 Yi et al. May 2004 A1
20040111184 Chiappetta et al. Jun 2004 A1
20040113777 Matsuhira et al. Jun 2004 A1
20040117064 McDonald Jun 2004 A1
20040117846 Karaoguz et al. Jun 2004 A1
20040118998 Wingett et al. Jun 2004 A1
20040128028 Miyamoto et al. Jul 2004 A1
20040133316 Dean Jul 2004 A1
20040134336 Solomon Jul 2004 A1
20040134337 Solomon Jul 2004 A1
20040143919 Wilder Jul 2004 A1
20040148419 Chen et al. Jul 2004 A1
20040148731 Damman et al. Aug 2004 A1
20040153212 Profio et al. Aug 2004 A1
20040156541 Jeon et al. Aug 2004 A1
20040158357 Lee et al. Aug 2004 A1
20040181706 Chen et al. Sep 2004 A1
20040187249 Jones et al. Sep 2004 A1
20040187457 Colens Sep 2004 A1
20040196451 Aoyama Oct 2004 A1
20040200505 Taylor et al. Oct 2004 A1
20040204792 Taylor et al. Oct 2004 A1
20040210345 Noda et al. Oct 2004 A1
20040210347 Sawada et al. Oct 2004 A1
20040211444 Taylor et al. Oct 2004 A1
20040221790 Sinclair et al. Nov 2004 A1
20040236468 Taylor et al. Nov 2004 A1
20040244138 Taylor et al. Dec 2004 A1
20040255425 Arai et al. Dec 2004 A1
20050000543 Taylor et al. Jan 2005 A1
20050010330 Abramson et al. Jan 2005 A1
20050015914 You et al. Jan 2005 A1
20050067994 Jones et al. Mar 2005 A1
20050076466 Yan Apr 2005 A1
20050085947 Aldred et al. Apr 2005 A1
20050137749 Jeon et al. Jun 2005 A1
20050144751 Kegg et al. Jul 2005 A1
20050150074 Diehl et al. Jul 2005 A1
20050154795 Kuz et al. Jul 2005 A1
20050156562 Cohen et al. Jul 2005 A1
20050165508 Kanda et al. Jul 2005 A1
20050166354 Uehigashi Aug 2005 A1
20050166355 Tani Aug 2005 A1
20050172445 Diehl et al. Aug 2005 A1
20050183229 Uehigashi Aug 2005 A1
20050183230 Uehigashi Aug 2005 A1
20050187678 Myeong et al. Aug 2005 A1
20050192707 Park et al. Sep 2005 A1
20050204717 Colens Sep 2005 A1
20050209736 Kawagoe Sep 2005 A1
20050211880 Schell et al. Sep 2005 A1
20050212929 Schell et al. Sep 2005 A1
20050213082 DiBernardo et al. Sep 2005 A1
20050213109 Schell et al. Sep 2005 A1
20050218852 Landry et al. Oct 2005 A1
20050222933 Wesby Oct 2005 A1
20050235451 Yan Oct 2005 A1
20050251292 Casey et al. Nov 2005 A1
20050255425 Pierson Nov 2005 A1
20050258154 Blankenship et al. Nov 2005 A1
20050273967 Taylor et al. Dec 2005 A1
20050288819 de Guzman Dec 2005 A1
20060000050 Cipolla et al. Jan 2006 A1
20060010638 Shimizu et al. Jan 2006 A1
20060020369 Taylor et al. Jan 2006 A1
20060020370 Abramson Jan 2006 A1
20060025134 Cho et al. Feb 2006 A1
20060037170 Shimizu Feb 2006 A1
20060042042 Mertes et al. Mar 2006 A1
20060044546 Lewin et al. Mar 2006 A1
20060060216 Woo Mar 2006 A1
20060061657 Rew et al. Mar 2006 A1
20060087273 Ko et al. Apr 2006 A1
20060089765 Pack et al. Apr 2006 A1
20060100741 Jung May 2006 A1
20060119839 Bertin et al. Jun 2006 A1
20060143295 Costa-Requena et al. Jun 2006 A1
20060146776 Kim Jul 2006 A1
20060190133 Konandreas et al. Aug 2006 A1
20060220900 Ceskutti et al. Oct 2006 A1
20060259194 Chiu Nov 2006 A1
20060259494 Watson et al. Nov 2006 A1
20060293787 Kanda et al. Dec 2006 A1
20070017061 Yan Jan 2007 A1
20070028574 Yan Feb 2007 A1
20070032904 Kawagoe et al. Feb 2007 A1
20070042716 Goodall et al. Feb 2007 A1
20070043459 Abbott et al. Feb 2007 A1
20070061041 Zweig Mar 2007 A1
20070114975 Cohen et al. May 2007 A1
20070136981 Dilger et al. Jun 2007 A1
20070150096 Yeh et al. Jun 2007 A1
20070157415 Lee et al. Jul 2007 A1
20070157420 Lee et al. Jul 2007 A1
20070179670 Chiappetta et al. Aug 2007 A1
20070226949 Hahm et al. Oct 2007 A1
20070234492 Svendsen et al. Oct 2007 A1
20070244610 Ozick et al. Oct 2007 A1
20070250212 Halloran et al. Oct 2007 A1
20070266508 Jones et al. Nov 2007 A1
20080007203 Cohen et al. Jan 2008 A1
20080039974 Sandin et al. Feb 2008 A1
20080052846 Kapoor et al. Mar 2008 A1
20080091304 Ozick et al. Apr 2008 A1
20080184518 Taylor Aug 2008 A1
20080276407 Schnittman et al. Nov 2008 A1
20080281470 Gilbert et al. Nov 2008 A1
20080294288 Yamauchi Nov 2008 A1
20080302586 Yan Dec 2008 A1
20080307590 Jones et al. Dec 2008 A1
20090007366 Svendsen et al. Jan 2009 A1
20090038089 Landry et al. Feb 2009 A1
20090049640 Lee et al. Feb 2009 A1
20090055022 Casey et al. Feb 2009 A1
20090102296 Greene et al. Apr 2009 A1
20090292393 Casey et al. Nov 2009 A1
20100011529 Won et al. Jan 2010 A1
20100049365 Jones et al. Feb 2010 A1
20100063628 Landry et al. Mar 2010 A1
20100107355 Won et al. May 2010 A1
20100257690 Jones et al. Oct 2010 A1
20100257691 Jones et al. Oct 2010 A1
20100263158 Jones et al. Oct 2010 A1
20100268384 Jones et al. Oct 2010 A1
20100312429 Jones et al. Dec 2010 A1
Foreign Referenced Citations (344)
Number Date Country
2003275566 Jun 2004 AU
2128842 Dec 1980 DE
3317376 Nov 1984 DE
3536907 Feb 1989 DE
3404202 Dec 1992 DE
199311014 Oct 1993 DE
4414683 Oct 1995 DE
4338841 Aug 1999 DE
19849978 Feb 2001 DE
10242257 Apr 2003 DE
10357636 Jul 2005 DE
102004041021 Aug 2005 DE
102005046813 Apr 2007 DE
198803389 Dec 1988 DK
265542 May 1988 EP
281085 Sep 1988 EP
307381 Jul 1990 EP
358628 May 1991 EP
437024 Jul 1991 EP
433697 Dec 1992 EP
479273 May 1993 EP
294101 Dec 1993 EP
554978 Mar 1994 EP
615719 Sep 1994 EP
861629 Sep 1998 EP
792726 Jun 1999 EP
930040 Oct 1999 EP
845237 Apr 2000 EP
1018315 Jul 2000 EP
1172719 Jan 2002 EP
1228734 Jun 2003 EP
1331537 Jul 2003 EP
1380246 Mar 2005 EP
1553472 Jul 2005 EP
1642522 Nov 2007 EP
2238196 Nov 2006 ES
2601443 Nov 1991 FR
2828589 Dec 2003 FR
702426 Jan 1954 GB
2128842 Apr 1986 GB
2213047 Aug 1989 GB
2225221 May 1990 GB
2284957 Jun 1995 GB
2267360 Dec 1995 GB
2283838 Dec 1997 GB
2300082 Sep 1999 GB
2404330 Jul 2005 GB
2417354 Feb 2006 GB
53021869 Feb 1978 JP
53110257 Sep 1978 JP
943901 Mar 1979 JP
57014726 Jan 1982 JP
57064217 Apr 1982 JP
59005315 Feb 1984 JP
59094005 May 1984 JP
59099308 Jul 1984 JP
59112311 Jul 1984 JP
59033511 Aug 1984 JP
59120124 Aug 1984 JP
59131668 Sep 1984 JP
59164973 Sep 1984 JP
59184917 Oct 1984 JP
59212924 Dec 1984 JP
59226909 Dec 1984 JP
60089213 Jun 1985 JP
60211510 Oct 1985 JP
60259895 Dec 1985 JP
61097712 May 1986 JP
61023221 Jun 1986 JP
62074018 Apr 1987 JP
62070709 May 1987 JP
62120510 Jul 1987 JP
62154008 Sep 1987 JP
62164431 Oct 1987 JP
62263507 Nov 1987 JP
62263508 Nov 1987 JP
62189057 Dec 1987 JP
63079623 Apr 1988 JP
63158032 Jul 1988 JP
63183032 Jul 1988 JP
63241610 Oct 1988 JP
1162454 Jun 1989 JP
2006312 Jan 1990 JP
2026312 Jun 1990 JP
2283343 Nov 1990 JP
3051023 Mar 1991 JP
3197758 Aug 1991 JP
3201903 Sep 1991 JP
4019586 Mar 1992 JP
4084921 Mar 1992 JP
5023269 Apr 1993 JP
5091604 Apr 1993 JP
5042076 Jun 1993 JP
5046246 Jun 1993 JP
5150827 Jun 1993 JP
5150829 Jun 1993 JP
5046239 Jul 1993 JP
5054620 Jul 1993 JP
5040519 Oct 1993 JP
5257527 Oct 1993 JP
5257533 Oct 1993 JP
5285861 Nov 1993 JP
6003251 Jan 1994 JP
6026312 Apr 1994 JP
6137828 May 1994 JP
6293095 Oct 1994 JP
6327598 Nov 1994 JP
6105781 Dec 1994 JP
7129239 May 1995 JP
7059702 Jun 1995 JP
7222705 Aug 1995 JP
7270518 Oct 1995 JP
7281742 Oct 1995 JP
7281752 Oct 1995 JP
7295636 Nov 1995 JP
7311041 Nov 1995 JP
7313417 Dec 1995 JP
7319542 Dec 1995 JP
8000393 Jan 1996 JP
8016241 Jan 1996 JP
8016776 Feb 1996 JP
8063229 Mar 1996 JP
8083125 Mar 1996 JP
8089449 Apr 1996 JP
8089451 Apr 1996 JP
2520732 May 1996 JP
8123548 May 1996 JP
8152916 Jun 1996 JP
8256960 Oct 1996 JP
8263137 Oct 1996 JP
8286741 Nov 1996 JP
8322774 Dec 1996 JP
8322774 Dec 1996 JP
8335112 Dec 1996 JP
9043901 Feb 1997 JP
9044240 Feb 1997 JP
9047413 Feb 1997 JP
9066855 Mar 1997 JP
9145309 Jun 1997 JP
9160644 Jun 1997 JP
9179625 Jul 1997 JP
9179685 Jul 1997 JP
9185410 Jul 1997 JP
9192069 Jul 1997 JP
9204223 Aug 1997 JP
9206258 Aug 1997 JP
9233712 Sep 1997 JP
9251318 Sep 1997 JP
9265319 Oct 1997 JP
9269807 Oct 1997 JP
9269810 Oct 1997 JP
02555263 Nov 1997 JP
9319431 Dec 1997 JP
9319432 Dec 1997 JP
9319434 Dec 1997 JP
9325812 Dec 1997 JP
10055215 Feb 1998 JP
10117973 May 1998 JP
10118963 May 1998 JP
10177414 Jun 1998 JP
10214114 Aug 1998 JP
10228316 Aug 1998 JP
10240342 Sep 1998 JP
10260727 Sep 1998 JP
10295595 Nov 1998 JP
11015941 Jan 1999 JP
11065655 Mar 1999 JP
11085269 Mar 1999 JP
11102219 Apr 1999 JP
11102220 Apr 1999 JP
11162454 Jun 1999 JP
11174145 Jul 1999 JP
11175149 Jul 1999 JP
11178764 Jul 1999 JP
11178765 Jul 1999 JP
11212642 Aug 1999 JP
11213157 Aug 1999 JP
11508810 Aug 1999 JP
11248806 Sep 1999 JP
11510935 Sep 1999 JP
11282532 Oct 1999 JP
11282533 Oct 1999 JP
11295412 Oct 1999 JP
11346964 Dec 1999 JP
2000047728 Feb 2000 JP
2000056006 Feb 2000 JP
2000056831 Feb 2000 JP
2000066722 Mar 2000 JP
2000075925 Mar 2000 JP
10240343 May 2000 JP
2000275321 Oct 2000 JP
2000353014 Dec 2000 JP
2001022443 Jan 2001 JP
2001067588 Mar 2001 JP
2001087182 Apr 2001 JP
2001121455 May 2001 JP
2001125641 May 2001 JP
2001216482 Aug 2001 JP
2001258807 Sep 2001 JP
2001265437 Sep 2001 JP
2001274908 Oct 2001 JP
2001289939 Oct 2001 JP
2001306170 Nov 2001 JP
2001320781 Nov 2001 JP
2001525567 Dec 2001 JP
2002204768 Jul 2002 JP
2002204769 Jul 2002 JP
2002247510 Aug 2002 JP
2002532178 Oct 2002 JP
2002323925 Nov 2002 JP
2002333920 Nov 2002 JP
03356170 Dec 2002 JP
2002355206 Dec 2002 JP
2002360471 Dec 2002 JP
2002360479 Dec 2002 JP
2002360482 Dec 2002 JP
2002366227 Dec 2002 JP
2002369778 Dec 2002 JP
2003005296 Jan 2003 JP
2003010076 Jan 2003 JP
2003010088 Jan 2003 JP
2003015740 Jan 2003 JP
2003028528 Jan 2003 JP
03375843 Feb 2003 JP
2003036116 Feb 2003 JP
2003047579 Feb 2003 JP
2003052596 Feb 2003 JP
2003505127 Feb 2003 JP
2003061882 Mar 2003 JP
2003084994 Mar 2003 JP
2003167628 Jun 2003 JP
2003180586 Jul 2003 JP
2003180587 Jul 2003 JP
2003186539 Jul 2003 JP
2003190064 Jul 2003 JP
2003241836 Aug 2003 JP
2003262520 Sep 2003 JP
2003285288 Oct 2003 JP
2003304992 Oct 2003 JP
200330543 Nov 2003 JP
2003310489 Nov 2003 JP
2003310509 Nov 2003 JP
2004016385 Jan 2004 JP
2004123040 Apr 2004 JP
2004148021 May 2004 JP
2004160102 Jun 2004 JP
2004166968 Jun 2004 JP
2004174228 Jun 2004 JP
2004198330 Jul 2004 JP
2004219185 Aug 2004 JP
2005118354 May 2005 JP
2005135400 May 2005 JP
2005211360 Aug 2005 JP
2005224265 Aug 2005 JP
2005230032 Sep 2005 JP
2005245916 Sep 2005 JP
2005296511 Oct 2005 JP
2005346700 Dec 2005 JP
2005352707 Dec 2005 JP
2006031503 Feb 2006 JP
2006043071 Feb 2006 JP
2006155274 Jun 2006 JP
2006164223 Jun 2006 JP
2006227673 Aug 2006 JP
2006247467 Sep 2006 JP
2006260161 Sep 2006 JP
2006293662 Oct 2006 JP
2006296697 Nov 2006 JP
2007034866 Feb 2007 JP
2007213180 Aug 2007 JP
04074285 Apr 2008 JP
2009015611 Jan 2009 JP
2010198552 Sep 2010 JP
WO9526512 Oct 1995 WO
WO9530887 Nov 1995 WO
WO9617258 Feb 1997 WO
WO9715224 May 1997 WO
WO9740734 Nov 1997 WO
WO9741451 Nov 1997 WO
WO9853456 Nov 1998 WO
WO9905580 Feb 1999 WO
WO9916078 Apr 1999 WO
WO9928800 Jun 1999 WO
WO9938056 Jul 1999 WO
WO9938237 Jul 1999 WO
WO9943250 Sep 1999 WO
WO9959042 Nov 1999 WO
WO0004430 Apr 2000 WO
WO0036962 Jun 2000 WO
WO0038026 Jun 2000 WO
WO0038028 Jun 2000 WO
WO0038029 Jun 2000 WO
WO0078410 Dec 2000 WO
WO0106904 Feb 2001 WO
WO0106905 Jun 2001 WO
WO0180703 Nov 2001 WO
WO0191623 Dec 2001 WO
WO0239864 May 2002 WO
WO0239868 May 2002 WO
WO02058527 Aug 2002 WO
WO02062194 Aug 2002 WO
WO02067744 Sep 2002 WO
WO02067745 Sep 2002 WO
WO02067752 Sep 2002 WO
WO02069774 Sep 2002 WO
WO02075350 Sep 2002 WO
WO02075356 Sep 2002 WO
WO02075469 Sep 2002 WO
WO02075470 Sep 2002 WO
WO02081074 Oct 2002 WO
WO03015220 Feb 2003 WO
WO03024292 Mar 2003 WO
WO02069775 May 2003 WO
WO03040546 May 2003 WO
WO03040845 May 2003 WO
WO03040846 May 2003 WO
WO03062850 Jul 2003 WO
WO03062852 Jul 2003 WO
WO02101477 Oct 2003 WO
WO03026474 Nov 2003 WO
WO2004004533 Jan 2004 WO
WO2004004534 Jan 2004 WO
WO2004005956 Jan 2004 WO
WO2004006034 Jan 2004 WO
WO2004025947 May 2004 WO
WO2004043215 May 2004 WO
WO2004058028 Jul 2004 WO
WO2005006935 Jan 2005 WO
WO2005036292 Apr 2005 WO
WO2005055796 Jun 2005 WO
WO2005076545 Aug 2005 WO
WO2005077243 Aug 2005 WO
WO2005077244 Aug 2005 WO
WO2005081074 Sep 2005 WO
WO2005082223 Sep 2005 WO
WO2005083541 Sep 2005 WO
WO2005098475 Oct 2005 WO
WO2005098476 Oct 2005 WO
WO2006046400 May 2006 WO
WO2006068403 Jun 2006 WO
WO2006073248 Jul 2006 WO
WO2007036490 May 2007 WO
WO2007065033 Jun 2007 WO
WO2007137234 Nov 2007 WO
Non-Patent Literature Citations (157)
Entry
Borges et al. “Optimal Mobile Robot Pose Estimation Using Geometrical Maps”, IEEE Transactions on Robotics and Automation, vol. 18, No. 1, pp. 87-94, Feb. 2002.
Braunstingl et al. “Fuzzy Logic Wall Following of a Mobile Robot Based on the Concept of General Perception” ICAR '95, 7th International Conference on Advanced Robotics, Sant Feliu De Guixols, Spain, pp. 367-376, Sep. 1995.
Bulusu, et al. “Self Configuring Localization systems: Design and Experimental Evaluation”, ACM Transactions on Embedded Computing Systems vol. 3 No. 1 pp. 24-60, 2003.
Caccia, et al. “Bottom-Following for Remotely Operated Vehicles”, 5th IFAC conference, Alaborg, Denmark, pp. 245-250 Aug. 1, 2000.
Chae, et al. “StarLITE: A new artificial landmark for the navigation of mobile robots”, http://www.irc.atr.jp/jk-nrs2005/pdf/Starlite.pdf, 4 pages, 2005.
Chamberlin et al. “Team 1: Robot Locator Beacon System” NASA Goddard SFC, Design Proposal, 15 pages, Feb. 17, 2006.
Champy “Physical management of IT assets in Data Centers using RFID technologies”, RFID 2005 University, Oct. 12-14, 2005.
Chiri “Joystick Control for Tiny OS Robot”, http://www.eecs.berkeley.edu/Programs/ugrad/superb/papers2002/chiri.pdf. 12 pages, Aug. 8, 2002.
Christensen et al. “Theoretical Methods for Planning and Control in Mobile Robotics” 1997 First International Conference on Knowledge-Based Intelligent Electronic Systems, Adelaide, Australia, pp. 81-86, May 21-27, 1997.
Clerentin, et al. “A localization method based on two omnidirectional perception systems cooperation” Proc of IEEE International Conference on Robotics & Automation, San Francisco, CA vol. 2, pp. 1219-1224, Apr. 2000.
Corke “High Performance Visual serving for robots end-point control”. SPIE vol. 2056 Intelligent robots and computer vision 1993.
Cozman et al. “Robot Localization using a Computer Vision Sextant”, IEEE International Midwest Conference on Robotics and Automation, pp. 106-111, 1995.
D'Orazio, et al. “Model based Vision System for mobile robot position estimation”, SPIE vol. 2058 Mobile Robots VIII, pp. 38-49, 1992.
De Bakker, et al. “Smart PSD—array for sheet of light range imaging”, Proc. of SPIE vol. 3965, pp. 1-12, May 15, 2000.
Desaulniers, et al. “An Efficient Algorithm to find a shortest path for a car-like Robot”, IEEE Transactions on robotics and Automation vol. 11 No. 6, pp. 819-828, Dec. 1995.
Dorfmüller-Ulhaas “Optical Tracking From User Motion to 3D Interaction”, http://www.cg.tuwien.ac.at/research/publications/2002/Dorfmueller-Ulhaas-thesis, 182 pages, 2002.
Dorsch, et al. “Laser Triangulation: Fundamental uncertainty in distance measurement”, Applied Optics, vol. 33 No. 7, pp. 1306-1314, Mar. 1, 1994.
Dudek, et al. “Localizing a Robot with Minimum Travel” Proceedings of the sixth annual ACM-SIAM symposium on Discrete algorithms, vol. 27 No. 2 pp. 583-604, Apr. 1998.
Dulimarta, et al. “Mobile Robot Localization in Indoor Environment”, Pattern Recognition, vol. 30, No. 1, pp. 99-111, 1997.
EBay “Roomba Timer → Timed Cleaning—Floorvac Robotic Vacuum”, Cgi.ebay.com/ws/eBay|SAP|.dll?viewitem&category=43526&item=4375198387&rd=1, 5 pages, Apr. 20, 2005.
Electrolux “Welcome to the Electrolux trilobite” www.electroluxusa.com/node57.asp?currentURL=node142.asp%3F, 2 pages, Mar. 18, 2005.
Eren, et al. “Accuracy in position estimation of mobile robots based on coded infrared signal transmission”, Proceedings: Integrating Intelligent Instrumentation and Control, Instrumentation and Measurement Technology Conference, 1995. IMTC/95. pp. 548-551, 1995.
Eren, et al. “Operation of Mobile Robots in a Structured Infrared Environment”, Proceedings. ‘Sensing, Processing, Networking’, IEEE Instrumentation and Measurement Technology Conference, 1997 (IMTC/97), Ottawa, Canada vol. 1, pp. 20-25, May 19-21, 1997.
Becker, et al. “Reliable Navigation Using Landmarks” IEEE International Conference on Robotics and Automation, 0-7803-1965-6, pp. 401-406, 1995.
Benayad-Cherif, et al., “Mobile Robot Navigation Sensors” SPIE vol. 1831 Mobile Robots, VII, pp. 378-387, 1992.
Facchinetti, Claudio et al. “Using and Learning Vision-Based Self-Positioning for Autonomous Robot Navigation”, ICARCV '94, vol. 3 pp. 1694-1698, 1994.
M. Betke and L. Gurvits, “Mobile robot localization using landmarks”, IEEE Trans. Robot. Automat, vol. 13, pp. 251-263 1997.
Facchinetti, Claudio et al. “Self-Positioning Robot Navigation Using Ceiling Images Sequences”, ACCV '95, 5 pages, Dec. 5-8, 1995.
Fairfield, Nathaniel et al. “Mobile Robot Localization with Sparse Landmarks”, SPIE vol. 4573 pp. 148-155, 2002.
Favre-Bulle, Bernard “Efficient tracking of 3D—Robot Position by Dynamic Triangulation”, IEEE Instrumentation and Measurement Technology Conference IMTC 98 Session on Instrumentation and Measurement in Robotics, vol. 1, pp. 446-449, May 18-21, 1998.
Fayman “Exploiting Process Integration and Composition in the context of Active Vision”, IEEE Transactions on Systems, Man, and Cybernetics—Part C: Application and reviews, vol. 29 No. 1, pp. 73-86, Feb. 1999.
Franz, et al. “Biomimetric robot navigation”, Robotics and Autonomous Systems vol. 30 pp. 133-153, 2000.
Friendly Robotics “Friendly Robotics—Friendly Vac, Robotic Vacuum Cleaner”, www.friendlyrobotics.com/vac.htm. 5 pages Apr. 20, 2005.
Fuentes, et al. “Mobile Robotics 1994”, University of Rochester. Computer Science Department, TR 588, 44 pages, Dec. 7, 1994.
Bison, P et al., “Using a structured beacon for cooperative position estimation” Robotics and Autonomous Systems vol. 29, No. 1, pp. 33-40, Oct. 1999.
Fukuda, et al. “Navigation System based on Ceiling Landmark Recognition for Autonomous mobile robot”, 1995 IEEE/RSJ International Conference on Intelligent Robots and Systems 95. ‘Human Robot Interaction and Cooperative Robots’, Pittsburgh, PA, pp. 1466/1471, Aug. 5-9, 1995.
Gionis “A hand-held optical surface scanner for environmental Modeling and Virtual Reality”, Virtual Reality World, 16 pages 1996.
Goncalves et al. “A Visual Front-End for Simultaneous Localization and Mapping”, Proceedings of the 2005 IEEE International Conference on Robotics and Automation, Barcelona, Spain, pp. 44-49, Apr. 2005.
Hamamatsu “SI PIN Diode S5980, S5981 S5870—Multi-element photodiodes for surface mounting”, Hamatsu Photonics, 2 pages Apr. 2004.
Hammacher Schlemmer “Electrolux Trilobite Robotic Vacuum” www.hammacher.com/publish/71579.asp?promo=xsells, 3 pages, Mar. 18, 2005.
Haralick et al. “Pose Estimation from Corresponding Point Data”, IEEE Transactions on systems, Man, and Cybernetics, vol. 19, No. 6, pp. 1426-1446, Nov. 1989.
Hausler “About the Scaling Behaviour of Optical Range Sensors”, Fringe '97, Proceedings of the 3rd International Workshop on Automatic Processing of Fringe Patterns, Bremen, Germany, pp. 147-155, Sep. 15-17, 1997.
Blaasvaer, et al. “AMOR—An Autonomous Mobile Robot Navigation System”, Proceedings of the IEEE International Conference on Systems, Man, and Cybernetics, pp. 2266-2271, 1994.
Hoag, et al. “Navigation and Guidance in interstellar space”, ACTA Astronautica vol. 2, pp. 513-533 , Feb. 14, 1975.
Huntsberger et al. “CAMPOUT: A Control Architecture for Tightly Coupled Coordination of Multirobot Systems for Planetary Surface Exploration”, IEEE Transactions on Systems, Man, and Cybernetics—Part A: Systems and Humans, vol. 33, No. 5, pp. 550-559, Sep. 2003.
Iirobotics.com “Samsung Unveils Its Multifunction Robot Vacuum”, www.iirobotics.com/webpages/hotstuff.php?ubre=111, 3 pages, Mar. 18, 2005.
Jarosiewicz et al., “Final Report—Lucid,” University of Florida, Departmetn of Electrical and Computer Engineering, EEL 5666—Intelligent Machine Design Laboratory, 50 pages, Aug. 1999.
Jensfelt et al., “Active Global Localization for a mobile robot using multiple hypothesis tracking,” IEEE Transactions on Robots and Automation, 17(5): 748-760, Oct. 2001.
Kahney, “Robot Vacs are in the House,” Retrieved from the Internet: URL<www.wired.com/news/technology/o,1282,59237,00.html>. 6 pages, Jun. 2003.
Karcher, “Product Manual Download Karch”, available at www.karcher.com, 16 pages, 2004.
Karcher “Karcher RoboCleaner RC 3000,” Retrieved from the Internet: URL<www.robocleaner.de/english/screen3.html>. 4 pages, Dec. 2003.
Karcher USA “RC 3000 Robotics cleaner,” : Retrieved from the Internet: URL<www.karcher-usa.com, 3 pages, Mar. 2005.
Karlsson et al, “Core Technologies for service Robotics,” IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2004), vol. 3, pp. 2979-2984, Sep. 2004.
Karlsson et al., The vSLAM Algorithm for Robust Localization and Mapping, Proceedings of the 2005 IEEE International Conference on Robotics and Automation, Barcelona, Spain, pp. 24-29, Apr. 2005.
Kleinberg, The Localization Problem for Mobile Robots, Laboratory for Computer Science, Massachusetts Institute of Technology, 1994 IEEE, pp. 521-531, 1994.
Knights, et al., “Localization and Identification of Visual Landmarks,” Journal of Computing Sciences in Colleges, 16(4):312-313, May 2001.
Kolodko et al., “Experimental System for Real-Time Motion Estimation,” Proceedings of the 2003 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM 2003), pp. 981-986, 2003.
Komoriya et al., “Planning of Landmark Measurement for the Navigation of a Mobile Robot,” Proceedings of the 1992 IEEE/RSJ International Cofnerence on Intelligent Robots and Systems, Raleigh, NC pp. 1476-1481, Jul. 1992.
Krotov et al., “Digital Sextant,” Downloaded from the internet at: http://www.cs.cmu.edu/˜epk/ , 1 page, 1995.
Krupa et al., “Autonomous 3-D Positioning of Surgical Instruments in Robotized Laparoscopic Surgery Using Visual Servoin,” IEEE Transactions on Robotics and Automation, 19(5):842-853, Oct. 2003.
Kuhl et al., “Self Localization in Environments using Visual Angles,” VRCAI '04 Proceedings of the 2004 ACM SIGGRAPH international conference on Virtual Reality continuum and its applications in industry, pp. 472-475, 2004.
Kurth, “Range-Only Robot Localization and Slam with Radio”, http://www.ri.cmu.edu/pub—files/pub4/kurth—derek—2004—1/kurth—derek—2004—1.pdf. 60 pages, May 2004, accessed Jul. 27, 2012.
Lambrinos et al., “A mobile robot employing insect strategies for navigation,” Retrieved from the Internat: URL<http://www8.cs.umu.se/kurser/TDBD17/VT04/dl/Assignment%20Papers/lambrinos-RAS-2000.pdf>. 38 pages, Feb. 1999.
Lang et al., “Visual Measurement of Orientation Using Ceiling Features”, 1994 IEEE, pp. 552-555, 1994.
Lapin, “Adaptive position estimation for an automated guided vehicle,” SPIE, vol. 1831 Mobile Robots VII, pp. 82-94, 1992.
LaValle et al., “Robot Motion Planning in a Changing, Partially Predictable Environment,” 1994 IEEE International Symposium on Intelligent Control, Columbus, OH, pp. 261-266, Aug. 1994.
Lee et al., “Development of Indoor Navigation system for Humanoid Robot Using Multi-sensors Integration”, ION NTM, San Diego, CA pp. 798-805, Jan. 2007.
Lee et al., “Localization of a Mobile Robot Using the Image of a Moving Object,” IEEE Transaction on Industrial Electronics, 50(3):612-619, Jun. 2003.
Leonard et al., “Mobile Robot Localization by tracking Geometric Beacons,” IEEE Transaction on Robotics and Automation, 7(3):376-382, Jun. 1991.
Li et al., “Making a Local Map of Indoor Environments by Swiveling a Camera and a Sonar,” Proceedings of the 1999 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 954-959, 1999.
Lin et al., “Mobile Robot Navigation Using Artificial Landmarks,” Journal of robotics System, 14(2): 93-106, 1997.
Linde, Dissertation—“On Aspects of Indoor Localization,” Available at: https://eldorado.tu-dortmund.de/handle/2003/22854, University of Dortmund, 138 pages, Aug. 2006.
Lumelsky et al., “An Algorithm for Maze Searching with Azimuth Input”, 1994 IEEE International Conference on Robotics and Automation, San Diego, CA vol. 1, pp. 111-116, 1994.
Luo et al., “Real-time Area-Covering Operations with Obstacle Avoidance for Cleaning Robots,” IEEE, pp. 2359-2364, 2002.
Ma, Thesis—“Documentation On Northstar,” California Institute of Technology, 14 pages, May 2006.
Madsen et al., “Optimal landmark selection for triangulation of robot position,” Journal of Robotics and Autonomous Systems, vol. 13 pp. 277-292, 1998.
Matsutek Enterprises Co. Ltd, “Automatic Rechargeable Vacuum Cleaner,” http://matsutek.manufacturer.globalsources.com/si/6008801427181/pdtl/Home-vacuum/10 . . . , Apr. 2007, 3 pages.
McGillem et al., “Infra-red Lacation System for Navigation and Autonomous Vehicles,” 1988 IEEE International Conference on Robotics and Automation, vol. 2, pp. 1236-1238, Apr. 1988.
McGillem,et al. “A Beacon Navigation Method for Autonomous Vehicles,” IEEE Transactions on Vehicular Technology, 38(3):132-139, Aug. 1989.
Miro et al., “Towards Vision Based Navigation in Large Indoor Environments,” Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Beijing, China, pp. 2096-2102, Oct. 2006.
MobileMag, “Samsung Unveils High-tech Robot Vacuum Cleaner,” Retrieved from the Internet: URL<http://www.mobilemag.com/content/100/102/C2261/>. 4 pages, Mar. 2005.
Monteiro et al., “Visual Servoing for Fast Mobile Robot: Adaptive Estimation of Kinematic Parameters,” Proceedings of the IECON '93., International Conference on Industrial Electronics, Maui, HI, pp. 1588-1593, Nov. 1993.
Moore et al., “A simple Map-bases Localization strategy using range measurements,” SPIE, vol. 5804 pp. 612-620, 2005.
Munich et al., “ERSP: A Software Platform and Architecture for the Service Robotics Industry,” Intelligent Robots and Systems, 2005. (IROS 2005), pp. 460-467, Aug. 2005.
Munich et al., “SIFT-ing Through Features with ViPR”, IEEE Robotics & Automation Magazine, pp. 72-77, Sep. 2006.
Nam et al., “Real-Time Dynamic Visual Tracking Using PSD Sensors and extended Trapezoidal Motion Planning”, Applied Intelligence 10, pp. 53-70, 1999.
Nitu et al., “Optomechatronic System for Position Detection of a Mobile Mini-Robot,” IEEE Ttransactions on Industrial Electronics, 52(4):969-973, Aug. 2005.
On Robo, “Robot Reviews Samsung Robot Vacuum (VC-RP30W),” Retrieved from the Internet: URL <www.onrobo.com/reviews/AT—Home/vacuum—cleaners/on00vcrb30rosam/index.htm>. 2 pages, 2005.
InMach “Intelligent Machines,” Retrieved from the Internet: URL<www.inmach.de/inside.html>. 1 page , Nov. 2008.
Innovation First, “2004 EDU Robot Controller Reference Guide,” Retrieved from the Internet: URL<http://www.ifirobotics.com>. 13 pages, Mar. 2004.
Pages et al., “A camera-projector system for robot positioning by visual serving,” Proceedings of the 2006 Conference on Computer Vision and Pattern Recognition Workshop (CVPRW06), 8 pages, Jun. 2006.
Pages et al., “Optimizing Plane-to-Plane Positioning Tasks by Image-Based Visual Servoing and Structured Light,” IEEE Transactions on Robotics, 22(5):1000-1010, Oct. 2006.
Pages et al., “Robust decoupled visual servoing based on structured light,” 2005 IEEE/RSJ, Int. Conf. on Intelligent Robots and Systems, pp. 2676-2681, 2005.
Park et al., “A Neural Network Based Real-Time Robot Tracking Controller Using Position Sensitive Detectors,” IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on Neutral Networks, Orlando, Florida pp. 2754-2758, Jun./Jul. 1994.
Park et al., “Dynamic Visual Servo Control of Robot Manipulators using Neutral Networks,” The Korean Institute Telematics and Electronics, 29-B(10):771-779, Oct. 1992.
Paromtchik et al., “Optical Guidance System for Multiple mobile Robots,” Proceedings 2001 ICRA. IEEE International Conference on Robotics and Automation, vol. 3, pp. 2935-2940, May 2001.
Penna et al., “Models for Map Building and Navigation”, IEEE Transactions on Systems. Man. And Cybernetics., 23(5):1276-1301, Sep./Oct. 1993.
Pirjanian et al. “Representation and Execution of Plan Sequences for Multi-Agent Systems,” Proceedings of the 2001 IEEE/RSJ International Conference on Intelligent Robots and Systems, Maui, Hawaii, pp. 2117-2123, Oct. 2001.
Pirjanian et al., “A decision-theoretic approach to fuzzy behavior coordination”, 1999 IEEE International Symposium on Computational Intelligence in Robotics and Automation, 1999. CIRA '99., Monterey, CA, pp. 101-106, Nov. 1999.
Pirjanian et al., “Distributed Control for a Modular, Reconfigurable Cliff Robot,” Proceedings of the 2002 IEEE International Conference on Robotics & Automation, Washington, D.C. pp. 4083-4088, May 2002.
Pirjanian et al., “Improving Task Reliability by Fusion of Redundant Homogeneous Modules Using Voting Schemes,” Proceedings of the 1997 IEEE International Conference on Robotics and Automation, Albuquerque, NM, pp. 425-430, Apr. 1997.
Pirjanian et al., “Multi-Robot Target Acquisition using Multiple Objective Behavior Coordination,” Proceedings of the 2000 IEEE International Conference on Robotics & Automation, San Francisco, CA, pp. 2696-2702, Apr. 2000.
Pirjanian, “Challenges for Standards for consumer Robotics,” IEEE Workshop on Advanced Robotics and its Social impacts, pp. 260-264, Jun. 2005.
Pirjanian, “Reliable Reaction,” Proceedings of the 1996 IEEE/SICE/RSJ International Conference on Multisensor Fusion and Integration for Intelligent Systems, pp. 158-165, 1996.
Prassler et al., “A Short History of Cleaning Robots,” Autonomous Robots 9, 211-226, 2000, 16 pages.
Remazeilles et al., “Image based robot navigation in 3D environments,” Proc. of SPIE, vol. 6052, pp. 1-14, Dec. 2005.
Rives et al., “Visual servoing based on ellipse features,” SPIE, vol. 2056 Intelligent Robots and Computer Vision pp. 356-367, 1993.
Robotics World, “A Clean Sweep,” 5 pages, Jan. 2001.
Ronnback, “On Methods for Assistive Mobile Robots,” Retrieved from the Internet: URL<http://www.openthesis.org/documents/methods-assistive-mobile-robots-595019.html>. 218 pages, Jan. 2006.
Roth-Tabak et al., “Environment Model for mobile Robots Indoor Navigation,” SPIE, vol. 1388 Mobile Robots, pp. 453-463, 1990.
Malik et al., “Virtual Prototyping for Conceptual Design of a Tracked Mobile Robot,” Electrical and Computer Engineering, Canadian Conference on, IEEE, PI. pp. 2349-2352, May 2006.
Sahin et al., “Development of a Visual Object Localization Module for Mobile Robots,” 1999 Third European Workshop on Advanced Mobile Robots, (Eurobot '99), pp. 65-72, 1999.
Salomon et al., “Low-Cost Optical Indoor Localization system for Mobile Objects without Image Processing,” IEEE Conference on Emerging Technologies and Factory Automation, 2006. (ETFA '06), pp. 629-632, Sep. 2006.
Sato, “Range Imaging Based on Moving Pattern Light and Spatio-Temporal Matched Filter,” Proceedings International Conference on Image Processing, vol. 1., Lausanne, Switzerland, pp. 33-36, Sep. 1996.
Schenker et al., “Lightweight rovers for Mars science exploration and sample return,” Intelligent Robots and Computer Vision XVI, SPIE Proc. 3208, pp. 24-36, 1997.
Shimoga et al., “Touch and Force Reflection for Telepresence Surgery,” Engineering in Medicine and Biology Society, 1994. Engineering Advances: New Opportunities for Biomedical Engineers. Proceedings of the 16th Annual International Conference of the IEEE, Baltimore, MD, pp. 1049-1050, 1994.
Sim et al, “Learning Visual Landmarks for Pose Estimation,” IEEE International Conference on Robotics and Automation, vol. 3, Detroit, MI, pp. 1972-1978, May 1999.
Sobh et al., “Case Studies in Web-Controlled Devices and Remote Manipulation,” Automation Congress, 2002 Proceedings of the 5th Biannual World, pp. 435-440, Dec. 2002.
Special Reports, “Vacuum Cleaner Robot Operated in Conjunction with 3G Celluar Phone,” 59(9): 3 pages, 2004, Retrieved from the Internet: URL<http://www.toshiba.co.jp/tech/review/2004/09/59—0>.
Stella et al., “Self-Location for Indoor Navigation of Autonomous Vehicles,” Part of the SPIE conference on Enhanced and Synthetic Vision SPIE vol. 3364, pp. 298-302, 1998.
Summet, “Tracking Locations of Moving Hand-held Displays Using Projected Light,” Pervasive 2005, LNCS 3468, pp. 37-46, 2005.
Svedman et al., “Structure from Stereo Vision using Unsynchronized Cameras for Simultaneous Localization and Mapping,” 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 2993-2998, 2005.
Takio et al., “Real-Time Position and Pose Tracking Method of Moving Object Using Visual Servo System,” 47th IEEE International Symposium on Circuits and Systems, pp. 167-170, 2004.
Teller, “Pervasive pose awareness for people, Objects and Robots,” http://www.ai.mit.edu/lab/dangerous-ideas/Spring2003/teller-pose.pdf, 6 pages, Apr. 2003.
Terada et al., “An Acquisition of the Relation between Vision and Action using Self-Organizing Map and Reinforcement Learning,” 1998 Second International Conference on Knowledge-Based Intelligent Electronic Systems, Adelaide, Australia, pp. 429-434, Apr. 1998.
The Sharper Image, eVac Robotic Vacuum—Product Details, www.sharperiamge.com/us/en/templates/products/pipmoreworklprintable.jhtml, 1 page, Mar. 2005.
TheRobotStore.com, “Friendly Robotics Robotic Vacuum RV400—The Robot Store,” www.therobotstore.com/s.nl/sc.9/category.-109/it.A/id.43/.f, 1 page, Apr. 2005.
TotalVac.com, RC3000 RoboCleaner website, Mar. 2005, 3 pages.
Trebi-Ollennu et al., “Mars Rover Pair Cooperatively Transporting a Long Payload,” Proceedings of the 2002 IEEE International Conference on Robotics & Automation, Washington, D.C. pp. 3136-3141, May 2002.
Tribelhorn et al., “Evaluating the Roomba: A low-cost, ubiquitous platform for robotics research and education,” IEEE, pp. 1393-1399, 2007.
Tse et al., “Design of a Navigation System for a Household Mobile Robot Using Neural Networks,” Department of Manufacturing Engg. & Engg. Management, City University of Hong Kong, pp. 2151-2156, 1998.
UAMA (Asia) Industrial Co., Ltd., “RobotFamily,” 2005, 1 page.
UBOT, cleaning robot capable of wiping with a wet duster, Retrieved from the Internet: URL<http://us.aving.net/news/view.php?articleId=23031>. 4 pages, accessed Nov. 2011.
Watanabe et al., “Position Estimation of Mobile Robots With Internal and External Sensors Using Uncertainty Evolution Technique,” 1990 IEEE International Conference on Robotics and Automation, Cincinnati, OH, pp. 2011-2016, May 1990.
Watts, “Robot, boldly goes where no man can,” The Times—pp. 20, Jan. 1985.
Wijk et al., “Triangulation-Based Fusion of Sonar Data with Application in Robot Pose Tracking,” IEEE Transactions on Robotics and Automation, 16(6):740-752, Dec. 2000.
Andersen et al., “Landmark based navigation strategies”, SPIE Conference on Mobile Robots XIII, SPIE vol. 3525, pp. 170-181, Jan. 8, 1999.
Ascii, Mar. 25, 2002, http://ascii.jp/elem/000/000/330/330024/ accessed Nov. 1, 2011.
U.S. Appl. No. 60/605,066 as provided to WIPO in PCT/US2005/030422, corresponding to U.S. National Stage Entry Application No. 11/574,290, U.S. publication 2008/0184518, filed Aug. 27, 2004.
U.S. Appl. No. 60/605,181 as provided to WIPO in PCT/US2005/030422, corresponding to U.S. National Stage Entry Application No. 11/574,290, U.S. publication 2008/0184518, filed Aug. 27, 2004.
Derek Kurth, “Range-Only Robot Localization and SLAM with Radio”, http://www.ri.cmu.edu/pub—files/pub4/kurth—derek—2004—1/kurth—derek—2004—1.pdf. 60 pages, May 2004, accessed Jul. 27, 2012.
Electrolux Trilobite, Jan. 12, 2001, http://www.electrolux-ui.com:8080/2002%5C822%5C833102EN.pdf, accessed Jul. 2, 2012, 10 pages.
Florbot GE Plastics, 1989-1990, 2 pages, available at http://www.fuseid.com/, accessed Sep. 27, 2012.
Gregg et al., “Autonomous Lawn Care Applications,” 2006 Florida Conference on Recent Advances in Robotics, Miami, Florida, May 25-26, 2006, Florida International University, 5 pages.
Hitachi ‘Feature’, http://kadenfan.hitachi.co.jp/robot/feature/feature.html, 1 page, Nov. 19, 2008.
Hitachi, http://www.hitachi.co.jp/New/cnews/hi—030529—hi—030529.pdf , 8 pages, May 29, 2003.
Home Robot—UBOT; Microbotusa.com, retrieved from the WWW at www.microrobotusa.com, accessed Dec. 2, 2008.
King and Weiman, “Helpmate™ Autonomous Mobile Robots Navigation Systems,” SPIE vol. 1388 Mobile Robots, pp. 190-198 (1990).
Li, Z.; Trappe, W.; Zhang, Y.; Badri Nath; , “Robust statistical methods for securing wireless localization in sensor networks,” Information Processing in Sensor Networks, 2005. IPSN 2005. Fourth International Symposium on , vol., no., pp. 91-98, Apr. 15, 2005.
Yuri V. Martishevcky, “Accuracy of point light target coordinate determination by dissectoral tracking system”, Proc. SPIE 2591, 25 (1995) (Oct. 23, 1995).
Maschinemarkt Würzburg 105, Nr. 27, pp. 3, 30, Jul. 5, 1999.
Miwako Doi “Using the symbiosis of human and robots from approaching Research and Development Center,” Toshiba Corporation, 16 pages, available at http://warp.ndl.go.jp/info:ndljp/pid/258151/www.soumu.go.jp/joho—tsusin/policyreports/chousa/netrobot/pdf/030214—1—33—a.pdf, Feb. 26, 2003.
Paromtchik “Toward Optical Guidance of Mobile Robots,” Proceedings of the Fourth World Multiconference on Systemics, Cybermetics and Informatics, Orlando, FL, USA, Jul. 23, 2000, vol. IX, pp. 44-49, available at http://emotion.inrialpes.fr/˜paromt/infos/papers/paromtchik:asama:sci:2000.ps.gz, accessed Jul. 3, 2012.
Roboking—not just a vacuum cleaner, a robot!, Jan. 21, 2004, infocom.uz/2004/01/21/robokingne-prosto-pyilesos-a-robot/, accessed Oct. 10, 2011, 7 pages.
Sebastian Thrun, “Learning Occupancy Grid Maps With Forward Sensor Models,” Autonomous Robots 15, 111-127, Sep. 1, 2003.
SVET Computers—New Technologies—Robot Vacuum Cleaner, Oct. 1999, available at http://www.sk.rs/1999/10/sknt01.html, accessed Nov. 1, 2011.
Written Opinion of the International Searching Authority, PCT/US2004/001504, Aug. 20, 2012, 9 pages.
Related Publications (1)
Number Date Country
20130117952 A1 May 2013 US
Provisional Applications (2)
Number Date Country
60938699 May 2007 US
60917065 May 2007 US
Continuations (2)
Number Date Country
Parent 13245118 Sep 2011 US
Child 13719784 US
Parent 12118117 May 2008 US
Child 13245118 US