Debris sensor for cleaning apparatus

Information

  • Patent Grant
  • 8598829
  • Patent Number
    8,598,829
  • Date Filed
    Thursday, June 14, 2012
    12 years ago
  • Date Issued
    Tuesday, December 3, 2013
    11 years ago
Abstract
A piezoelectric debris sensor and associated signal processor responsive to debris strikes enable an autonomous or non-autonomous cleaning device to detect the presence of debris and in response, to select a behavioral mode, operational condition or pattern of movement, such as spot coverage or the like. Multiple sensor channels (e.g., left and right) can be used to enable the detection or generation of differential left/right debris signals and thereby enable an autonomous device to steer in the direction of debris.
Description
FIELD OF THE INVENTION

The present invention relates generally to cleaning apparatus, and, more particularly, to a debris sensor for sensing instantaneous strikes by debris in a cleaning path of a cleaning apparatus, and for enabling control of an operational mode of the cleaning apparatus. The term “debris” is used herein to collectively denote dirt, dust, and/or other particulates or objects that might be collected by a vacuum cleaner or other cleaning apparatus, whether autonomous or non-autonomous.


BACKGROUND OF THE INVENTION

Debris sensors, including some suitable for cleaning apparatus, are known in the art. Debris sensors can be useful in autonomous cleaning devices like those disclosed in the above-referenced patent applications, and can also be useful in non-autonomous cleaning devices, whether to indicate to the user that a particularly dirty area is being entered, to increase a power setting in response to detection of debris, or to modify some other operational setting.


Examples of debris sensors are disclosed in the following:


















De Brey
3,674,316



De Brey
3,989,311



De Brey
4,175,892



Kurz
4,601,082



Westergren
4,733,430



Martin
4,733,431



Harkonen
4,829,626



Takashima
5,105,502



Takashima
5,136,750



Kawakami
5,163,202



Yang
5,319,827



Kim
5,440,216



Gordon
5,608,944



Imamura
5,815,884



Imamura
6,023,814



Kasper
6,446,302



Gordon
6,571,422










Among the examples disclosed therein, many such debris sensors are optical in nature, using a light emitter and detector. In typical designs used in, e.g., a vacuum cleaner, the light transmitter and the light receiver of the optical sensor are positioned such that they are exposed into the suction passage or cleaning pathway through which dust flows. During usage of the vacuum cleaner, therefore, dust particles tend to adhere to the exposed surfaces of the light transmitter and the light receiver, through which light is emitted and detected, eventually degrading the performance of the optical sensor.


Accordingly, it would be desirable to provide a debris sensor that is not subject to degradation by accretion of debris.


In addition, debris sensors typical of the prior art are sensitive to a level of built-up debris in a reservoir or cleaning pathway, but not particularly sensitive to instantaneous debris strikes or encounters.


It would therefore be desirable to provide a debris sensor that is capable of instantaneously sensing and responding to debris strikes, and which is immediately responsive to debris on a floor or other surface to be cleaned, with reduced sensitivity to variations in airflow, instantaneous power, or other operational conditions of the cleaning device.


It would be also be useful to provide an autonomous cleaning device having operational modes, patterns of movement or behaviors responsive to detected debris, for example, by steering the device toward “dirtier” areas based on signals generated by a debris sensor.


In addition, it would be desirable to provide a debris sensor that could be used to control, select or vary operational modes of either an autonomous or non-autonomous cleaning apparatus.


SUMMARY OF THE INVENTION

The present invention provides a debris sensor, and apparatus utilizing such a debris sensor, wherein the sensor is instantaneously responsive to debris strikes, and can be used to control, select or vary the operational mode of an autonomous or non-autonomous cleaning apparatus containing such a sensor.


One aspect of the invention is an autonomous cleaning apparatus including a drive system operable to enable movement of the cleaning apparatus; a controller in communication with the drive system, the controller including a processor operable to control the drive system to provide at least one pattern of movement of the cleaning apparatus; and a debris sensor for generating a debris signal indicating that the cleaning apparatus has encountered debris; wherein the processor is responsive to the debris signal to select an operative mode from among predetermined operative modes of the cleaning apparatus.


The selection of operative mode could include selecting a pattern of movement of the cleaning apparatus.


The pattern of movement can include spot coverage of an area containing debris, or steering the cleaning apparatus toward an area containing debris. The debris sensor could include spaced-apart first and second debris sensing elements respectively operable to generate first and second debris signals; and the processor can be responsive to the respective first and second debris signals to select a pattern of movement, such as steering toward a side (e.g., left or right side) with more debris.


The debris sensor can include a piezoelectric sensor element located proximate to a cleaning pathway of the cleaning apparatus and responsive to a debris strike to generate a signal indicative of such strike.


The debris sensor of the invention can also be incorporated into a non-autonomous cleaning apparatus. This aspect of the invention can include a piezoelectric sensor located proximate to a cleaning pathway and responsive to a debris strike to generate a debris signal indicative of such strike; and a processor responsive to the debris signal to change an operative mode of the cleaning apparatus. The change in operative mode could include illuminating a user-perceptible indicator light, changing a power setting (e.g., higher power setting when more debris is encountered), or slowing or reducing a movement speed of the apparatus.


A further aspect of the invention is a debris sensor, including a piezoelectric element located proximate to or within a cleaning pathway of the cleaning apparatus and responsive to a debris strike to generate a first signal indicative of such strike; and a processor operable to process the first signal to generate a second signal representative of a characteristic of debris being encountered by the cleaning apparatus. That characteristic could be, for example, a quantity or volumetric parameter of the debris, or a vector from a present location of the cleaning apparatus to an area containing debris.


Another aspect of the invention takes advantage of the motion of an autonomous cleaning device across a floor or other surface, processing the debris signal in conjunction with knowledge of the cleaning device's movement to calculate a debris gradient. The debris gradient is representative of changes in debris strikes count as the autonomous cleaning apparatus moves along a surface. By examining the sign of the gradient (positive or negative, associated with increasing or decreasing debris), an autonomous cleaning device controller can continuously adjust the path or pattern of movement of the device to clean a debris field most effectively.


These and other aspects, features and advantages of the invention will become more apparent from the following description, in conjunction with the accompanying drawings, in which embodiments of the invention are shown and described by way of illustrative example.





BRIEF DESCRIPTION OF THE DRAWINGS

A more complete understanding of the present invention and the attendant features and advantages thereof may be had by reference to the following detailed description of the invention when considered in conjunction with the accompanying drawings wherein:



FIG. 1 is a top-view schematic of an exemplary autonomous cleaning device in which the debris sensor of the invention can be employed.



FIG. 2 is a block diagram of exemplary hardware elements of the robotic device of FIG. 1, including a debris sensor subsystem of the invention.



FIG. 3 is a side view of the robotic device of FIG. 1, showing a debris sensor according to the invention situated in a cleaning or vacuum pathway, where it will be struck by debris upswept by the main cleaning brush element.



FIG. 4 is an exploded diagram of a piezoelectric debris sensor in accordance with the invention.



FIG. 5 is a schematic diagram of a debris sensor signal processing architecture according to the present invention.



FIG. 6 is a schematic diagram of signal processing circuitry for the debris sensor architecture of FIG. 5.



FIG. 7 is a schematic diagram showing the debris sensor in a non-autonomous cleaning apparatus.



FIG. 8 is a flowchart of a method according to one practice of the invention.





DETAILED DESCRIPTION OF THE INVENTION

While the debris sensor of the present invention can be incorporated into a wide range of autonomous cleaning devices (and indeed, into non-autonomous cleaning devices as shown by way of example in FIG. 7), it will first be described in the context of an exemplary autonomous cleaning device shown in FIGS. 1-3. Further details of the structure, function and behavioral modes of such an autonomous cleaning device are set forth in the patent applications cited above in the Cross-Reference section, each of which is incorporated herein by reference. Accordingly, the following detailed description is organized into the following sections:

    • I. Exemplary Autonomous Cleaning Device
    • II. Behavioral Modes of an Autonomous Cleaning Device
    • III. Debris Sensor Structure
    • IV. Signal Processing
    • V. Conclusions


I. AUTONOMOUS CLEANING DEVICE

Referring now to the drawings wherein like reference numerals identify corresponding or similar elements throughout the several views, FIG. 1 is a top-view schematic of an exemplary autonomous cleaning device 100 in which a debris sensor according to the present invention may be incorporated. FIG. 2 is a block diagram of the hardware of the robot device 100 of FIG. 1.


Examples of hardware and behavioral modes (coverage behaviors or patterns of movement for cleaning operations; escape behaviors for transitory movement patterns; and safety behaviors for emergency conditions) of an autonomous cleaning device 100 marketed by the iRobot Corporation of Burlington, Mass. under the ROOMBA trademark, will next be described to provide a more complete understanding of how the debris sensing system of the present invention may be employed. However, the invention can also be employed in non-autonomous cleaning devices, and an example is described below in connection with FIG. 7.


In the following description, the terms “forward” and “fore” are used to refer to the primary direction of motion (forward) of the robotic device (see arrow identified by reference character “FM” in FIG. 1). The fore/aft axis FAx of the robotic device 100 coincides with the medial diameter of the robotic device 100 that divides the robotic device 100 into generally symmetrical right and left halves, which are defined as the dominant and non-dominant sides, respectively.


An example of such a robotic cleaning device 100 has a generally disk-like housing infrastructure that includes a chassis 102 and an outer shell 104 secured to the chassis 102 that define a structural envelope of minimal height (to facilitate movement under furniture). The hardware comprising the robotic device 100 can be generally categorized as the functional elements of a power system, a motive power system (also referred to herein as a “drive system”), a sensor system, a control module, a side brush assembly, or a self-adjusting cleaning head system, respectively, all of which are integrated in combination with the housing infrastructure. In addition to such categorized hardware, the robotic device 100 further includes a forward bumper 106 having a generally arcuate configuration and a nose-wheel assembly 108.


The forward bumper 106 (illustrated as a single component; alternatively, a two-segment component) is integrated in movable combination with the chassis 102 (by means of displaceable support members pairs) to extend outwardly therefrom. Whenever the robotic device 100 impacts an obstacle (e.g., wall, furniture) during movement thereof, the bumper 106 is displaced (compressed) towards the chassis 102 and returns to its extended (operating) position when contact with the obstacle is terminated.


The nose-wheel assembly 108 is mounted in biased combination with the chassis 102 so that the nose-wheel subassembly 108 is in a retracted position (due to the weight of the robotic device 100) during cleaning operations wherein it rotates freely over the surface being cleaned. When the nose-wheel subassembly 108 encounters a drop-off during operation (e.g., descending stairs, split-level floors), the nose-wheel assembly 108 is biased to an extended position.


The hardware of the power system, which provides the energy to power the electrically-operated hardware of the robotic device 100, comprises a rechargeable battery pack 110 (and associated conduction lines, not shown) that is integrated in combination with the chassis 102.


As shown in FIG. 1, the motive power system provides the means that propels the robotic device 100 and operates the cleaning mechanisms, e.g., side brush assembly and the self-adjusting cleaning head system, during movement of the robotic device 100. The motive power system comprises left and right main drive wheel assemblies 112L, 112R, their associated independent electric motors 114L, 114R, and electric motors 116, 118 for operation of the side brush assembly and the self-adjusting cleaning head subsystem, respectively.


The electric motors 114L, 114R are mechanically coupled to the main drive wheel assemblies 112L, 112R, respectively, and independently operated by control signals generated by the control module as a response to the implementation of a behavioral mode, or, as discussed in greater detail below, in response to debris signals generated by left and right debris sensors 125L, 125R shown in FIG. 1.


Independent operation of the electric motors 114L, 114R allows the main wheel assemblies 112L, 112R to be: (1) rotated at the same speed in the same direction to propel the robotic device 100 in a straight line, forward or aft; (2) differentially rotated (including the condition wherein one wheel assembly is not rotated) to effect a variety of right and/or left turning patterns (over a spectrum of sharp to shallow turns) for the robotic device 100; and (3) rotated at the same speed in opposite directions to cause the robotic device 100 to turn in place, i.e., “spin on a dime”, to provide an extensive repertoire of movement capability for the robotic device 100.


As shown in FIG. 1, the sensor system comprises a variety of different sensor units that are operative to generate signals that control the behavioral mode operations of the robotic device 100. The described robotic device 100 includes obstacle detection units 120, cliff detection units 122, wheel drop sensors 124, an obstacle-following unit 126, a virtual wall omnidirectional detector 128, stall-sensor units 130, main wheel encoder units 132, and, in accordance with the present invention, left and right debris sensors 125L and 125R described in greater detail below.


In the illustrated embodiment, the obstacle (“bump”) detection units 120 can be IR break beam sensors mounted in combination with the displaceable support member pairs of the forward bumper 106. These detection units 120 are operative to generate one or more signals indicating relative displacement between one or more support member pairs whenever the robotic device 100 impacts an obstacle such that the forward bumper 106 is compressed. These signals are processed by the control module to determine an approximate point of contact with the obstacle relative to the fore-aft axis FAX of the robotic device 100 (and the behavioral mode(s) to be implemented).


The cliff detection units 122 are mounted in combination with the forward bumper 106. Each cliff detection unit 122 comprises an IR emitter-detector pair configured and operative to establish a focal point such that radiation emitted downwardly by the emitter is reflected from the surface being traversed and detected by the detector. If reflected radiation is not detected by the detector, i.e., a drop-off is encountered, the cliff detection unit 122 transmits a signal to the control module (which causes one or more behavioral modes to be implemented).


A wheel drop sensor 124 such as a contact switch is integrated in combination with each of the main drive wheel assemblies 112L, 112R and the nose wheel assembly 108 and is operative to generate a signal whenever any of the wheel assemblies is in an extended position, i.e., not in contact with the surface being traversed, (which causes the control module to implement one or more behavioral modes).


The obstacle-following unit 126 for the described embodiment is an IR emitter-detector pair mounted on the ‘dominant’ side (right hand side of FIG. 1) of the robotic device 100. The emitter-detector pair is similar in configuration to the cliff detection units 112, but is positioned so that the emitter emits radiation laterally from the dominant side of the robotic device 100. The unit 126 is operative to transmit a signal to the control module whenever an obstacle is detected as a result of radiation reflected from the obstacle and detected by the detector. The control module, in response to this signal, causes one or more behavioral modes to be implemented.


A virtual wall detection system for use in conjunction with the described embodiment of the robotic device 100 comprises an omnidirectional detector 128 mounted atop the outer shell 104 and a stand-alone transmitting unit (not shown) that transmits an axially-directed confinement beam. The stand-alone transmitting unit is positioned so that the emitted confinement beam blocks an accessway to a defined working area, thereby restricting the robotic device 100 to operations within the defined working area (e.g., in a doorway to confine the robotic device 100 within a specific room to be cleaned). Upon detection of the confinement beam, the omnidirectional detector 128 transmits a signal to the control module (which causes one or more behavioral modes to be implemented to move the robotic device 100 away from the confinement beam generated by the stand-alone transmitting unit).


A stall sensor unit 130 is integrated in combination with each electric motor 114L, 114R, 116, 118 and operative to transmit a signal to the control module when a change in current is detected in the associated electric motor (which is indicative of a dysfunctional condition in the corresponding driven hardware). The control module is operative in response to such a signal to implement one or more behavioral modes.


An IR encoder unit 132 (see FIG. 2) is integrated in combination with each main wheel assembly 112L, 112R and operative to detect the rotation of the corresponding wheel and transmit signals corresponding thereto the control module (wheel rotation can be used to provide an estimate of distance traveled for the robotic device 100).


Control Module:


Referring now to FIG. 2, the control module comprises the microprocessing unit 135 that includes I/O ports connected to the sensors and controllable hardware of the robotic device 100, a microcontroller (such as a Motorola MC9512E128CPV 16-bit controller), and ROM and RAM memory. The I/O ports function as the interface between the microcontroller and the sensor units (including left and right debris sensors 125 discussed in greater detail below) and controllable hardware, transferring signals generated by the sensor units to the microcontroller and transferring control (instruction) signals generated by the microcontroller to the controllable hardware to implement a specific behavioral mode.


The microcontroller is operative to execute instruction sets for processing sensor signals, implementing specific behavioral modes based upon such processed signals, and generating control (instruction) signals for the controllable hardware based upon implemented behavioral modes for the robotic device 100. The cleaning coverage and control programs for the robotic device 100 are stored in the ROM of the microprocessing unit 135, which includes the behavioral modes, sensor processing algorithms, control signal generation algorithms and a prioritization algorithm for determining which behavioral mode or modes are to be given control of the robotic device 100. The RAM of the microprocessing unit 135 is used to store the active state of the robotic device 100, including the ID of the behavioral mode(s) under which the robotic device 100 is currently being operated and the hardware commands associated therewith.


Referring again to FIG. 1, there is shown a brush assembly 140, configured and operative to entrain particulates outside the periphery of the housing infrastructure and to direct such particulates towards the self-adjusting cleaning head system. The side brush assembly 140 provides the robotic device 100 with the capability of cleaning surfaces adjacent to base-boards when the robotic device is operated in an Obstacle-Following behavioral mode. As shown in FIG. 1, the side brush assembly 140 is preferably mounted in combination with the chassis 102 in the forward quadrant on the dominant side of the robotic device 100.


The self-adjusting cleaning head system 145 for the described robotic device 100 comprises a dual-stage brush assembly and a vacuum assembly, each of which is independently powered by an electric motor (reference numeral 118 in FIG. 1 actually identifies two independent electric motors—one for the brush assembly and one for the vacuum assembly). The cleaning capability of the robotic device 100 is commonly characterized in terms of the width of the cleaning head system 145 (see reference character W in FIG. 1).


Referring now to FIG. 3, in one embodiment of a robotic cleaning device, the cleaning brush assembly comprises asymmetric, counter-rotating flapper and main brush elements 92 and 94, respectively, that are positioned forward of the vacuum assembly inlet 84, and operative to direct particulate debris 127 into a removable dust cartridge 86. As shown in FIG. 3, the autonomous cleaning apparatus can also include left and right debris sensor elements 125PS, which can be piezoelectric sensor elements, as described in detail below. The piezoelectric debris sensor elements 125PS can be situated in a cleaning pathway of the cleaning device, mounted, for example, in the roof of the cleaning head, so that when struck by particles 127 swept up by the brush elements and/or pulled up by vacuum, the debris sensor elements 125PS generate electrical pulses representative of debris impacts and thus, of the presence of debris in an area in which the autonomous cleaning device is operating.


More particularly, in the arrangement shown in FIG. 3, the sensor elements 125PS are located substantially at an axis AX along which main and flapper brushes 94, 92 meet, so that particles strike the sensor elements 125PS with maximum force.


As shown in FIG. 1, and described in greater detail below, the robotic cleaning device can be fitted with left and right side piezoelectric debris sensors, to generate separate left and right side debris signals that can be processed to signal the robotic device to turn in the direction of a “dirty” area.


The operation of the piezoelectric debris sensors, as well as signal processing and selection of behavioral modes based on the debris signals they generate, will be discussed below following a brief discussion of general aspects of behavioral modes for the cleaning device.


II. BEHAVIORAL MODES

The robotic device 100 can employ a variety of behavioral modes to effectively clean a defined working area where behavioral modes are layers of control systems that can be operated in parallel. The microprocessor unit 135 is operative to execute a prioritized arbitration scheme to identify and implement one or more dominant behavioral modes for any given scenario based upon inputs from the sensor system.


The behavioral modes for the described robotic device 100 can be characterized as: (1) coverage behavioral modes; (2) escape behavioral modes; and (3) safety behavioral modes. Coverage behavioral modes are primarily designed to allow the robotic device 100 to perform its cleaning operations in an efficient and effective manner and the escape and safety behavioral modes are priority behavioral modes implemented when a signal from the sensor system indicates that normal operation of the robotic device 100 is impaired, e.g., obstacle encountered, or is likely to be impaired, e.g., drop-off detected.


Representative and illustrative coverage behavioral (cleaning) modes for the robotic device 100 include: (1) a Spot Coverage pattern; (2) an Obstacle-Following (or Edge-Cleaning) Coverage pattern, and (3) a Room Coverage pattern. The Spot Coverage pattern causes the robotic device 100 to clean a limited area within the defined working area, e.g., a high-traffic area. In a preferred embodiment the Spot Coverage pattern is implemented by means of a spiral algorithm (but other types of self-bounded area algorithms, e.g., polygonal, can be used). The spiral algorithm, which causes outward spiraling (preferred) or inward spiraling movement of the robotic device 100, is implemented by control signals from the microprocessing unit 135 to the main wheel assemblies 112L, 112R to change the turn radius/radii thereof as a function of time (thereby increasing/decreasing the spiral movement pattern of the robotic device 100).


The robotic device 100 is operated in the Spot Coverage pattern for a predetermined or random period of time, for a predetermined or random distance (e.g., a maximum spiral distance) and/or until the occurrence of a specified event, e.g., activation of one or more of the obstacle detection units 120 (collectively a transition condition). Once a transition condition occurs, the robotic device 100 can implement or transition to a different behavioral mode, e.g., a Straight Line behavioral mode (in a preferred embodiment of the robotic device 100, the Straight Line behavioral mode is a low priority, default behavior that propels the robot in an approximately straight line at a preset velocity of approximately 0.306 m/s) or a Bounce behavioral mode in combination with a Straight Line behavioral mode.


If the transition condition is the result of the robotic device 100 encountering an obstacle, the robotic device 100 can take other actions in lieu of transitioning to a different behavioral mode. The robotic device 100 can momentarily implement a behavioral mode to avoid or escape the obstacle and resume operation under control of the spiral algorithm (i.e., continue spiraling in the same direction). Alternatively, the robotic device 100 can momentarily implement a behavioral mode to avoid or escape the obstacle and resume operation under control of the spiral algorithm (but in the opposite direction—reflective spiraling).


The Obstacle-Following Coverage pattern causes the robotic device 100 to clean the perimeter of the defined working area, e.g., a room bounded by walls, and/or the perimeter of an obstacle (e.g., furniture) within the defined working area. Preferably the robotic device 100 of FIG. 1 utilizes obstacle-following unit 126 (see FIG. 1) to continuously maintain its position with respect to an obstacle, e.g., wall, furniture, so that the motion of the robotic device 100 causes it to travel adjacent to and concomitantly clean along the perimeter of the obstacle. Different embodiments of the obstacle-following unit 126 can be used to implement the Obstacle-Following behavioral pattern.


In a first embodiment, the obstacle-following unit 126 is operated to detect the presence or absence of the obstacle. In an alternative embodiment, the obstacle-following unit 126 is operated to detect an obstacle and then maintain a predetermined distance between the obstacle and the robotic device 100. In the first embodiment, the microprocessing unit 135 is operative, in response to signals from the obstacle-following unit, to implement small CW or CCW turns to maintain its position with respect to the obstacle. The robotic device 100 implements a small CW when the robotic device 100 transitions from obstacle detection to non-detection (reflection to non-reflection) or to implement a small CCW turn when the robotic device 100 transitions from non-detection to detection (non-reflection to reflection). Similar turning behaviors are implemented by the robotic device 100 to maintain the predetermined distance from the obstacle.


The robotic device 100 is operated in the Obstacle-Following behavioral mode for a predetermined or random period of time, for a predetermined or random distance (e.g., a maximum or minimum distance) and/or until the occurrence of a specified event, e.g., activation of one or more of the obstacle detection units 120 a predetermined number of times (collectively a transition condition). In certain embodiments, the microprocessor 135 will cause the robotic device to implement an Align behavioral mode upon activation of the obstacle-detection units 120 in the Obstacle-Following behavioral mode wherein the implements a minimum angle CCW turn to align the robotic device 100 with the obstacle.


The Room Coverage pattern can be used by the robotic device 100 to clean any defined working area that is bounded by walls, stairs, obstacles or other barriers (e.g., a virtual wall unit). A preferred embodiment for the Room Coverage pattern comprises the Random-Bounce behavioral mode in combination with the Straight Line behavioral mode. Initially, the robotic device 100 travels under control of the Straight-Line behavioral mode, i.e., straight-line algorithm (main drive wheel assemblies 112L, 112R operating at the same rotational speed in the same direction) until an obstacle is encountered. Upon activation of one or more of the obstacle detection units 120, the microprocessing unit 135 is operative to compute an acceptable range of new directions based upon the obstacle detection unit(s) 126 activated. The microprocessing unit 135 selects a new heading from within the acceptable range and implements a CW or CCW turn to achieve the new heading with minimal movement. In some embodiments, the new turn heading may be followed by forward movement to increase the cleaning efficiency of the robotic device 100. The new heading may be randomly selected across the acceptable range of headings, or based upon some statistical selection scheme, e.g., Gaussian distribution. In other embodiments of the Room Coverage behavioral mode, the microprocessing unit 135 can be programmed to change headings randomly or at predetermined times, without input from the sensor system.


The robotic device 100 is operated in the Room Coverage behavioral mode for a predetermined or random period of time, for a predetermined or random distance (e.g., a maximum or minimum distance) and/or until the occurrence of a specified event, e.g., activation of the obstacle-detection units 120 a predetermined number of times (collectively a transition condition).


By way of example, the robotic device 100 can include four escape behavioral modes: a Turn behavioral mode, an Edge behavioral mode, a Wheel Drop behavioral mode, and a Slow behavioral mode. One skilled in the art will appreciate that other behavioral modes can be utilized by the robotic device 100. One or more of these behavioral modes may be implemented, for example, in response to a current rise in one of the electric motors 116, 118 of the side brush assembly 140 or dual-stage brush assembly above a low or high stall threshold, forward bumper 106 in compressed position for determined time period, detection of a wheel-drop event.


In the Turn behavioral mode, the robotic device 100 turns in place in a random direction, starting at higher velocity (e.g., twice normal turning velocity) and decreasing to a lower velocity (one-half normal turning velocity), i.e., small panic turns and large panic turns, respectively. Low panic turns are preferably in the range of 45° to 90°, large panic turns are preferably in the range of 90° to 270°. The Turn behavioral mode prevents the robotic device 100 from becoming stuck on room impediments, e.g., high spot in carpet, ramped lamp base, from becoming stuck under room impediments, e.g., under a sofa, or from becoming trapped in a confined area.


In the Edge behavioral mode follows the edge of an obstacle unit it has turned through a predetermined number of degrees, e.g., 60°, without activation of any of the obstacle detection units 120, or until the robotic device has turned through a predetermined number of degrees, e.g., 170°, since initiation of the Edge behavioral mode. The Edge behavioral mode allows the robotic device 100 to move through the smallest possible openings to escape from confined areas.


In the Wheel Drop behavioral mode, the microprocessor 135 reverses the direction of the main wheel drive assemblies 112L, 112R momentarily, then stops them. If the activated wheel drop sensor 124 deactivates within a predetermined time, the microprocessor 135 then reimplements the behavioral mode that was being executed prior to the activation of the wheel drop sensor 124.


In response to certain events, e.g., activation of a wheel drop sensor 124 or a cliff detector 122, the Slow behavioral mode is implemented to slowed down the robotic device 100 for a predetermined distance and then ramped back up to its normal operating speed.


When a safety condition is detected by the sensor subsystem, e.g., a series of brush or wheel stalls that cause the corresponding electric motors to be temporarily cycled off, wheel drop sensor 124 or a cliff detection sensor 122 activated for greater that a predetermined period of time, the robotic device 100 is generally cycled to an off state. In addition, an audible alarm may be generated.


The foregoing description of behavioral modes for the robotic device 100 is merely representative of the types of operating modes that can be implemented by the robotic device 100. One skilled in the art will appreciate that the behavioral modes described above can be implemented in other combinations and/or circumstances, and other behavioral modes and patterns of movement are also possible.


III. DEBRIS SENSOR STRUCTURE AND OPERATION

As shown in FIGS. 1-3, in accordance with the present invention, an autonomous cleaning device (and similarly, a non-autonomous cleaning device as shown by way of example in FIG. 7) can be improved by incorporation of a debris sensor. In the embodiment illustrated in FIGS. 1 and 3, the debris sensor subsystem comprises left and right piezoelectric sensing elements 125L, 125R situated proximate to or within a cleaning pathway of a cleaning device, and electronics for processing the debris signal from the sensor for forwarding to a microprocessor 135 or other controller.


When employed in an autonomous, robot cleaning device, the debris signal from the debris sensor can be used to select a behavioral mode (such as entering into a spot cleaning mode), change an operational condition (such as speed, power or other), steer in the direction of debris (particularly when spaced-apart left and right debris sensors are used to create a differential signal), or take other actions.


A debris sensor according to the present invention can also be incorporated into a non-autonomous cleaning device. When employed in a non-autonomous cleaning device such as, for example, an otherwise relatively conventional vacuum cleaner 700 like that shown in FIG. 7, the debris signal 706 generated by a piezoelectric debris sensor 704PS situated within a cleaning or vacuum pathway of the device can be employed by a controlling microprocessor 708 in the body of the vacuum cleaner 702 to generate a user-perceptible signal (such as by lighting a light 710), to increase power from the power system 703, or take some combination of actions (such as lighting a “high power” light and simultaneously increasing power).


The algorithmic aspects of the operation of the debris sensor subsystem are summarized in FIG. 8. As shown therein, a method according to the invention can include detecting left and right debris signals representative of debris strikes, and thus, of the presence, quantity or volume, and direction of debris (802); selecting an operational mode or pattern of movement (such as Spot Coverage) based on the debris signal values (804); selecting a direction of movement based on differential left/right debris signals (e.g., steering toward the side with more debris) (806); generating a user-perceptible signal representative of the presence of debris or other characteristic (e.g., by illuminating a user-perceptible LED) (808); or otherwise varying or controlling an operational condition, such as power (810).


A further practice of the invention takes advantage of the motion of an autonomous cleaning device across a floor or other surface, processing the debris signal in conjunction with knowledge of the cleaning device's movement to calculate a debris gradient (812 in FIG. 8). The debris gradient is representative of changes in debris strikes count as the autonomous cleaning apparatus moves along a surface. By examining the sign of the gradient (positive or negative, associated with increasing or decreasing debris), an autonomous cleaning device controller can continuously adjust the path or pattern of movement of the device to clean a debris field most effectively (812).


Piezoelectric Sensor:


As noted above, a piezoelectric transducer element can be used in the debris sensor subsystem of the invention. Piezoelectric sensors provide instantaneous response to debris strikes and are relatively immune to accretion that would degrade the performance of an optical debris sensor typical of the prior art.


An example of a piezoelectric transducer 125PS is shown in FIG. 4. Referring now to FIG. 4, the piezoelectric sensor element 125PS can include one or more 0.20 millimeter thick, 20 millimeter diameter brass disks 402 with the piezoelectric material and electrodes bonded to the topside (with a total thickness of 0.51 mm), mounted to an elastomer pad 404, a plastic dirt sensor cap 406, a debris sensor PC board with associated electronics 408, grounded metal shield 410, and retained by mounting screws (or bolts or the like) 412 and elastomer grommets 414. The elastomer grommets provide a degree of vibration dampening or isolation between the piezoelectric sensor element 125PS and the cleaning device.


In the example shown in FIG. 4, a rigid piezoelectric disk, of the type typically used as inexpensive sounders, can be used. However, flexible piezoelectric film can also be advantageously employed. Since the film can be produced in arbitrary shapes, its use affords the possibility of sensitivity to debris across the entire cleaning width of the cleaning device, rather than sensitivity in selected areas where, for example, the disks may be located. Conversely, however, film is at present substantially more expensive and is subject to degradation over time. In contrast, brass disks have proven to be extremely robust.


The exemplary mounting configuration shown in FIG. 4 is substantially optimized for use within a platform that is mechanically quite noisy, such as an autonomous vacuum cleaner like that shown in FIG. 3. In such a device, vibration dampening or isolation of the sensor is extremely useful. However, in an application involving a non-autonomous cleaning device such as a canister-type vacuum cleaner like that shown in FIG. 7, the dampening aspects of the mounting system of FIG. 4 may not be necessary. In a non-autonomous cleaning apparatus, an alternative mounting system may involve heat staking the piezoelectric element directly to its housing. In either case, a key consideration for achieving enhanced performance is the reduction of the surface area required to clamp, bolt, or otherwise maintain the piezoelectric element in place. The smaller the footprint of this clamped “dead zone”, the more sensitive the piezoelectric element will be.


In operation, debris thrown up by the cleaning brush assembly (e.g., brush 94 of FIG. 3), or otherwise flowing through a cleaning pathway within the cleaning device (e.g., vacuum compartment 104 of FIG. 3) can strike the bottom, all-brass side of the sensor 125PS (see FIG. 3). In an autonomous cleaning device, as shown in FIG. 3, the debris sensor 125PS can be located substantially at an axis AX along which main brush 94 and flapper brush 92 meet, so that the particles 127 are thrown up and strike the sensor 125PS with maximum force.


As is well known, a piezoelectric sensor converts mechanical energy (e.g., the kinetic energy of a debris strike and vibration of the brass disk) into electrical energy—in this case, generating an electrical pulse each time it is struck by debris—and it is this electrical pulse that can be processed and transmitted to a system controller (e.g., controller 135 of FIGS. 1 and 2 or 708 of FIG. 8) to control or cause a change in operational mode, in accordance with the invention. Piezoelectric elements are typically designed for use as audio transducers, for example, to generate beep tones. When an AC voltage is applied, they vibrate mechanically in step with the AC waveform, and generate an audible output. Conversely, if they are mechanically vibrated, they produce an AC voltage output. This is the manner in which they are employed in the present invention. In particular, when an object first strikes the brass face of the sensor, it causes the disk to flex inward, which produces a voltage pulse.


Filtering:


However, since the sensor element 125PS is in direct or indirect contact with the cleaning device chassis or body through its mounting system (see FIGS. 3 and 4), it is subject to the mechanical vibrations normally produced by motors, brushes, fans and other moving parts when the cleaning device is functioning. This mechanical vibration can cause the sensor to output an undesirable noise signal that can be larger in amplitude than the signal created by small, low mass debris (such as crushed black pepper) striking the sensor. The end result is that the sensor would output a composite signal composed of lower frequency noise components (up to approximately 16 kHz) and higher frequency, possibly lower amplitude, debris-strike components (greater than 30 kHz, up to hundreds of kHz). Thus, it is useful to provide a way to filter out extraneous signals.


Accordingly, as described below, an electronic filter is used to greatly attenuate the lower frequency signal components to improve signal-to-noise performance. Examples of the architecture and circuitry of such filtering and signal processing elements will next be described in connection with FIGS. 5 and 6.


IV. SIGNAL PROCESSING


FIG. 5 is a schematic diagram of the signal processing elements of a debris sensor subsystem in one practice of the invention.


As noted above, one purpose of a debris sensor is to enable an autonomous cleaning apparatus to sense when it is picking up debris or otherwise encountering a debris field. This information can be used as an input to effect a change in the cleaning behavior or cause the apparatus to enter a selected operational or behavioral mode, such as, for example, the spot cleaning mode described above when debris is encountered. In an non-autonomous cleaning apparatus like that shown in FIG. 7, the debris signal 706 from the debris sensor 704PS can be used to cause a user-perceptible light 710 to be illuminated (e.g., to signal to the user that debris is being encountered), to raise power output from the power until 703 to the cleaning systems, or to cause some other operational change or combination of changes (e.g., lighting a user-perceptible “high power” light and simultaneously raising power).


Moreover, as noted above, two debris sensor circuit modules (i.e., left and right channels like 125L and 125R of FIG. 1) can be used to enable an autonomous cleaning device to sense the difference between the amounts of debris picked up on the right and left sides of the cleaning head assembly. For example, if the robot encounters a field of dirt off to its left side, the left side debris sensor may indicate debris hits, while the right side sensor indicates no (or a low rate of) debris hits. This differential output could be used by the microprocessor controller of an autonomous cleaning device (such as controller 135 of FIGS. 1 and 2) to steer the device in the direction of the debris (e.g., to steer left if the left-side debris sensor is generating higher signal values than the right-side debris sensor); to otherwise choose a vector in the direction of the debris; or to otherwise select a pattern of movement or behavior pattern such as spot coverage or other.


Thus, FIG. 5 illustrates one channel (for example, the left-side channel) of a debris sensor subsystem that can contain both left and right side channels. The right side channel is substantially identical, and its structure and operation will therefore be understood from the following discussion.


As shown in FIG. 5, the left channel consists of a sensor element (piezoelectric disk) 402, an acoustic vibration filter/RFI filter module 502, a signal amplifier 504, a reference level generator 506, an attenuator 508, a comparator 510 for comparing the outputs of the attenuator and reference level generator, and a pulse stretcher 512. The output of the pulse stretcher is a logic level output signal to a system controller like the processor 135 shown in FIG. 2; i.e., a controller suitable for use in selecting an operational behavior.


The Acoustic Vibration Filter/RFI Filter block 502 can be designed to provide significant attenuation (in one embodiment, better than −45 dB Volts), and to block most of the lower frequency, slow rate of change mechanical vibration signals, while permitting higher frequency, fast rate of change debris-strike signals to pass. However, even though these higher frequency signals get through the filter, they are attenuated, and thus require amplification by the Signal Amplifier block 504.


In addition to amplifying the desired higher frequency debris strike signals, the very small residual mechanical noise signals that do pass through the filter also get amplified, along with electrical noise generated by the amplifier itself, and any radio frequency interference (RFI) components generated by the motors and radiated through the air, or picked up by the sensor and its conducting wires. The signal amplifier's high frequency response is designed to minimize the amplification of very high frequency RFI. This constant background noise signal, which has much lower frequency components than the desired debris strike signals, is fed into the Reference Level Generator block 506. The purpose of module 506 is to create a reference signal that follows the instantaneous peak value, or envelope, of the noise signal. It can be seen in FIG. 5 that the signal of interest, i.e., the signal that results when debris strikes the sensor, is also fed into this block. Thus, the Reference Level Generator block circuitry is designed so that it does not respond quickly enough to high frequency, fast rate of change debris-strike signals to be able to track the instantaneous peak value of these signals. The resulting reference signal will be used to make a comparison as described below.


Referring again to FIG. 5, it will be seen that the signal from amplifier 504 is also fed into the Attenuator block. This is the same signal that goes to the Reference Level Generator 506, so it is a composite signal containing both the high frequency signal of interest (i.e., when debris strikes the sensor) and the lower frequency noise. The Attenuator 508 reduces the amplitude of this signal so that it normally is below the amplitude of the signal from the Reference Level Generator 506 when no debris is striking the sensor element.


The Comparator 510 compares the instantaneous voltage amplitude value of the signal from the Attenuator 508 to the signal from the Reference Level Generator 506. Normally, when the cleaning device operating is running and debris are not striking the sensor element, the instantaneous voltage coming out of the Reference Level Generator 506 will be higher than the voltage coming out of the Attenuator block 508. This causes the Comparator block 510 to output a high logic level signal (logic one), which is then inverted by the Pulse Stretcher block 512 to create a low logic level (logic zero).


However, when debris strikes the sensor, the voltage from the Attenuator 508 exceeds the voltage from the Reference Level Generator 506 (since this circuit cannot track the high frequency, fast rate of change signal component from the Amplifier 504) and the signal produced by a debris strike is higher in voltage amplitude than the constant background mechanical noise signal which is more severely attenuated by the Acoustic Vibration Filter 502. This causes the comparator to momentarily change state to a logic level zero. The Pulse Stretcher block 512 extends this very brief (typically under 10-microsecond) event to a constant 1 millisecond (+0.3 mS, −0 mS) event, so as to provide the system controller (e.g., controller 135 of FIG. 2) sufficient time to sample the signal.


When the system controller “sees” this 1-millisecond logic zero pulse, it interprets the event as a debris strike.


Referring now to the RFI Filter portion of the Acoustic Vibration Filter/RFI Filter block 502, this filter serves to attenuate the very high frequency radiated electrical noise (RFI), which is generated by the motors and motor driver circuits.


In summary, the illustrated circuitry connected to the sensor element uses both amplitude and frequency information to discriminate a debris strike (representative of the cleaning device picking up debris) from the normal background mechanical noise also picked up by the sensor element, and the radiated radio frequency electrical noise produced by the motors and motor driver circuits. The normal, though undesirable, constant background noise is used to establish a dynamic reference that prevents false debris-strike indications while maintaining a good signal-to-noise ratio.


In practice, the mechanical mounting system for the sensor element (see FIG. 4) is also designed to help minimize the mechanical acoustic noise vibration coupling that affects the sensor element.


Signal Processing Circuitry:



FIG. 6 is a detailed schematic diagram of exemplary debris sensor circuitry. Those skilled in the art will understand that in other embodiments, the signal processing can be partially or entirely contained and executed within the software of the microcontroller 135. With reference to FIG. 6, the illustrated example of suitable signal processing circuitry contains the following elements, operating in accordance with the following description:


The ground referenced, composite signal from the piezoelectric sensor disk (see piezoelectric disk 402 of FIG. 4) is fed into the capacitor C1, which is the input to the 5-pole, high pass, passive R-C filter designed to attenuate the low frequency, acoustic mechanical vibrations conducted into the sensor through the mounting system. This filter has a 21.5 kHz, −3 dB corner frequency rolling off at −100 dB/Decade. The output of this filter is fed to a signal pole, low pass, passive R-C filter designed to attenuate any very high frequency RFI. This filter has a 1.06 MHz, −3 dB corner frequency rolling off at −20 dB/Decade. The output of this filter is diode clamped by D1 and D2 in order to protect U1 from high voltage transients in the event the sensor element sustains a severe strike that generates a voltage pulse greater than the amplifier's supply rails. The DC biasing required for signal-supply operation for the amplifier chain and subsequent comparator circuitry is created by R5 and R6. These two resistor values are selected such that their thevenin impedance works with C5 to maintain the filter's fifth pole frequency response correctly.


U1A, U1B and their associated components form a two stage, ac-coupled, non-inverting amplifier with a theoretical AC gain of 441. C9 and C10 serve to minimize gain at low frequencies while C7 and C8 work to roll the gain off at RFI frequencies. The net theoretical frequency response from the filter input to the amplifier output is a single pole high pass response with −3 dB at 32.5 kHz, −100 dB/Decade, and a 2-pole low pass response with break frequencies at 100 kHz, −32 dB/Decade, and 5.4 MHz, −100 dB/Decade, together forming a band-pass filter.


The output from the amplifier is split, with one output going into R14, and the other to the non-inverting input of U1C. The signal going into R14 is attenuated by the R14-R15 voltage divider, and then fed into the inverting input of comparator U2A. The other signal branch from the output of U1B is fed into the non-inverting input of amplifier U1C. U1C along with U1D and the components therebetween (as shown in FIG. 2) form a half-wave, positive peak detector. The attack and decay times are set by R13 and R12, respectively. The output from this circuit is fed to the non-inverting input of U2A through R16. R16 along with R19 provide hysteresis to improve switching time and noise immunity. U2A functions to compare the instantaneous value between the output of the peak detector to the output of the R14-R15 attenuator.


Normally, when debris is not striking the sensor, the output of the peak detector will be greater in amplitude than the output of the attenuator network. When debris strikes the sensor, a high frequency pulse is created that has a higher amplitude coming out of the front-end high pass filter going into U1A than the lower frequency mechanical noise signal component. This signal will be larger in amplitude, even after coming out of the R14-R15 attenuator network, than the signal coming out of the peak detector, because the peak detector cannot track high-speed pulses due to the component values in the R13, C11, R12 network. The comparator then changes state from high to low for as long as the amplitude of the debris-strike pulse stays above the dynamic, noise generated, reference-level signal coming out of the peak detector. Since this comparator output pulse can be too short for the system controller to see, a pulse stretcher is used.


The pulse stretcher is a one-shot monostable design with a lockout mechanism to prevent re-triggering until the end of the timeout period. The output from U2A is fed into the junction of C13 and Q1. C13 couples the signal into the monostable formed by U2C and its associated components. Q1 functions as the lockout by holding the output of U2A low until the monostable times out. The timeout period is set by the time constant formed by R22, C12 and R18, and the reference level set by the R20-R21 voltage divider. This time can adjusted for 1 mS, −0.00 mS as dictated by the requirements of the software used by the controller/processor.


Power for the debris sensor circuit is provided by U3 and associated components. U3 is a low power linear regulator that provides a 5-volt output. The unregulated voltage from the robot's onboard battery provides the power input.


When required, circuit adjustments can be set by R14 and R12. These adjustments will allow the circuit response to be tuned in a short period of time


In a production device of this kind, it is expected that power into, and signal out of the debris sensor circuit printed circuit board (PCB) will be transferred to the main board via shielded cable. Alternatively, noise filters may be substituted for the use of shielded cable, reducing the cost of wiring. The cable shield drain wire can be grounded at the sensor circuit PCB side only. The shield is not to carry any ground current. A separate conductor inside the cable will carry power ground. To reduce noise, the production sensor PCB should have all components on the topside with solid ground plane on the bottom side. The sensor PCB should be housed in a mounting assembly that has a grounded metal shield that covers the topside of the board to shield the components from radiated noise pick up from the robot's motors. The piezoelectric sensor disk can be mounted under the sensor circuit PCB on a suitable mechanical mounting system, such as that shown in FIG. 4, in order to keep the connecting leads as short as possible for noise immunity.


V. CONCLUSIONS

The invention provides a debris sensor that is not subject to degradation by accretion of debris, but is capable of instantaneously sensing and responding to debris strikes, and thus immediately responsive to debris on a floor or other surface to be cleaned, with reduced sensitivity to variations in airflow, instantaneous power, or other operational conditions of the cleaning device.


When employed as described herein, the invention enables an autonomous cleaning device to control its operation or select from among operational modes, patterns of movement or behaviors responsive to detected debris, for example, by steering the device toward “dirtier” areas based on signals generated by the debris sensor.


The debris sensor can also be employed in non-autonomous cleaning devices to control, select or vary operational modes of either an autonomous or non-autonomous cleaning apparatus.


In addition, the disclosed signal processing architecture and circuitry is particularly useful in conjunction with a piezoelectric debris sensor to provide high signal to noise ratios.


Those skilled in the art will appreciate that a wide range of modifications and variations of the present invention are possible and within the scope of the invention. The debris sensor can also be employed for purposes, and in devices, other than those described herein. Accordingly, the foregoing is presented solely by way of example, and the scope of the invention is limited solely by the appended claims.

Claims
  • 1. An autonomous cleaning apparatus comprising: a side brush;a main brush;a debris bin;a chassis carrying the side brush, the main brush, and the debris bin, the side brush operative to direct particulates from a cleaning surface toward the main brush, and the main brush operative to direct the particulates toward the debris bin;a piezoelectric sensor disposed between the main brush and the debris bin, wherein impact of the particulates moving from the side brush to the main brush toward the debris bin is detectable by the piezoelectric sensor;a drive system carried by the chassis and operative to move the chassis; anda control module coupled to the piezoelectric sensor and configured control the drive system based on the detected impact of the particulates moving from the side brush to the main brush toward the debris bin.
  • 2. The autonomous cleaning apparatus of claim 1, wherein the chassis defines a perimeter and the side brush is arranged to entrain particulates outside the perimeter of the chassis.
  • 3. The autonomous cleaning apparatus of claim 1, the drive system operative to move the chassis along a fore-aft axis dividing the chassis into substantially symmetrical right and left halves, wherein the side brush is disposed on the right half or the left half of the chassis, along a forward portion of the chassis.
  • 4. The autonomous cleaning apparatus of claim 3, wherein the side brush is disposed on a dominant side of the cleaning apparatus.
  • 5. The autonomous cleaning apparatus of claim 3, wherein the main brush is on the right half and the left half of the chassis and rearward of the sidebrush.
  • 6. The autonomous cleaning apparatus of claim 3, wherein the chassis defines a vacuum inlet, and the main brush is disposed forward of the vacuum inlet.
  • 7. The autonomous cleaning apparatus of claim 3, wherein the debris bin is disposed rearward of the main brush and the debris bin is removable from the chassis.
  • 8. The autonomous cleaning apparatus of claim 3, wherein the drive system is configured to adjust movement of the cleaning apparatus based at least in part on a signal representative of debris impact on the piezoelectric sensor.
  • 9. The autonomous cleaning apparatus of claim 1, further comprising a counter-rotating flapper carried by the chassis, the counter-rotating flapper meeting the main brush such that the counter-rotating flapper and the main brush direct the particulates toward the debris bin.
  • 10. The autonomous cleaning apparatus of claim 1, wherein the main brush and the piezoelectric sensor define an axis therebetween and the main brush is operable to move the particulates along the axis in a direction toward the piezoelectric sensor.
  • 11. The autonomous cleaning apparatus of claim 1, wherein the piezoelectric sensor is above the main brush as the cleaning apparatus moves along the cleaning surface.
  • 12. The autonomous cleaning apparatus of claim 1, further comprising a processor operable to process a signal from the piezoelectric sensor indicative of a particulate strike to generate a signal representative of a quantity or volumetric parameter of the particulates.
  • 13. An autonomous cleaning apparatus comprising: a drive system comprising a right wheel assembly and a left wheel assembly, each wheel assembly comprising a respective motor;a side brush;a main brush;a debris bin;a chassis carrying the drive system, the side brush, the main brush, and the debris bin, the side brush operative to direct particulates from a cleaning surface toward the main brush, and the main brush operative to direct the particulates toward the debris bin;a control module configured control the drive system; anda debris sensor disposed between the main brush and the debris bin, wherein the particulates directed by the side brush to the main brush toward the debris bin are detectable by the debris sensor, and the control module is configured to adjust movement of the cleaning apparatus based at least in part on a signal representative of the particulates detected by the debris sensor.
  • 14. The autonomous cleaning apparatus of claim 13, wherein a speed of one or both of the right and left wheel assemblies is changeable based at least in part on a signal representative of the particulates detected by the debris sensor.
  • 15. The autonomous cleaning apparatus of claim 13, wherein the drive system is configured to steer the chassis based at least in part on a signal representative of the particulates detected by the debris sensor.
  • 16. The autonomous cleaning apparatus of claim 13, further comprising a processor operable to process a signal from the debris sensor to generate a signal representative of a quantity or volumetric parameter of the particulates.
CROSS-REFERENCE TO RELATED PATENT DOCUMENTS

This application for patent is a continuation of, and claims priority from U.S. patent application Ser. No. 12/255,393, filed on Oct. 21, 2008, which is a continuation of and claims priority from U.S. patent application Ser. No. 11/860,272, filed on Sep. 24, 2007 (now U.S. Pat. No. 7,459,871), which is a continuation of and claims priority from U.S. patent application Ser. No. 11/533,294, filed Sep. 19, 2006 (now U.S. Pat. No. 7,288,912), which is a continuation of and claims priority from U.S. patent application Ser. No. 11/109,832 filed Apr. 19, 2005 (now abandoned), which is a continuation of and claims priority from U.S. patent application Ser. No. 10/766,303, filed Jan. 28, 2004 (now U.S. Pat. No. 6,956,348). This application is related to the following commonly-owned U.S. patent applications or patents, incorporated by reference as if fully set forth herein: U.S. patent application Ser. No. 09/768,773 filed Jan. 24, 2001, now U.S. Pat. No. 6,594,844, entitled Robot Obstacle Detection System; U.S. Provisional Patent Application Ser. No. 60/345,764 filed Jan. 3, 2002, entitled Cleaning Mechanisms for Autonomous Robot; U.S. patent application Ser. No. 10/056,804, filed Jan. 24, 2002, entitled Method and System for Robot Localization and Confinement; U.S. patent application Ser. No. 10/167,851 filed Jun. 12, 2002, entitled Method and System for Multi-Mode Coverage for an Autonomous Robot; U.S. patent application Ser. No. 10/320,729 filed Dec. 16, 2002, entitled Autonomous Floor-Cleaning Robot; and U.S. patent application Ser. No. 10/661,835 filed Sep. 12, 2003, entitled Navigational Control System for Robotic Device.

US Referenced Citations (560)
Number Name Date Kind
1755054 Darst Apr 1930 A
1780221 Buchmann Nov 1930 A
1970302 Gerhardt Aug 1934 A
2136324 John Nov 1938 A
2302111 Dow et al. Nov 1942 A
2770825 Pullen Nov 1956 A
3166138 Dunn Jan 1965 A
3375375 Robert et al. Mar 1968 A
3569727 Aggarwal et al. Mar 1971 A
3744586 Leinauer Jul 1973 A
3756667 Bombardier et al. Sep 1973 A
3809004 Leonheart May 1974 A
3845831 James Nov 1974 A
3863285 Hukuba Feb 1975 A
3888181 Kups Jun 1975 A
3989931 Phillips Nov 1976 A
4004313 Capra Jan 1977 A
4012681 Finger et al. Mar 1977 A
4198727 Farmer Apr 1980 A
4209254 Reymond et al. Jun 1980 A
D258901 Keyworth Apr 1981 S
4309758 Halsall et al. Jan 1982 A
4328545 Halsall et al. May 1982 A
4367403 Miller Jan 1983 A
4445245 Lu May 1984 A
4465370 Yuasa et al. Aug 1984 A
4477998 You Oct 1984 A
4482960 Pryor Nov 1984 A
4492058 Goldfarb et al. Jan 1985 A
D278732 Ohkado May 1985 S
4518437 Sommer May 1985 A
4534637 Suzuki et al. Aug 1985 A
4556313 Miller et al. Dec 1985 A
4575211 Matsumura et al. Mar 1986 A
4618213 Chen Oct 1986 A
4620285 Perdue Oct 1986 A
4624026 Olson et al. Nov 1986 A
4628454 Ito Dec 1986 A
4638445 Mattaboni Jan 1987 A
4644156 Takahashi et al. Feb 1987 A
4649504 Krouglicof et al. Mar 1987 A
4652917 Miller Mar 1987 A
4654492 Koerner et al. Mar 1987 A
4660969 Sorimachi et al. Apr 1987 A
4662854 Fang May 1987 A
D292223 Trumbull Oct 1987 S
4700301 Dyke Oct 1987 A
4703820 Reinaud Nov 1987 A
4710020 Maddox et al. Dec 1987 A
4733343 Yoneda et al. Mar 1988 A
4735138 Gawler et al. Apr 1988 A
4748833 Nagasawa Jun 1988 A
4769700 Pryor Sep 1988 A
D298766 Tanno et al. Nov 1988 S
4796198 Boultinghouse et al. Jan 1989 A
4806751 Abe et al. Feb 1989 A
4811228 Hyyppa Mar 1989 A
4813906 Matsuyama et al. Mar 1989 A
4817000 Eberhardt Mar 1989 A
4829442 Kadonoff et al. May 1989 A
4832098 Palinkas et al. May 1989 A
4851661 Everett Jul 1989 A
4855915 Dallaire Aug 1989 A
4857912 Everett et al. Aug 1989 A
4858132 Holmquist Aug 1989 A
4867570 Sorimachi et al. Sep 1989 A
4891762 Chotiros Jan 1990 A
4893025 Lee Jan 1990 A
4905151 Weiman et al. Feb 1990 A
4912643 Beirne Mar 1990 A
4919489 Kopsco Apr 1990 A
4920060 Parrent et al. Apr 1990 A
4954962 Evans et al. Sep 1990 A
4955714 Stotler et al. Sep 1990 A
4971591 Raviv et al. Nov 1990 A
4977618 Allen Dec 1990 A
4977639 Takahashi et al. Dec 1990 A
4986663 Cecchi et al. Jan 1991 A
5001635 Yasutomi et al. Mar 1991 A
5012886 Jonas et al. May 1991 A
5018240 Holman May 1991 A
5020186 Lessig et al. Jun 1991 A
5022812 Coughlan et al. Jun 1991 A
5024529 Svetkoff et al. Jun 1991 A
D318500 Malewicki et al. Jul 1991 S
5033291 Podoloff et al. Jul 1991 A
5040116 Evans et al. Aug 1991 A
5045769 Everett Sep 1991 A
5049802 Mintus et al. Sep 1991 A
5051906 Evans et al. Sep 1991 A
5062819 Mallory Nov 1991 A
5070567 Holland Dec 1991 A
5084934 Lessig et al. Feb 1992 A
5090321 Abouav Feb 1992 A
5094311 Akeel Mar 1992 A
5105550 Shenoha Apr 1992 A
5109566 Kobayashi et al. May 1992 A
5127128 Lee Jul 1992 A
5136675 Hodson Aug 1992 A
5142985 Stearns et al. Sep 1992 A
5144471 Takanashi et al. Sep 1992 A
5152202 Strauss Oct 1992 A
5155684 Burke et al. Oct 1992 A
5163320 Goshima et al. Nov 1992 A
5164579 Pryor et al. Nov 1992 A
5165064 Mattaboni Nov 1992 A
5170352 McTamaney et al. Dec 1992 A
5173881 Sindle Dec 1992 A
5202742 Frank et al. Apr 1993 A
5206500 Decker et al. Apr 1993 A
5227985 DeMenthon Jul 1993 A
5276618 Everett Jan 1994 A
5277064 Knigga et al. Jan 1994 A
5284452 Corona Feb 1994 A
5310379 Hippely et al. May 1994 A
5315227 Pierson et al. May 1994 A
5341186 Kato Aug 1994 A
5341549 Wirtz et al. Aug 1994 A
5345649 Whitlow Sep 1994 A
5363305 Cox et al. Nov 1994 A
5363935 Schempf et al. Nov 1994 A
5369838 Wood et al. Dec 1994 A
5386862 Glover et al. Feb 1995 A
5399951 Lavallee et al. Mar 1995 A
5400244 Watanabe et al. Mar 1995 A
5435405 Schempf et al. Jul 1995 A
5442358 Keeler et al. Aug 1995 A
5446445 Bloomfield et al. Aug 1995 A
5451135 Schempf et al. Sep 1995 A
5471560 Allard et al. Nov 1995 A
5491670 Weber Feb 1996 A
5498948 Bruni et al. Mar 1996 A
5502638 Takenaka Mar 1996 A
5505072 Oreper Apr 1996 A
5510893 Suzuki Apr 1996 A
5511147 Abdel-Malek Apr 1996 A
5537711 Tseng Jul 1996 A
5542148 Young Aug 1996 A
5551525 Pack et al. Sep 1996 A
D375592 Ljunggren Nov 1996 S
5608306 Rybeck et al. Mar 1997 A
5608894 Kawakami et al. Mar 1997 A
5610488 Miyazawa Mar 1997 A
5613261 Kawakami et al. Mar 1997 A
5642299 Hardin et al. Jun 1997 A
5646494 Han Jul 1997 A
5647554 Ikegami et al. Jul 1997 A
5682839 Grimsley et al. Nov 1997 A
5698861 Oh Dec 1997 A
5710506 Broell et al. Jan 1998 A
5717169 Liang et al. Feb 1998 A
5722109 Delmas et al. Mar 1998 A
5732401 Conway Mar 1998 A
5745235 Vercammen et al. Apr 1998 A
5752871 Tsuzuki May 1998 A
5756904 Oreper et al. May 1998 A
5764888 Bolan et al. Jun 1998 A
5767437 Rogers Jun 1998 A
5767960 Orman Jun 1998 A
5777596 Herbert Jul 1998 A
5781697 Jeong Jul 1998 A
5786602 Pryor et al. Jul 1998 A
5793900 Nourbakhsh et al. Aug 1998 A
5812267 Everett et al. Sep 1998 A
5814808 Takada et al. Sep 1998 A
5819008 Asama et al. Oct 1998 A
5819360 Fujii Oct 1998 A
5819936 Saveliev et al. Oct 1998 A
5821730 Drapkin Oct 1998 A
5825981 Matsuda Oct 1998 A
5828770 Leis et al. Oct 1998 A
5831597 West et al. Nov 1998 A
5839532 Yoshiji et al. Nov 1998 A
5896611 Haaga Apr 1999 A
5905209 Oreper May 1999 A
5911260 Suzuki Jun 1999 A
5916008 Wong Jun 1999 A
5924167 Wright et al. Jul 1999 A
5933102 Miller et al. Aug 1999 A
5933913 Wright et al. Aug 1999 A
5940346 Sadowsky et al. Aug 1999 A
5968281 Wright et al. Oct 1999 A
5974348 Rocks Oct 1999 A
5974365 Mitchell Oct 1999 A
5983448 Wright et al. Nov 1999 A
5984880 Lander et al. Nov 1999 A
5987383 Keller et al. Nov 1999 A
5989700 Krivopal Nov 1999 A
5995883 Nishikado Nov 1999 A
5995884 Allen et al. Nov 1999 A
5996167 Close Dec 1999 A
5998971 Corbridge Dec 1999 A
6000088 Wright et al. Dec 1999 A
6009358 Angott et al. Dec 1999 A
6021545 Delgado et al. Feb 2000 A
6023813 Thatcher et al. Feb 2000 A
6030464 Azevedo Feb 2000 A
6030465 Marcussen et al. Feb 2000 A
6032542 Warnick et al. Mar 2000 A
6036572 Sze Mar 2000 A
6041472 Kasen et al. Mar 2000 A
6046800 Ohtomo et al. Apr 2000 A
6049620 Dickinson et al. Apr 2000 A
6052821 Chouly et al. Apr 2000 A
6055042 Sarangapani Apr 2000 A
6061868 Moritsch et al. May 2000 A
6065182 Wright et al. May 2000 A
6076026 Jambhekar et al. Jun 2000 A
6081257 Zeller Jun 2000 A
6088020 Mor Jul 2000 A
6094775 Behmer Aug 2000 A
6099091 Campbell Aug 2000 A
6101671 Wright et al. Aug 2000 A
6108031 King et al. Aug 2000 A
6108067 Okamoto Aug 2000 A
6108076 Hanseder Aug 2000 A
6108269 Kabel Aug 2000 A
6108597 Kirchner et al. Aug 2000 A
6112143 Allen et al. Aug 2000 A
6122798 Kobayashi et al. Sep 2000 A
6125498 Roberts et al. Oct 2000 A
6131237 Kasper et al. Oct 2000 A
6146278 Kobayashi Nov 2000 A
6154279 Thayer Nov 2000 A
6154694 Aoki et al. Nov 2000 A
6160479 Åhlen et al Dec 2000 A
6167587 Kasper et al. Jan 2001 B1
6192548 Huffman Feb 2001 B1
6216307 Kaleta et al. Apr 2001 B1
6220865 Macri et al. Apr 2001 B1
6226830 Hendriks et al. May 2001 B1
6230362 Kasper et al. May 2001 B1
6237741 Guidetti May 2001 B1
6240342 Fiegert et al. May 2001 B1
6243913 Frank et al. Jun 2001 B1
6255793 Peless et al. Jul 2001 B1
6259979 Holmquist Jul 2001 B1
6261379 Conrad et al. Jul 2001 B1
6263539 Baig Jul 2001 B1
6263989 Won Jul 2001 B1
6272936 Oreper et al. Aug 2001 B1
6276478 Hopkins et al. Aug 2001 B1
6282526 Ganesh Aug 2001 B1
6283034 Miles Sep 2001 B1
6285778 Nakajima et al. Sep 2001 B1
6300737 Bergvall et al. Oct 2001 B1
6321337 Reshef et al. Nov 2001 B1
6327741 Reed Dec 2001 B1
6332400 Meyer Dec 2001 B1
6362875 Burkley Mar 2002 B1
6370453 Sommer Apr 2002 B2
6374155 Wallach et al. Apr 2002 B1
6374157 Takamura Apr 2002 B1
6381802 Park May 2002 B2
6388013 Saraf et al. May 2002 B1
6389329 Colens May 2002 B1
6401294 Kasper Jun 2002 B2
6408226 Byrne et al. Jun 2002 B1
6412141 Kasper et al. Jul 2002 B2
6415203 Inoue et al. Jul 2002 B1
6421870 Basham et al. Jul 2002 B1
6427285 Legatt et al. Aug 2002 B1
6431296 Won Aug 2002 B1
6437227 Theimer Aug 2002 B1
6438456 Feddema et al. Aug 2002 B1
6438793 Miner et al. Aug 2002 B1
6442476 Poropat Aug 2002 B1
6454036 Airey et al. Sep 2002 B1
D464091 Christianson Oct 2002 S
6457206 Judson Oct 2002 B1
6463368 Feiten et al. Oct 2002 B1
6465982 Bergvall et al. Oct 2002 B1
6473167 Odell Oct 2002 B1
6480762 Uchikubo et al. Nov 2002 B1
6491127 Holmberg et al. Dec 2002 B1
6493613 Peless et al. Dec 2002 B2
6502657 Kerrebrock et al. Jan 2003 B2
6504610 Bauer et al. Jan 2003 B1
6507773 Parker et al. Jan 2003 B2
D471243 Cioffi et al. Mar 2003 S
6532404 Colens Mar 2003 B2
6535793 Allard Mar 2003 B2
6540607 Mokris et al. Apr 2003 B2
6548982 Papanikolopoulos et al. Apr 2003 B1
6553612 Dyson et al. Apr 2003 B1
6556722 Russell et al. Apr 2003 B1
6556892 Kuroki et al. Apr 2003 B2
6557104 Vu et al. Apr 2003 B2
D474312 Stephens et al. May 2003 S
6563130 Dworkowski et al. May 2003 B2
6572711 Sclafani et al. Jun 2003 B2
6584376 Van Kommer Jun 2003 B1
6586908 Petersson et al. Jul 2003 B2
6587573 Stam et al. Jul 2003 B1
6594551 McKinney et al. Jul 2003 B2
6594844 Jones Jul 2003 B2
D478884 Slipy et al. Aug 2003 S
6604021 Imai et al. Aug 2003 B2
6611734 Parker et al. Aug 2003 B2
6615885 Ohm Sep 2003 B1
6624744 Wilson et al. Sep 2003 B1
6625843 Kim et al. Sep 2003 B2
6629028 Paromtchik et al. Sep 2003 B2
6639659 Granger Oct 2003 B2
6658325 Zweig Dec 2003 B2
6658354 Lin Dec 2003 B2
6658692 Lenkiewicz et al. Dec 2003 B2
6661239 Ozick Dec 2003 B1
6662889 De Fazio et al. Dec 2003 B2
6668951 Won Dec 2003 B2
6687571 Byrne et al. Feb 2004 B1
6690993 Foulke et al. Feb 2004 B2
6697147 Ko et al. Feb 2004 B2
6711280 Stafsudd et al. Mar 2004 B2
6732826 Song et al. May 2004 B2
6737591 Lapstun et al. May 2004 B1
6741364 Lange et al. May 2004 B2
6756703 Chang Jun 2004 B2
6760647 Nourbakhsh et al. Jul 2004 B2
6764373 Osawa et al. Jul 2004 B1
6769004 Barrett Jul 2004 B2
6801015 Bertram et al. Oct 2004 B2
6832407 Salem et al. Dec 2004 B2
6836701 McKee Dec 2004 B2
6845297 Allard Jan 2005 B2
6856811 Burdue et al. Feb 2005 B2
6859010 Jeon et al. Feb 2005 B2
6859682 Naka et al. Feb 2005 B2
6860206 Rudakevych et al. Mar 2005 B1
6870792 Chiappetta Mar 2005 B2
6871115 Huang et al. Mar 2005 B2
6886651 Slocum et al. May 2005 B1
6888333 Laby May 2005 B2
6906702 Tanaka et al. Jun 2005 B1
6914403 Tsurumi Jul 2005 B2
6917854 Bayer Jul 2005 B2
6925357 Wang et al. Aug 2005 B2
6929548 Wang Aug 2005 B2
6940291 Ozick Sep 2005 B1
6957712 Song et al. Oct 2005 B2
6960986 Asama et al. Nov 2005 B2
6965211 Tsurumi Nov 2005 B2
6975246 Trudeau Dec 2005 B1
6980229 Ebersole Dec 2005 B1
6985556 Shanmugavel et al. Jan 2006 B2
6993954 George et al. Feb 2006 B1
7013527 Thomas et al. Mar 2006 B2
7027893 Perry et al. Apr 2006 B2
7030768 Wanie Apr 2006 B2
7031805 Lee et al. Apr 2006 B2
7032469 Bailey Apr 2006 B2
7054716 McKee et al. May 2006 B2
7057120 Ma et al. Jun 2006 B2
7057643 Iida et al. Jun 2006 B2
7065430 Naka et al. Jun 2006 B2
7066291 Martins et al. Jun 2006 B2
7069124 Whittaker et al. Jun 2006 B1
7085623 Siegers Aug 2006 B2
7113847 Chmura et al. Sep 2006 B2
7142198 Lee Nov 2006 B2
7148458 Schell et al. Dec 2006 B2
7166983 Jung Jan 2007 B2
7171285 Kim et al. Jan 2007 B2
7173391 Jones et al. Feb 2007 B2
7174238 Zweig Feb 2007 B1
7193384 Norman et al. Mar 2007 B1
7196487 Jones et al. Mar 2007 B2
7211980 Bruemmer et al. May 2007 B1
7233122 Kim et al. Jun 2007 B2
7251853 Park et al. Aug 2007 B2
7275280 Haegermarck et al. Oct 2007 B2
7283892 Boillot et al. Oct 2007 B1
7328196 Peters Feb 2008 B2
7332890 Cohen et al. Feb 2008 B2
7360277 Moshenrose et al. Apr 2008 B2
7363108 Noda et al. Apr 2008 B2
7388879 Sabe et al. Jun 2008 B2
7389166 Harwig et al. Jun 2008 B2
7430462 Chiu et al. Sep 2008 B2
7441298 Svendsen et al. Oct 2008 B2
7444206 Abramson et al. Oct 2008 B2
7448113 Jones et al. Nov 2008 B2
7467026 Sakagami et al. Dec 2008 B2
7474941 Kim et al. Jan 2009 B2
7503096 Lin Mar 2009 B2
7555363 Augenbraun et al. Jun 2009 B2
7557703 Yamada et al. Jul 2009 B2
7571511 Jones et al. Aug 2009 B2
7578020 Jaworski et al. Aug 2009 B2
7603744 Reindle Oct 2009 B2
7617557 Reindle Nov 2009 B2
7620476 Morse et al. Nov 2009 B2
7636982 Jones et al. Dec 2009 B2
7663333 Jones et al. Feb 2010 B2
7706917 Chiappetta et al. Apr 2010 B1
7765635 Park Aug 2010 B2
7809944 Kawamoto Oct 2010 B2
7849555 Hahm et al. Dec 2010 B2
7853645 Brown et al. Dec 2010 B2
20010013929 Torsten Aug 2001 A1
20010020200 Das et al. Sep 2001 A1
20010025183 Shahidi Sep 2001 A1
20010037163 Allard Nov 2001 A1
20010043509 Green et al. Nov 2001 A1
20010045883 Holdaway et al. Nov 2001 A1
20010047895 De Fazio et al. Dec 2001 A1
20020011367 Kolesnik Jan 2002 A1
20020021219 Edwards Feb 2002 A1
20020027652 Paromtchik et al. Mar 2002 A1
20020036779 Kiyoi et al. Mar 2002 A1
20020081937 Yamada et al. Jun 2002 A1
20020095239 Wallach et al. Jul 2002 A1
20020097400 Jung et al. Jul 2002 A1
20020104963 Mancevski Aug 2002 A1
20020108209 Peterson Aug 2002 A1
20020112742 Bredo et al. Aug 2002 A1
20020113973 Ge Aug 2002 A1
20020120364 Colens Aug 2002 A1
20020124343 Reed Sep 2002 A1
20020153185 Song et al. Oct 2002 A1
20020159051 Guo Oct 2002 A1
20020166193 Kasper Nov 2002 A1
20020169521 Goodman et al. Nov 2002 A1
20020189871 Won Dec 2002 A1
20030009259 Hattori et al. Jan 2003 A1
20030024986 Mazz et al. Feb 2003 A1
20030025472 Jones et al. Feb 2003 A1
20030026472 Abe Feb 2003 A1
20030028286 Glenn et al. Feb 2003 A1
20030030399 Jacobs Feb 2003 A1
20030058262 Sato et al. Mar 2003 A1
20030067451 Tagg et al. Apr 2003 A1
20030124312 Autumn Jul 2003 A1
20030126352 Barrett Jul 2003 A1
20030146384 Logsdon et al. Aug 2003 A1
20030146739 Bertram et al. Aug 2003 A1
20030193657 Uomori et al. Oct 2003 A1
20030221114 Hino et al. Nov 2003 A1
20030229421 Chmura et al. Dec 2003 A1
20030229474 Suzuki et al. Dec 2003 A1
20030233171 Heiligensetzer Dec 2003 A1
20030233870 Mancevski Dec 2003 A1
20030233930 Ozick Dec 2003 A1
20040016077 Song et al. Jan 2004 A1
20040030451 Solomon Feb 2004 A1
20040030570 Solomon Feb 2004 A1
20040045117 Alowonle et al. Mar 2004 A1
20040049877 Jones et al. Mar 2004 A1
20040055163 McCambridge et al. Mar 2004 A1
20040074038 Im et al. Apr 2004 A1
20040083570 Song et al. May 2004 A1
20040085037 Jones et al. May 2004 A1
20040093122 Galibraith May 2004 A1
20040098167 Yi et al. May 2004 A1
20040113777 Matsuhira et al. Jun 2004 A1
20040117064 McDonald Jun 2004 A1
20040117846 Karaoguz et al. Jun 2004 A1
20040118998 Wingett et al. Jun 2004 A1
20040128028 Miyamoto et al. Jul 2004 A1
20040133316 Dean Jul 2004 A1
20040148419 Chen et al. Jul 2004 A1
20040148731 Damman et al. Aug 2004 A1
20040153212 Profio et al. Aug 2004 A1
20040181706 Chen et al. Sep 2004 A1
20040187249 Jones et al. Sep 2004 A1
20040196451 Aoyama Oct 2004 A1
20040204792 Taylor et al. Oct 2004 A1
20040210345 Noda et al. Oct 2004 A1
20040210347 Sawada et al. Oct 2004 A1
20040221790 Sinclair et al. Nov 2004 A1
20040255425 Arai et al. Dec 2004 A1
20050010330 Abramson et al. Jan 2005 A1
20050021181 Kim et al. Jan 2005 A1
20050022330 Park et al. Feb 2005 A1
20050067994 Jones et al. Mar 2005 A1
20050137749 Jeon et al. Jun 2005 A1
20050138764 Grey Jun 2005 A1
20050144751 Kegg et al. Jul 2005 A1
20050154795 Kuz et al. Jul 2005 A1
20050165508 Kanda et al. Jul 2005 A1
20050166354 Uehigashi Aug 2005 A1
20050166355 Tani Aug 2005 A1
20050171644 Tani Aug 2005 A1
20050192707 Park et al. Sep 2005 A1
20050204505 Kashiwagi Sep 2005 A1
20050211880 Schell et al. Sep 2005 A1
20050212478 Takenaka Sep 2005 A1
20050212680 Uehigashi Sep 2005 A1
20050212929 Schell et al. Sep 2005 A1
20050213082 DiBernardo et al. Sep 2005 A1
20050213109 Schell et al. Sep 2005 A1
20050217042 Reindle Oct 2005 A1
20050222933 Wesby Oct 2005 A1
20050229340 Sawalski et al. Oct 2005 A1
20050234595 Tani Oct 2005 A1
20050251292 Casey et al. Nov 2005 A1
20050255425 Pierson Nov 2005 A1
20050258154 Blankenship et al. Nov 2005 A1
20050288819 de Guzman Dec 2005 A1
20060000050 Cipolla et al. Jan 2006 A1
20060010638 Shimizu et al. Jan 2006 A1
20060025134 Cho et al. Feb 2006 A1
20060037170 Shimizu Feb 2006 A1
20060042042 Mertes et al. Mar 2006 A1
20060044546 Lewin et al. Mar 2006 A1
20060056677 Tani Mar 2006 A1
20060060216 Woo Mar 2006 A1
20060064828 Stein et al. Mar 2006 A1
20060087273 Ko et al. Apr 2006 A1
20060089765 Pack et al. Apr 2006 A1
20060119839 Bertin et al. Jun 2006 A1
20060143295 Costa-Requena et al. Jun 2006 A1
20060146776 Kim Jul 2006 A1
20060190133 Konandreas et al. Aug 2006 A1
20060196003 Song et al. Sep 2006 A1
20060220900 Ceskutti et al. Oct 2006 A1
20060238157 Kim et al. Oct 2006 A1
20060238159 Jung Oct 2006 A1
20060241814 Jung Oct 2006 A1
20060253224 Tani et al. Nov 2006 A1
20060259212 Jeon Nov 2006 A1
20060259494 Watson et al. Nov 2006 A1
20060293787 Kanda et al. Dec 2006 A1
20070006404 Cheng et al. Jan 2007 A1
20070032904 Kawagoe et al. Feb 2007 A1
20070042716 Goodall et al. Feb 2007 A1
20070043459 Abbott et al. Feb 2007 A1
20070061041 Zweig Mar 2007 A1
20070096676 Im et al. May 2007 A1
20070114975 Cohen et al. May 2007 A1
20070150096 Yeh et al. Jun 2007 A1
20070157415 Lee et al. Jul 2007 A1
20070157420 Lee et al. Jul 2007 A1
20070179670 Chiappetta et al. Aug 2007 A1
20070226949 Hahm et al. Oct 2007 A1
20070234492 Svendsen et al. Oct 2007 A1
20070244610 Ozick et al. Oct 2007 A1
20070250212 Halloran et al. Oct 2007 A1
20070266508 Jones et al. Nov 2007 A1
20080007203 Cohen et al. Jan 2008 A1
20080039974 Sandin et al. Feb 2008 A1
20080052846 Kapoor et al. Mar 2008 A1
20080091304 Ozick et al. Apr 2008 A1
20080184518 Taylor Aug 2008 A1
20080281470 Gilbert et al. Nov 2008 A1
20080282494 Won et al. Nov 2008 A1
20080294288 Yamauchi Nov 2008 A1
20080307590 Jones et al. Dec 2008 A1
20090007366 Svendsen et al. Jan 2009 A1
20090049640 Lee et al. Feb 2009 A1
20090055022 Casey et al. Feb 2009 A1
20090102296 Greene et al. Apr 2009 A1
20090292393 Casey et al. Nov 2009 A1
20100011529 Won et al. Jan 2010 A1
20100107355 Won et al. May 2010 A1
20100257690 Jones et al. Oct 2010 A1
20100257691 Jones et al. Oct 2010 A1
20100263158 Jones et al. Oct 2010 A1
20100268384 Jones et al. Oct 2010 A1
20100312429 Jones et al. Dec 2010 A1
Foreign Referenced Citations (321)
Number Date Country
2003275566 Jun 2004 AU
2128842 Dec 1980 DE
3317376 Nov 1984 DE
3536907 Feb 1989 DE
3404202 Dec 1992 DE
199311014 Oct 1993 DE
4414683 Oct 1995 DE
4338841 Aug 1999 DE
19849978 Feb 2001 DE
19849978 Feb 2001 DE
10357636 Jul 2005 DE
102004041021 Aug 2005 DE
102005046813 Apr 2007 DE
198803389 Dec 1988 DK
265542 May 1988 EP
281085 Sep 1988 EP
307381 Jul 1990 EP
358628 May 1991 EP
437024 Jul 1991 EP
433697 Dec 1992 EP
479273 May 1993 EP
294101 Dec 1993 EP
554978 Mar 1994 EP
615719 Sep 1994 EP
861629 Sep 1998 EP
792726 Jun 1999 EP
930040 Oct 1999 EP
845237 Apr 2000 EP
1018315 Jul 2000 EP
1172719 Jan 2002 EP
1228734 Jun 2003 EP
1380246 Mar 2005 EP
1553472 Jul 2005 EP
1642522 Nov 2007 EP
2238196 Nov 2006 ES
2601443 Nov 1991 FR
702426 Jan 1954 GB
2128842 Apr 1986 GB
2213047 Aug 1989 GB
2225221 May 1990 GB
2284957 Jun 1995 GB
2267360 Dec 1995 GB
2300082 Sep 1999 GB
2404330 Jul 2005 GB
2417354 Feb 2006 GB
5257533 Oct 1933 JP
53021869 Feb 1978 JP
53110257 Sep 1978 JP
53110257 Sep 1978 JP
943901 Mar 1979 JP
57014726 Jan 1982 JP
57064217 Apr 1982 JP
59005315 Feb 1984 JP
59033511 Mar 1984 JP
59094005 May 1984 JP
59099308 Jul 1984 JP
59112311 Jul 1984 JP
59033511 Aug 1984 JP
59120124 Aug 1984 JP
59131668 Sep 1984 JP
59164973 Sep 1984 JP
59124917 Oct 1984 JP
2283343 Nov 1984 JP
59212924 Dec 1984 JP
59226909 Dec 1984 JP
60089213 May 1985 JP
60089213 Jun 1985 JP
60211510 Oct 1985 JP
60259895 Dec 1985 JP
61023221 Jan 1986 JP
61079912 May 1986 JP
61023221 Jun 1986 JP
SHO 61-130147 Aug 1986 JP
62074018 Apr 1987 JP
62070709 May 1987 JP
62164431 Oct 1987 JP
62263507 Nov 1987 JP
62263508 Nov 1987 JP
62189057 Dec 1987 JP
63079623 Apr 1988 JP
63158032 Jul 1988 JP
1162454 Jun 1989 JP
2006312 Jan 1990 JP
2026312 Jun 1990 JP
2283343 Nov 1990 JP
HEI 3-39126 Feb 1991 JP
3051023 Mar 1991 JP
HEI 3-173519 Jul 1991 JP
3197758 Aug 1991 JP
3201903 Sep 1991 JP
HEI 3-242710 Oct 1991 JP
4019586 Mar 1992 JP
4084921 Mar 1992 JP
04300516 Oct 1992 JP
5023269 Apr 1993 JP
5091604 Apr 1993 JP
05095879 Apr 1993 JP
5042076 Jun 1993 JP
5046246 Jun 1993 JP
5150827 Jun 1993 JP
5150829 Jun 1993 JP
5046239 Jul 1993 JP
5054620 Jul 1993 JP
5054620 Sep 1993 JP
5040519 Oct 1993 JP
5257527 Oct 1993 JP
5285861 Nov 1993 JP
6003251 Jan 1994 JP
HEI 6-38912 Feb 1994 JP
6026312 Apr 1994 JP
6137828 May 1994 JP
6293095 Oct 1994 JP
6105781 Dec 1994 JP
7059702 Mar 1995 JP
HEI 7-59702 Mar 1995 JP
7129239 May 1995 JP
7059702 Jun 1995 JP
7222705 Aug 1995 JP
7222705 Aug 1995 JP
7270518 Oct 1995 JP
7281742 Oct 1995 JP
7281752 Oct 1995 JP
7311041 Nov 1995 JP
7313417 Dec 1995 JP
7319542 Dec 1995 JP
HEI 7-334242 Dec 1995 JP
8000393 Jan 1996 JP
8000393 Jan 1996 JP
8016241 Jan 1996 JP
8016776 Feb 1996 JP
8016776 Feb 1996 JP
8063229 Mar 1996 JP
8083125 Mar 1996 JP
8083125 Mar 1996 JP
8089449 Apr 1996 JP
2520732 May 1996 JP
8123548 May 1996 JP
8152916 Jun 1996 JP
8256960 Oct 1996 JP
8263137 Oct 1996 JP
8286741 Nov 1996 JP
8286744 Nov 1996 JP
8322774 Dec 1996 JP
8322774 Dec 1996 JP
8335112 Dec 1996 JP
9043901 Feb 1997 JP
9044240 Feb 1997 JP
9047413 Feb 1997 JP
9066855 Mar 1997 JP
9145309 Jun 1997 JP
9160644 Jun 1997 JP
9160644 Jun 1997 JP
9179625 Jul 1997 JP
9179685 Jul 1997 JP
9192069 Jul 1997 JP
9204223 Aug 1997 JP
9206258 Aug 1997 JP
9206258 Aug 1997 JP
9233712 Sep 1997 JP
09251318 Sep 1997 JP
9251318 Sep 1997 JP
9265319 Oct 1997 JP
9269807 Oct 1997 JP
9269810 Oct 1997 JP
02555263 Nov 1997 JP
9319431 Dec 1997 JP
9319432 Dec 1997 JP
9319434 Dec 1997 JP
9325812 Dec 1997 JP
10055215 Feb 1998 JP
10117973 May 1998 JP
10117973 May 1998 JP
10118963 May 1998 JP
10165738 Jun 1998 JP
10177414 Jun 1998 JP
10214114 Aug 1998 JP
10214114 Aug 1998 JP
10228316 Aug 1998 JP
10240342 Sep 1998 JP
10260727 Sep 1998 JP
10295595 Nov 1998 JP
11015941 Jan 1999 JP
11065655 Mar 1999 JP
11085269 Mar 1999 JP
11102219 Apr 1999 JP
11102220 Apr 1999 JP
11162454 Jun 1999 JP
11174145 Jul 1999 JP
11175149 Jul 1999 JP
11178764 Jul 1999 JP
11178765 Jul 1999 JP
11212642 Aug 1999 JP
11212642 Aug 1999 JP
11213154 Aug 1999 JP
11248806 Sep 1999 JP
11282532 Oct 1999 JP
11282533 Oct 1999 JP
11295412 Oct 1999 JP
11346964 Dec 1999 JP
2000047728 Feb 2000 JP
2000056006 Feb 2000 JP
2000056831 Feb 2000 JP
2000066722 Mar 2000 JP
2000075925 Mar 2000 JP
10240343 May 2000 JP
20000275321 Oct 2000 JP
2000353014 Dec 2000 JP
20000353014 Dec 2000 JP
2001022443 Jan 2001 JP
2001067588 Mar 2001 JP
2001-87182 Apr 2001 JP
2001087182 Apr 2001 JP
2001087182 Apr 2001 JP
2001121455 May 2001 JP
2001125641 May 2001 JP
2001216482 Aug 2001 JP
2001265439 Sep 2001 JP
2001289939 Oct 2001 JP
2001306170 Nov 2001 JP
2001320781 Nov 2001 JP
2002-165731 Jun 2002 JP
2002204769 Jul 2002 JP
2002247510 Aug 2002 JP
2002333920 Nov 2002 JP
2002360479 Dec 2002 JP
2002366227 Dec 2002 JP
2002369778 Dec 2002 JP
2002369778 Dec 2002 JP
2003010076 Jan 2003 JP
2003010088 Jan 2003 JP
2003015740 Jan 2003 JP
2003028528 Jan 2003 JP
2003-47579 Feb 2003 JP
2003-50632 Feb 2003 JP
03375843 Feb 2003 JP
2003047579 Feb 2003 JP
2003052596 Feb 2003 JP
2003084994 Mar 2003 JP
2003-114719 Apr 2003 JP
2003167628 Jun 2003 JP
2003180587 Jul 2003 JP
2003186539 Jul 2003 JP
2003190064 Jul 2003 JP
2003190064 Jul 2003 JP
2003241836 Aug 2003 JP
2003262520 Sep 2003 JP
2003285288 Oct 2003 JP
2003304992 Oct 2003 JP
2003310509 Nov 2003 JP
2003330543 Nov 2003 JP
1020043088 Jan 2004 JP
2004123040 Apr 2004 JP
2004148021 May 2004 JP
2004160102 Jun 2004 JP
2004166968 Jun 2004 JP
2004174228 Jun 2004 JP
2004198330 Jul 2004 JP
2004219185 Aug 2004 JP
2005118354 May 2005 JP
2005135400 May 2005 JP
2005211360 Aug 2005 JP
2005224265 Aug 2005 JP
2005230032 Sep 2005 JP
2005245916 Sep 2005 JP
2005296511 Oct 2005 JP
2005346700 Dec 2005 JP
2005352707 Dec 2005 JP
2006043071 Feb 2006 JP
2006155274 Jun 2006 JP
2006164223 Jun 2006 JP
2006227673 Aug 2006 JP
2006247467 Sep 2006 JP
2006260161 Sep 2006 JP
2006293662 Oct 2006 JP
2006296697 Nov 2006 JP
2007034866 Feb 2007 JP
2007213180 Aug 2007 JP
04074285 Apr 2008 JP
2009015611 Jan 2009 JP
2010198552 Sep 2010 JP
1020010032583 Apr 2001 KR
WO9530887 Nov 1995 WO
WO9617258 Feb 1997 WO
W09853456 Nov 1998 WO
WO9905580 Feb 1999 WO
W09916078 Apr 1999 WO
W09959042 Nov 1999 WO
W00038029 Jun 2000 WO
WO0038028 Jun 2000 WO
WO0180703 Nov 2001 WO
WO0191623 Dec 2001 WO
WO02067752 Sep 2002 WO
WO02069774 Sep 2002 WO
WO02075350 Sep 2002 WO
WO02081074 Oct 2002 WO
WO03015220 Feb 2003 WO
WO03024292 Mar 2003 WO
W002069775 May 2003 WO
WO03040546 May 2003 WO
WO03062850 Jul 2003 WO
WO03062852 Jul 2003 WO
WO2004004534 Jan 2004 WO
WO2004005956 Jan 2004 WO
W02004025947 May 2004 WO
WO2004043215 May 2004 WO
WO2004058028 Jul 2004 WO
WO2005006935 Jan 2005 WO
WO2005036292 Apr 2005 WO
WO2005055796 Jun 2005 WO
WO2005076545 Aug 2005 WO
WO2005077243 Aug 2005 WO
W02005083541 Sep 2005 WO
WO2005081074 Sep 2005 WO
WO2005082223 Sep 2005 WO
WO2005098475 Oct 2005 WO
WO2005098476 Oct 2005 WO
WO2006046400 May 2006 WO
WO2006073248 Jul 2006 WO
WO2007036490 May 2007 WO
WO2007065033 Jun 2007 WO
WO2007137234 Nov 2007 WO
Non-Patent Literature Citations (218)
Entry
Borges et al. “Optimal Mobile Robot Pose Estimation Using Geometrical Maps”, IEEE Transactions on Robotics and Automation, vol. 18, No. 1, pp. 87-94, Feb. 2002.
Braunstingl et al. “Fuzzy Logic Wall Following of a Mobile Robot Based on the Concept of General Perception” ICAR '95, 7th International Conference on Advanced Robotics, Sant Feliu De Guixols, Spain, pp. 367-376, Sep. 1995.
Bulusu, et al. “Self Configuring Localization systems: Design and Experimental Evaluation”, ACM Transactions on Embedded Computing Systems vol. 3 No. 1 pp. 24-60, 2003.
Caccia, et al. “Bottom-Following for Remotely Operated Vehicles”, 5th IFAC conference, Alaborg, Denmark, pp. 245-250 Aug. 1, 2000.
Chae, et al. “StarLITE: A new artificial landmark for the navigation of mobile robots”, http://www.irc.atr.jp/jk-nrs2005/pdf/Starlite.pdf, 4 pages, 2005.
Chamberlin et al. “Team 1: Robot Locator Beacon System” NASA Goddard SFC, Design Proposal, 15 pages, Feb. 17, 2006.
Champy “Physical management of IT assets in Data Centers using RFID technologies”, RFID 2005 University, Oct. 12-14, 2005 (NPL0126).
Chiri“Joystick Control for Tiny OS Robot”, http://www.eecs.berkeley.edu/Programs/ugrad/superb/papers2002/chiri.pdf. 12 pages, Aug. 8, 2002.
Christensen et al. “Theoretical Methods for Planning and Control in Mobile Robotics” 1997 First International Conference on Knowledge-Based Intelligent Electronic Systems, Adelaide, Australia, pp. 81-86, May 21-27, 1997.
Andersen et al., “Landmark based navigation strategies”, SPIE Conference on Mobile Robots XIII, SPIE vol. 3525, pp.
Clerentin, et al. “A localization method based on two omnidirectional perception systems cooperation” Proc of IEEE International Conference on Robotics & Automation, San Francisco, CA vol. 2, pp. 1219-1224, Apr. 2000.
Corke “High Performance Visual serving for robots end-point control”. SPIE vol. 2056 Intelligent robots and computer vision 1993.
Cozman et al. “Robot Localization using a Computer Vision Sextant”, IEEE International Midwest Conference on Robotics and Automation, pp. 106-111, 1995.
D'Orazio, et al. “Model based Vision System for mobile robot position estimation”, SPIE vol. 2058 Mobile Robots VIII, pp. 38-49, 1992.
De Bakker, et al. “Smart PSD—array for sheet of light range imaging”, Proc. of SPIE vol. 3965, pp. 1-12, May 15, 2000.
Desaulniers, et al. “An Efficient Algorithm to find a shortest path for a car-like Robot”, IEEE Transactions on robotics and Automation vol. 11 No. 6, pp. 819-828, Dec. 1995.
Dorfmüller-Ulhaas “Optical Tracking From User Motion to 3D Interaction”, http://www.cg.tuwien.ac.at/research/publications/2002/Dorimueller-Ulhaas-thesis, 182 pages, 2002.
Dorsch, et al. “Laser Triangulation: Fundamental uncertainty in distance measurement”, Applied Optics, vol. 33 No. 7, pp. 1306-1314, Mar. 1, 1994.
Dudek, et al. “Localizing a Robot with Minimum Travel” Proceedings of the sixth annual ACM-SIAM symposium on Discrete algorithms, vol. 27 No. 2 pp. 583-604, Apr. 1998.
Dulimarta, et al. “Mobile Robot Localization in Indoor Environment”, Pattern Recognition, vol. 30, No. 1, pp. 99-111, 1997.
EBay “Roomba Timer -> Timed Cleaning- Floorvac Robotic Vacuum”, Cgi.ebay.com/ws/eBay|SAP|.dll?viewitem&category=43526&item=4375198387&rd=1, 5 pages, Apr. 2005.
Electrolux “Welcome to the Electrolux trilobite” www.electroluxusa.com/node57.asp?currentURL=node142.asp%3F, 2 pages, Mar. 18, 2005.
Eren, et al. “Accuracy in position estimation of mobile robots based on coded infrared signal transmission”, Proceedings: Integrating Intelligent Instrumentation and Control, Instrumentation and Measurement Technology Conference, 1995. IMTC/95. pp. 548-551, 1995.
Eren, et al. “Operation of Mobile Robots in a Structured Infrared Environment”, Proceedings. ‘Sensing, Processing, Networking’, IEEE Instrumentation and Measurement Technology Conference, 1997 (IMTC/97), Ottawa, Canada vol. 1, pp. 20-25, May 19-21, 1997.
Barker, “Navigation by the Stars- Ben Barker 4th Year Project” Power point pp. 1-20.
Becker, et al. “Reliable Navigation Using Landmarks” IEEE International Conference on Robotics and Automation, 0-7803-1965-6, pp. 401-406, 1995.
Benayad-Cherif, et al., “Mobile Robot Navigation Sensors” SPIE vol. 1831 Mobile Robots, VII, pp. 378-387, 1992.
Facchinetti, Claudio et al. “Using and Learning Vision-Based Self-Positioning for Autonomous Robot Navigation”, ICARCV '94, vol. 3 pp. 1694-1698, 1994.
Betke, et al., “Mobile Robot localization using Landmarks” Proceedings of the IEEE/RSJ/GI International Conference on Intelligent Robots and Systems '94 “Advanced Robotic Systems and the Real World” (IROS '94), Vol.
Facchinetti, Claudio et al. “Self-Positioning Robot Navigation Using Ceiling Images Sequences”, ACCV '95, 5 pages, Dec. 5-8, 1995.
Fairfield, Nathaniel et al. “Mobile Robot Localization with Sparse Landmarks”, SPIE vol. 4573 pp. 148-155, 2002.
Favre-Bulle, Bernard “Efficient tracking of 3D—Robot Position by Dynamic Triangulation”, IEEE Instrumentation and Measurement Technology Conference IMTC 98 Session on Instrumentation and Measurement in Robotics, vol. 1, pp. 446-449, May 18-21, 1998.
Fayman “Exploiting Process Integration and Composition in the context of Active Vision”, IEEE Transactions on Systems, Man, and Cybernetics- Part C: Application and reviews, vol. 29 No. 1, pp. 73-86, Feb. 1999.
Florbot GE Plastics Image (1989-1990).
Franz, et al. “Biomimetric robot navigation”, Robotics and Autonomous Systems vol. 30 pp. 133-153, 2000.
Friendly Robotics “Friendly Robotics- Friendly Vac, Robotic Vacuum Cleaner”, www.friendlyrobotics.com/vac.htm. 5 pages Apr. 20, 2005.
Fuentes, et al. “Mobile Robotics 1994”, University of Rochester. Computer Science Department, TR 588, 44 pages, Dec. 7, 1994.
Bison, P. et al., “Using a structured beacon for cooperative position estimation” Robotics and Autonomous Systems vol. 29, No. 1, pp. 33-40, Oct. 1999.
Fukuda, et al. “Navigation System based on Ceiling Landmark Recognition for Autonomous mobile robot”, 1995 IEEE/RSJ International Conference on Intelligent Robots and Systems 95. ‘Human Robot Interaction and Cooperative Robots’, Pittsburgh, PA, pp. 1466/1471, Aug. 5-9, 1995.
Gionis “A hand-held optical surface scanner for environmental Modeling and Virtual Reality”, Virtual Reality World, 16 pages 1996.
Goncalves et al. “A Visual Front-End for Simultaneous Localization and Mapping”, Proceedings of the 2005 IEEE International Conference on Robotics and Automation, Barcelona, Spain, pp. 44-49, Apr. 2005.
Gregg et al. “Autonomous Lawn Care Applications”, 2006 Florida Conference on Recent Advances in Robotics, FCRAR 2006, pp. 1-5, May 25-26, 2006.
Hamamatsu “SI PIN Diode S5980, S5981 S5870- Multi-element photodiodes for surface mounting”, Hamatsu Photonics, 2 pages Apr. 2004.
Hammacher Schlemmer “Electrolux Trilobite Robotic Vacuum” www.hammacher.com/publish/71579.asp?promo=xsells, 3 pages, Mar. 18, 2005.
Haralick et al. “Pose Estimation from Corresponding Point Data”, IEEE Transactions on systems, Man, and Cybernetics, vol. 19, No. 6, pp. 1426-1446, Nov. 1989.
Hausler “About the Scaling Behaviour of Optical Range Sensors”, Fringe '97, Proceedings of the 3rd International Workshop on Automatic Processing of Fringe Patterns, Bremen, Germany, pp. 147-155, Sep. 15-17, 1997.
Blaasvaer, et al. “AMOR—An Autonomous Mobile Robot Navigation System”, Proceedings of the IEEE International Conference on Systems, Man, and Cybernetics, pp. 2266-2271, 1994.
Hoag, et al. “Navigation and Guidance in interstellar space”, ACTA Astronautica vol. 2, pp. 513-533 , Feb. 14, 1975.
Huntsberger et al. “CAMPOUT: A Control Architecture for Tightly Coupled Coordination of Multirobot Systems for Planetary Surface Exploration”, IEEE Transactions on Systems, Man, and Cybernetics—Part A: Systems and Humans, vol. 33, No. 5, pp. 550-559, Sep. 2003.
Iirobotics.com “Samsung Unveils Its Multifunction Robot Vacuum”, www.iirobotics.com/webpages/hotstuff.php?ubre=111, 3 pages, Mar. 18, 2005.
OnRobo “Samsung Unveils Its Multifunction Robot Vacuum”, www.onrobo.com/enews/0210/samsung—vacuum.shtml, 3 pages, Mar. 18, 2005.
Pages et al. “Optimizing Plane-to-Plane Positioning Tasks by Image-Based Visual Servoing and Structured Light”, IEEE Transactions on Robotics, vol. 22, No. 5, pp. 1000-1010, Oct. 2006.
Pages et al. “A camera-projector system for robot positioning by visual servoing”, Proceedings of the 2006 Conference on Computer Vision and Pattern Recognition Workshop (CVPRW06), 8 pages, Jun. 17-22, 2006.
Pages, et al. “Robust decoupled visual servoing based on structured light”, 2005 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 2676-2681, 2005.
Park et al. “A Neural Network Based Real-Time Robot Tracking Controller Using Position Sensitive Detectors,” IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on Neutral Networks, Orlando, Florida pp. 2754-2758, Jun. 27-Jul. 2, 1994.
Park, et al. “Dynamic Visual Servo Control of Robot Manipulators using Neutral Networks”, The Korean Institute Telematics and Electronics, vol. 29-B, No. 10, pp. 771-779, Oct. 1992.
Paromtchik “Toward Optical Guidance of Mobile Robots”.
Paromtchik, et al. “Optical Guidance System for Multiple mobile Robots”, Proceedings 2001 ICRA. IEEE International Conference on Robotics and Automation, vol. 3, pp. 2935-2940 (May 21-26, 2001).
Penna, et al. “Models for Map Building and Navigation”, IEEE Transactions on Systems. Man. And Cybernetics. vol. 23 No. 5, pp. 1276-1301, Sep./Oct. 1993.
Pirjanian “Reliable Reaction”, Proceedings of the 1996 IEEE/SICE/RSJ International Conference on Multisensor Fusion and Integration for Intelligent Systems, pp. 158-165, 1996.
Pirjanian “Challenges for Standards for consumer Robotics”, IEEE Workshop on Advanced Robotics and its Social impacts, pp. 260-264, Jun. 12-15, 2005.
Pirjanian et al. “Distributed Control for a Modular, Reconfigurable Cliff Robot”, Proceedings of the 2002 IEEE International Conference on Robotics & Automation, Washington, D.C. pp. 4083-4088, May 2002.
Pirjanian et al. “Representation and Execution of Plan Sequences for Multi-Agent Systems”, Proceedings of the 2001 IEEE/RSJ International Conference on Intelligent Robots and Systems, Maui, Hawaii, pp. 2117-2123, Oct. 29-Nov. 3, 2001.
Pirjanian et al. “Multi-Robot Target Acquisition using Multiple Objective Behavior Coordination”, Proceedings of the 2000 IEEE International Conference on Robotics & Automation, San Francisco, CA, pp. 2696-2702, Apr. 2000.
Pirjanian et al. “A decision-theoretic approach to fuzzy behavior coordination”, 1999 IEEE International Symposium on Computational Intelligence in Robotics and Automation, 1999. CIRA '99., Monterey, CA, pp. 101-106, Nov. 8-9, 1999.
Pirjanian et al. “Improving Task Reliability by Fusion of Redundant Homogeneous Modules Using Voting Schemes”, Proceedings of the 1997 IEEE International Conference on Robotics and Automation, Albuquerque, NM, pp. 425-430, Apr. 1997.
Prassler et al., “A Short History of Cleaning Robots”, Autonomous Robots 9, 211-226, 2000, 16 pages.
Radio Frequency Identification: Tracking ISS Consumables, Author Unknown, 41 pages (NPL0127).
Remazeilles, et al. “Image based robot navigation in 3D environments”, Proc. of SPIE, Vol. 6052, pp. 1-14, Dec. 6, 2005.
Rives, et al. “Visual servoing based on ellipse features”, SPIE vol. 2056 Intelligent Robots and Computer Vision pp. 356-367, 1993.
Robotics World Jan 2001: “A Clean Sweep” (Jan. 2001).
Ronnback “On Methods for Assistive Mobile Robots”, http://www.openthesis.org/documents/methods-assistive-mobile-robots-595019.html, 218 pages, Jan. 1, 2006.
Roth-Tabak, et al. “Environment Model for mobile Robots Indoor Navigation”, SPIE vol. 1388 Mobile Robots pp. 453-463, 1990.
Sadath M Malik et al. “Virtual Prototyping for Conceptual Design of a Tracked Mobile Robot”. Electrical and Computer Engineering, Canadian Conference on, IEEE, PI. May 1, 2006, pp. 2349-2352.
Sahin, et al. “Development of a Visual Object Locialization Molule for Mobile Robots”, 1999 Third European Workshop on Advanced Mobile Robots, (Eurobot '99), pp. 65-72, 1999.
Salomon, et al. “Low-Cost Optical Indoor Localization system for Mobile Objects without Image Processing”, IEEE Conference on Emerging Technologies and Factory Automation, 2006. (ETFA '06), pp. 629-632, Sep. 20-22, 2006.
Sato “Range Imaging Based on Moving Pattern Light and Spatio-Temporal Matched Filter”, Proceedings International Conference on Image Processing, vol. 1., Lausanne, Switzerland, pp. 33-36, Sep. 16-19, 1996.
Schenker, et al. “Lightweight rovers for Mars science exploration and sample return”, Intelligent Robots and Computer Vision XVI, SPIE Proc. 3208, pp. 24-36, 1997.
Sebastian Thrun, Learning Occupancy Grid Maps With Forward Sensor Models, School of Computer Science, Carnegie Mellon University, pp. 1-28.
Shimoga et al. “Touch and Force Reflection for Telepresence Surgery”, Engineering in Medicine and Biology Society, 1994. Engineering Advances: New Opportunities for Biomedical Engineers. Proceedings of the 16th Annual International Conference of the IEEE, Baltimore, MD, pp. 1049-1050, 1994.
Sim, et al “Learning Visual Landmarks for Pose Estimation”, IEEE International Conference on Robotics and Automation, vol. 3, Detroit, MI, pp. 1972-1978, May 10-15, 1999.
Sobh et al. “Case Studies in Web-Controlled Devices and Remote Manipulation”, Automation Congress, 2002 Proceedings of the 5th Biannual World, pp. 435-440, Dec. 10, 2001.
Stella, et al. “Self-Location for Indoor Navigation of Autonomous Vehicles”, Part of the SPIE conference on Enhanced and Synthetic Vision SPIE, vol. 3364 pp. 298-302, 1998.
Summet “Tracking Locations of Moving Hand-held Displays Using Projected Light”, Pervasive 2005, LNCS 3468 pp. 37-46 (2005).
Svedman et al. “Structure from Stereo Vision using Unsynchronized Cameras for Simultaneous Localization and Mapping”, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 2993-2998, 2005.
Takio et al. “Real-Time Position and Pose Tracking Method of Moving Object Using Visual Servo System”, 47th IEEE International Symposium on Circuits and Systems, pp. 167-170, 2004.
Teller “Pervasive pose awareness for people, Objects and Robots”, http://www.ai.mit.edu/lab/dangerous-ideas/Spring2003/teller-pose.pdf, 6 pages, Apr. 30, 2003.
Terada et al. “An Acquisition of the Relation between Vision and Action using Self-Organizing Map and Reinforcement Learning”, 1998 Second International Conference on Knowledge-Based Intelligent Electronic Systems, Adelaide, Australiam pp. 429-434, Apr. 21-23, 1998.
The Sharper Image “e-Vac Robotic Vacuum, S1727 Instructions”www.sharperimage.com, 18 pages.
The Sharper Image “Robotic Vacuum Cleaner—Blue” www.Sharperimage.com, 2 pages, Mar. 18, 2005.
The Sharper Image “E Vac Robotic Vacuum”, www.sharperiamge.com/us/en/templates/products/pipmorework1printable.jhtml, 2 pages, Mar. 18, 2005.
TheRobotStore.com “Friendly Robotics Robotic Vacuum RV400-The Robot Store”, www.therobotstore.com/s.nl/sc.9/category.-109/it.A/id.43/.f, 1 page, Apr. 20, 2005.
TotalVac.com RC3000 RoboCleaner website Mar. 18, 2005.
Trebi-Ollennu et al. “Mars Rover Pair Cooperatively Transporting a Long Payload”, Proceedings of the 2002 IEEE International Conference on Robotics & Automation, Washington, D.C. pp. 3136-3141, May 2002.
Tribelhorn et al., “Evaluating the Roomba: a low-cost, ubiquitous platform for robotics research and education,” 2007, IEEE, p. 1393-1399.
Tse et al. “Design of a Navigation System for a Household Mobile Robot Using Neural Networks”, Department of Manufacturing Engg. & Engg. Management, City University of Hong Kong, pp. 2151-2156, 1998.
UAMA (Asia) Industrial Co., Ltd. “RobotFamily”, 2005.
Watanabe et al. “Position Estimation of Mobile Robots With Internal and External Sensors Using Uncertainty Evolution Technique”, 1990 IEEE International Conference on Robotics and Automation, Cincinnati, OH, pp. 2011-2016, May 13-18, 1990.
Watts “Robot, boldly goes where no man can”, The Times—pp. 20, Jan. 1985.
Wijk et al. “Triangulation-Based Fusion of Sonar Data with Application in Robot Pose Tracking”, IEEE Transactions on Robotics and Automation, vol. 16, No. 6, pp. 740-752, Dec. 2000.
Wolf et al. “Robust Vision-based Localization for Mobile Robots Using an Image Retrieval System Based on Invariant Features”, Proceedings of the 2002 IEEE International Conference on Robotics & Automation, Washington, D.C. pp. 359-365, May 2002.
Wolf et al. “Robust Vision-Based Localization by Combining an Image-Retrieval System with Monte Carol Localization”, IEEE Transactions on Robotics, vol. 21, No. 2, pp. 208-216, Apr. 2005.
Wong “EIED Online>> Robot Business”, ED Online ID# 13114, 17 pages, Jul. 2006.
Yamamoto et al. “Optical Sensing for Robot Perception and Localization”, 2005 IEEE Workshop on Advanced Robotics and its Social Impacts, pp. 14-17, 2005.
Yata et al. “Wall Following Using Angle Information Measured by a Single Ultrasonic Transducer”, Proceedings of the 1998 IEEE, International Conference on Robotics & Automation, Leuven, Belgium, pp. 1590-1596, May 1998.
Yun, et al. “Image-Based Absolute Positioning System for Mobile Robot Navigation”, IAPR International Workshops SSPR, Hong Kong, pp. 261-269, Aug. 17-19, 2006.
Yun, et al. “Robust Positioning a Mobile Robot with Active Beacon Sensors”, Lecture Notes in Computer Science, 2006, vol. 4251, pp. 890-897, 2006.
Yuta, et al. “Implementation of an Active Optical Range sensor Using Laser Slit for In-Door Intelligent Mobile Robot”, IEE/RSJ International workshop on Intelligent Robots and systems (IROS 91) vol. 1, Osaka, Japan, pp. 415-420, Nov. 3-5, 1991.
Zha et al. “Mobile Robot Localization Using Incomplete Maps for Change Detection in a Dynamic Environment”, Advanced Intelligent Mechatronics '97. Final Program and Abstracts., IEEE/ASME International Conference, pp. 110, Jun. 16-20, 1997.
Zhang, et al. “A Novel Mobile Robot Localization Based on Vision”, SPIE vol. 6279, 6 pages, Jan. 29, 2007.
Euroflex Intellegente Monstre Mauele (English only except).
Roboking—not just a vacuum cleaner, a robot! Jan. 21, 2004, 5 pages.
SVET Computers—New Technologies—Robot vacuum cleaner, 1 page.
Popco.net Make your digital life http://www.popco.net/zboard/view.php?id=tr—review&no=40 accessed Nov. 1, 2011.
Matsumura Camera Online Shop http://www.rakuten.co.jp/matsucame/587179/711512/ accessed Nov. 1, 2011.
Dyson's Robot Vacuum Cleaner—the DC06, May 2, 2004 http://www.gizmag.com/go/1282/ accessed Novmber 11, 2011.
Electrolux Trilobite, http://www.electrolux-ui.com:8080/2002%5C822%5C833102EN.pdf 10 pages.
Electrolux Trilobite, Time to enjoy life, 38 pages http://www.robocon.co.kr/trilobite/Presentation—Ttilobite—Kor—030104. ppt accessed Dec. 22, 2011.
Facts on the Trilobite http://www.frc.ri.cmu.edu/˜hpm/talks/Extras/trilobite.desc.html 2 pages accessed Nov. 1, 2011.
Euroflex Jan. 1, 2006 http://www.euroflex.tv/novita—dett.php?id=15 1 page accessed Nov. 1, 2011.
FloorBotics, VR-8 Floor Cleaning Robot, Product Description for Manuafacturers, http://www.consensus.com.au/ SoftwareAwards/CSAarchive/CSA2004/CSAart04/FloorBot/F, 2004.
Friendly Robotics, 18 pages http://www.robotsandrelax.com/PDFs/RV400Manual.pdf accessed Dec. 22, 2011.
It's eye, 2003 www.hitachi.co.jp/rd/pdf/topics/hitac2003—10.pdf 2 pages.
Hitachi, May 29, 2003 http://www.hitachi.co.jp/New/cnews/hl—030529—hl—030529.pdf 8 pages.
Robot Buying Guide, LG announces the first robotic vacuum cleaner for Korea, Apr. 21, 2003 http://robotbg.com/news/2003/04122/lg—announces—the—first—robotic—vacu.
CleanMate 365, Intelligent Automatic Vacuum Cleaner, Model No. QQ-1, User Manual www.metapo.com/support/user—manual.pdf 11 pages.
UBOT, cleaning robot capable of wiping with a wet duster, http://us.aving.net/news/view.php?articleId=23031, 4 pages accessed Nov. 1, 2011.
Taipei Times, Robotic vacuum by Matsuhita about of undergo testing, Mar. 26, 2002 http://www.taipeitimes.com/News/worldbiz/archives/2002/03/2610000129338 accessed.
Tech-on! http://techon.nikkeibp.co.jp/members/01db/200203110065011, 4 pages, accessed Nov. 1, 2011.
http://ascii.jp/elem10001000133013300241.
IT media http://www.itmedia.co.jp/news/0111116/robofesta—m.html accessed Nov. 1, 2011.
Yujin Robotics, an intelligent cleaning robot ‘iclebo Q’ AVING USA http://us.aving.net/news/view.php?articleId=7257, 8 pages accessed Nov. 4, 2011.
Special Reports, Vacuum Cleaner Robot Operated in Conjunction with 3G Celluar Phone vol. 59, No. 9 (2004) 3 pages http://www.toshiba.co.jp/tech/review/2004/09/59—0, 2004.
Toshiba Corporation 2003, http://warp.ndl.go.jp/info:ndljp/pid/258151/www.soumu.go.jp/joho—tsusin/policyreports/chousa/netrobot/pdf/030214—1—33—a.pdf 16 pages, 2003.
http://www.karcher.de/versions/intg/assets/video/2—4—robo—en.swf. Accessed Sep. 25, 2009.
McLurkin “The Ants: A community of Microrobots”, Paper submitted for requirements of BSEE at MIT, May 12, 1995.
Grumet “Robots Clean House”, Popular Mechanics, Nov. 2003.
McLurkin Stupid Robot Tricks: A Behavior-based Distributed Algorithm Library for Programming Swarms of Robots, Paper submitted for requirements of BSEE at MIT, May 2004.
Kurs et al, Wireless Power transfer via Strongly Coupled Magnetic Resonances, Downloaded from www.sciencemag.org , Aug. 17, 2007.
Karcher RC 3000 Cleaning Robot-user manual Manufacturer: Alfred-Karcher GmbH & Co, Cleaning Systems, Alfred Karcher-Str 28-40, PO Box 160, D-71349 Winnenden, Germany, Dec. 2002.
Karcher RoboCleaner RC 3000 Product Details, webpages: “http://wwwrobocleaner.de/english/screen3.html” through “. . . screen6.html” Dec. 12, 2003, 4 pages.
Karcher USA, RC3000 Robotic Cleaner, website: http://www.karcher-usa.com/showproducts.php?op=viewprod&paraml=143&param2=&param3=, accessed Mar. 18, 2005, 6 pages.
Koolvac Robotic Vacuum Cleaner Owner's Manual, Koolatron, Undated, 26 pages.
Put Your Roomba . . . On “Automatic” webpages: “http://www.acomputeredge.com/roomba,” accessed Apr. 20, 2005, 5 pages.
eVac Robotic Vacuum S1727 Instruction Manual, Sharper Image Corp, Copyright 2004, 16 pages.
Friendly Robotics Robotic Vacuum RV400—The Robot Store website: www.therobotstore.com/s.nl/sc.9/category,-109/it.A/id.43/.f, accessed Apr. 20, 2005, 5 pages.
Hitachi “Feature”, http://kadenfan.hitachi.co.jp/robot/feature/feature.html , 1 page Nov. 19, 2008.
Jarosiewicz et al. “Final Report—Lucid”, University of Florida, Departmetn of Electrical and Computer Engineering, EEL 5666—Intelligent Machine Design Laboratory, 50 pages, Aug. 4, 1999.
Jensfelt, et al. “Active Global Localization for a mobile robot using multiple hypothesis tracking”, IEEE Transactions on Robots and Automation vol. 17, No. 5, pp. 748-760, Oct. 2001.
Jeong, et al. “An intelligent map-building system for indoor mobile robot using low cost photo sensors”, SPIE vol. 6042 6 pages, 2005.
Kahney, “Robot Vacs are in the House,” www.wired.com/news/technology/o,1282,59237,00.html, 6 pages, Jun. 18, 2003.
Karcher “Product Manual Download Karch”, www.karcher.com, 17 pages, 2004.
Karcher “Karcher RoboCleaner RC 3000”, www.robocleaner.de/english/screen3.html, 4 pages, Dec. 12, 2003.
Karcher USA “RC 3000 Robotics cleaner”, www.karcher-usa.com, 3 pages, Mar. 18, 2005.
Karlsson et al., The vSLAM Algorithm for Robust Localization and Mapping, Proceedings of the 2005 IEEE International Conference on Robotics and Automation, Barcelona, Spain, pp. 24-29, Apr. 2005.
Karlsson, et al Core Technologies for service Robotics, IEEE/RSJ International Conference on Intelligent Robots and Systems (IROSA 2004), vol. 3, pp. 2979-2984, Sep. 28-Oct. 2, 2004.
King “Heplmate-TM—Autonomous mobile Robots Navigation Systems”, SPIE vol. 1388 Mobile Robots pp. 190-198, 1990.
Kleinberg, The Localization Problem for Mobile Robots, Laboratory for Computer Science, Massachusetts Institute of Technology, 1994 IEEE, pp. 521-531, 1994.
Knight, et al., “Localization and Identification of Visual Landmarks”, Journal of Computing Sciences in Colleges, vol. 16 Issue 4, 2001 pp. 312-313, May 2001.
Kolodko et al. “Experimental System for Real-Time Motion Estimation”, Proceedings of the 2003 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM 2003), pp. 981-986, 2003.
Komoriya et al., Planning of Landmark Measurement for the Navigation of a Mobile Robot, Proceedings of the 1992 IEEE/RSJ International Cofnerence on Intelligent Robots and Systems, Raleigh, NC pp. 1476-1481, Jul. 7-10, 1992.
Koolatron “KOOLVAC—Owner's Manual”, 13 pages.
Krotov, et al. “Digital Sextant”, Downloaded from the internet at: http://www.cs.cmu.edu/˜epk/ , 1 page, 1995.
Krupa et al. “Autonomous 3-D Positioning of Surgical Instruments in Robotized Laparoscopic Surgery Using Visual Servoing”, IEEE Transactions on Robotics and Automation, vol. 19, No. 5, pp. 842-853, Oct. 5, 2003.
Kuhl, et al. “Self Localization in Environments using Visual Angles”, VRCAI '04 Proceedings of the 2004 ACM SIGGRAPH international conference on Virtual Reality continuum and its applications in industry, pp. 472-475, 2004.
Kurth, “Range-Only Robot Localization and SLAM with Radio”, http://www.ri.cmu.edu/pub—files/pub4/kurth—derek—2004—1/kurth—derek—2004—1.pdf. 60 pages, May 2004.
Lambrinos, et al. “A mobile robot employing insect strategies for navigation”, http://www8.cs.umu.se/kurser/TDBD17/VT04/dl/Assignment%20Papers/lambrinos-RAS-2000.pdf, 38 pages, Feb. 19, 1999.
Lang et al. “Visual Measurement of Orientation Using Ceiling Features”, 1994 IEEE, pp. 552-555, 1994.
Lapin, “Adaptive position estimation for an automated guided vehicle”, SPIE vol. 1831 Mobile Robots VII, pp. 82-94, 1992.
LaValle et al. “Robot Motion Planning in a Changing, Partially Environment”, 1994 IEEE International Symposium on Intelligent Control, Columbus, OH, pp. 261-266, Aug. 16-18, 1994.
Lee, et al. “Localization Of a Mobile Robot Using the Image of a Moving Object”, IEEE Transaction on Industrial Electronics, vol. 50, No. 3 pp. 612-619, Jun. 2003.
Lee, et al. “Development of Indoor Navigation system for Humanoid Robot Using Multi-sensors Integration”, ION NTM, San Diego, CA pp. 798-805, Jan 22-24, 2007.
Leonard, et al. “Mobile Robot Localization by tracking Geometric Beacons”, IEEE Transaction on Robotics and Automation, vol. 7, No. 3 pp. 376-382, Jun. 1991.
Li et al. “Robost Statistical Methods for Securing Wireless Localization in Sensor Networks”, Wireless Information Network Laboratory, Rutgers University.
Li et al. “Making a Local Map of Indoor Environments by Swiveling a Camera and a Sonar”, Proceedings of the 1999 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 954-959, 1999.
Lin, et al.. “Mobile Robot Navigation Using Artificial Landmarks”, Journal of robotics System 14(2). pp. 93-106, 1997.
Linde “Dissertation, “On Aspects of Indoor Localization”” https://eldorado.tu-dortmund.de/handle/2003/22854, University of Dortmund, 138 pages, Aug. 28, 2006.
Lumelsky, et al. “An Algorithm for Maze Searching with Azimuth Input”, 1994 IEEE International Conference on Robotics and Automation, San Diego, CA vol. 1, pp. 111-116, 1994.
Luo et al., “Real-time Area-Covering Operations with Obstacle Avoidance for Cleaning Robots,” 2002, IEeE, p. 2359-2364.
Ma “Thesis: Documentation On Northstar”, California Institute of Technology, 14 pages, May 17, 2006.
Madsen, et al. “Optimal landmark selection for triangulation of robot position”, Journal of Robotics and Autonomous Systems vol. 13 pp. 277-292, 1998.
Martishevcky, “The Accuracy of point light target coordinate determination by dissectoral tracking system”, SPIE vol. 2591 pp. 25-30.
Matsutek Enterprises Co. Ltd “Automatic Rechargeable Vacuum Cleaner”, http://matsutek.manufacturer.globalsources.com/si/6008801427181/pdtl/Home-vacuum/10..., Apr. 23, 2007.
McGillem, et al. “Infra-red Lacation System for Navigation and Autonomous Vehicles”, 1988 IEEE International Conference on Robotics and Automation, vol. 2, pp. 1236-1238, Apr. 24-29, 1988.
McGillem,et al. “A Beacon Navigation Method for Autonomous Vehicles”, IEEE Transactions on Vehicular Technology, vol. 38, No. 3, pp. 132-139, Aug. 1989.
Michelson “Autonomous Navigation”, 2000 Yearbook of Science & Technology, McGraw-Hill, New York, ISBN 0-07-052771-7, pp. 28-30, 1999.
Miro, et al. “Towards Vision Based Navigation in Large Indoor Environments”, Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Beijing, China, pp. 2096-2102, Oct. 9-15, 2006.
MobileMag “Samsung Unveils High-tech Robot Vacuum Cleaner”, http://www.mobilemag.com/content/100/102/C2261/, 4 pages, Mar. 18, 2005.
Monteiro, et al. “Visual Servoing for Fast Mobile Robot: Adaptive Estimation of Kinematic Parameters”, Proceedings of the IECON'93., International Conference on Industrial Electronics, Maui, HI, pp. 1588-1593, Nov. 15-19, 1993.
Moore, et al. A simple Map-bases Localization strategy using range measurements, SPIE vol. 5804 pp. 612-620, 2005.
Munich et al. “SIFT-ing Through Features with ViPR”, IEEE Robotics & Automation Magazine, pp. 72-77, Sep. 2006.
Munich et al. “ERSP: A Software Platform and Architecture for the Service Robotics Industry”, Intelligent Robots and Systems, 2005. (IROS 2005), pp. 460-467, Aug. 2-6, 2005.
Nam, et al. “Real-Time Dynamic Visual Tracking Using PSD Sensors and extended Trapezoidal Motion Planning”, Applied Intelligence 10, pp. 53-70, 1999.
Nitu et al. “Optomechatronic System for Position Detection of a Mobile Mini-Robot”, IEEE Ttransactions on Industrial Electronics, vol. 52, No. 4, pp. 969-973, Aug. 2005.
On Robo “Robot Reviews Samsung Robot Vacuum (VC-RP3OW)”, www.onrobo.com/reviews/AT—Home/vacuum—cleaners/on0Ovcrb3Orosam/index.htm.. 2 pages, 2005.
InMach “Intelligent Machines”, www.inmach.de/inside.html, 1 page , Nov. 19, 2008.
Innovation First “2004 EDU Robot Controller Reference Guide”, http://www.ifirobotics.com, 13 pgs., Mar. 1, 2004.
Andersen et al., “Landmark based navigation strategies”, SPIE Conference on Mobile Robots XIII, SPIE vol. 3525, pp. 170-181, Jan. 8, 1999.
Ascii, Mar. 25, 2002, http://ascii.jp/elem/000/000/330/330024/ accessed Nov. 1, 2011.
Certified U.S. Appl. No. 60/605,066 as provided to WIPO in PCT/US2005/030422, corresponding to U.S. National Stage Entry U.S. Appl. No. 11/574,290, U.S. publication 2008/0184518, filed Aug. 27, 2004.
Certified U.S. Appl. No. 60/605,181 as provided to WIPO in PCT/US2005/030422, corresponding to U.S. National Stage Entry U.S. Appl. No. 11/574,290, U.S. publication 2008/0184518, filed Aug. 27, 2004.
Derek Kurth, “Range-Only Robot Localization and SLAM with Radio”, http://www.ri.cmu.edu/pub—files/pub4/kurth—derek—2004—1/kurth—derek—2004—1.pdf. 60 pages, May 2004, accessed Jul. 27, 2012.
Electrolux Trilobite, Jan. 12, 2001, http://www.electrolux-ui.com:8080/2002%5C822%5C833102EN.pdf, accessed Jul. 2, 2012, 10 pages.
Florbot GE Plastics, 1989-1990, 2 pages, available at http://www.fuseid.com/, accessed Sep. 27, 2012.
Gregg et al., “Autonomous Lawn Care Applications,” 2006 Florida Conference on Recent Advances in Robotics, Miami, Florida, May 25-26, 2006, Florida International University, 5 pages.
Hitachi ‘Feature’, http://kadenfan.hitachi.co.jp/robot/feature/feature.html, 1 page, Nov. 19, 2008.
Hitachi, http://www.hitachi.co.jp/New/cnews/hi—030529—hi—030529.pdf , 8 pages, May 29, 2003.
Home Robot—UBOT; Microbotusa.com, retrieved from the WWW at www.microrobotusa.com, accessed Dec. 2, 2008.
King and Weiman, “Helpmate™ Autonomous Mobile Robots Navigation Systems,” SPIE vol. 1388 Mobile Robots, pp. 190-198 (1990).
Li et al. “Robust Statistical Methods for Securing Wireless Localization in Sensor Networks,” Information Procesing in Sensor Networks, 2005, Fourth International Symposium on, pp. 91-98, Apr. 2005.
Martishevcky, “The Accuracy of point light target coordinate determination by dissectoral tracking system”, SPIE vol. 2591, pp. 25-30, Oct. 23, 2005.
Maschinemarkt Würzburg 105, Nr. 27, pp. 3, 30, Jul. 5, 1999.
Miwako Doi “Using the symbiosis of human and robots from approaching Research and Development Center,” Toshiba Corporation, 16 pages, available at http://warp.ndl.go.jp/info:ndljp/pid/258151/www.soumu.go.jp/joho—tsusin/policyreports/chousa/netrobot/pdf/030214—1—33—a.pdf, Feb. 26, 2003.
Paromtchik “Toward Optical Guidance of Mobile Robots,” Proceedings of the Fourth World Multiconference on Systemics, Cybermetics and Informatics, Orlando, FL, USA, Jul. 23, 2000, vol. IX, pp. 44-49, available at http://emotion.inrialpes.fr/˜paromt/infos/papers/paromtchik:asama:sci:2000.ps.gz, accessed Jul. 3, 2012.
Roboking—not just a vacuum cleaner, a robot!, Jan. 21, 2004, infocom.uz/2004/01/21/robokingne-prosto-pyilesos-a-robot/, accessed Oct. 10, 2011, 7 pages.
Sebastian Thrun, “Learning Occupancy Grid Maps With Forward Sensor Models,” Autonomous Robots 15, 111-127, Sep. 1, 2003.
SVET Computers—New Technologies—Robot Vacuum Cleaner, Oct. 1999, available at http://www.sk.rs/1999/10/sknt01.html, accessed Nov. 1, 2011.
Written Opinion of the International Searching Authority, PCT/US2004/001504, Aug. 20, 2012, 9 pages.
Related Publications (1)
Number Date Country
20120246862 A1 Oct 2012 US
Continuations (5)
Number Date Country
Parent 12255393 Oct 2008 US
Child 13523109 US
Parent 11860272 Sep 2007 US
Child 12255393 US
Parent 11533294 Sep 2006 US
Child 11860272 US
Parent 11109832 Apr 2005 US
Child 11533294 US
Parent 10766303 Jan 2004 US
Child 11109832 US