Obstacle following sensor scheme for a mobile robot

Abstract
A robot obstacle detection system including a robot housing which navigates with respect to a surface and a sensor subsystem aimed at the surface for detecting the surface. The sensor subsystem includes an emitter which emits a signal having a field of emission and a photon detector having a field of view which intersects the field of emission at a region. The subsystem detects the presence of an object proximate the mobile robot and determines a value of a signal corresponding to the object. It compares the value to a predetermined value, moves the mobile robot in response to the comparison, and updates the predetermined value upon the occurrence of an event.
Description
FIELD OF THE INVENTION

This invention relates to an obstacle detection system for an autonomous robot, such as an autonomous cleaning robot.


BACKGROUND OF THE INVENTION

There is a long felt need for autonomous robotic cleaning and processing devices for dusting, mopping, vacuuming, sweeping, lawn mowing, ice resurfacing, ice melting, and other operations. Although technology exists for complex robots which can, to some extent, “see” and “feel” their surroundings, the complexity, expense and power requirements associated with these types of robotic subsystems render them unsuitable for the consumer marketplace.


The assignee of the subject application has devised a less expensive, battery operated, autonomous cleaning robot which operates in various modes, including random bounce and wall-following modes. In the random bounce mode, the processing circuitry of the robot causes it to move in a straight line until the robot comes into contact with an obstacle; the robot then turns away from the obstacle and heads in a random direction. In the wall-following mode, the robot encounters a wall, follows it for a time, and then returns to the random mode. By using this combination of modes, robotic theory has proven that the floor, including the edges thereof, is adequately covered in an optimal time resulting in a power savings.


Unfortunately, however, presently available sensor subsystems such as sonar sensors for detecting obstacles on or in the floor or for detecting the wall in order to enter the wall-following mode (or to avoid bumping into the wall) are either too complex or too expensive or both. Tactile sensors are inefficient to ensure that walls or other obstacles can be effectively followed at a predetermined distance.


Some existing systems that disclose wall-following modes for autonomous robots are disclosed in International Publication No. WO 02/101477 A2, U.S. patent application Ser. No. 10/453,202 and U.S. Pat. No. 6,809,490, the disclosures of which are herein incorporated by reference in their entireties. In an embodiment of the system disclosed in the U.S. patent and application (and available commercially from iRobot Corporation as the ROOMBA® Robotic Floorvac), analog electronics (i.e., a comparator) are used to determine whether a sensor has detected the wall or not. The system is designed to follow along a wall at a predetermined distance to allow a cleaning mechanism (e.g., a side brush) to clean against a wall. In the ROOMBA® Robotic Floorvac, a mechanical shutter proximate the sensor can be manually adjusted by the user in order to make the robot follow an appropriate distance from the wall. This shutter is used since the sensor can be sensitive to the albedo of the wall. This manually adjusted shutter, while effective, detracts from the autonomous nature of mobile robots; thus, a fully independent wall-following scheme for a mobile robot is needed.


SUMMARY OF THE INVENTION

Accordingly, the control system of the present invention utilizes, in one embodiment, a synchronous detection scheme inputted directly into an A/D port on a microprocessor of the robot. This allows sensor values, and not merely the presence or absence of a wall, to be used to control the robot. The synchronous detection algorithm also allows readings to be taken with and without the sensor emitter powered, which allows the system to take into account ambient light.


In one aspect, the invention relates to a robot obstacle detection system that is simple in design, low cost, accurate, easy to implement, and easy to calibrate.


In an embodiment of the above aspect, such a robot detection system prevents an autonomous cleaning robot from driving off a stair or an obstacle that is too high.


In another aspect, the invention relates to a robotic wall detection system that is low cost, accurate, easy to implement, and easy to calibrate.


In an embodiment of the above aspect, such a robot wall detection system effects smoother robot operation in the wall-following mode.


In yet another aspect, the invention relates to a sensor subsystem for a robot that consumes a minimal amount of power.


In still another aspect, the invention relates to a sensor subsystem that is unaffected by surfaces of different reflectivity or albedo.


Another aspect of the invention results from the realization that a low cost, accurate, and easy-to-implement system for either preventing an autonomous robot from driving off a stair or over an obstacle which is too high or too low and/or for more smoothly causing the robot to follow a wall for more thorough cleaning can be effected by intersecting the field of view of a detector with the field of emission of a directed beam at a predetermined region and then detecting whether the floor or wall occupies that region. If the floor does not occupy the predefined region, a stair or some other obstacle is present and the robot is directed away accordingly. If a wall occupies the region, the robot is first turned away from the wall and then turned back towards the wall at decreasing radiuses of curvature until the wall once again occupies the region of intersection to effect smoother robot operation in the wall-following mode.


One embodiment of the invention features an autonomous robot having a housing that navigates in at least one direction on a surface. A first sensor subsystem is aimed at the surface for detecting obstacles on the surface. A second sensor subsystem is aimed at least proximate the direction of navigation for detecting walls. Each subsystem can include an optical emitter which emits a directed beam having a defined field of emission and a photon detector having a defined field of view which intersects the field of emission of the emitter at a finite, predetermined region.


Another embodiment of the robot obstacle detection system of this invention features a robot housing which navigates with respect to a surface and a sensor subsystem having a defined relationship with respect to the housing and aimed at the surface for detecting the surface. The sensor subsystem can include an optical emitter which emits a directed beam having a defined field of emission and a photon detector having a defined field of view which intersects the field of emission of the emitter at a region. A circuit in communication with the detector then redirects the robot when the surface does not occupy the region to avoid obstacles.


In certain embodiments, there are a plurality of sensor subsystems spaced from each other on the housing of the robot and the circuit includes logic for detecting whether any detector has failed to detect a beam from an emitter.


In one embodiment, the robot includes a surface cleaning brush. Other embodiments attach to the robot a buffing brush for floor polishing, a wire brush for stripping paint from a floor, a sandpaper drum for sanding a surface, a blade for mowing grass, etc. The emitter typically includes an infrared light source and, consequently, the detector includes an infrared photon detector. A modulator connected to the infrared light source modulates the directed infrared light source beam at a predetermined frequency, with the photon detector tuned to that frequency. The emitter usually includes an emitter collimator about the infrared light source for directing the beam and the detector then further includes a detector collimator about the infrared photon detector. The emitter collimator and the detector collimator may be angled with respect to the surface to define a finite region of intersection.


One embodiment of the robot wall detection system in accordance with the invention includes a robot housing which navigates with respect to a wall and a sensor subsystem having a defined relationship with respect to the housing and aimed at the wall for detecting the presence of the wall. The sensor subsystem includes an emitter which emits a directed beam having a defined field of emission and a detector having a defined field of view which intersects the field of emission of the emitter at a region. A circuit in communication with the detector redirects the robot when the wall occupies the region.


In another embodiment, there are a plurality of sensor subsystems spaced from each other on the housing of the robot and the circuit includes logic for detecting whether any detector has detected a beam from an emitter.


The circuit includes logic which redirects the robot away from the wall when the wall occupies the region and back towards the wall when the wall no longer occupies the region of intersection, typically at decreasing radiuses of curvature until the wall once again occupies the region of intersection to effect smooth operation of the robot in the wall-following mode.


The sensor subsystem for an autonomous robot which rides on a surface in accordance with this invention includes an optical emitter which emits a directed optical beam having a defined field of emission, a photon detector having a defined field of view which intersects the field of emission of the emitter at a region and a circuit in communication with a detector for providing an output when an object is not present in the region.


If the object is the surface, the output from the circuit causes the robot to be directed to avoid an obstacle. If, on the other hand, the object is a wall, the output from the circuit causes the robot to be directed back towards the wall.


If the object is diffuse, at least one of the detector and the emitter may be oriented normal to the object. Also, an optional lens for the emitter and a lens for the detector control the size and/or shape of the region. A control system may be included and configured to operate the robot in a plurality of modes including an obstacle following mode, whereby said robot travels adjacent to an obstacle. Typically, the obstacle following mode comprises alternating between decreasing the turning radius of the robot as a function of distance traveled, such that the robot turns toward said obstacle until the obstacle is detected, and such that the robot turns away from said obstacle until the obstacle is no longer detected. In one embodiment, the robot operates in obstacle following mode for a distance greater than twice the work width of the robot and less than approximately ten times the work width of the robot. In one example, the robot operates in obstacle following mode for a distance greater than twice the work width of the robot and less than five times the work width of the robot.


In another aspect, the invention relates to a method for operating a mobile robot, the method including the steps of detecting the presence of an object proximate the mobile robot, sensing a value of a signal corresponding to the object, comparing the value to a predetermined value, moving the mobile robot in response to the comparison, and updating the predetermined value upon the occurrence of an event. In another embodiment, the updated predetermined value is based at least in part on a product of the predetermined value and a constant. In certain embodiments, the event may include a physical contact between the mobile robot and the object or may include when a scaled value is less than the predetermined value. In one embodiment, the scaled value is based at least in part on a product of the value and a constant. The step of moving the mobile robot may include causing the robot to travel toward the object, when the value is less than the predetermined value, and/or causing the robot to travel away from the object, when the value is greater than the predetermined value.


In other embodiments, the method includes conditioning the value of the signal corresponding to the object. The detection step of the method may also include a first detection at a first distance to the object, and a second detection at a second distance to the object. The detection step may include emitting at least one signal and/or measuring at least one signal with at least one sensor. Embodiments of the above aspect may average a plurality of signals or filter one or more signals. In certain embodiments, a plurality of sensors are disposed on the mobile robot in a predetermined pattern that minimizes a variation in object reflectivity. Other embodiments vary the power of at least one emitted signal and/or vary the sensitivity of at least one sensor.


In various embodiments of the above aspect, at least one emitted signal or detected signal includes light having at least one of a visible wavelength and an infrared wavelength. In other embodiments of the above aspect, at least one emitted signal or detected signal includes an acoustic wave having at least one of an audible frequency and an ultrasonic frequency. Other embodiments of the above aspect include a mobile robot, the robot having at least one infrared emitter and at least one infrared detector, wherein the infrared emitter and the infrared detector are oriented substantially parallel to each other. In certain embodiments, the signal value corresponds to at least one of a distance to the object and an albedo of the object.


In another aspect, the invention relates to a method for operating a mobile robot, the method including the steps of detecting a presence of an object proximate the mobile robot, detecting an absence of the object, moving the robot a predetermined distance in a predetermined first direction, and rotating the robot in a predetermined second direction about a fixed point. In certain embodiments of the above aspect, the predetermined distance corresponds at least in part to a distance from a sensor located on the robot to a robot wheel axis. In one embodiment, the first direction is defined at least in part by a previous direction of motion of the robot prior to detecting the absence of the object.


In alternative embodiments, the fixed point is a point between a first wheel of the robot and the object. In some embodiments, the first wheel is proximate the object. In other embodiments, rotating the robot may cease on the occurrence of an event, the event including detecting a presence of an object, contacting an object, or rotating the robot beyond a predetermined angle. An additional step of moving in a third direction is included in other embodiments.





BRIEF DESCRIPTION OF THE DRAWINGS

Other objects, features and advantages will occur to those skilled in the art from the following description of some embodiments of the invention and the accompanying drawings, in which:



FIG. 1 is schematic view of a robot in accordance with one embodiment of the invention approaching a downward stair;



FIG. 2 is a schematic view of the robot of FIG. 1 approaching an upward stair;



FIG. 3 is a schematic view of the robot of FIG. 1 approaching an obstacle on a floor;



FIG. 4 is a schematic view showing the difference between the wall-following and random modes of travel of a robot in accordance with one embodiment of the invention;



FIG. 5A is a schematic view of a sensor subsystem in accordance with one embodiment of the invention;



FIG. 5B is a graph of signal strength versus distance for the sensor-detector configuration depicted in FIG. 5A;



FIG. 6A is a schematic view of a sensor subsystem in accordance with another embodiment of the invention;



FIG. 6B is a graph of signal strength versus distance for the sensor-detector configuration depicted in FIG. 6A;



FIG. 7 is a schematic view showing the field of emission of the emitter and the field of view of the detector of the sensor subsystem in accordance with one embodiment of the invention;



FIG. 8 is a three-dimensional schematic view showing a full overlap of the field of emission of the emitter and the field of view of the detector in accordance with one embodiment of the invention;



FIG. 9 is a three-dimensional schematic view showing the situation which occurs when there is a minimal overlap between the field of emission and the field of view of one embodiment of the sensor subsystem of the invention;



FIG. 10 is a series of views showing, from top to bottom, no overlap between the field of emission and the field of view and then a full overlap of the field of view over the field of emission;



FIG. 11 is a set of figures corresponding to FIG. 10 depicting the area of overlap for each of these conditions shown in FIG. 10;



FIG. 12 is a more detailed schematic view of the sensor subsystem according to one embodiment of the invention;



FIG. 13 is a schematic view of the sensor subsystem of FIG. 12 in place on or in a robot in accordance with one embodiment of the invention;



FIG. 14 is a schematic top view of the wall detection system in accordance with one embodiment of the invention in place on the shell or housing of a robot;



FIG. 15 is a schematic three dimensional view of the sensor system in accordance with another embodiment of the invention;



FIG. 16 is a flow chart depicting the primary steps associated with a logic which detects whether a cliff is present in front of the robot in accordance with one embodiment of the invention;



FIG. 17 is a flow chart depicting the primary steps associated with the wall-detection logic in accordance with one embodiment of the invention;



FIG. 18 is a bottom view of a cleaning robot in accordance with one embodiment of the invention configured to turn about curvatures of decreasing radiuses;



FIG. 19 is a schematic top view showing the abrupt turns made by a robot in the wall-following mode when the wall-following algorithm of an embodiment of the invention is not employed;



FIG. 20A is a view similar to FIG. 19 except that now the wall-following algorithm of one embodiment of the invention is employed to smooth out the path of the robotic cleaning device in the wall-following mode;



FIGS. 20B-20G depict a sequence wherein a mobile robot operates a wall-following, corner-turning algorithm in accordance with one embodiment of the invention;



FIG. 21A is a flow-chart illustration of the obstacle-following algorithm of an embodiment of the invention;



FIG. 21B is a flow-chart illustration of the obstacle-following algorithm of another embodiment of the invention;



FIG. 21C is a flow-chart illustration of the threshold-adjustment subroutine of the algorithm depicted in FIG. 21B;



FIG. 22 is a flow-chart illustration of an algorithm for determining when to exit the obstacle following mode;



FIG. 23 is a block diagram showing the various components associated with a robotic cleaning device;



FIG. 24 is a schematic three-dimensional view of a robotic cleaning device employing a number of cliff sensors and wall sensors in accordance with one embodiment of the invention;



FIG. 25 is a bottom view of one particular robotic cleaning device and the cliff sensors incorporated therewith in accordance one embodiment of the invention;



FIG. 26 is a side view of the robot of FIG. 25, further incorporating wall-following sensors in accordance with one embodiment of the invention;



FIG. 27A is a circuit diagram for the detector circuit of one embodiment of the invention;



FIG. 27B is a circuit diagram for the detector circuit of another embodiment of the invention;



FIG. 28 is a circuit diagram for the oscillator circuit of one embodiment of the invention;



FIG. 29 is a circuit diagram for the power connection circuit of one embodiment of the invention;



FIG. 30 is a decoupling circuit of one embodiment of the invention;



FIG. 31 is a diagram of a connector used in one embodiment of the invention;



FIG. 32 is a diagram of another connector used in one embodiment of the invention;



FIG. 33 is a diagram of still another connector used in one embodiment of the invention;



FIG. 34 is a circuit diagram of a jumper used in one embodiment of the invention; and



FIG. 35 is a circuit diagram for constant current source used in one embodiment of the invention.





DETAILED DESCRIPTION

Robotic cleaning device 10, FIG. 1 can be configured to dust, mop, vacuum, and/or sweep a surface such as a floor. Typically, robot 10 operates in several modes: random coverage, spiral, and a wall-following mode, as discussed in U.S. Pat. No. 6,809,490 and in the Background section above. In any mode, robot 10 may encounter downward stair 12 or another similar “cliff,” upward stair 14, FIG. 2, or another similar rise, and/or obstacle 16, FIG. 3. According to one specification, the robot must be capable of traversing obstacles less then ⅝″ above or below floor level. Therefore, robot 10 must avoid stairs 12 and 14 but traverse obstacle 16 which may be an extension cord, the interface between a rug and hard flooring, or a threshold between rooms.


As delineated in the background of the invention, presently available obstacle sensor subsystems useful in connection with robot 10 are either too complex or too expensive or both. Moreover, robot 10, depicted in FIG. 4, is designed to be inexpensive and to operate based on battery power to thus thoroughly clean room 20 in several modes: a spiral mode (not shown), a wall-following mode as shown at 22 and 24, and a random bounce mode as shown at 26. In the wall-following mode, the robot follows the wall for a time. In the random bounce mode, the robot travels in a straight line until it bumps into an object. It then turns away from the obstacle by a random turn and then continues along in a straight line until the next object is encountered.


Accordingly, any obstacle sensor subsystem must be inexpensive, simple in design, reliable, must not consume too much power, and must avoid certain obstacles but properly recognize and traverse obstacles which do not pose a threat to the operation of the robot.


Although the following disclosure relates to cleaning robots, the invention hereof is not limited to such devices and may be useful in other devices or systems wherein one or more of the design criteria listed above are important.


In one embodiment, depicted in FIG. 5A, sensor subsystem 50, includes optical emitter 52 which emits a directed beam 54 having a defined field of emission explained supra. Sensor subsystem 50 also includes photon detector 56 having a defined field of view which intersects the field of emission of emitter 52 at or for a given region. Surface 58 may be a floor or a wall depending on the arrangement of sensor subsystem 50 with respect to the housing of the robot.


In general, for obstacle avoidance, circuitry is added to the robot and connected to detector 56 to redirect the robot when surface 58 does not occupy the region defining the intersection of the field of emission of emitter 52 and the field of view of detector 56. For wall-following, the circuitry redirects the robot when the wall occupies the region defined by the intersection of the field of emission of emitter 52 and the field of view of detector 56. Emitter collimator tube 60 forms directed beam 54 with a predefined field of emission and detector collimator tube 62 defines the field of view of the detector 56. In alternative embodiments, collimator tubes 60, 62 are not used.



FIG. 5A depicts one embodiment of the invention where the emitter 52 and detector 56 are parallel to each other and perpendicular to a surface. This orientation makes the signal strength at the detector more dependant on the distance to obstacle. However, in this orientation, the difference between a white or highly reflective surface a long distance away from subsystem 50 and a black or non-reflective surface closer to subsystem 50 cannot be easily detected by the control circuitry. Moreover, the effects of specular scattering are not always easily compensated for adequately when the beam from emitter 52 is directed normal to the plane of surface 58. Notwithstanding the foregoing, this parallel configuration of emitter 52 and detector 56 can be utilized advantageously with the wall-following mode depicted in FIGS. 21A and 21B, described in more detail below.


In another embodiment, depicted in FIG. 6A, emitter collimator 60′ and detector collimator 62′ are both angled with respect to surface 58 and with respect to each other as shown, which is intended to reduce the signal strength dependence on the wall reflectivity. In this way, the region 70, FIG. 7, in which the field of emission of emitter 52′ as shown at 72 and the field of view of detector of 56′ as shown at 74 intersect is finite to more adequately address specular scattering and surfaces of different reflectivity. In this design, the emitter is typically an infrared emitter and the detector is typically an infrared radiation detector. The infrared energy directed at the floor decreases rapidly as the sensor-to-floor distance increases while the infrared energy received by the detector changes linearly with surface reflectivity. Note, however, that an angled relationship between the emitter and detector is not required for diffuse surfaces. Optional lenses 61 and 63 may also be employed to better control the size and/or shape of region 70.



FIGS. 5B and 6B are graphs comparing the signal strength to distance from an object for the emitter/detector configurations depicted in FIGS. 5A and 6B, respectively. In FIG. 5B, depicting the relationship for a parallel configuration, the signal strength is proportional to the distance, and approaches a linear relationship for all four surfaces tested (white cardboard, brown cardboard, bare wood, and clear plastic). Accordingly, sensors arranged to detect walls or other essentially vertical obstacles are well-suited to the parallel configuration, as it allows surfaces further distance away to be effectively detected.



FIG. 6B, on the other hand, depicts the relationship for an angled configuration of the emitter/detector. In this orientation, the signal strength falls off rapidly at a closer distance, regardless of surface type, for all three surfaces tested (brown cardboard, white cardboard, and clear plastic). This occurs when the surface being detected is no longer present in the intersecting region of the emitter signal and detector field of view. Accordingly, the angled orientation and resulting overlap region are desirable for cliff detection subsystems. In that application, a difference in surface height must be detected clearly, allowing the robot to redirect accordingly, saving the robot from damage. Although the parallel configuration and angled configurations are better suited to wall- and cliff-detection, respectively, the invention contemplates using either configuration for either application.


The sensor subsystem is calibrated such that when floor or surface 58′, FIG. 8, is the “normal” or expected distance with respect to the robot, there is a full or a nearly full overlap between the field of emission of the emitter and the field of view of the detector as shown. When the floor or surface is too far away such that the robot can not successfully traverse an obstacle, there is no or only a minimal overlap between the field of emission of the emitter and the field of view of the detector as shown in FIG. 9. The emitter beam and the detector field of view are collimated such that they fully overlap only in a small region near the expected position of the floor. The detector threshold is then set so that the darkest available floor material is detected when the beam and the field of view fully overlap. As the robot approaches a cliff, the overlap decreases until the reflected intensity is below the preset threshold. This triggers cliff avoidance behavior. Highly reflective floor material delays the onset of cliff detection only slightly. By arranging the emitter and detector at 45°. with respect to the floor, the region of overlap as a function of height is minimized. Equal incidence and reflection angles ensure that the cliff detector functions regardless of whether the floor material is specular or diffuse. The size of the overlap region can be selected by choosing the degree of collimation and the nominal distance to the floor. In this way, the logic interface between the sensor subsystem and the control circuitry of the robot is greatly simplified.


By tuning the system to simply redirect the robot when there is no detectable overlap, i.e., when the detector fails to emit a signal, the logic interface required between the sensor subsystem and the control electronics (e.g., a microprocessor) is simple to design and requires no or little signal conditioning. The emitted IR beam may be modulated and the return beam filtered with a matching filter in order to provide robust operation in the presence of spurious signals, such as sunlight, IR-based remote control units, fluorescent lights, and the like. Conversely, for the wall sensor embodiment, the system is tuned to redirect the robot when there is a detectable overlap.



FIGS. 10-11 provide in graphical form an example of the differences in the area of overlap depending on the height (d) of the sensor subsystem from a surface. The field of emission of the emitter and the field of view of the detector were set to be equal and non-overlapping at a distance (d) of 1.3 inches and each was an ellipse 0.940 inches along the major diameter and 0.650 inches along minor diameter. A full overlap occurred at d=0.85 inches where the resulting overlapping ellipses converge into a single ellipse 0.426 inches along the minor diameter and 0.600 inches along the major diameter. Those skilled in the art will understand how to adjust the field of emission and the field of view and the intersection region between the two to meet the specific design criteria of any robotic device in question. Thus, FIGS. 10 and 11 provide illustrative examples only.


In one embodiment, as shown in FIG. 12, in housing 80 of the sensor subsystem, a rectangular 22 mm by 53 mm by 3 mm diameter plastic emitter collimator tube 82 and 3 mm diameter plastic detector collimator tube 84 were placed 13.772 mm from the bottom of housing 80 which was flush with the bottom of the shell of the robot. The collimators 82, 84 may be either a separate component, or may be integrally formed in the robot housing. This configuration defined field of view and field of emission cones of 20° placed at a 60° angle from each other. The angle between the respective collimator tubes was 60° and they were spaced 31.24 mm apart. This configuration defined a region of intersection between the field of emission and the field of view 29.00 mm long beginning at the bottom of the robot.


In the design shown in FIG. 13, the sensor subsystem is shown integrated with robot shell or housing 90 with a wheel (not shown) which supports the bottom 92 of shell 90 one-half inch above surface or floor 94. The region of overlap of the field of view and the field of emission was 0.688 inches, 0.393 inches above the surface. Thus, if stair 96 has a drop greater than 0.393 inches, no signal will be output by the detector and the robot redirected accordingly. In one embodiment, the emitter includes an infrared light source and the detector includes an infrared photon detector each disposed in round plastic angled collimators. The emitter, however, may also be a laser or any other source of light.


For wall detection, emitter 102 and detector 100 are arranged as shown in FIG. 14. The optical axes of the emitter and detector are parallel to the floor on which the robot travels. The field of emission of the emitter and the field of view of the detector are both 22 degree cones. A three millimeter diameter tube produces a cone of this specification when the active element is mounted 0.604 inches from the open end as shown. The optical axes of the emitter and detector intersect at an angle of 80 degrees. The volume of intersection 103 occurs at a point about 2.6 inches ahead of the point of tangency between the robot shell 106 and the wall 104 when the robot is traveling parallel to the wall. The line bisecting the intersection of the optical axes of the emitter and detector is perpendicular to the wall. This ensures that reflections from specular walls are directed from the emitter into the detector.


In another embodiment, depicted in FIG. 15, detector 116 is positioned above emitter 112 and lens 118 with two areas of different curvature 115 and 114 used to focus light from emitter 112 to the same spot as the field of view of detector 116 at only one height above surface 120 so that if the height changes, there is no or at least not a complete overlap between the field of view of detector 116 and emitter 112 as defined by curvature areas 115 and 114. In this situation, the rapid change of reflected intensity with height is provided by focusing two lenses on a single spot. When the floor is in the nominal position relative to the sensor subsystem, the emitter places all its energy on a small spot. The detector is focused on the same spot. As the floor falls away from the nominal position, light reflected into the detector (now doubly out of focus) decreases rapidly. By carefully selecting the lens-to-floor distance and the focal lengths of the two lenses, it is possible for the emitter and detector to be located at different points but have a common focus on the floor. Lens may also be used in connection with the embodiments of FIGS. 5A-7 to better control the shape and/or size of the region of intersection.


The logic of the circuitry associated with the cliff sensor embodiment modulates the emitter at a frequency of several kilohertz and detects any signal from the detector, step 150, FIG. 16, which is tuned to that frequency. When a signal is not output by the detector, step 152, the expected surface is not present and no overlap is detected. In response, an avoidance algorithm is initiated, step 154, to cause the robot to avoid any interfering obstacle. When a reflected signal is detected, processing continues to step 150.


In the wall detection mode, the logic of the circuitry associated with the sensor subsystem modulates the emitter and detects signals from the detector as before, step 170, FIG. 17 until a reflection is detected, step 172. A wall is then next to the robot and the controlling circuitry causes the robot to turn away from the wall, step 174 and then turn back, step 176 until a reflection (the wall) is again detected, step 178. By continuously decreasing the radius of curvature of the robot, step 180, the path of the robot along the wall in the wall-following mode is made smoother.


As shown in FIG. 18, robot housing 200 includes three wheels 202, 204, and 206 and is designed to only move forward in the direction shown by vector 208. When a wall is first detected (step 172, FIG. 17), the robot turns away from the wall in the direction of vector 210 and then turns back towards the wall rotating first about radius R1 and then about radius R2 and then about smoothly decreasing radius points (steps 178-180, FIG. 17) until the wall is again detected. This discussion assumes the detector is on the right of robot housing 200.


As shown in FIG. 19, if only one constant radius of curvature was chosen, the robot's travel path along the wall would be a series of abrupt motions. In contrast, by continuously reducing the radius of curvature as the robot moves forward back to the wall in accordance with the subject invention, the robot's travel path along the wall is relatively smooth as shown in FIG. 20A.



FIGS. 20B-20G depict a sequence corresponding to a corner-turning behavior of an autonomous robot 600. The corner-turning behavior allows the robot 600 to turn smoothly about an outside corner 612 without colliding with the corner 612 or the wall 614. Avoiding unnecessary collisions helps to improve cleaning efficiency and enhances users' perception of the robot's effectiveness.


The robot 600 depicted in FIGS. 20B-20G includes at least one wall sensor 602 and two drive wheels, described with respect to the wall 614 as a outside wheel 604 and a inside wheel 606. The wheels are aligned on a common axis 610. The signal 608 may be either a reflected signal projected by a corresponding emitter (in the sensor 602) or may be the signal received as a result of the ambient light, obstacle reflectivity, or other factors. In this embodiment, the sensor 602 is oriented approximately perpendicular to both the robot's general direction of motion MG and the wall 614. The sensor is located a distance X forward of the common axis 610 of the robot's drive wheels 604, 606. In this embodiment, X equals approximately three inches, but may be any distance based on the size of the robot 600, the robot's application, or other factors.


During the wall-following operation depicted in FIGS. 20A and 20B, the robot 600 servos on the analog signal from the sensor 602. That is, while moving generally forward along MG along the wall 614, the robot 600 turns slightly toward or away from the wall as the signal 608 decreases or increases, respectively. FIG. 20C depicts the condition when the robot 600 reaches an outside corner 612 and the signal 608 suddenly decreases to a low or zero value. When this occurs, the robot 600 triggers its corner-turning behavior. In an alternative embodiment, or in addition to the corner-turning behavior described below, the robot may turn immediately in an effort to bump the wall 614, allowing it to confirm the presence or absence of the wall 614. If the signal 608 remains low or at zero, but the bump sensor continues to activate, the robot would be able to self-diagnose a failed wall sensor 602.



FIG. 20D depicts the initial step of corner-turning behavior, when the robot 600 first ceases serving on the wall (i.e., moving generally forward MG—the robot's last position while serving is shown by dashed outline 600a) and moves straight MS ahead. The robot moves straight ahead MS a distance equal to the distance X between the servo sensor 602 and the drive wheel axis 610 (in this embodiment, approximately three inches). The drive wheel axis 610 now approximately intersects the corner 612. At this stage the robot 600 begins to rotate R about a point P located to the outside of the inside wheel 606 near the corner 612, as depicted in FIG. 20E. The distance from the inside wheel 606 to the point P can be selected in various ways. In one embodiment, the point P is approximately one inch from the inside drive wheel 606. This distance allows the robot 600 to turn, without collision, about an outside corner 612 of any angle or even a standard-width door. In the case of a door the robot 600 would make a 180-degree turn.


The robot 600 continues to rotate in a direction R about the rotation point P until one of three events occurs. FIG. 20F (showing with a dashed outline 600b, the position of the robot at rotation initiation) depicts the first scenario, where the signal 608 from the sensor 602 becomes high. Here, the robot 600 assumes it has found a wall 614. The robot 600 resumes the wall-following behavior, servoing on the wall 614, and moving generally forward MG, as depicted in FIG. 20G. In a second scenario, the robot's bump sensor activates while rotating, at which time the robot may realign itself to the wall and begin following or, depending on other behaviors, the robot may abandon wall-following. The third scenario occurs if the robot turns nearly a complete circle or other predetermined angle without encountering a wall. The robot will assume it has “lost” the wall and can abandon wall-following mode.


The method used in one embodiment for following the wall is explained with reference to FIG. 21A and provides a smooth wall-following operation even with a one-bit sensor. (Here the one-bit sensor detects only the presence of absence of the wall within a particular volume rather than the distance between wall and sensor.) Other methods of detecting a wall or object can be used, such as bump sensing or sonar sensors.


Once the wall-following operational mode, or wall-following behavior of one embodiment, is initiated (step 1301), the robot first sets its initial value for the steering at r0. The wall-following behavior then initiates the emit-detect routine in the wall-follower sensor (step 1310). The existence of a reflection for the IR transmitter portion of the sensor translates into the existence of an object within a predetermined distance from the sensor. The wall-following behavior then determines whether there has been a transition from a reflection (object within range) to a non-reflection (object outside of range) (step 1320). If there has been a transition (in other words, the wall is now out of range), the value of r is set to its most negative value and the robot will veer slightly to the right (step 1325). The robot then begins the emit-detect sequence again (step 1310). If there has not been a transition from a reflection to a non-reflection, the wall-following behavior then determines whether there has been a transition from non-reflection to reflection (step 1330). If there has been such a transition, the value of r is set to its most positive value and the robot will veer slightly left (step 1335). In one embodiment, veering or turning is accomplished by driving the wheel opposite the direction of turn at a greater rate than the other wheel (i.e., the left wheel when veering right, the right wheel when veering left). In an alternative embodiment, both wheels may drive at the same rate, and a rearward or forward caster may direct the turn.


In the absence of either type of transition event, the wall-following behavior reduces the absolute value of r (step 1340) and begins the emit-detect sequence (step 1310) anew. By decreasing the absolute value of r, the robot 10 begins to turn more sharply in whatever direction it is currently heading. In one embodiment, the rate of decreasing the absolute value of r is a constant rate dependant on the distance traveled.



FIG. 21B, depicts another embodiment of the obstacle-following algorithm 1500 of the invention. The microprocessor takes sensor readings (step 1505) and monitors the strength of the signal detected by the wall-following sensor (S) against constantly updated and adjusted threshold values (T) (step 1510). The threshold adjustment algorithm is depicted in FIG. 21C, described below. In general, in order to follow along an obstacle, the robot is running a behavior that turns away from the wall if the sensor reading is greater than the threshold value (step 1515), and turns toward the wall if the sensor reading is less than the threshold value (step 1520). In certain embodiments, the value of the difference between S and T can be used to set the radius of the robot's turn, thereby reducing oscillations in wall-follow mode.



FIG. 21C depicts an embodiment of the threshold-adjustment subroutine utilized with the obstacle-following algorithm depicted in FIG. 21B. In this embodiment, the synchronous detection scheme 1400 inputs directly into the A/D port on the microprocessor of the robot. This allows sensor values (not merely the presence or absence of a wall, as described in FIG. 21A) to be used. The synchronous detection allows readings to be taken with and without the emitter powered, which allows the system to take into account ambient light.



FIG. 21C depicts the steps of setting and adjusting the threshold value (T). The program 1400 may run while the robot is in wall-following (or obstacle-following) mode. In the depicted embodiment, the robot is moving forward (step 1405) in any operational mode. The term “forward,” is used here to describe any operational mode or behavior that is not the wall- or obstacle-following mode described herein. Such modes or behaviors include spiral, straightline, and bounce (or random) as described in U.S. Pat. No. 6,809,490, even though that movement does not consist solely of movement in a single direction. Entering wall-following mode occurs after the robot has sensed an obstacle through its optical or tactile sensors (step 1410). It is therefore assumed, at the time program 1400 begins, the robot is adjacent to a wall or obstacle. When the robot enters wall-following mode, it first sets the threshold value to a minimum level, Tmin (step 1415), and aligns the robot initially along the wall and begins moving along the wall (step 1420). The system then takes sensor readings (step 1425). In one embodiment, the detection scheme (step 1425) involves taking four readings and averaging the results. Additional filtering of the sensor input can be used to remove localized changes in the wall surface (e.g., spots of dirt, patches of missing or altered paint, dents, etc.)


The system then looks for either of two conditions to reset the threshold (T): (i) a bump event (i.e. contact with the wall) (step 1430) or (ii) if S times C1 exceeds T (step 1435), where in one embodiment C1 is 0.5. In general, C1 should be between 0 and 1, where a higher value causes the robot to follow closer to the wall. If T is to be reset, it is set to SC1 (step 1440). If neither condition is met, the system continues to move along the wall (step 1420) and take additional sensor readings (step 1425).


In the embodiment of the threshold-adjustment algorithm depicted in FIG. 21C, the process called “wall-follow-adjuster” 1450 is constantly updating the threshold (T) based on the current signal from the wall sensor (S). The behavior called “wall-follow-align” 1455 initializes the threshold on a bump or sensor detection of a wall or other obstacle (step 1410). Near the beginning of this algorithm, it sets the threshold (step 1415) based on the sensor signal without the check done in the “wall-follow-adjuster” process (i.e., step 1435) that ensures that the new threshold is higher (step 1440).


Other embodiments of the wall-following sensor and system include the ability to vary the power or sensitivity of the emitter or detector. A stronger emitted signal, for example, would allow the robot to effectively follow the contours of a wall or other obstacle at a further distance. Such an embodiment would allow a robot to deliberately mop or vacuum, for example, an entire large room following the contours of the wall from the outer wall to the innermost point. This would be an extremely efficient way to clean large rooms devoid of furniture or other obstructions, such as ballrooms, conference centers, etc.


The sensor system may also take readings at various distances from the wall (e.g., at the wall and after a small amount of movement) to set the threshold. Such an embodiment would be particularly useful to increase the likelihood that the robot never touch obstacles (such as installation art pieces in museums) or walls in architecturally sensitive buildings (such as restored mansions and the like). Other embodiments of the wall detection system use multiple receivers at different distances or angles so as to accommodate differences caused by various reflective surfaces or single surfaces having different reflectivities due to surface coloration, cleanliness, etc. For example, some embodiments may have multiple detectors set at different depths and/or heights within the robot housing.


Other embodiments of the sensor subsystem may utilize an emitter to condition the value of the signal that corresponds to an object. For example, the detection sequence may include emitting a signal from an LED emitter and detecting the signal and corresponding value. The system may then detect a signal again, without emitting a corresponding signal. This would allow the robot to effectively minimize the effect of ambient light or walls of different reflectivities.


The wall-follower mode can be continued for a predetermined or random time, a predetermined or random distance, or until some additional criteria are met (e.g., bump sensor is activated, etc.). In one embodiment, the robot continues to follow the wall indefinitely. In another embodiment, minimum and maximum travel distances are determined, whereby the robot will remain in wall-following behavior until the robot has either traveled the maximum distance or traveled at least the minimum distance and encountered an obstacle. This implementation of wall-following behavior ensures the robot spends an appropriate amount of time in wall-following behavior as compared to its other operational modes, thereby decreasing systemic neglect and distributing coverage to all areas. By increasing wall-following, the robot is able to move in more spaces, but the robot is less efficient at cleaning any one space. In addition, by exiting the wall-following behavior after obstacle detection, the robot increases the users' perceived effectiveness.



FIG. 22 is a flow-chart illustration showing an embodiment of determining when to exit wall-following behavior. The robot first determines the minimum distance to follow the wall (dmin) and the maximum distance to follow the wall (dmax). While in wall (or obstacle) following mode, the control system tracks the distance the robot has traveled in that mode (dwf). If dwf is greater than dmax (step 1350), then the robot exits wall-following mode (step 1380). If, however, dwf is less than dmax (step 1350) and dwf is less than dmin (step 1360), the robot remains in wall-following mode (step 1385). If dwf is greater than dmin (step 1360) and an obstacle is encountered (step 1370), the robot exits wall-following mode (step 1380).


Theoretically, the optimal distance for the robot to travel in wall-following behavior is a function of room size and configuration and robot size. In a preferred embodiment, the minimum and maximum distance to remain in wall-following are set based upon the approximate room size, the robot's width and a random component, where by the average minimum travel distance is 2 w/p, where w is the width of the work element of the robot and p is the probability that the robot will enter wall-following behavior in a given interaction with an obstacle. By way of example, in one embodiment, w is approximately between 15 cm and 25 cm, and p is 0.095 (where the robot encounters 6 to 15 obstacles, or an average of 10.5 obstacles, before entering an obstacle following mode). The minimum distance is then set randomly as a distance between approximately 115 cm and 350 cm; the maximum distance is then set randomly as a distance between approximately 170 cm and 520 cm. In certain embodiments the ratio between the minimum distance to the maximum distance is 2:3. For the sake of perceived efficiency, the robot's initial operation in an obstacle-following mode can be set to be longer than its later operations in obstacle following mode. In addition, users may place the robot along the longest wall when starting the robot, which improves actual as well as perceived coverage.


The distance that the robot travels in wall-following mode can also be set by the robot depending on the number and frequency of objects encountered (as determined by other sensors), which is a measure of room “clutter.” If more objects are encountered, the robot would wall follow for a greater distance in order to get into all the areas of the floor. Conversely, if few obstacles are encountered, the robot would wall follow less in order to not over-cover the edges of the space in favor of passes through the center of the space. An initial wall-following distance can also be included to allow the robot to follow the wall a longer or shorter distance during its initial period where the wall-following behavior has control.


In one embodiment, the robot may also leave wall-following mode if the robot turns more than, for example, 270 degrees and is unable to locate the wall (or object) or if the robot has turned a total of 360 degrees since entering the wall-following mode.


In certain embodiments, when the wall-following behavior is active and there is a bump, the align behavior becomes active. The align behavior turns the robot counter-clockwise to align the robot with the wall. The robot always turns a minimum angle. The robot monitors its wall sensor and if it detects a wall and then the wall detection goes away, the robot stops turning. This is because at the end of the wall follower range, the robot is well aligned to start wall-following. If the robot has not seen its wall detector go on and then off by the time it reaches its maximum angle, it stops anyway. This prevents the robot from turning around in circles when the wall is out of range of its wall sensor. When the most recent bump is within the side 60 degrees of the bumper on the dominant side, the minimum angle is set to 14 degrees and the maximum angle is 19 degrees. Otherwise, if the bump is within 30 degrees of the front of the bumper on the dominant side or on the non-dominant side, the minimum angle is 20 degrees and the maximum angle is 44 degrees. When the align behavior has completed turning, it cedes control to the wall-following behavior.


For reasons of cleaning thoroughness and navigation, the ability to follow walls is essential for cleaning robots. Dust and dirt tend to accumulate at room edges. The robot therefore follows walls that it encounters to insure that this special area is well cleaned. Also, the ability to follow walls enables a navigation strategy that promotes full coverage. Using this strategy, the robot can avoid becoming trapped in small areas. Such entrapments could otherwise cause the robot to neglect other, possibly larger, areas.


But, it is important that the detected distance of the robot from the wall does not vary according to the reflectivity of the wall. Proper cleaning would not occur if the robot positioned itself very close to a dark-colored wall but several inches away from a light-colored wall. By using the dual collimation system of the subject invention, the field of view of the infrared emitter and detector are restricted in such a way that there is a limited, selectable volume where the cones of visibility intersect. Geometrically, the sensor is arranged so that it can detect both diffuse and specular reflection. Additionally, a manual shutter may be utilized on or in the robot housing to further limit the intersection of the cones of visibility or adjust the magnitude of the detected signal. This arrangement allows the designer to select with precision the distance at which the robot follows the wall independent of the reflectivity of the wall.


One robot system 300, FIG. 23, in accordance with this invention includes a circuit embodied in microprocessor 302 which controls drive motion subsystem 304 of robot 300 in both the random movement and wall-following modes to drive and turn the robot accordingly. Sensor subsystem 308 represents the designs discussed above with respect to FIGS. 5A-15. The detectors of each such subsystem provide an output signal to microprocessor 302 as discussed supra which is programmed according to the logic discussed with reference to FIGS. 16-17 to provide the appropriate signals to drive subsystem 304. Modulator circuitry 310 drives the emitters of the sensor subsystem 308 under the control of processor 302 as discussed above.


There may be three or more cliff-detector subsystems, as shown in FIG. 24, at locations 316, 318, and 320 spaced about the forward bottom portion of the robot and aimed downward and only one or two or more wall detector subsystems at locations 322 and 324 spaced about the forward portion of the robot housing and aimed outwardly.


In one embodiment, depicted in FIG. 25, a 12-inch diameter, three-wheeled, differentially-steered robot 340, is a sweeper-type cleaning robot equipped with sweeping brush 342 and includes four cliff-detector subsystems 342, 344, 346, and 348 and one wall-detector subsystem 352, FIG. 26. The output of the detectors of each subsystem are typically connected together by “OR” circuitry logic so that when any one detector detects a signal it is communicated to the processor.



FIG. 27A shows one embodiment of a detector circuit. R1 (384), CR1 (382), R2 (388), R3 (390), C1 (392), and U1:D (394) form a voltage reference used to prevent saturation of intermediate gain stages. In this embodiment, R1 (384) and CR1 (382) create from the input voltage (386) approximately 5.1V that is divided by voltage divider R2 (388), R3 (390) to create a voltage of approximately 1.8V. This is buffered by U1:D (394) configured as a unity gain follower. C1 (392) is provided to reduce noise. The photo-transistor (not shown) used in this embodiment requires a biasing current, provided from the above-described reference voltage through R7 (396). R10 (398), R13 (402), and U1:A (400) implement an amplifier with a gain of approximately −10. C4 (404) is provided for compensation and to reduce noise.


C2 (404) is used to block any DC component of the signal, while R8 (407), R12 (408), and U1:B (406) implement an amplifier with a gain of approximately −100. CR2 (410), R5 (414), and C3 (416) implement a peak detector/rectifier. R11 (412) provides a discharge path for C3 (416). The output of this peak detector is then compared to the above mentioned reference voltage by U1:C (420). R4 (422) provide hystersis. R9 (424) is a current limiting resistor used so that the output of U1:C (420) may be used to drive an indicator LED (not shown). Jumper JU1 (426) provides a convenient test point for debugging.


An oscillator circuit as shown in FIG. 28 is used to modulate the emitter IR LED at a frequency of several kHz. The exact frequency may be selected by adjusting R23 (468). Those skilled in the art will immediately deduce other ways of obtaining the same function. The simple filter/amplifier circuit of FIG. 27A is used to receive and amplify the output of a photo-transistor (not shown). A peak detector/integrator is used to convert the AC input to a threshold measurement. If sufficient energy in the selected bandwidth is received, the output signal is present at (428) is driven to a logical high state. Those skilled in the art will immediately recognize other ways of achieving the same ends. Components R14 (440), R17 (446), and U2:B (448) create a buffered bias voltage equal to approximately one-half of the input voltage (442). U2:A (456), R19 (460), R23 (468), and C5 (470) create a simple oscillator of a form commonly used. R18 (458), Q1 (462), and R21 (466) convert the voltage-mode oscillations of the oscillator described to current-mode oscillations in order that the emitter LED (connected to 464) be relatively constant current regardless of power supply voltage (442). The actual current impressed through the circuit may be altered to meet the requirements of the chosen LED by varying the value of R21 (466).



FIG. 27B depicts an embodiment of circuitry 700 that implements the wall-following behavior described in connection with FIG. 21B above. For this application, the four-stage circuit depicted in FIG. 27A can be replaced by a direct connection of a phototransistor light detector 702 to the analog input of the microcontroller 704. This significantly reduces the space required to implement the sensor system and reduces its cost, thus enabling more sensors to be used (for example, as additional proximity sensors around the circumference of a robot). In the depicted embodiment, an analog-to-digital conversion takes place within the microcontroller 704, and all signal processing is accomplished in the digital domain. This allows for maximum flexibility in the development of sensing algorithms.


This embodiment of the invention achieves a high response to the signal of interest, while minimizing the response to unwanted signals, by sampling the photodetector 702 at specific intervals synchronized with the modulated output of the infrared emitter 706. In this embodiment, moving-window averages of four IR-on and four IR-off samples are taken. In the figure, samples 1, 3, 5, and 7 are summed to produce an average IR-on value; samples 2, 4, 6, and 8 are summed to produce an average IR-off value. The difference between those averages represents the signal of interest. Because of the synchronous sampling, stray light, whether DC or modulated, has little effect on the measured signal.


In FIG. 29, a connector J1 (500) is used to connect the system to a means of supplying power (e.g., a battery). Fuse F1 (501) is included to limit excessive current flow in the event of a short circuit or other defect. Capacitors C6 (506) and C7 (510), FIG. 30 are provided for decoupling of other electronics (U1 and U2). Connector J2 (514), FIG. 31 provides a means of attachment for the IR LED transmitter (not shown). Connector J3 (520), FIG. 32 provides a means of attachment for the IR photo-transistor (not shown). Connector J4 (530), FIG. 33 provides a means of attachment for an indicator LED (to indicate the presence or absence of an obstacle, a means of attachment for a battery (not shown), and a means of attachment for a recharging power supply (not shown). Jumper JU2, FIG. 34, provides a convenient GROUND point for test equipment, etc. U3 (536) and R22 (538), FIG. 35 implements a constant-current source used in recharging an attached NiCad battery. U3 maintains a constant 5 volts between pins 3 and 2.5 volts divided by 22 Ohms (R22) creates a current of approximately 230 mA.


In other embodiments, a fiber optic source and detector may be used which operate similar to the sensor subsystems described above. The difference is that collimation is provided by the acceptance angle of two fiber optic cables. The fiber arrangement allows the emitter and detector to be located on a circuit board rather than mounted near the wheel of the robot. The cliff detector and wall detector can also be implemented using a laser as the source of the beam. The laser provides a very small spot size and may be useful in certain applications where the overall expense is not a priority design consideration. Infrared systems are desirable when cost is a primary design constraint. Infrared sensors can be designed to work well with all floor types. They are inexpensive and can be fitted into constrained spaces. In alternative embodiments audible or ultrasonic signals may be utilized for the emitter and/or detector.


Although specific features of the invention are shown in some drawings and not in others, this is for convenience only as each feature may be combined with any or all of the other features in accordance with the invention. The words “including,” “comprising,” “having,” and “with” as used herein are to be interpreted broadly and comprehensively and are not limited to any physical interconnection. Moreover, any embodiments disclosed in the subject application are not to be taken as the only possible embodiments.

Claims
  • 1. A mobile robot comprising: a drive mechanism that both drives the robot forward in a drive direction and turns the robot to change the drive direction;a sensor responsive to proximity of objects near a lateral side of the robot; anda drive controller that controls the drive mechanism to drive the robot to follow a sensed object proximate the lateral side of the robot, the drive controller configured to monitor the sensor to detect when a followed object is no longer proximate the lateral side of the robot, to initiate a turn of the robot toward the lateral side in accordance with a turning radius in response to detecting that the followed object is no longer proximate the lateral side, and then to progressively decrease the turning radius of the robot over time while driving the robot forward, thereby causing the robot to progressively turn more sharply until the robot encounters a change in sensory input,wherein the controller is configured to stop turning the robot if the robot reaches a predetermined maximum turn angle while monitoring the sensor, and to then move the robot forward along the drive direction.
  • 2. The mobile robot of claim 1, wherein the controller is configured to stop decreasing the turning radius if the robot bumps an object.
  • 3. The mobile robot of claim 1, wherein the sensor is a reflection detector, and wherein the controller detects that the followed object is no longer proximate the lateral side upon a transition from a reflection state to a non-reflection state of the sensor.
  • 4. The mobile robot of claim 1, wherein the controller is configured to stop decreasing the turning radius of the robot if the sensor detects an object on the lateral side.
  • 5. The mobile robot of claim 1, further comprising a floor cleaner disposed on the lateral side of the robot.
  • 6. The mobile robot of claim 5, wherein the floor cleaner includes a side brush extending beyond a lateral extent of a housing of the robot, the side brush driven to sweep debris from a floor surface beyond the lateral extent of the housing, for collection by the robot.
  • 7. The mobile robot of claim 1, wherein the sensor is responsive to proximity of a room wall; andthe controller is configured to drive the robot to follow a proximate wall on the lateral side of the robot by changing a turning radius to maintain a continuous detection of the wall by the sensor,monitor the sensor to detect when the wall is no longer proximate the lateral side of the robot, andin response to detecting that the wall is no longer proximate the lateral side, turning the robot toward the lateral side while decreasing the turning radius of the robot.
  • 8. The mobile robot of claim 7, wherein the controller is configured to, in response to the robot bumping into an object while following a wall, turn the robot through a first angle, then monitor the sensor for wall detection while continuing to turn the robot, and then,in response to wall detection and a subsequent cessation of wall detection, stop turning the robot.
  • 9. The mobile robot of claim 8, further comprising a bump sensor, and wherein the first angle is determined according to an angle at which the bump sensor indicates a bump.
  • 10. A mobile robot, comprising: a drive mechanism that both drives the robot forward across a floor in a drive direction and turns the robot to change the drive direction;a sensor responsive to proximity of an object to be followed on a lateral side of the robot;a floor area cleaner disposed on the lateral side of the robot; anda drive controller that controls the drive mechanism to turn the robot to follow the object on the lateral side of the robot, by changing a robot turning radius to maintain continuous detection of the object by the sensor, including turning the robot toward the lateral side while progressively decreasing the robot turning radius in response to cessation of detection of the object proximate the lateral side.
  • 11. The mobile robot of claim 10, wherein the controller sets an initial turning radius and steadily decreases the turning radius until the robot bumps against an object.
  • 12. The mobile robot of claim 10, wherein the controller controls the drive mechanism to follow a path having a turning radius proportional to an angle of turn.
  • 13. The mobile robot of claim 10, wherein the sensor is a reflection detector, and wherein the controller determines cessation of detection of the object upon the sensor transitioning from a reflection state to a non-reflection state.
  • 14. The mobile robot of claim 10, wherein the controller decreases the robot turning radius until the sensor detects an object on the lateral side.
  • 15. The mobile robot of claim 10, wherein the floor area cleaner comprises a side brush extending beyond a lateral extent of a housing of the robot, the side brush driven to sweep debris from a floor surface beyond the lateral extent of housing for collection by the robot.
  • 16. The mobile robot of claim 10, wherein the sensor is responsive to proximity of a room wall, and wherein the controller drives the robot to follow a proximate wall on the lateral side of the robot by changing a turning radius to maintain continuous detection of the wall by the sensor, including turning the robot toward the lateral side while decreasing the turning radius of the robot in response to cessation of detection of the wall proximate the lateral side.
  • 17. A mobile robot comprising: a robot housing;a drive mechanism that both drives the robot housing forward in a drive direction and turns the robot housing to change the drive direction;a bump sensor responsive to the robot bumping into an object in the drive direction;a proximity sensor responsive to proximity of a wall to be followed on a lateral side of the robot housing;a side brush extending beyond a lateral extent of the housing, the side brush driven to sweep debris from a floor surface beyond the lateral extent of the housing, for collection by the robot; anda drive controller that controls the drive mechanism to follow the wall on the lateral side of the robot, the drive controller configured to monitor the sensor to detect when the wall is no longer proximate the lateral side of the robot, in response to detecting that the wall is no longer proximate the lateral side, initiate a turn of the robot toward the lateral side in accordance with a turning radius in response to detecting that the followed wall is no longer proximate the lateral side, and then progressively decrease the turning radius of the robot over time while driving the robot forward, thereby causing the robot to progressively turn more sharply until one of the sensors detects a wall on the lateral side,wherein the controller is configured to stop turning the robot if the robot reaches a predetermined maximum turn angle while monitoring the sensor, and to then move the robot housing forward along the drive direction.
CROSS REFERENCE TO RELATED APPLICATIONS

This U.S. patent application is a continuation of, and claims priority under 35 U.S.C. §120 from U.S. patent application Ser. No. 11/166,986, filed on Jun. 24, 2005, which is a continuation-in-part of U.S. patent application Ser. No. 10/453,202, filed on Jun. 3, 2003 (now U.S. Pat. No. 7,155,308, issued Dec. 26, 2006), which is a continuation-in-part of U.S. patent application Ser. No. 09/768,773, filed on Jan. 24, 2001 (now U.S. Pat. No. 6,594,844, issued Jul. 22, 2003), which claims priority under 35 U.S.C. §119(e) to U.S. Provisional Application 60/177,703, filed on Jan. 24, 2000. U.S. patent application Ser. No. 11/166,986 also claims priority under 35 U.S.C. §119(e) to U.S. Provisional Application 60/582,992, filed on Jun. 25, 2004. The disclosures of the prior applications are considered part of, and are hereby incorporated by reference in, the disclosure of this application.

US Referenced Citations (899)
Number Name Date Kind
1755054 Darst Apr 1930 A
1780221 Buchmann Nov 1930 A
1970302 Gerhardt Aug 1934 A
2136324 John Sep 1938 A
2302111 Dow et al. Nov 1942 A
2353621 Sav et al. Jul 1944 A
2770825 Pullen Nov 1956 A
3119369 Harland et al. Jan 1964 A
3166138 Dunn Jan 1965 A
3333564 Waters Aug 1967 A
3375375 Robert et al. Mar 1968 A
3381652 Schaefer et al. May 1968 A
3457575 Bienek Jul 1969 A
3550714 Bellinger Dec 1970 A
3569727 Aggarwal et al. Mar 1971 A
3674316 De Brey Jul 1972 A
3678882 Kinsella Jul 1972 A
3744586 Leinauer Jul 1973 A
3756667 Bombardier et al. Sep 1973 A
3809004 Leonheart May 1974 A
3816004 Bignardi Jun 1974 A
3845831 James Nov 1974 A
RE28268 Autrand Dec 1974 E
3853086 Asplund Dec 1974 A
3863285 Hukuba Feb 1975 A
3888181 Kups Jun 1975 A
3937174 Haaga Feb 1976 A
3952361 Wilkins Apr 1976 A
3989311 Debrey Nov 1976 A
3989931 Phillips Nov 1976 A
4004313 Capra Jan 1977 A
4012681 Finger et al. Mar 1977 A
4070170 Leinfelt Jan 1978 A
4099284 Shinozaki et al. Jul 1978 A
4119900 Kremnitz Oct 1978 A
4175589 Nakamura et al. Nov 1979 A
4175892 De Brey Nov 1979 A
4196727 Verkaart et al. Apr 1980 A
4198727 Farmer Apr 1980 A
4199838 Simonsson Apr 1980 A
4209254 Reymond et al. Jun 1980 A
D258901 Keyworth Apr 1981 S
4297578 Carter Oct 1981 A
4306329 Yokoi Dec 1981 A
4309758 Halsall et al. Jan 1982 A
4328545 Halsall et al. May 1982 A
4367403 Miller Jan 1983 A
4369543 Chen et al. Jan 1983 A
4401909 Gorsek Aug 1983 A
4416033 Specht Nov 1983 A
4445245 Lu May 1984 A
4465370 Yuasa et al. Aug 1984 A
4477998 You Oct 1984 A
4481692 Kurz Nov 1984 A
4482960 Pryor Nov 1984 A
4492058 Goldfarb et al. Jan 1985 A
4513469 Godfrey et al. Apr 1985 A
D278732 Ohkado May 1985 S
4518437 Sommer May 1985 A
4534637 Suzuki et al. Aug 1985 A
4556313 Miller et al. Dec 1985 A
4575211 Matsumura et al. Mar 1986 A
4580311 Kurz Apr 1986 A
4596412 Everett et al. Jun 1986 A
4601082 Kurz Jul 1986 A
4618213 Chen Oct 1986 A
4620285 Perdue Oct 1986 A
4624026 Olson et al. Nov 1986 A
4626995 Lofgren et al. Dec 1986 A
4628454 Ito Dec 1986 A
4638445 Mattaboni Jan 1987 A
4644156 Takahashi et al. Feb 1987 A
4649504 Krouglicof et al. Mar 1987 A
4652917 Miller Mar 1987 A
4654492 Koerner et al. Mar 1987 A
4654924 Getz et al. Apr 1987 A
4660969 Sorimachi et al. Apr 1987 A
4662854 Fang May 1987 A
4674048 Okumura Jun 1987 A
4679152 Perdue Jul 1987 A
4680827 Hummel Jul 1987 A
4696074 Cavalli Sep 1987 A
D292223 Trumbull Oct 1987 S
4700301 Dyke Oct 1987 A
4700427 Knepper Oct 1987 A
4703820 Reinaud Nov 1987 A
4710020 Maddox et al. Dec 1987 A
4716621 Zoni Jan 1988 A
4728801 O'Connor Mar 1988 A
4733343 Yoneda et al. Mar 1988 A
4733430 Westergren Mar 1988 A
4733431 Martin Mar 1988 A
4735136 Lee et al. Apr 1988 A
4735138 Gawler et al. Apr 1988 A
4748336 Fujie et al. May 1988 A
4748833 Nagasawa Jun 1988 A
4756049 Uehara Jul 1988 A
4767213 Hummel Aug 1988 A
4769700 Pryor Sep 1988 A
4777416 George et al. Oct 1988 A
D298766 Tanno et al. Nov 1988 S
4782550 Jacobs Nov 1988 A
4796198 Boultinghouse et al. Jan 1989 A
4806751 Abe et al. Feb 1989 A
4811228 Hyyppa Mar 1989 A
4813906 Matsuyama et al. Mar 1989 A
4815157 Tsuchiya Mar 1989 A
4817000 Eberhardt Mar 1989 A
4818875 Weiner Apr 1989 A
4829442 Kadonoff et al. May 1989 A
4829626 Harkonen et al. May 1989 A
4832098 Palinkas et al. May 1989 A
4851661 Everett Jul 1989 A
4854000 Takimoto Aug 1989 A
4854006 Nishimura et al. Aug 1989 A
4855915 Dallaire Aug 1989 A
4857912 Everett et al. Aug 1989 A
4858132 Holmquist Aug 1989 A
4867570 Sorimachi et al. Sep 1989 A
4878003 Knepper Oct 1989 A
4880474 Koharagi et al. Nov 1989 A
4887415 Martin Dec 1989 A
4891762 Chotiros Jan 1990 A
4893025 Lee Jan 1990 A
4901394 Nakamura et al. Feb 1990 A
4905151 Weiman et al. Feb 1990 A
4912643 Beirne Mar 1990 A
4918441 Bohman Apr 1990 A
4919224 Shyu et al. Apr 1990 A
4919489 Kopsco Apr 1990 A
4920060 Parrent et al. Apr 1990 A
4920605 Takashima May 1990 A
4933864 Evans et al. Jun 1990 A
4937912 Kurz Jul 1990 A
4949277 Trovato et al. Aug 1990 A
4953253 Fukuda et al. Sep 1990 A
4954962 Evans et al. Sep 1990 A
4955714 Stotler et al. Sep 1990 A
4956891 Wulff Sep 1990 A
4961303 McCarty et al. Oct 1990 A
4961304 Ovsborn et al. Oct 1990 A
4962453 Pong et al. Oct 1990 A
4971591 Raviv et al. Nov 1990 A
4973912 Kaminski et al. Nov 1990 A
4974283 Holsten et al. Dec 1990 A
4977618 Allen Dec 1990 A
4977639 Takahashi et al. Dec 1990 A
4986663 Cecchi et al. Jan 1991 A
5001635 Yasutomi et al. Mar 1991 A
5002145 Wakaumi et al. Mar 1991 A
5012886 Jonas et al. May 1991 A
5018240 Holman May 1991 A
5020186 Lessig et al. Jun 1991 A
5022812 Coughlan et al. Jun 1991 A
5023788 Kitazume et al. Jun 1991 A
5024529 Svetkoff et al. Jun 1991 A
D318500 Malewicki et al. Jul 1991 S
5032775 Mizuno et al. Jul 1991 A
5033151 Kraft et al. Jul 1991 A
5033291 Podoloff et al. Jul 1991 A
5040116 Evans et al. Aug 1991 A
5045769 Everett Sep 1991 A
5049802 Mintus et al. Sep 1991 A
5051906 Evans et al. Sep 1991 A
5062819 Mallory Nov 1991 A
5070567 Holland Dec 1991 A
5084934 Lessig et al. Feb 1992 A
5086535 Grossmeyer et al. Feb 1992 A
5090321 Abouav Feb 1992 A
5093955 Blehert et al. Mar 1992 A
5094311 Akeel Mar 1992 A
5105502 Takashima Apr 1992 A
5105550 Shenoha Apr 1992 A
5109566 Kobayashi et al. May 1992 A
5115538 Cochran et al. May 1992 A
5127128 Lee Jul 1992 A
5136675 Hodson Aug 1992 A
5136750 Takashima et al. Aug 1992 A
5142985 Stearns et al. Sep 1992 A
5144471 Takanashi et al. Sep 1992 A
5144714 Mori et al. Sep 1992 A
5144715 Matsuyo et al. Sep 1992 A
5152028 Hirano Oct 1992 A
5152202 Strauss Oct 1992 A
5155684 Burke et al. Oct 1992 A
5163202 Kawakami et al. Nov 1992 A
5163320 Goshima et al. Nov 1992 A
5164579 Pryor et al. Nov 1992 A
5165064 Mattaboni Nov 1992 A
5170352 McTamaney et al. Dec 1992 A
5173881 Sindle Dec 1992 A
5182833 Yamaguchi et al. Feb 1993 A
5202742 Frank et al. Apr 1993 A
5204814 Noonan et al. Apr 1993 A
5206500 Decker et al. Apr 1993 A
5208521 Aoyama May 1993 A
5216777 Moro et al. Jun 1993 A
5227985 DeMenthon Jul 1993 A
5233682 Abe et al. Aug 1993 A
5239720 Wood et al. Aug 1993 A
5251358 Moro et al. Oct 1993 A
5261139 Lewis Nov 1993 A
5276618 Everett Jan 1994 A
5276939 Uenishi Jan 1994 A
5277064 Knigga et al. Jan 1994 A
5279672 Betker et al. Jan 1994 A
5284452 Corona Feb 1994 A
5284522 Kobayashi et al. Feb 1994 A
5293955 Lee Mar 1994 A
D345707 Alister Apr 1994 S
5303448 Hennessey et al. Apr 1994 A
5307273 Oh et al. Apr 1994 A
5309592 Hiratsuka May 1994 A
5310379 Hippely et al. May 1994 A
5315227 Pierson et al. May 1994 A
5319827 Yang Jun 1994 A
5319828 Waldhauser et al. Jun 1994 A
5321614 Ashworth Jun 1994 A
5323483 Baeg Jun 1994 A
5324948 Dudar et al. Jun 1994 A
5341186 Kato Aug 1994 A
5341540 Soupert et al. Aug 1994 A
5341549 Wirtz et al. Aug 1994 A
5345649 Whitlow Sep 1994 A
5353224 Lee et al. Oct 1994 A
5363305 Cox et al. Nov 1994 A
5363935 Schempf et al. Nov 1994 A
5369347 Yoo Nov 1994 A
5369838 Wood et al. Dec 1994 A
5386862 Glover et al. Feb 1995 A
5399951 Lavallee et al. Mar 1995 A
5400244 Watanabe et al. Mar 1995 A
5404612 Ishikawa Apr 1995 A
5410479 Coker Apr 1995 A
5435405 Schempf et al. Jul 1995 A
5440216 Kim Aug 1995 A
5442358 Keeler et al. Aug 1995 A
5444965 Colens Aug 1995 A
5446356 Kim Aug 1995 A
5446445 Bloomfield et al. Aug 1995 A
5451135 Schempf et al. Sep 1995 A
5454129 Kell Oct 1995 A
5455982 Armstrong et al. Oct 1995 A
5465525 Mifune et al. Nov 1995 A
5465619 Sotack et al. Nov 1995 A
5467273 Faibish et al. Nov 1995 A
5471560 Allard et al. Nov 1995 A
5491670 Weber Feb 1996 A
5497529 Boesi Mar 1996 A
5498948 Bruni et al. Mar 1996 A
5502638 Takenaka Mar 1996 A
5505072 Oreper Apr 1996 A
5507067 Hoekstra et al. Apr 1996 A
5510893 Suzuki Apr 1996 A
5511147 Abdel Apr 1996 A
5515572 Hoekstra et al. May 1996 A
5534762 Kim Jul 1996 A
5537017 Feiten et al. Jul 1996 A
5537711 Tseng Jul 1996 A
5539953 Kurz Jul 1996 A
5542146 Hoekstra et al. Aug 1996 A
5542148 Young Aug 1996 A
5546631 Chambon Aug 1996 A
5548511 Bancroft Aug 1996 A
5551525 Pack et al. Sep 1996 A
5553349 Kilstrom et al. Sep 1996 A
5555587 Guha Sep 1996 A
5560077 Crotchett Oct 1996 A
5568589 Hwang Oct 1996 A
D375592 Ljunggren Nov 1996 S
5608306 Rybeck et al. Mar 1997 A
5608894 Kawakami et al. Mar 1997 A
5608944 Gordon Mar 1997 A
5610488 Miyazawa Mar 1997 A
5611106 Wulff Mar 1997 A
5611108 Knowlton et al. Mar 1997 A
5613261 Kawakami et al. Mar 1997 A
5613269 Miwa Mar 1997 A
5621291 Lee Apr 1997 A
5622236 Azumi et al. Apr 1997 A
5634237 Paranjpe Jun 1997 A
5634239 Tuvin et al. Jun 1997 A
5636402 Kubo et al. Jun 1997 A
5642299 Hardin et al. Jun 1997 A
5646494 Han Jul 1997 A
5647554 Ikegami et al. Jul 1997 A
5650702 Azumi Jul 1997 A
5652489 Kawakami Jul 1997 A
5682313 Edlund et al. Oct 1997 A
5682839 Grimsley et al. Nov 1997 A
5696675 Nakamura et al. Dec 1997 A
5698861 Oh Dec 1997 A
5709007 Chiang Jan 1998 A
5710506 Broell et al. Jan 1998 A
5714119 Kawagoe et al. Feb 1998 A
5717169 Liang et al. Feb 1998 A
5717484 Hamaguchi et al. Feb 1998 A
5720077 Nakamura et al. Feb 1998 A
5732401 Conway Mar 1998 A
5735959 Kubo et al. Apr 1998 A
5745235 Vercammen et al. Apr 1998 A
5752871 Tsuzuki May 1998 A
5756904 Oreper et al. May 1998 A
5761762 Kubo Jun 1998 A
5764888 Bolan et al. Jun 1998 A
5767437 Rogers Jun 1998 A
5767960 Orman Jun 1998 A
5777596 Herbert Jul 1998 A
5778486 Kim Jul 1998 A
5781697 Jeong Jul 1998 A
5781960 Kilstrom et al. Jul 1998 A
5786602 Pryor et al. Jul 1998 A
5787545 Colens Aug 1998 A
5793900 Nourbakhsh et al. Aug 1998 A
5794297 Muta Aug 1998 A
5812267 Everett et al. Sep 1998 A
5814808 Takada et al. Sep 1998 A
5815880 Nakanishi Oct 1998 A
5815884 Imamura et al. Oct 1998 A
5819008 Asama et al. Oct 1998 A
5819360 Fujii Oct 1998 A
5819936 Saveliev et al. Oct 1998 A
5820821 Kawagoe et al. Oct 1998 A
5821730 Drapkin Oct 1998 A
5825981 Matsuda Oct 1998 A
5828770 Leis et al. Oct 1998 A
5831597 West et al. Nov 1998 A
5839156 Park et al. Nov 1998 A
5839532 Yoshiji et al. Nov 1998 A
5841259 Kim et al. Nov 1998 A
5867800 Leif Feb 1999 A
5869910 Colens Feb 1999 A
5894621 Kubo Apr 1999 A
5896611 Haaga Apr 1999 A
5903124 Kawakami May 1999 A
5905209 Oreper May 1999 A
5907886 Buscher Jun 1999 A
5910700 Crotzer Jun 1999 A
5911260 Suzuki Jun 1999 A
5916008 Wong Jun 1999 A
5924167 Wright et al. Jul 1999 A
5926909 McGee Jul 1999 A
5933102 Miller et al. Aug 1999 A
5933913 Wright et al. Aug 1999 A
5935179 Kleiner et al. Aug 1999 A
5940346 Sadowsky et al. Aug 1999 A
5940927 Haegermarck et al. Aug 1999 A
5940928 Erko Aug 1999 A
5940930 Oh et al. Aug 1999 A
5942869 Katou et al. Aug 1999 A
5943730 Boomgaarden Aug 1999 A
5943733 Tagliaferri Aug 1999 A
5947225 Kawakami et al. Sep 1999 A
5950408 Schaedler Sep 1999 A
5959423 Nakanishi et al. Sep 1999 A
5968281 Wright et al. Oct 1999 A
5974348 Rocks Oct 1999 A
5974365 Mitchell Oct 1999 A
5983448 Wright et al. Nov 1999 A
5984880 Lander et al. Nov 1999 A
5987383 Keller et al. Nov 1999 A
5989700 Krivopal Nov 1999 A
5991951 Kubo et al. Nov 1999 A
5995883 Nishikado Nov 1999 A
5995884 Allen et al. Nov 1999 A
5996167 Close Dec 1999 A
5998953 Nakamura et al. Dec 1999 A
5998971 Corbridge Dec 1999 A
6000088 Wright et al. Dec 1999 A
6009358 Angott et al. Dec 1999 A
6021545 Delgado et al. Feb 2000 A
6023813 Thatcher et al. Feb 2000 A
6023814 Imamura Feb 2000 A
6025687 Himeda et al. Feb 2000 A
6026539 Mouw et al. Feb 2000 A
6030464 Azevedo Feb 2000 A
6030465 Marcussen et al. Feb 2000 A
6032542 Warnick et al. Mar 2000 A
6036572 Sze Mar 2000 A
6038501 Kawakami Mar 2000 A
6040669 Hog Mar 2000 A
6041471 Charky et al. Mar 2000 A
6041472 Kasen et al. Mar 2000 A
6046800 Ohtomo et al. Apr 2000 A
6049620 Dickinson et al. Apr 2000 A
6052821 Chouly et al. Apr 2000 A
6055042 Sarangapani Apr 2000 A
6055702 Imamura et al. May 2000 A
6061868 Moritsch et al. May 2000 A
6065182 Wright et al. May 2000 A
6073432 Schaedler Jun 2000 A
6076025 Ueno et al. Jun 2000 A
6076026 Jambhekar et al. Jun 2000 A
6076226 Reed Jun 2000 A
6076227 Schallig et al. Jun 2000 A
6081257 Zeller Jun 2000 A
6088020 Mor Jul 2000 A
6094775 Behmer Aug 2000 A
6099091 Campbell Aug 2000 A
6101670 Song Aug 2000 A
6101671 Wright et al. Aug 2000 A
6108031 King et al. Aug 2000 A
6108067 Okamoto Aug 2000 A
6108076 Hanseder Aug 2000 A
6108269 Kabel Aug 2000 A
6108597 Kirchner et al. Aug 2000 A
6112143 Allen et al. Aug 2000 A
6112996 Matsuo Sep 2000 A
6119057 Kawagoe Sep 2000 A
6122798 Kobayashi et al. Sep 2000 A
6124694 Bancroft et al. Sep 2000 A
6125498 Roberts et al. Oct 2000 A
6131237 Kasper et al. Oct 2000 A
6138063 Himeda Oct 2000 A
6142252 Kinto et al. Nov 2000 A
6146278 Kobayashi Nov 2000 A
6154279 Thayer Nov 2000 A
6154694 Aoki et al. Nov 2000 A
6160479 Åhlen et al. Dec 2000 A
6167332 Kurtzberg et al. Dec 2000 A
6167587 Kasper et al. Jan 2001 B1
6192548 Huffman Feb 2001 B1
6216307 Kaleta et al. Apr 2001 B1
6220865 Macri et al. Apr 2001 B1
6226830 Hendriks et al. May 2001 B1
6230362 Kasper et al. May 2001 B1
6237741 Guidetti May 2001 B1
6240342 Fiegert et al. May 2001 B1
6243913 Frank et al. Jun 2001 B1
6255793 Peless et al. Jul 2001 B1
6259979 Holmquist Jul 2001 B1
6261379 Conrad et al. Jul 2001 B1
6263539 Baig Jul 2001 B1
6263989 Won Jul 2001 B1
6272936 Oreper et al. Aug 2001 B1
6276478 Hopkins et al. Aug 2001 B1
6278918 Dickson et al. Aug 2001 B1
6282526 Ganesh Aug 2001 B1
6283034 Miles Sep 2001 B1
6285778 Nakajima et al. Sep 2001 B1
6285930 Dickson et al. Sep 2001 B1
6300737 Bergvall et al. Oct 2001 B1
6321337 Reshef et al. Nov 2001 B1
6321515 Colens Nov 2001 B1
6323570 Nishimura et al. Nov 2001 B1
6324714 Walz et al. Dec 2001 B1
6327741 Reed Dec 2001 B1
6332400 Meyer Dec 2001 B1
6339735 Peless et al. Jan 2002 B1
6362875 Burkley Mar 2002 B1
6370453 Sommer Apr 2002 B2
6374155 Wallach et al. Apr 2002 B1
6374157 Takamura Apr 2002 B1
6381802 Park May 2002 B2
6385515 Dickson et al. May 2002 B1
6388013 Saraf et al. May 2002 B1
6389329 Colens May 2002 B1
6400048 Nishimura et al. Jun 2002 B1
6401294 Kasper Jun 2002 B2
6408226 Byrne et al. Jun 2002 B1
6412141 Kasper et al. Jul 2002 B2
6415203 Inoue et al. Jul 2002 B1
6421870 Basham et al. Jul 2002 B1
6427285 Legatt et al. Aug 2002 B1
6430471 Kintou et al. Aug 2002 B1
6431296 Won Aug 2002 B1
6437227 Theimer Aug 2002 B1
6437465 Nishimura et al. Aug 2002 B1
6438456 Feddema et al. Aug 2002 B1
6438793 Miner et al. Aug 2002 B1
6442476 Poropat Aug 2002 B1
6443509 Levin et al. Sep 2002 B1
6444003 Sutcliffe Sep 2002 B1
6446302 Kasper et al. Sep 2002 B1
6454036 Airey et al. Sep 2002 B1
D464091 Christianson Oct 2002 S
6457206 Judson Oct 2002 B1
6459955 Bartsch et al. Oct 2002 B1
6461281 Bouvier Oct 2002 B2
6463368 Feiten et al. Oct 2002 B1
6465982 Bergvall et al. Oct 2002 B1
6473167 Odell Oct 2002 B1
6480762 Uchikubo et al. Nov 2002 B1
6481515 Kirkpatrick et al. Nov 2002 B1
6490539 Dickson et al. Dec 2002 B1
6491127 Holmberg et al. Dec 2002 B1
6493612 Bisset et al. Dec 2002 B1
6493613 Peless et al. Dec 2002 B2
6496754 Song et al. Dec 2002 B2
6496755 Wallach et al. Dec 2002 B2
6502657 Kerrebrock et al. Jan 2003 B2
6504610 Bauer et al. Jan 2003 B1
6507773 Parker et al. Jan 2003 B2
6525509 Petersson et al. Feb 2003 B1
D471243 Cioffi et al. Mar 2003 S
6532404 Colens Mar 2003 B2
6535793 Allard Mar 2003 B2
6540607 Mokris et al. Apr 2003 B2
6548982 Papanikolopoulos et al. Apr 2003 B1
6553612 Dyson et al. Apr 2003 B1
6556722 Russell et al. Apr 2003 B1
6556892 Kuroki et al. Apr 2003 B2
6557104 Vu et al. Apr 2003 B2
D474312 Stephens et al. May 2003 S
6563130 Dworkowski et al. May 2003 B2
6571415 Gerber et al. Jun 2003 B2
6571422 Gordon et al. Jun 2003 B1
6572711 Sclafani et al. Jun 2003 B2
6574536 Kawagoe et al. Jun 2003 B1
6580246 Jacobs Jun 2003 B2
6584376 Van Kommer Jun 2003 B1
6586908 Petersson et al. Jul 2003 B2
6587573 Stam et al. Jul 2003 B1
6590222 Bisset et al. Jul 2003 B1
6594551 McKinney et al. Jul 2003 B2
6594844 Jones Jul 2003 B2
D478884 Slipy et al. Aug 2003 S
6601265 Burlington Aug 2003 B1
6604021 Imai et al. Aug 2003 B2
6604022 Parker et al. Aug 2003 B2
6605156 Clark et al. Aug 2003 B1
6611120 Song et al. Aug 2003 B2
6611734 Parker et al. Aug 2003 B2
6611738 Ruffner Aug 2003 B2
6615108 Peless et al. Sep 2003 B1
6615885 Ohm Sep 2003 B1
6622465 Jerome et al. Sep 2003 B2
6624744 Wilson et al. Sep 2003 B1
6625843 Kim et al. Sep 2003 B2
6629028 Paromtchik et al. Sep 2003 B2
6639659 Granger Oct 2003 B2
6658325 Zweig Dec 2003 B2
6658354 Lin Dec 2003 B2
6658692 Lenkiewicz et al. Dec 2003 B2
6658693 Reed Dec 2003 B1
6661239 Ozick Dec 2003 B1
6662889 De Fazio et al. Dec 2003 B2
6668951 Won Dec 2003 B2
6670817 Fournier et al. Dec 2003 B2
6671592 Bisset et al. Dec 2003 B1
6687571 Byrne et al. Feb 2004 B1
6690134 Jones et al. Feb 2004 B1
6690993 Foulke et al. Feb 2004 B2
6697147 Ko et al. Feb 2004 B2
6711280 Stafsudd et al. Mar 2004 B2
6732826 Song et al. May 2004 B2
6737591 Lapstun et al. May 2004 B1
6741054 Koselka et al. May 2004 B2
6741364 Lange et al. May 2004 B2
6748297 Song et al. Jun 2004 B2
6756703 Chang Jun 2004 B2
6760647 Nourbakhsh et al. Jul 2004 B2
6764373 Osawa et al. Jul 2004 B1
6769004 Barrett Jul 2004 B2
6774596 Bisset Aug 2004 B1
6779380 Nieuwkamp Aug 2004 B1
6781338 Jones et al. Aug 2004 B2
6809490 Jones et al. Oct 2004 B2
6810305 Kirkpatrick Oct 2004 B2
6830120 Yashima et al. Dec 2004 B1
6832407 Salem et al. Dec 2004 B2
6836701 McKee Dec 2004 B2
6841963 Song et al. Jan 2005 B2
6845297 Allard Jan 2005 B2
6856811 Burdue et al. Feb 2005 B2
6859010 Jeon et al. Feb 2005 B2
6859682 Naka et al. Feb 2005 B2
6860206 Rudakevych et al. Mar 2005 B1
6865447 Lau et al. Mar 2005 B2
6870792 Chiappetta Mar 2005 B2
6871115 Huang et al. Mar 2005 B2
6883201 Jones et al. Apr 2005 B2
6886651 Slocum et al. May 2005 B1
6888333 Laby May 2005 B2
6901624 Mori et al. Jun 2005 B2
6906702 Tanaka et al. Jun 2005 B1
6914403 Tsurumi Jul 2005 B2
6917854 Bayer Jul 2005 B2
6925357 Wang et al. Aug 2005 B2
6925679 Wallach et al. Aug 2005 B2
6929548 Wang Aug 2005 B2
D510066 Hickey et al. Sep 2005 S
6938298 Aasen Sep 2005 B2
6940291 Ozick Sep 2005 B1
6941199 Bottomley et al. Sep 2005 B1
6956348 Landry et al. Oct 2005 B2
6957712 Song et al. Oct 2005 B2
6960986 Asama et al. Nov 2005 B2
6965209 Jones et al. Nov 2005 B2
6965211 Tsurumi Nov 2005 B2
6968592 Takeuchi et al. Nov 2005 B2
6971140 Kim Dec 2005 B2
6975246 Trudeau Dec 2005 B1
6980229 Ebersole Dec 2005 B1
6985556 Shanmugavel et al. Jan 2006 B2
6993954 George et al. Feb 2006 B1
6999850 McDonald Feb 2006 B2
7013527 Thomas et al. Mar 2006 B2
7024278 Chiappetta et al. Apr 2006 B2
7024280 Parker et al. Apr 2006 B2
7027893 Perry et al. Apr 2006 B2
7030768 Wanie Apr 2006 B2
7031805 Lee et al. Apr 2006 B2
7032469 Bailey Apr 2006 B2
7053578 Diehl et al. May 2006 B2
7054716 McKee et al. May 2006 B2
7055210 Keppler et al. Jun 2006 B2
7057120 Ma et al. Jun 2006 B2
7057643 Iida et al. Jun 2006 B2
7065430 Naka et al. Jun 2006 B2
7066291 Martins et al. Jun 2006 B2
7069124 Whittaker et al. Jun 2006 B1
7079923 Abramson et al. Jul 2006 B2
7085623 Siegers Aug 2006 B2
7085624 Aldred et al. Aug 2006 B2
7113847 Chmura et al. Sep 2006 B2
7133746 Abramson et al. Nov 2006 B2
7142198 Lee Nov 2006 B2
7148458 Schell et al. Dec 2006 B2
7155308 Jones Dec 2006 B2
7167775 Abramson et al. Jan 2007 B2
7171285 Kim et al. Jan 2007 B2
7173391 Jones et al. Feb 2007 B2
7174238 Zweig Feb 2007 B1
7188000 Chiappetta et al. Mar 2007 B2
7193384 Norman et al. Mar 2007 B1
7196487 Jones et al. Mar 2007 B2
7201786 Wegelin et al. Apr 2007 B2
7206677 Huldén Apr 2007 B2
7211980 Bruemmer et al. May 2007 B1
7225500 Diehl et al. Jun 2007 B2
7246405 Yan Jul 2007 B2
7248951 Huldén Jul 2007 B2
7275280 Haegermarck et al. Oct 2007 B2
7283892 Boillot et al. Oct 2007 B1
7288912 Landry et al. Oct 2007 B2
7318248 Yan Jan 2008 B1
7320149 Huffman et al. Jan 2008 B1
7324870 Lee Jan 2008 B2
7328196 Peters Feb 2008 B2
7332890 Cohen et al. Feb 2008 B2
7352153 Yan Apr 2008 B2
7359766 Jeon et al. Apr 2008 B2
7360277 Moshenrose et al. Apr 2008 B2
7363108 Noda et al. Apr 2008 B2
7388879 Sabe et al. Jun 2008 B2
7389166 Harwig et al. Jun 2008 B2
7408157 Yan Aug 2008 B2
7418762 Arai et al. Sep 2008 B2
7430455 Casey et al. Sep 2008 B2
7430462 Chiu et al. Sep 2008 B2
7441298 Svendsen et al. Oct 2008 B2
7444206 Abramson et al. Oct 2008 B2
7448113 Jones et al. Nov 2008 B2
7459871 Landry et al. Dec 2008 B2
7467026 Sakagami et al. Dec 2008 B2
7474941 Kim et al. Jan 2009 B2
7503096 Lin Mar 2009 B2
7515991 Egawa et al. Apr 2009 B2
7555363 Augenbraun et al. Jun 2009 B2
7557703 Yamada et al. Jul 2009 B2
7568259 Yan Aug 2009 B2
7571511 Jones et al. Aug 2009 B2
7578020 Jaworski et al. Aug 2009 B2
7600521 Woo Oct 2009 B2
7603744 Reindle Oct 2009 B2
7617557 Reindle Nov 2009 B2
7620476 Morse et al. Nov 2009 B2
7636982 Jones et al. Dec 2009 B2
7647144 Haegermarck Jan 2010 B2
7650666 Jang Jan 2010 B2
7660650 Kawagoe et al. Feb 2010 B2
7663333 Jones et al. Feb 2010 B2
7693605 Park Apr 2010 B2
7706917 Chiappetta et al. Apr 2010 B1
7765635 Park Aug 2010 B2
7801645 Taylor et al. Sep 2010 B2
7805220 Taylor et al. Sep 2010 B2
7809944 Kawamoto Oct 2010 B2
7849555 Hahm et al. Dec 2010 B2
7853645 Brown et al. Dec 2010 B2
7920941 Park et al. Apr 2011 B2
7937800 Yan May 2011 B2
7957836 Myeong et al. Jun 2011 B2
20010004719 Sommer Jun 2001 A1
20010013929 Torsten Aug 2001 A1
20010020200 Das et al. Sep 2001 A1
20010025183 Shahidi Sep 2001 A1
20010037163 Allard Nov 2001 A1
20010043509 Green et al. Nov 2001 A1
20010045883 Holdaway et al. Nov 2001 A1
20010047231 Peless et al. Nov 2001 A1
20010047895 De Fazio et al. Dec 2001 A1
20020011367 Kolesnik Jan 2002 A1
20020011813 Koselka et al. Jan 2002 A1
20020016649 Jones Feb 2002 A1
20020021219 Edwards Feb 2002 A1
20020027652 Paromtchik et al. Mar 2002 A1
20020036779 Kiyoi et al. Mar 2002 A1
20020081937 Yamada et al. Jun 2002 A1
20020095239 Wallach et al. Jul 2002 A1
20020097400 Jung et al. Jul 2002 A1
20020104963 Mancevski Aug 2002 A1
20020108209 Peterson Aug 2002 A1
20020112742 Bredo et al. Aug 2002 A1
20020113973 Ge Aug 2002 A1
20020116089 Kirkpatrick Aug 2002 A1
20020120364 Colens Aug 2002 A1
20020124343 Reed Sep 2002 A1
20020153185 Song et al. Oct 2002 A1
20020156556 Ruffner Oct 2002 A1
20020159051 Guo Oct 2002 A1
20020166193 Kasper Nov 2002 A1
20020169521 Goodman et al. Nov 2002 A1
20020173877 Zweig Nov 2002 A1
20020189871 Won Dec 2002 A1
20030009259 Hattori et al. Jan 2003 A1
20030019071 Field et al. Jan 2003 A1
20030023356 Keable Jan 2003 A1
20030024986 Mazz et al. Feb 2003 A1
20030025472 Jones et al. Feb 2003 A1
20030028286 Glenn et al. Feb 2003 A1
20030030399 Jacobs Feb 2003 A1
20030058262 Sato et al. Mar 2003 A1
20030060928 Abramson et al. Mar 2003 A1
20030067451 Tagg et al. Apr 2003 A1
20030097875 Lentz et al. May 2003 A1
20030120389 Abramson et al. Jun 2003 A1
20030124312 Autumn Jul 2003 A1
20030126352 Barrett Jul 2003 A1
20030137268 Papanikolopoulos et al. Jul 2003 A1
20030146384 Logsdon et al. Aug 2003 A1
20030192144 Song et al. Oct 2003 A1
20030193657 Uomori et al. Oct 2003 A1
20030216834 Allard Nov 2003 A1
20030221114 Hino et al. Nov 2003 A1
20030229421 Chmura et al. Dec 2003 A1
20030229474 Suzuki et al. Dec 2003 A1
20030233171 Heiligensetzer Dec 2003 A1
20030233177 Johnson et al. Dec 2003 A1
20030233870 Mancevski Dec 2003 A1
20030233930 Ozick Dec 2003 A1
20040016077 Song et al. Jan 2004 A1
20040020000 Jones Feb 2004 A1
20040030448 Solomon Feb 2004 A1
20040030449 Solomon Feb 2004 A1
20040030450 Solomon Feb 2004 A1
20040030451 Solomon Feb 2004 A1
20040030570 Solomon Feb 2004 A1
20040030571 Solomon Feb 2004 A1
20040031113 Wosewick et al. Feb 2004 A1
20040049877 Jones et al. Mar 2004 A1
20040055163 McCambridge et al. Mar 2004 A1
20040068351 Solomon Apr 2004 A1
20040068415 Solomon Apr 2004 A1
20040068416 Solomon Apr 2004 A1
20040074038 Im et al. Apr 2004 A1
20040074044 Diehl et al. Apr 2004 A1
20040076324 Burl et al. Apr 2004 A1
20040083570 Song et al. May 2004 A1
20040085037 Jones et al. May 2004 A1
20040088079 Lavarec et al. May 2004 A1
20040093122 Galibraith May 2004 A1
20040098167 Yi et al. May 2004 A1
20040111184 Chiappetta et al. Jun 2004 A1
20040111821 Lenkiewicz et al. Jun 2004 A1
20040113777 Matsuhira et al. Jun 2004 A1
20040117064 McDonald Jun 2004 A1
20040117846 Karaoguz et al. Jun 2004 A1
20040118998 Wingett et al. Jun 2004 A1
20040128028 Miyamoto et al. Jul 2004 A1
20040133316 Dean Jul 2004 A1
20040134336 Solomon Jul 2004 A1
20040134337 Solomon Jul 2004 A1
20040143919 Wilder Jul 2004 A1
20040148419 Chen et al. Jul 2004 A1
20040148731 Damman et al. Aug 2004 A1
20040153212 Profio et al. Aug 2004 A1
20040156541 Jeon et al. Aug 2004 A1
20040158357 Lee et al. Aug 2004 A1
20040181706 Chen et al. Sep 2004 A1
20040187249 Jones et al. Sep 2004 A1
20040187457 Colens Sep 2004 A1
20040196451 Aoyama Oct 2004 A1
20040200505 Taylor et al. Oct 2004 A1
20040204792 Taylor et al. Oct 2004 A1
20040210345 Noda et al. Oct 2004 A1
20040210347 Sawada et al. Oct 2004 A1
20040211444 Taylor et al. Oct 2004 A1
20040221790 Sinclair et al. Nov 2004 A1
20040236468 Taylor et al. Nov 2004 A1
20040244138 Taylor et al. Dec 2004 A1
20040255425 Arai et al. Dec 2004 A1
20050000543 Taylor et al. Jan 2005 A1
20050010330 Abramson et al. Jan 2005 A1
20050010331 Taylor et al. Jan 2005 A1
20050021181 Kim et al. Jan 2005 A1
20050067994 Jones et al. Mar 2005 A1
20050085947 Aldred et al. Apr 2005 A1
20050137749 Jeon et al. Jun 2005 A1
20050144751 Kegg et al. Jul 2005 A1
20050150074 Diehl et al. Jul 2005 A1
20050150519 Keppler et al. Jul 2005 A1
20050154795 Kuz et al. Jul 2005 A1
20050156562 Cohen et al. Jul 2005 A1
20050165508 Kanda et al. Jul 2005 A1
20050166354 Uehigashi Aug 2005 A1
20050166355 Tani Aug 2005 A1
20050172445 Diehl et al. Aug 2005 A1
20050183229 Uehigashi Aug 2005 A1
20050183230 Uehigashi Aug 2005 A1
20050187678 Myeong et al. Aug 2005 A1
20050192707 Park et al. Sep 2005 A1
20050204717 Colens Sep 2005 A1
20050209736 Kawagoe Sep 2005 A1
20050211880 Schell et al. Sep 2005 A1
20050212929 Schell et al. Sep 2005 A1
20050213082 DiBernardo et al. Sep 2005 A1
20050213109 Schell et al. Sep 2005 A1
20050217042 Reindle Oct 2005 A1
20050218852 Landry et al. Oct 2005 A1
20050222933 Wesby Oct 2005 A1
20050229340 Sawalski et al. Oct 2005 A1
20050229355 Crouch et al. Oct 2005 A1
20050235451 Yan Oct 2005 A1
20050251292 Casey et al. Nov 2005 A1
20050255425 Pierson Nov 2005 A1
20050258154 Blankenship et al. Nov 2005 A1
20050273967 Taylor et al. Dec 2005 A1
20050288819 de Guzman Dec 2005 A1
20060000050 Cipolla et al. Jan 2006 A1
20060010638 Shimizu et al. Jan 2006 A1
20060020369 Taylor et al. Jan 2006 A1
20060020370 Abramson Jan 2006 A1
20060021168 Nishikawa Feb 2006 A1
20060025134 Cho et al. Feb 2006 A1
20060037170 Shimizu Feb 2006 A1
20060042042 Mertes et al. Mar 2006 A1
20060044546 Lewin et al. Mar 2006 A1
20060060216 Woo Mar 2006 A1
20060061657 Rew et al. Mar 2006 A1
20060064828 Stein et al. Mar 2006 A1
20060087273 Ko et al. Apr 2006 A1
20060089765 Pack et al. Apr 2006 A1
20060100741 Jung May 2006 A1
20060119839 Bertin et al. Jun 2006 A1
20060143295 Costa et al. Jun 2006 A1
20060146776 Kim Jul 2006 A1
20060190133 Konandreas et al. Aug 2006 A1
20060190146 Morse et al. Aug 2006 A1
20060196003 Song et al. Sep 2006 A1
20060220900 Ceskutti et al. Oct 2006 A1
20060259194 Chiu Nov 2006 A1
20060259494 Watson et al. Nov 2006 A1
20060288519 Jaworski et al. Dec 2006 A1
20060293787 Kanda et al. Dec 2006 A1
20070006404 Cheng et al. Jan 2007 A1
20070017061 Yan Jan 2007 A1
20070028574 Yan Feb 2007 A1
20070032904 Kawagoe et al. Feb 2007 A1
20070042716 Goodall et al. Feb 2007 A1
20070043459 Abbott et al. Feb 2007 A1
20070061041 Zweig Mar 2007 A1
20070114975 Cohen et al. May 2007 A1
20070150096 Yeh et al. Jun 2007 A1
20070157415 Lee et al. Jul 2007 A1
20070157420 Lee et al. Jul 2007 A1
20070179670 Chiappetta et al. Aug 2007 A1
20070226949 Hahm et al. Oct 2007 A1
20070234492 Svendsen et al. Oct 2007 A1
20070244610 Ozick et al. Oct 2007 A1
20070250212 Halloran et al. Oct 2007 A1
20070266508 Jones et al. Nov 2007 A1
20080007203 Cohen et al. Jan 2008 A1
20080039974 Sandin et al. Feb 2008 A1
20080052846 Kapoor et al. Mar 2008 A1
20080091304 Ozick et al. Apr 2008 A1
20080184518 Taylor Aug 2008 A1
20080276407 Schnittman et al. Nov 2008 A1
20080281470 Gilbert et al. Nov 2008 A1
20080282494 Won et al. Nov 2008 A1
20080294288 Yamauchi Nov 2008 A1
20080302586 Yan Dec 2008 A1
20080307590 Jones et al. Dec 2008 A1
20090007366 Svendsen et al. Jan 2009 A1
20090038089 Landry et al. Feb 2009 A1
20090049640 Lee et al. Feb 2009 A1
20090055022 Casey et al. Feb 2009 A1
20090102296 Greene et al. Apr 2009 A1
20090292393 Casey et al. Nov 2009 A1
20100011529 Won et al. Jan 2010 A1
20100049365 Jones et al. Feb 2010 A1
20100063628 Landry et al. Mar 2010 A1
20100107355 Won et al. May 2010 A1
20100257690 Jones et al. Oct 2010 A1
20100257691 Jones et al. Oct 2010 A1
20100263158 Jones et al. Oct 2010 A1
20100268384 Jones et al. Oct 2010 A1
20100312429 Jones et al. Dec 2010 A1
Foreign Referenced Citations (375)
Number Date Country
2003275566 Jun 2004 AU
2128842 Dec 1980 DE
3317376 Nov 1984 DE
3536907 Feb 1989 DE
3404202 Dec 1992 DE
199311014 Oct 1993 DE
4414683 Oct 1995 DE
4338841 Aug 1999 DE
19849978 Feb 2001 DE
19849978 Feb 2001 DE
10242257 Apr 2003 DE
102004038074.0 Jun 2005 DE
10357636 Jul 2005 DE
102004041021 Aug 2005 DE
102005046813 Apr 2007 DE
198803389 Dec 1988 DK
265542 May 1988 EP
281085 Sep 1988 EP
307381 Jul 1990 EP
358628 May 1991 EP
437024 Jul 1991 EP
433697 Dec 1992 EP
479273 May 1993 EP
294101 Dec 1993 EP
554978 Mar 1994 EP
615719 Sep 1994 EP
861629 Sep 1998 EP
792726 Jun 1999 EP
930040 Oct 1999 EP
845237 Apr 2000 EP
1018315 Jul 2000 EP
1172719 Jan 2002 EP
1228734 Jun 2003 EP
1331537 Jul 2003 EP
1380245 Jan 2004 EP
1380246 Mar 2005 EP
1553472 Jul 2005 EP
1557730 Jul 2005 EP
1557730 Jul 2005 EP
1642522 Nov 2007 EP
2238196 Nov 2006 ES
2601443 Nov 1991 FR
2828589 Dec 2003 FR
702426 Jan 1954 GB
2128842 Apr 1986 GB
2213047 Aug 1989 GB
2225221 May 1990 GB
2284957 Jun 1995 GB
2267360 Dec 1995 GB
2283838 Dec 1998 GB
2300082 Sep 1999 GB
2404330 Jul 2005 GB
2417354 Feb 2006 GB
53021869 Feb 1978 JP
53110257 Sep 1978 JP
53110257 Sep 1978 JP
943901 Mar 1979 JP
57014726 Jan 1982 JP
57064217 Apr 1982 JP
59005315 Feb 1984 JP
59033511 Mar 1984 JP
59094005 May 1984 JP
59099308 Jul 1984 JP
59112311 Jul 1984 JP
59033511 Aug 1984 JP
59120124 Aug 1984 JP
59131668 Sep 1984 JP
59164973 Sep 1984 JP
59184917 Oct 1984 JP
2283343 Nov 1984 JP
59212924 Dec 1984 JP
59226909 Dec 1984 JP
60089213 May 1985 JP
60089213 Jun 1985 JP
60211510 Oct 1985 JP
60259895 Dec 1985 JP
61023221 Jan 1986 JP
61097712 May 1986 JP
61023221 Jun 1986 JP
62074018 Apr 1987 JP
62070709 May 1987 JP
62120510 Jul 1987 JP
62154008 Sep 1987 JP
62164431 Oct 1987 JP
62263507 Nov 1987 JP
62263508 Nov 1987 JP
62189057 Dec 1987 JP
63079623 Apr 1988 JP
63158032 Jul 1988 JP
63183032 Jul 1988 JP
63241610 Oct 1988 JP
1162454 Jun 1989 JP
2006312 Jan 1990 JP
2026312 Jun 1990 JP
2283343 Nov 1990 JP
3051023 Mar 1991 JP
3197758 Aug 1991 JP
3201903 Sep 1991 JP
4019586 Mar 1992 JP
4084921 Mar 1992 JP
5023269 Apr 1993 JP
5091604 Apr 1993 JP
5042076 Jun 1993 JP
5046246 Jun 1993 JP
5150827 Jun 1993 JP
5150829 Jun 1993 JP
5046239 Jul 1993 JP
5054620 Jul 1993 JP
5040519 Oct 1993 JP
5257527 Oct 1993 JP
5257533 Oct 1993 JP
5285861 Nov 1993 JP
6003251 Jan 1994 JP
6026312 Apr 1994 JP
6137828 May 1994 JP
6293095 Oct 1994 JP
6327598 Nov 1994 JP
6105781 Dec 1994 JP
7059702 Mar 1995 JP
7129239 May 1995 JP
7059702 Jun 1995 JP
7222705 Aug 1995 JP
7222705 Aug 1995 JP
7270518 Oct 1995 JP
7281742 Oct 1995 JP
7281752 Oct 1995 JP
295636 Nov 1995 JP
7311041 Nov 1995 JP
7313417 Dec 1995 JP
7319542 Dec 1995 JP
8000393 Jan 1996 JP
8000393 Jan 1996 JP
8016241 Jan 1996 JP
8016776 Feb 1996 JP
8016776 Feb 1996 JP
8063229 Mar 1996 JP
8083125 Mar 1996 JP
8083125 Mar 1996 JP
8089449 Apr 1996 JP
8089451 Apr 1996 JP
2520732 May 1996 JP
8123548 May 1996 JP
8152916 Jun 1996 JP
8152916 Jun 1996 JP
8256960 Oct 1996 JP
8263137 Oct 1996 JP
8286741 Nov 1996 JP
8286744 Nov 1996 JP
8322774 Dec 1996 JP
8322774 Dec 1996 JP
8335112 Dec 1996 JP
9043901 Feb 1997 JP
9044240 Feb 1997 JP
9047413 Feb 1997 JP
9066855 Mar 1997 JP
9145309 Jun 1997 JP
9160644 Jun 1997 JP
9160644 Jun 1997 JP
9179625 Jul 1997 JP
9179625 Jul 1997 JP
9179685 Jul 1997 JP
9185410 Jul 1997 JP
9192069 Jul 1997 JP
9204223 Aug 1997 JP
9206258 Aug 1997 JP
9206258 Aug 1997 JP
9233712 Sep 1997 JP
09251318 Sep 1997 JP
9251318 Sep 1997 JP
9265319 Oct 1997 JP
9269807 Oct 1997 JP
9269810 Oct 1997 JP
02555263 Nov 1997 JP
9319431 Dec 1997 JP
9319432 Dec 1997 JP
9319434 Dec 1997 JP
9325812 Dec 1997 JP
10055215 Feb 1998 JP
10117973 May 1998 JP
10117973 May 1998 JP
10118963 May 1998 JP
10177414 Jun 1998 JP
10214114 Aug 1998 JP
10214114 Aug 1998 JP
10228316 Aug 1998 JP
10240342 Sep 1998 JP
10260727 Sep 1998 JP
10295595 Nov 1998 JP
11015941 Jan 1999 JP
11065655 Mar 1999 JP
11085269 Mar 1999 JP
11102219 Apr 1999 JP
11102220 Apr 1999 JP
11162454 Jun 1999 JP
11174145 Jul 1999 JP
11175149 Jul 1999 JP
11178764 Jul 1999 JP
11178765 Jul 1999 JP
11212642 Aug 1999 JP
11212642 Aug 1999 JP
11213157 Aug 1999 JP
11508810 Aug 1999 JP
11248806 Sep 1999 JP
11510935 Sep 1999 JP
11282532 Oct 1999 JP
11282533 Oct 1999 JP
11295412 Oct 1999 JP
11346964 Dec 1999 JP
2000047728 Feb 2000 JP
2000056006 Feb 2000 JP
2000056831 Feb 2000 JP
2000066722 Mar 2000 JP
2000075925 Mar 2000 JP
10240343 May 2000 JP
2000275321 Oct 2000 JP
2000353014 Dec 2000 JP
2000353014 Dec 2000 JP
2001022443 Jan 2001 JP
2001067588 Mar 2001 JP
2001087182 Apr 2001 JP
2001087182 Apr 2001 JP
2001121455 May 2001 JP
2001125641 May 2001 JP
2001216482 Aug 2001 JP
2001258807 Sep 2001 JP
2001265437 Sep 2001 JP
2001275908 Oct 2001 JP
2001289939 Oct 2001 JP
2001306170 Nov 2001 JP
2001320781 Nov 2001 JP
2001525567 Dec 2001 JP
2002204768 Jul 2002 JP
2002204769 Jul 2002 JP
2002247510 Aug 2002 JP
2002532178 Oct 2002 JP
2002323925 Nov 2002 JP
2002333920 Nov 2002 JP
03356170 Dec 2002 JP
2002355206 Dec 2002 JP
2002360471 Dec 2002 JP
2002360479 Dec 2002 JP
2002360482 Dec 2002 JP
2002366227 Dec 2002 JP
2002369778 Dec 2002 JP
2002369778 Dec 2002 JP
2003005296 Jan 2003 JP
2003010076 Jan 2003 JP
2003010088 Jan 2003 JP
2003015740 Jan 2003 JP
2003028528 Jan 2003 JP
03375843 Feb 2003 JP
2003036116 Feb 2003 JP
2003047579 Feb 2003 JP
2003052596 Feb 2003 JP
2003505127 Feb 2003 JP
2003061882 Mar 2003 JP
2003084994 Mar 2003 JP
2003167628 Jun 2003 JP
2003180586 Jul 2003 JP
2003180587 Jul 2003 JP
2003186539 Jul 2003 JP
2003190064 Jul 2003 JP
2003190064 Jul 2003 JP
2003241836 Aug 2003 JP
2003262520 Sep 2003 JP
2003285288 Oct 2003 JP
2003304992 Oct 2003 JP
2003310489 Nov 2003 JP
2003310509 Nov 2003 JP
2003330543 Nov 2003 JP
2004123040 Apr 2004 JP
2004148021 May 2004 JP
2004160102 Jun 2004 JP
2004166968 Jun 2004 JP
2004174228 Jun 2004 JP
2004198330 Jul 2004 JP
2004219185 Aug 2004 JP
2005118354 May 2005 JP
2005135400 May 2005 JP
2005211360 Aug 2005 JP
2005224265 Aug 2005 JP
2005230032 Sep 2005 JP
2005245916 Sep 2005 JP
2005296511 Oct 2005 JP
2005346700 Dec 2005 JP
2005352707 Dec 2005 JP
2006043071 Feb 2006 JP
2006155274 Jun 2006 JP
2006164223 Jun 2006 JP
2006227673 Aug 2006 JP
2006247467 Sep 2006 JP
2006260161 Sep 2006 JP
2006293662 Oct 2006 JP
2006296697 Nov 2006 JP
2007034866 Feb 2007 JP
2007213180 Aug 2007 JP
04074285 Apr 2008 JP
2009015611 Jan 2009 JP
2010198552 Sep 2010 JP
WO9526512 Oct 1995 WO
WO9530887 Nov 1995 WO
WO9617258 Feb 1997 WO
WO9715224 May 1997 WO
WO9740734 Nov 1997 WO
WO9741451 Nov 1997 WO
WO9853456 Nov 1998 WO
WO9905580 Feb 1999 WO
WO9916078 Apr 1999 WO
WO9928800 Jun 1999 WO
WO9938056 Jul 1999 WO
WO9938237 Jul 1999 WO
WO9943250 Sep 1999 WO
WO9959042 Nov 1999 WO
WO0004430 Apr 2000 WO
WO0036962 Jun 2000 WO
WO0038026 Jun 2000 WO
WO0038028 Jun 2000 WO
WO0038029 Jun 2000 WO
WO0078410 Dec 2000 WO
WO0106904 Feb 2001 WO
WO0106905 Jun 2001 WO
WO0180703 Nov 2001 WO
WO0191623 Dec 2001 WO
WO0239864 May 2002 WO
WO0239868 May 2002 WO
WO02006219 Aug 2002 WO
WO02058527 Aug 2002 WO
WO 02071175 Sep 2002 WO
WO02067744 Sep 2002 WO
WO02067745 Sep 2002 WO
WO02067752 Sep 2002 WO
WO02069774 Sep 2002 WO
WO02075350 Sep 2002 WO
WO02075356 Sep 2002 WO
WO02075469 Sep 2002 WO
WO02075470 Sep 2002 WO
WO02081074 Oct 2002 WO
WO03015220 Feb 2003 WO
WO03024292 Mar 2003 WO
WO0269775 May 2003 WO
WO03040546 May 2003 WO
WO03040845 May 2003 WO
WO03040846 May 2003 WO
WO03062850 Jul 2003 WO
WO03062852 Jul 2003 WO
WO02101477 Oct 2003 WO
WO03026474 Nov 2003 WO
WO2004004533 Jan 2004 WO
WO2004004534 Jan 2004 WO
WO2004005956 Jan 2004 WO
WO2004006034 Jan 2004 WO
WO2004025947 May 2004 WO
WO2004043215 May 2004 WO
WO2004058028 Jul 2004 WO
WO2004059409 Jul 2004 WO
WO2004058028 Jul 2004 WO
WO2005006935 Jan 2005 WO
WO2005036292 Apr 2005 WO
WO2005055795 Jun 2005 WO
WO2005055796 Jun 2005 WO
WO2005076545 Aug 2005 WO
WO2005077243 Aug 2005 WO
WO2005077244 Aug 2005 WO
WO2005081074 Sep 2005 WO
WO2005082223 Sep 2005 WO
WO2005083541 Sep 2005 WO
WO2005098475 Oct 2005 WO
WO2005098476 Oct 2005 WO
WO2006046400 May 2006 WO
WO2006061133 Jun 2006 WO
WO2006068403 Jun 2006 WO
WO2006073248 Jul 2006 WO
WO2007036490 May 2007 WO
WO2007065033 Jun 2007 WO
WO2007137234 Nov 2007 WO
Non-Patent Literature Citations (227)
Entry
Bulusu, et al. “Self Cofiguring Localizaton systems: Design and Experimental Evaluaton”, ACM Transactons on Embedded Computing Systems vol. 3 No. 1 pp. 24-60, 2003.
Chin “Joystick Control for Tiny OS Robot”, http://www.eecs.berkeley.edu/Programs/ugrad/superb/papers2002/chiri.pdf. 12 pages, Aug. 8, 2002.
Andersen et al., “Landmark based navigation strategies”, SPIE Conference on Mobile Robots XIII, SPIE vol. 3525, pp.
Clerentin et al. “A localization method based on two omnidirectional peception systems cooperation” Proc of IEEE International Conference on Robotics & Automation, San Francisco, CA vol. 2, pp. 1219-1224, Apr. 2000.
Desaulniers et al. “An Efficient Algorithm to find a shortest path for a car-like Robot”, IEEE Transactions on robotics and Automation vol. 11 No. 6, pp. 819-828, Dec. 1995.
Dorfmüller-Ulhaas “Optical Tracking From User Motion to 3D Interaction”, http://www.cg.tuwien.ac.at/research/publications/2002/Dorfrnueller-Ulhaas-thesis, 182 pages, 2002.
EBay “Roomba Timer -> Timed Cleaning—Floorvac Robotic Vacuum”, Cgi.ebay.com/ws/eBay|SAP|.dll?viewitem&category=43526&item=4375198387&rd=1, 5 pages, Apr. 20, 2005.
Electrolux “Welcome to the Electrolux trilobite” www.electroluxusa.com/node57.asp?currentURL=node142.asp%3F, 2 pages, Mar. 18, 2005.
Eren, et al. “Accuracy in position estimation of mobile robots based on coded infrared signal transmission”, Proceedings: Integrating Intelligent Instrumentation and Control, Instrumentation and Measurement Technology Conference, 1995. IMTC/95. pp. 548-551, 1995.
Barker, “Navigation by the Stars—Ben Barker 4th Year Project” Power point pp. 1-20.
Florbot GE Plastics Image (1989-1990).
Bison, P et al., “Using a structured beacon for cooperative position estimation” Robotics and Autonomous Systems vol. 29, No. 1, pp. 33-40, Oct. 1999.
Jeong, et al. “An intelligent map-buiding system for indoor mobile robot using low cost photo sensors”, SPIE vol. 6042 6 pages, 2005.
King “Heplmate-TM—Autonomous mobile Robots Navigation Systems”, SPIE vol. 1388 Mobile Robots pp. 190-198, 1990.
Lambrinos, et al. “A mobile robot employing insect strategies for navigation”, http://www8.cs.umu.se/kurser/TDBD17/VT04/dl/Assignment%20Papers/larnbrinos-RAS-2000.pdf, 38 pages, Feb. 19, 1999.
Leonard et al. “Mobile Robot Localizaton by tracking Geometric Beacons”, IEEE Transaction on Robotics and Automation, vol. 7, No. 3 pp. 376-382, Jun. 1991.
Li et al “Making a Local Map of Indoor Environments by Swiveling a Camera and a Sonar”, Proceedings of the 1999 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 954-959, 1999.
Madsen, et al. “Optimal landmark selection for triangulation of robot position”, Journal of Robtics and Autonomous Systems vol. 13 pp. 277-292, 1998.
Martishevcky, “The Accuracy of point light target coordinate determination by dissectoral tracking system”, SPIE vol. 2591 pp. 25-30.
Miro, et al. “Towards Vision Based Navigation in Large Indoor Environments”, Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Beijing, China, pp. 2096-2102, Oct. 9-15, 2006.
Munich et al. “ERSP:A Software Platform and Architecture for the Service Robotics Industry”, Intelligent Robots and Systems, 2005. (IROS 2005), pp. 460-467, Aug. 2-6, 2005.
Nitu et al. “Optomechatronic System for Position Detection of a Mobile Mini-Robot”, IEEE Ttransactions on Industrial Electronics, vol. 52, No. 4, pp. 969-973, Aug. 2005.
On Robo “Robot Reviews Samsung Robot Vacuum (VC-RP30W)”, www.onrobo.com/reviews/AT—Home/vacuum—cleaners/on00vcrb30rosam/index.htm.. 2 pages, 2005.
InMach “lntelligent Machines”, www.inmach.de/inside.html, 1 page , Nov. 19, 2008.
Park, et al. “Dynamic Visual Servo Control of Robot Manipulators using Neutral Networks”, The Korean Institute Telematics and Electronics, vol. 29-B, No. 10. pp. 771-779, Oct. 1992.
Pirjanian et al “Multi-Robot Target Acquisition uing Multiple Objective Behavior Coordination”, Proceedings of the 2000 IEEE International Conference on Robotics & Automation, San Francisco, CA, pp. 2696-2702, Apr. 2000.
Schenker, et al. “Lightweight rovers for Mars science exploration and sample return”, Intelligent Robots and Computer Vision XVI, SPIE Proc. 3208, pp. 24-36, 1997.
Shimoga et al. “Touch and Force Reflection for Telepresence Surgery”, Engineering in Medicine and Biology Society, 1994. Engineering Advances: New Opportunities for Biomedical Engineers. Proceedings of the 16th Annual International Conference of the IEEE, Baltimore, MD, pp. 1049-1050, 1994.
Tribelhorn et al., “Evaluating the Roomba: A low-cost, ubiquitous platform for robotics reasearch and education,” 2007, IEEE, p. 1393-1399.
Tse et al. “Design of a Navigation System for a Household Mobile Robot Using Neutral Networks”, Department of Manufacturing Engg. & Engg. Management, City University of Hong Kong, pp. 2151-2156, 1998.
Yun, et al. “Robust Positioning a Mobile Robot with Active Beacon Sensors”, Lecture Notes in Computer Science, 2006, vol. 4251, pp. 890-897, 2006.
Taipei Times, Robotic vacuum by Matsuhita about ot undergo testing, Mar. 26, 2002 http://www.taipeitimes.com/News/worldbiz/archives/2002/03/26/0000129338 accessed.
Andersen et al., “Landmark based navigation strategies”, SPIE Conference on Mobile Robots XIII, SPIE vol. 3525, pp. 170-181, Jan. 8, 1999.
Ascii, Mar. 25, 2002, http://ascii.jp/elem/000/000/330/330024/ accessed Nov. 1, 2011.
Li et al. “Robust Statistical Methods for Securing Wireless Localization in Sensor Networks,” Information Procesing in Sensor Networks, 2005, Fourth International Symposium on, pp. 91-98, Apr. 2005.
Maschinemarkt Würzburg 105, Nr. 27, pp. 3, 30, Jul. 5, 1999.
Paromtchik “Toward Optical Guidance of Mobile Robots,” Proceedings of the Fourth World Multiconference on Systemics, Cybermetics and Informatics, Orlando, FL, USA, Jul. 23, 2000, vol. IX, pp. 44-49, available at http://emotion.inrialpes.fr/˜paromt/infos/papers/paromtchik:asama:sci:2000.ps.gz, accessed Jul. 3, 2012.
Roboking—not just a vacuum cleaner, a robot!, Jan. 21, 2004, infocom.uz/2004/01/21/robokingne-prosto-pyilesos-a-robot/, accessed Oct. 10, 2011, 7 pages.
Sebastian Thrun, “Learning Occupancy Grid Maps With Forward Sensor Models,” Autonomous Robots 15, 111-127, Sep. 1, 2003.
SVET Computers—New Technologies—Robot Vacuum Cleaner, Oct. 1999, available at http://www.sk.rs/1999/10/sknt01.html, accessed Nov. 1, 2011.
Jarosiewicz et al. “Final Report—Lucid”, University of Florida, Departmetn of Electrical and Computer Engineering, EEL 5666—Intelligent Machine Design Laboratory, 50 pages, Aug. 4, 1999.
Jensfelt, et al. “Active Global Localization for a mobile robot using multiple hypothesis tracking”, IEEE Transactions on Robots and Automation vol. 17, No. 5, pp. 748-760, Oct. 2001.
Jeong, et al. “An intelligent map-building system for indoor mobile robot using low cost photo sensors”, SPIE vol. 6042 6 pages, 2005.
Kahney, “Robot Vacs are in the House,” www.wired.com/news/technology/o,1282,59237,00.html, 6 pages, Jun. 18, 2003.
Karcher “Product Manual Download Karch”, www.karcher.com. 17 pages, 2004.
Karcher “Karcher RoboCleaner RC 3000”, www.robocleaner.de/english/screen3.html, 4 pages, Dec. 12, 2003.
Karcher USA “RC 3000 Robotics cleaner”, www.karcher-usa.com, 3 pages, Mar. 18, 2005.
Karlsson et al., The vSLAM Algorithm for Robust Localization and Mapping, Proceedings of the 2005 IEEE International Conference on Robotics and Automation, Barcelona, Spain, pp. 24-29, Apr. 2005.
Karlsson, et al Core Technologies for service Robotics, IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2004), vol. 3, pp. 2979-2984, Sep. 28-Oct. 2, 2004.
King “Heplmate—TM—Autonomous mobile Robots Navigation Systems”, SPIE vol. 1388 Mobile Robots pp. 190-198, 1990.
Kleinberg, The Localization Problem for Mobile Robots, Laboratory for Computer Science, Massachusetts Institute of Technology, 1994 IEEE, pp. 521-531, 1994.
Knight, et al., “Localization and Identification of Visual Landmarks”, Journal of Computing Sciences in Colleges, vol. 16 Issue 4, 2001 pp. 312-313, May 2001.
Kolodko et al. “Experimental System for Real-Time Motion Estimation”, Proceedings of the 2003 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM 2003), pp. 981-986, 2003.
Komoriya et al., Planning of Landmark Measurement for the Navigation of a Mobile Robot, Proceedings of the 1992 IEEE/RSJ International Cofnerence on Intelligent Robots and Systems, Raleigh, NC pp. 1476-1481, Jul. 7-10, 1992.
Krotov, et al. “Digital Sextant”, Downloaded from the internet at: http://www.cs.cmu.edu/˜epk/ , 1 page, 1995.
Krupa et al. “Autonomous 3-D Positioning of Surgical Instruments in Robotized Laparoscopic Surgery Using Visual Servoing”, IEEE Transactions on Robotics and Automation, vol. 19, No. 5, pp. 842-853, Oct. 5, 2003.
Kuhl, et al. “Self Localization in Environments using Visual Angles”, VRCAI '04 Proceedings of the 2004 ACM SIGGRAPH international conference on Virtual Reality continuum and its applications in industry, pp. 472-475, 2004.
Kurth, “Range-Only Robot Localization and SLAM with Radio”, http://www.ri.cmu.edu/pub—files/pub4/kurth—derek—2004—1/kurth—derek—2004—1.pdf. 60 pages, May 2004.
Lambrinos, et al. “A mobile robot employing insect strategies for navigation”, http://www8.cs.umu.se/kurser/TDBD17/VT04/dl/Assignment%20Papers/lambrinos-RAS-2000.pdf, 38 pages, Feb. 19, 1999.
Lang et al. “Visual Measurement of Orientation Using Ceiling Features”, 1994 IEEE, pp. 552-555, 1994.
Lapin, “Adaptive position estimation for an automated guided vehicle”, SPIE vol. 1831 Mobile Robots VII, pp. 82-94, 1992.
LaValle et al. “Robot Motion Planning in a Changing, Partially Predictable Environment”, 1994 IEEE International Symposium on Intelligent Control, Columbus, OH, pp. 261-266, Aug. 16-18, 1994.
Lee, et al. “Localization of a Mobile Robot Using the Image of a Moving Object”, lEEE Transaction on Industrial Electronics, vol. 50, No. 3 pp. 612-619, Jun. 2003.
Lee, et al. “Development of Indoor Navigation system for Humanoid Robot Using Multi-sensors Integration”, ION NTM, San Diego, CA pp. 798-805, Jan. 22-24, 2007.
Leonard, et al. “Mobile Robot Localization by tracking Geometric Beacos”, IEEE Transaction on Robotics and Automation, vol. 7, No. 3 pp. 376-382, Jun. 1991.
Li et al. “Making a Local Map of Indoor Environments by Swiveling a Camera and a Sonar”, Proceedings of the 1999 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 954-959, 1999.
Lin, et al.. “Mobile Robot Navigation Using Artificial Landmarks”, Journal of robotics System 14(2). pp. 93-106, 1997.
Linde “Dissertation, “On Aspects of Indoor Localization”” https://eldorado.tu-dortmund.de/handle/2003/22854, University of Dortmund, 138 pages, Aug. 28, 2006.
Lumelsky, et al. “An Algorithm for Maze Searching with Azimuth Input”, 1994 IEEE Internationai Conference on Robotics and Automation, San Diego, CA vol. 1, pp. 111-116, 1994.
Luo et al., “Real-time Area-Covering Operations with Obstacle Avoidance for Cleaning Robots,” 2002, IEeE, p. 2359-2364.
Ma “Thesis: Documentation on Northstar”, California Institute of Technology, 14 pages, May 17, 2006.
Madsen, et al. “Optimal landmark selection for triangulation of robot position”, Journal of Robotics and Autonomous Systems vol. 13 pp. 277-292, 1998.
Matsutek Enterprises Co. Ltd “Automatic Rechargeable Vacuum Cleaner”, http://matsutek.manufacturer.globalsources.com/si/6008801427181/pdtl/Home-vacuum/10 . . . , Apr. 23, 2007.
McGillem, et al. “Infra-red Lacation System for Navigation and Autonomous Vehicles”, 1988 IEEE International Conference on Robotics and Automation, vol. 2, pp. 1236-1238, Apr. 24-29, 1988.
McGillem,et al. “A Beacon Navigation Method for Autonomous Vehicles”, IEEE Transactions on Vehicular Technology, vol. 38, No. 3, pp. 132-139, Aug. 1989.
Michelson “Autonomous Navigation”, 2000 Yearbook of Science & Technology, McGraw-Hill, New York, ISBN 0-07-052771-7, pp. 28-30, 1999.
Miro, et al. “Towards Vision Based Navigation in Large Indoor Environments”, Proceedings of the IEEE/RSJ International Confernece on Intelligent Robots and Systems, Beijing, China, pp. 2096-2102, Oct. 9-15, 2006.
MobileMag “Samsung Unveils High-tech Robot Vacuum Cleaner”, http://www.mobilemag.com/content/100/102/C2261/, 4 pages, Mar. 18, 2005.
Monteiro, et al. “Visual Servoing for Fast Mobile Robot: Adaptive Estimation of Kinematic Parameters”, Proceedings of the IECON '93., International Conference on Industrial Electronics, Maui, HI, pp. 1588-1593, Nov. 15-19, 1993.
Moore, et al. A simple Map-bases Localization strategy using range measurements, SPIE vol. 5804 pp. 612-620, 2005.
Munich et al. “SIFT-ing Through Features with ViPR”, IEEE Robotics & Automation Magazine, pp. 72-77, Sep. 2006.
Munich et al. “ERSP: A Software Platform and Architecture for the Service Robotics Industry”, Intelligent Robots and Systems, 2005. (IROS 2005), pp. 460-467, Aug. 2-6, 2005.
Nam, et al. “Real-Time Dynamic Visual Tracking Using PSD Sensors and extended Trapezoidal Motion Planning”, Applied Intelligence 10, pp. 53-70, 1999.
Nitu et al. “Optomechatronic System for Position Detection of a Mobile Mini-Robot”, IEEE Ttransactions on Indusrial Electronics, vol. 52, No. 4, pp. 969-973, Aug. 2005.
On Robo “Robot Reviews Samsung Robot Vacuum (VC-RP30W)”, www.onrobo.com/reviews/AT—Home/vacuum—cleaners/on00vcrb30rosam/index.htm . . . 2 pages, 2005.
InMach “Intelligent Machines”, www.inmach.de/inside.html, 1 page , Nov. 19, 2008.
Innovation First “2004 EDU Robot Controller Reference Guide”, http://www.ifirobotics.com, 13 pgs., Mar. 1, 2004.
OnRobo “Samsung Unveils Its Multifunction Robot Vacuum”, www.onrobo.com/enews/0210/samsung—vacuum.shtml, 3 pages, Mar. 18, 2005.
Pages et al. “Optimizing Plane-to-Plane Positioning Tasks by Image-Based Visual Servoing and Structured Light”, IEEE Transactions on Robotics, vol. 22, No. 5, pp. 1000-1010, Oct. 2006.
Pages et al. “A camera-projector system for robot positioning by visual servoing”, Proceedings of the 2006 Conference on Computer Vision and Pattern Recognition Workshop (CVPRW06), 8 pages, Jun. 17-22, 2006.
Pages, et al. “Robust decoupled visual servoing based on structured light”, 2005 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 2676-2681, 2005.
Park et al. “A Neural Network Based Real-Time Robot Tracking Controller Using Position Sensitive Detectors,” IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on Neutral Networks, Orlando, Florida pp. 2754-2758, Jun. 27-Jul. 2, 1994.
Park, et al. “Dynamic Visual Servo Control of Robot Manipulators using Neutral Networks”, The Korean Institute Telematics and Electronics, vol. 29-B, No. 10, pp. 771-779, Oct. 1992.
Paromtchik, et al. “Optical Guidance System for Multiple mobile Robots”, Proceedings 2001 ICRA. IEEE International Conference on Robotics and Automation, vol. 3, pp. 2935-2940 (May 21-26, 2001).
Penna, et al. “Models for Map Building and Navigation”, IEEE Transactions on Systems. Man. and Cybernetics. vol. 23 No. 5, pp. 1276-1301, Sep./Oct. 1993.
Pirjanian “Reliable Reaction”, Proceedings of the 1996 IEEE/SICE/RSJ International Conference on Multisensor Fusion and Integration for Intelligent Systems, pp. 158-165, 1996.
Pirjanian “Challenges for Standards for consumer Robotics”, IEEE Workshop on Advanced Robotics and its Social impacts, pp. 260-264, Jun. 12-15, 2005.
Pirjanian et al. “Distributed Control for a Modular, Reconfigurable Cliff Robot”, Proceedings of the 2002 IEEE International Conference on Robotics & Automation, Washington, D.C. pp. 4083-4088, May 2002.
Pirjanian et al. “Representation and Execution of Plan Sequences for Multi-Agent Systems”, Proceedings of the 2001 IEEE/RSJ International Conference on Intelligent Robots and Systems, Maui, Hawaii, pp. 2117-2123, Oct. 29-Nov. 3, 2001.
Pirjanian et al. “Multi-Robot Target Acquisition using Multiple Objective Behavior Coordination”, Proceedings of the 2000 IEEE International Conference on Robotics & Automation, San Francisco, CA, pp. 2696-2702, Apr. 2000.
Pirjanian et al. “A decision-theoretic approach to fuzzy behavior coordination”, 1999 IEEE International Symposium on Computational Intelligence in Robotics and Automation, 1999. CIRA '99., Monterey, CA, pp. 101-106, Nov. 8-9, 1999.
Pirjanian et al. “Improving Task Reliability by Fusion of Redundant Homogeneous Modules Using Voting Schemes”, Proceedings of the 1997 IEEE International Conference on Robotics and Automation, Albuquerque, NM, pp. 425-430, Apr. 1997.
Prassler et al., “A Short History of Cleaning Robots”, Autonomous Robots 9, 211-226, 2000, 16 pages.
Remazeilles, et al. “Image based robot navigation in 3D environments”, Proc. of SPIE, vol. 6052, pp. 1-14, Dec. 6, 2005.
Rives, et al. “Visual servoing based on ellipse features”, SPIE vol. 2056 Intelligent Robots and Computer Vision pp. 356-367, 1993.
Robotics World Jan. 2001: “A Clean Sweep” (Jan. 2001).
Ronnback “On Methods for Assistive Mobile Robots”, http://www.openthesis.org/documents/methods-assistive-mobile-robots-595019.html, 218 pages, Jan. 1, 2006.
Roth-Tabak, et al. “Environment Model for mobile Robots Indoor Navigation”, SPIE vol. 1388 Mobile Robots pp. 453-463, 1990.
Sadath M Malik et al. “Virtual Prototyping for Conceptual Design of a Tracked Mobile Robot”. Electrical and Computer Engineering, Canadian Conference on, IEEE, PI. May 1, 2006, pp. 2349-2352.
Sahin, et al. “Development of a Visual Object Localization Module for Mobile Robots”, 1999 Third European Workshop on Advanced Mobile Robots, (Eurobot '99), pp. 65-72, 1999.
Salomon, et al. “Low-Cost Optical Indoor Localization system for Mobile Objects without Image Processing”, IEEE Conference on Emerging Technologies and Factory Automation, 2006. (ETFA '06), pp. 629-632, Sep. 20-22, 2006.
Sato “Range Imaging Based on Moving Pattern Light and Spatio-Temporal Matched Filter”, Proceedings International Conference on Image Processing, vol. 1., Lausanne, Switzerland, pp. 33-36, Sep. 16-19, 1996.
Schenker, et al. “Lightweight rovers for Mars science exploration and sample return”, Intelligent Robots and Computer Vision XVI, SPIE Proc. 3208, pp. 34-36, 1997.
Shimoga et al. “Touch and Force Reflection for Telepresence Surgery”, Engineering in Medicine and Biology Society, 1994. Engineering Advances: New Opportunities for Biomedical Engineers. Proceedings of the 16th Annual International Conference of the IEEE, Baltimore, MD, pp. 1049-1050. 1994.
Sim, et al “Learning Visual Landmarks for Pose Estimation”, IEEE International Conference on Robotics and Automation, vol. 3, Detroit, MI, pp. 1972-1978, May 10-15, 1999.
Sobh et al. “Case Studies in Web-Controlled Devices and Remote Manipulation”, Automation Congress, 2002 Proceedings of the 5th Biannual World, pp. 435-440, Dec. 10, 2002.
Stella, et al. “Self-Location for Indoor Navigation of Autonomous Vehicles”, Part of the SPIE conference on Enhanced and Synthetic Vision SPIE vol. 3364 pp. 298-302, 1998.
Summet “Tracking Locations of Moving Hand-held Displays Using Projected Light”, Pervasive 2005, LNCS 3468 pp. 37-46 (2005).
Svedman et al. “Structure from Stereo Vision using Unsynchronized Cameras for Simultaneous Localization and Mapping”, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 2993-2998, 2005.
Takio et al. “Real-Time Position and Pose Tracking Method of Moving Object Using Visual Servo System”, 47th IEEE International Symposium on Circuits and Systems, pp. 167-170, 2004.
Teller “Pervasive pose awareness for people, Objects and Robots”, http://www.ai.mit.edu/lab/dangerous-ideas/Spring2003/teller-pose.pdf, 6 pages, Apr. 30, 2003.
Terada et al. “An Acquisition of the Relation between Vision and Action using Self-Organizing Map and Reinforcement Learning”, 1998 Second International Conference on Knowledge-Based Intelligent Electronic Systems, Adelaide, Australiam pp. 429-434, Apr. 21-23, 1998.
The Sharper Image “Robotic Vacuum Cleaner—Blue” www.Sharperimage.com, 2 pages, Mar. 18, 2005.
The Sharper Image “E Vac Robotic Vacuum”, www.sharperiamge.com/us/en/templates/products/pipmorework1printable.jhtml, 2 pages, Mar. 18, 2005.
TheRobotStore.com “Friendly Robotics Robotic Vacuum RV400—The Robot Store”, www.therobotstore.com/s.nl/sc.9/category.-109/it.A/id.43/.f, 1 page, Apr. 20, 2005.
TotalVac.com RC3000 RoboCleaner website Mar. 18, 2005.
Trebi-Ollennu et al. “Mars Rover Pair Cooperatively Transporting a Long Payload”, Proceedings of the 2002 IEEE International Conference on Robotics & Automation, Washington, D.C. pp. 3136-3141, May 2002.
Trbelhorn et al., “Evaluating the Roomba: A low-cost, ubiquitous platform for robotics research and education,” 2007, IEEE, p. 1393-1399.
Tse et al. “Design of a Navigation System for a Household Mobile Robot Using Neural Networks”, Department of Manufacturing Engg. & Engg. Management, City University of Hong Kong, pp. 2151-2156, 1998.
UAMA (Asia) Industrial Co., Ltd. “RobotFamily”, 2005.
Watanabe et al. “Position Estimation of Mobile Robots With Internal and External Sensors Using Uncertainty Evolution Technique”, 1990 IEEE International Conference on Robotics and Automation, Cincinnati, OH, pp. 2011-2016, May 13-18, 1990.
Watts “Robot, boldly goes where no man can”, The Times—pp. 20, Jan. 1985.
Wijk et al. “Triangulation-Based Fusion of Sonar Data with Application in Robot Pose Tracking”, IEEE Transactions on Robotics and Automation, vol. 16, No. 6, pp. 740-752, Dec. 2000.
Wolf et al. “Robust Vision-based Localization for Mobile Robots Using an Image Retrieval System Based on Invariant Features”, Proceedings of the 2002 IEEE International Conference on Robotics & Automation, Washington, D.C. pp. 359-365, May 2002.
Wolf et al. “Robust Vision-Based Localization by Combining an Image-Retrieval System with Monte Carol Localization”, IEEE Transactions on Robotics, vol. 21, No. 2, pp. 208-216, Apr. 2005.
Wong “EIED Online>> Robot Business”, ED Online ID# 13114, 17 pages, Jul. 2006.
Yamamoto et al. “Optical Sensing for Robot Perception and Localization”, 2005 IEEE Workshop on Advanced Robotics and its Social Impacts, pp. 14-17, 2005.
Yata et al. “Wall Following Using Angle Information Measured by a Single Ultrasonic Transducer”, Proceedings of the 1998 IEEE, International Conference on Robotics & Automation, Leuven, Belgium, pp. 1590-1596, May 1998.
Yun, et al. “Image-Based Absolute Positioning System for Mobile Robot Navigation”, IAPR International Workshops SSPR, Hong Kong, pp. 261-269, Aug. 17-19, 2006.
Yun, et al. “Robust Positioning a Mobile Robot with Active Beacon Sensors”, Lecture Notes in Computer Science, 2006, vol. 4251, pp. 690-897, 2006.
Yuta, et al. “Implementation of an Active Optical Range sensor Using Laser Slit for In-Door Intelligent Mobile Robot”, IEE/RSJ International workshop on Intelligent Robots and systems (IROS 91) vol. 1, Osaka, Japan, pp. 415-420, Nov. 3-5, 1991.
Zha et al. “Mobile Robot Localization Using Incomplete Maps for Change Detection in a Dynamic Environment”, Advanced Intelligent Mechatronics '97. Final Program and Abstacts., IEEE/ASME International Conference, pp. 110, Jun. 16-20, 1997.
Zhang, et al. “A Novel Mobile Robot Localization Based on Vision”, SPIE vol. 6279, 6 pages, Jan. 29, 2007.
Euroflex Intellegente Monstre Mauele (English only except).
Roboking—not just a vacuum cleaner, a robot! Jan. 21, 2004, 5 pages.
Popco.net Make your digital life http://www.popco.net/zboard/view.php?id=tr—review&no=40 accessed Nov. 1, 2011.
Matsumura Camera Online Shop http://www.rakuten.co.jp/matsucame/587179/711512/ accessed Nov. 1, 2011.
Dyson's Robot Vacuum Cleaner—the DC06, May 2, 2004 http://www.gizmag.com/go/1282/ accessed Nov. 11, 2011.
Electrolux Trilobite, Time to enjoy life, 38 pages http://www.robocon.co.kr/trilobite/Presentation—Trilobite—Kor—030104. ppt accessed Dec. 22, 2011.
Facts on the Trilobite http://www.frc.ri.cmu.edu/˜hpm/talks/Extras/trilobite.desc.html 2 pages accessed Nov. 1, 2011.
Euroflex Jan. 1, 2006 http://www.euroflex.tv/novita—dett.php?id=15 1 page accessed Nov. 1, 2011.
Friendly Robotics, 18 pages http://www.robotsandrelax.com/PDFs/RV400Manual.pdf accessed Dec. 22, 2011.
Its eye, 2003 www.hitachi.co.jp/rd/pdf/topics/hitac2003—10.pdf 2 pages.
Hitachi, May 29, 2003 http://www.hitachi.co.jp/New/cnews/hl—030529—hl—030529.pdf 8 pages.
Robot Buying Guide, LG announces the first robotic vacuum cleaner for Korea, Apr. 21, 2003 http://robotbg.com/news/2003/04/22/lg—announces—the—first—robotic—vacu.
UBOT, cleaning robot capable of wiping with a wet duster, http://us.aving.net/news/view.php?articleId=23031, 4 pages accessed Nov. 1, 2011.
Taipei Times, Robotic vacuum by Matsuhita about of undergo testing, Mar. 26, 2002 http://www.taipeitimes.com/News/worldbiz/archives/2002/03/26/0000129338 accessed.
Tech-on! http://techon.nikkeibp.co.jp/members/01db/200203/1006501/, 4 pages, accessed Nov. 1, 2011.
IT media http://www.itmedia.co.jp/news/0111/16/robofesta—m.html accessed Nov. 1. 2011.
Yujin Robotics, an intelligent cleaning robot ‘iclebo Q’ AVING USA http://us.aving.net/news/view.php?articleId=7257, 8 pages accessed Nov. 4, 2011.
Special Reports, Vacuum Cleaner Robot Operated in Conjunction with 3G Celluar Phone vol. 59. No. 9 (2004) 3 pages http://www.toshiba.co.jp/tech/review/2004/09/59—0.
Toshiba Corporation 2003, http://warp,ndl.go.jp/info:ndljp/pid/25815/www.soumu.go.jp/joho—tsusin/policyreports/chousa/netrobot/pdf/030214—1—33—a.pdf 16 pages.
http://www.karcher.de/versions/intg/assets/video/2—4—robo—en.swf. Accessed Sep. 25, 2009.
McLurkin “The Ants: A community of Microrobots”, Paper submitted for requirements of BSEE at MIT, May 12, 1995.
Grumet “Robots Clean House”, Popular Mechanics, Nov. 2003.
McLurkin Stupid Robot Tricks: A Behavior-based Distributed Algorithm Library for Programming Swarms of Robots, Paper submitted for requirements of BSEE at MIT, May 2004.
Kurs et al, Wireless Power transfer via Strongly Coupled Magnetic Resonances, Downloaded from www.sciencemag.org , Aug. 17, 2007.
U.S. Appl. No. 60/605,066 as provided to WIPO in PCT/US2005/030422, corresponding to U.S. Appl. No. 11/574,290, U.S. publication 2008/0184518, filing date Aug. 27, 2004.
U.S. Appl. No. 60/605,181 as provided to WIPO in PCT/US2005/030422, corresponding to U.S. Appl. No. 11/574,290, U.S. publication 2008/0184518, filing date Aug. 27, 2004.
Derek Kurth, “Range-Only Robot Localization and SLAM with Radio”, http://www.ri.cmu.edu/pub—files/pub4/kurth—derek—2004—1/kurth—derek—2004—1.pdf. 60 pages, May 2004, accessed Jul. 27, 2012.
Electrolux Trilobite, Jan. 12, 2001, http://www.electrolux-ui.com:8080/2002%5C822%5C833102EN.pdf, accessed Jul. 2, 2012, 10 pages.
Florbot GE Plastics, 1989-1990, 2 pages, available at http://www.fuseid.com/, accessed Sep. 27, 2012.
Gregg et al., “Autonomous Lawn Care Applications,” 2006 Florida Conference on Recent Advances in Robotics, Miami, Florida, May 25-26, 2006, Florida International University, 5 pages.
Hitachi ‘Feature’, http://kadenfan.hitachi.co.jp/robot/feature/feature.html, 1 page, Nov. 19, 2008.
Hitachi, http://www.hitachi.co.jp/New/cnews/hi—030529—hi—030529.pdf , 8 pages, May 29, 2003.
Home Robot—UBOT; Microbotusa.com, retrieved from the WWW at www.microrobotusa.com, accessed Dec. 2, 2008.
King and Weiman, “Helpmate™ Autonomous Mobile Robots Navigation Systems,” SPIE vol. 1388 Mobile Robots, pp. 190-198 (1990).
Martishevcky, “The Accuracy of point light target coordinate determination by dissectoral tracking system”, SPIE vol. 2591, pp. 25-30, Oct. 23, 2005.
Miwako Doi “Using the symbiosis of human and robots from approaching Research and Development Center,” Toshiba Corporation, 16 pages, available at http://warp.ndl.go.jp/info:ndljp/pid/258151/www.soumu.go.jp/joho—tsusin/policyreports/chousa/netrobot/pdf/030214—1—33—a.pdf, Feb. 26, 2003.
Written Opinion of the International Searching Authority, PCT/US2004/001504, Aug. 20, 2012, 9 pages.
Borges et al. “Optimal Mobile Robot Pose Estimation Using Geometrical Maps”, IEEE Transactions on Robotics and Automation, vol. 18, No. 1, pp. 87-94, Feb. 2002.
Braunstingl et al. “Fuzzy Logic Wall Following of a Mobile Robot Based on the Concept of General Perception” ICAR '95, 7th International Conference on Advanced Robotics, Sant Feliu De Guixols, Spain, pp. 367-376, Sep. 1995.
Bulusu, et al. “Self Configuring Localization systems: Design and Experimental Evaluation”, ACM Transactons on Embedded Computing Systems vol. 3 No. 1 pp. 24-60, 2003.
Caccia, et al. “Bottom-Following for Remotely Operated Vehicies”, 5th IFAC conference, Alaborg, Denmark, pp. 245-250 Aug. 1, 2000.
Chae, et al. “StarLITE: A new artificial landmark for the navigation of mobile robots”, http://www.irc.atr.jp/jk-nrs2005/pdf/Starlite.pdf, 4 pages, 2005.
Chamberlin et al. “Team 1: Robot Locator Beacon System” NASA Goddard SFC, Design Proposal, 15 pages, Feb. 17, 2006.
Champy “Physical management of IT assets in Data Centers using RFID technologies”, RFID 2005 University, Oct. 12-14, 2005 (NPL0126).
Chiri “Joystick Control for Tiny OS Robot”, http://eecs.berkeley.edu/Programs/ugrad/superb/papers2002/chiri.pdf. 12 pages, Aug. 8, 2002.
Christensen et al. “Theoretical Methods for Planning and Control in Mobile Robotics” 1997 First International Conference on Knowledge-Based Intelligent Electronic Systems, Adelaide, Australia, pp. 81-86, May 21-27, 1997.
Clerentin, et al. “A localization mehod based on two omnidirectional peception systems cooperation” Proc of IEEE International Conference on Robotics & Automation, San Francisco, CA vol. 2, pp. 1219-1224, Apr. 2000.
Corke “High Performance Visual serving for robots end-point control”. SPIE vol. 2056 Intelligent robots and computer vision 1993.
Cozman et al. “Robot Localization using a Computer Vision Sextant”, IEEE International Midwest Conference on Robotics and Automation, pp. 106-111, 1995.
D'Orazio, et al. “Model based Vision System for mobile robot position estimation”, SPIE vol. 2058 Mobile Robots VIII, pp. 38-49, 1992.
De Bakker, et al. “Smart PSD—array for sheet of light range imaging”, Proc. of SPIE vol. 3965, pp. 1-12, May 15, 2000.
Desaulniers, et al. “An Efficient Algorithm to find a shortest path for a car-like Robot”, IEEE Transactions on robotics and Automation vol. 11 No. 6, pp. 819-828, Dec. 1995.
Dorfmüller-Ulhaas “Optical Tracking From User Motion to 3D Interaction”, http://www.cg.tuwien.ac.at/research/publications/2002/Dorfmueller-Ulhaas-thesis, 182 pages, 2002.
Dorsch, et al. “Laser Triangulation: Fundamental uncertainty in distance measurement”, Applied Optics, vol. 33 No. 7, pp. 1306-1314, Mar. 1, 1994.
Dudek, et al. “Localizing a Robot with Minimum Travel” Proceedings of the sixth annual ACM-SIAM symposium on Discrete algorithms, vol. 27 No. 2 pp. 583-604, Apr. 1998.
Dulimarta, et al. “Mobile Robot Localization in Indoor Environment”, Pattern Recognition, vol. 30, No. 1, pp. 99-111, 1997.
EBay “Roomba Timer—< Timed Cleaning—Floorvac Robotic Vacuum”, Cgi.ebay.com/ws/eBay|SAP|.dll?viewitem&category=43526&item=4375198387&rd=1, 5 pages, Apr. 20, 2005.
Electrolux “Welcome to the Electrolux trilobite” www.electroluxusa.com/node57.asp?currentURL=node142.asp%3F, 2 pages, Mar. 18, 2005.
Eren, et al. “Accuracy in position estimation of mobile robots based on coded infrared signal transmission”, Proceedings: Integrating Intelligent Instrumentation and Control, Instrumentation and Measurement Technology Conference, 1995. IMTC/95, pp. 548-551, 1995.
Eren, et al. “Operation of Mobile Robots in a Structured Infrared Environment”, Proceedings. ‘Sensing, Processing, Networking’, IEEE Instrumentation and Measurement Technology Conference, 1997 (IMTC/97), Ottawa, Canada vol. 1, pp. 20-25, May 19-21, 1997.
Becker, et al. “Reliable Navigation Using Landmarks” IEEE International Conference on Robotics and Automation, 0-7803-1965-6, pp. 401-406, 1995.
Benayad-Cherif, et al., “Mobile Robot Navigation Sensors” SPIE vol. 1831 Mobile Robots, VII, pp. 378-387, 1992.
Facchinetti, Claudio et al. “Using and Learning Vision-Based Self-Positioning for Autonomous Robot Navigation”, ICARCV '94, vol. 3 pp. 1694-1698, 1994.
Betke, et al., “Mobile Robot localization using Landmarks” Proceedings of the IEEE/RSJ/GI International Conference on Intelligent Robots and Systems '94 “Advanced Robotic Systems and the Real World” (IROS '94), vol.
Facchinetti, Claudio et al. “Self-Positioning Robot Navigation Using Ceiling Images Sequences”, ACCV '95, 5 pages, Dec. 5-8, 1995.
Fairfield, Nathaniel et al. “Mobile Robot Localization with Sparse Landmarks”, SPIE vol. 4573 pp. 148-155, 2002.
Favre-Bulle, Bernard “Efficient tracking of 3D—Robot Position by Dynamic Triangulation”, IEEE Instrumentation and Measurement Technology Conference IMTC 98 Session on Instrumentation and Measurement in Robotics, vol. 1, pp. 446-449, May 18-21, 1998.
Fayman “Exploiting Process Integration and Composition in the context of Active Vision”, IEEE Transactions on Systems, Man, and Cybernetics—Part C: Application and reviews, vol. 29 No. 1, pp. 73-86, Feb. 1999.
Franz, et al. “Biomimetric robot navigation”, Robotics and Autonomous Systems vol. 30 pp. 133-153, 2000.
Friendly Robotics “Friendly Robotics—Friendly Vac, Robotic Vacuum Cleaner”, www.friendlyrobotics.com/vac.htm. 5 pages Apr. 20, 2005.
Fuentes, et al. “Mobile Robotics 1994”, University of Rochester. Computer Science Department, TR 588, 44 pages, Dec. 7, 1994.
Bison, P et al., “Using a structured beacon for cooperative position estimation” Robotics and Autonomous Systems vol. 29, No. 1. pp. 33-40, Oct. 1999.
Fukuda, et al. “Navigation System based on Ceiling Landmark Recognition for Autonomous mobile robot”, 1995 IEEE/RSJ International Conference on Intelligent Robots and Systems 95. ‘Human Robot Interaction and Cooperative Robots’, Pittsburgh, PA, pp. 1466/1471, Aug. 5-9, 1995.
Gionis “A hand-held optical surface scanner for environmental Modeling and Virtual Reality”, Virtual Reality World, 16 pages 1996.
Goncalves et al. “A Visual Front-End for Simultaneous Localization and Mapping”, Proceedings of the 2005 IEEE International Conference on Robotics and Automation, Barcelona, Spain, pp. 44-49, Apr. 2005.
Gregg et al. “Autonomous Lawn Care Applications”, 2006 Florida Conference on Recent Advances in Robotics, FCRAR 2006, pp. 1-5, May 25-26, 2006.
Hamamatsu “SI PIN Diode S5980, S5981 S5870—Multi-element photodiodes for surface mounting”, Hamatsu Photonics, 2 pages Apr. 2004.
Hammacher Schlemmer “Electrolux Trilobite Robotic Vacuum” www.hammacher.com/publish/71579.asp?promo=xsells, 3 pages, Mar. 18, 2005.
Haralick et al. “Pose Estimation from Corresponding Point Data”, IEEE Transactions on systems, Man, and Cybernetics, vol. 19, No. 6, pp. 1426-1446, Nov. 1989.
Hausler “About the Scaling Behaviour of Optical Range Sensors”, Fringe '97, Proceedings of the 3rd International Workshop on Automatic Processing of Fringe Patterns, Bremen, Germany, pp. 147-155, Sep. 15-17, 1997.
Blaasvaer, et al. “AMOR—An Autonomous Mobile Robot Navigation System”, Proceedings of the IEEE International Conference on Systems, Man, and Cybernetics, pp. 2266-2271, 1994.
Hoag, et al. “Navigation and Guidance in interstellar space”, ACTA Astronautica vol. 2, pp. 513-533 , Feb. 14, 1975.
Huntsberger et al. “CAMPOUT: A Control Architecture for Tightly Coupled Coordination of Multirobot Systems for Planetary Surface Exploration”, IEEE Transactions on Systems, Man, and Cybernetics—Part A: Systems and Humans, vol. 33, No. 5, pp. 550-559, Sep. 2003.
Iirobotics.com “Samsung Unveils Its Multifunction Robot Vacuum”, www.iirobotics.com/webpages/hotstuff.php?ubre=111, 3 pages, Mar. 18, 2005.
Related Publications (1)
Number Date Country
20090292393 A1 Nov 2009 US
Provisional Applications (2)
Number Date Country
60177703 Jan 2000 US
60582992 Jun 2004 US
Continuations (1)
Number Date Country
Parent 11166986 Jun 2005 US
Child 12486820 US
Continuation in Parts (2)
Number Date Country
Parent 10453202 Jun 2003 US
Child 11166986 US
Parent 09768773 Jan 2001 US
Child 10453202 US