Coverage robot mobility

Information

  • Patent Grant
  • 8600553
  • Patent Number
    8,600,553
  • Date Filed
    Tuesday, June 5, 2007
    17 years ago
  • Date Issued
    Tuesday, December 3, 2013
    10 years ago
Abstract
An autonomous coverage robot includes a drive system, a bump sensor, and a proximity sensor. The drive system is configured to maneuver the robot according to a heading (turn) setting and a speed setting. The bump sensor is responsive to a collision of the robot with an obstacle in a forward direction. A method of navigating an autonomous coverage robot with respect to an object on a floor includes the robot autonomously traversing the floor in a cleaning mode at a full cleaning speed. Upon sensing a proximity of the object forward of the robot, the robot reduces the cleaning speed to a reduced cleaning speed while continuing towards the object until the robot detects a contact with the object. Upon sensing contact with the object, the robot turns with respect to the object and cleans next to the object, optionally substantially at the reduced cleaning speed.
Description
TECHNICAL FIELD

This disclosure relates to autonomous coverage robots.


BACKGROUND

Autonomous robots are robots which can perform desired tasks in unstructured environments without continuous human guidance. Many kinds of robots are autonomous to some degree. Different robots can be autonomous in different ways. An autonomous coverage robot traverses a work surface without continuous human guidance to perform one or more tasks. In the field of home, office and/or consumer-oriented robotics, mobile robots that perform household functions such as vacuum cleaning, floor washing, patrolling, lawn cutting and other such tasks have been widely adopted.


SUMMARY

An autonomous coverage robot will encounter many obstacles while operating. In order to continue operating, the robot will need to continually avoid obstacles, and in cases where trapped by fabric, string, or other entangling soft media, free itself.


In one aspect, an autonomous coverage robot includes a chassis, a drive system mounted on the chassis and configured to maneuver the robot, an edge cleaning head carried by the chassis, and a controller carried by the chassis. The edge cleaning head is driven by an edge cleaning head motor and may rotate about a non-horizontal axis. The edge cleaning head extends beyond a lateral extent of the chassis to engage a floor surface while the robot is maneuvered across the floor. The edge cleaning head may be disposed on or near a peripheral edge of the robot. A brush control process, independent of drive processes, on a controller that controls robot operation is configured to monitor motor current associated with the edge cleaning head. The brush control process on the controller is also configured to reverse bias the edge cleaning head motor in a direction opposite to the previous cleaning direction after detecting a spike (e.g., transient or rapid increase in motor current) or in general an elevated motor current motor (to substantially neutrally rotate and/or be driven to rotate at the same speed as a an unwinding cord, string, or other tangled medium), while continuing to maneuver the robot across the floor performing uninterrupted coverage or cleaning of the floor or other motion behaviors. In one implementation, the brush control process on the controller, following an elevated edge cleaning head motor current, reverse biases the edge cleaning head motor (to substantially neutrally rotate and/or be driven to rotate at the same speed as a an unwinding cord, string, or other tangled medium) and subsequently or concurrently passes a signal to a drive motor control process, directly or indirectly via a supervising process, so that the unwinding may occur at the same time that the robot drives substantially backwards, alters a drive direction, and moves the robot forward.


In one implementation, the edge cleaning head includes a brush with bristles that extend beyond a peripheral edge of the chassis. In one example, the edge cleaning head includes at least one brush element having first and second ends, the bush element defining an axis of rotation about the first end normal to the work surface. The edge cleaning head may rotate about a substantially vertical axis. In one instance, the edge cleaning head includes three brush elements, where each brush element forms an angle with an adjacent brush element of about 120 degrees. In another instance, the edge cleaning head comprises six brush elements, where each brush element forms an angle with an adjacent brush element of about 60 degrees.


In another implementation, the edge cleaning head comprises a rotatable squeegee that extends beyond a peripheral edge of the chassis. The rotatable squeegee may be used for wet cleaning, surface treatments, etc.


In yet another implementation, the edge cleaning head includes a plurality of absorbent fibers that extend beyond a peripheral edge of the chassis upon rotation of the cleaning head. The plurality of absorbent fibers may be used like a mop to clean up spills, clean floors, apply surface treatments, etc.


The robot may include multiple cleaning heads (e.g., two or three) carried by the chassis. In one example, the robot further includes a main cleaning head carried by the chassis, a cleaning head extending across a swath covered by the robot, which forms the main work width of the robot, and which may be driven to rotate about a horizontal axis to engage a floor surface while the robot is maneuvered across the floor. The main cleaning head may include a cylindrical body defining a longitudinal axis of rotation parallel to the work surface, bristles disposed on the cylindrical body, and flexible flaps disposed longitudinally along the cylindrical body. The brush control process on the controller is configured to reverse bias the rotation of the main cleaning head (to substantially neutrally rotate and/or be driven to rotate at the same speed as a an unwinding cord, string, or other tangled medium), in response to an elevated main cleaning head motor current, while a motion control process independently continues to maneuver the robot across the floor. In another example, the robot includes two main cleaning brushes carried by the chassis and driven to rotate about a horizontal axis to engage a floor surface while the robot is maneuvered across the floor. The two main cleaning brushes may be driven to rotate in the same or opposite directions.


In another aspect, a method of disentangling an autonomous coverage robot includes placing the robot on a floor surface, the robot autonomously traversing across the floor surface in a forward direction of the robot while rotating about a non-horizontal axis an edge cleaning head carried by the chassis and driven by an edge cleaning head motor. The edge cleaning head extends beyond a lateral extent of the chassis while engaging the floor surface. The robot independently provides a reverse bias for the edge cleaning head motor (to substantially neutrally rotate and/or be driven to rotate at the same speed as a an unwinding cord, string, or other tangled medium), in response to an elevated edge cleaning head motor current while continuing to maneuver across the floor surface.


In one implementation, the brush control process on the controller of the robot determines movement of the robot in the forward direction before (independently of robot motion control) reversing the rotation of the edge cleaning head in response to an elevated cleaning head motor current. The brush control process of the robot may (independently of robot motion control) reverses the rotation of the edge cleaning head in response to an elevated edge cleaning head motor current for a period of time. In one example, after the brush control process reverses the rotation of the edge cleaning head, the brush control process may directly or through a supervising process pass a signal to the motion control process of the robot to move in a reverse direction, alter a drive direction, and moves in the drive direction.


In another implementation, the robot also includes a main cleaning brush carried by the chassis, which may be driven to rotate about a horizontal axis to engage the floor surface while the robot is maneuvered across the floor. The robot independently reverses the rotation of the main cleaning brush in response to an elevated main cleaning head motor current while continuing to maneuver across the floor surface. The brush cleaning process of the robot may also determine movement of the robot in the forward direction before independently reversing the rotation of the main cleaning brush in response to an elevated main cleaning brush motor current. Furthermore, the brush cleaning process of the robot may also reverse the rotation of the main cleaning brush for a certain period of time or in intervals.


In another aspect, an autonomous coverage robot includes a drive system, a bump sensor, and a proximity sensor. The drive system is configured to maneuver the robot according to a heading (turn) setting and a speed setting. The bump sensor is responsive to a collision of the robot with an obstacle in a forward direction. The proximity sensor is responsive to an obstacle forward of the robot at a proximate distance but not contacting the robot, e.g., 1-10 inches, preferably 1-4 inches. The motion control processes of the drive system may also be configured to reduce the speed setting in response to a signal from the proximity sensor indicating detection of a potential obstacle, while continuing a cleaning or coverage process, including advancing the robot according to the heading setting. Furthermore, the motion control processes of the drive system may also be configured to alter the heading (turn) setting in response to a signal received from the bump sensor indicating contact with an obstacle.


In some instances, the motion control processes of the drive system may be configured to alter the heading setting in response to the signals received from the bump sensor and one or more side proximity sensors to follow a perimeter of the obstacle. In other instances, the drive system may be configured to alter the heading (turn) setting in response to the signals received from the bump sensor and the proximity sensor to direct the robot away from the obstacle. In one example, the drive system is configured to maneuver the robot at a torque (e.g., motor current or motor resistance) setting and the drive system is configured to alter the motor current or motor resistance setting in response to a signal received from the bump sensor indicating contact with an obstacle. The drive system may increase the motor current or motor resistance setting in response to a signal received from the bump sensor indicating contact with an obstacle.


The proximity sensor may include a plurality of sets of at least one infrared emitter and receive pair, directed toward one another to converge at a fixed distance from one another, substantially as disclosed in “Robot obstacle detection system”, U.S. Pat. No. 6,594,844, herein incorporated by reference in its entirety. Alternatively, the proximity sensor may include a sonar device. The bump sensor may include a switch, a capacitive sensor, or other contact sensitive device.


The robot may be placed on the floor. In yet another aspect, a method of navigating an autonomous coverage robot with respect to an object on a floor includes the robot autonomously traversing the floor in a cleaning mode at a full cleaning speed. Upon sensing a proximity of the object forward of the robot, the robot reduces the cleaning speed to a reduced cleaning speed while continuing towards the object until the robot detects a contact with the object. Upon sensing contact with the object, the robot turns with respect to the object and cleans next to the object, optionally substantially at the reduced cleaning speed. The robot may follow a perimeter of the object while cleaning next to the object. Upon leaving the perimeter of the robot, the robot may increase speed to a full cleaning speed. The robot may maintain a substantially constant following distance from the object, may maintain a following distance smaller than the extent of extension of an edge cleaning head or brush beyond a following side of the robot body, or may substantially contact the object while cleaning next to the object in response to the initial, reduced cleaning speed contact with the object. In one example, the following distance from the object is substantially a distance between the robot and the object substantially immediately after the contact with the object. In another example, the following distance from the object is between about 0 and 2 inches.


In one instance, the robot performs a maneuver to move around the object in response to the contact with the object. The maneuver may include the robot moving in a substantially semi-circular path, or a succession of alternating partial spirals (e.g., arcs with progressively decreasing radius) around the object. Alternatively, the maneuver may include the robot moving away from the object and then moving in a direction substantially tangential to the object.


Upon sensing a proximity of the object forward of the robot, the robot may decrease the full cleaning speed to a reduced cleaning speed at a constant rate, an exponential rate, a non-linear rate, or some other rate. In addition, upon sensing contact with the object, the robot may increase a torque (e.g., motor current) setting of the drive, main brush, or side brush motors.


In yet another aspect, an autonomous robot includes a chassis, a drive system mounted on the chassis and configured to maneuver the robot, and a floor proximity sensor carried by the chassis and configured to detect a floor surface below the robot. The floor proximity sensor includes a beam emitter configured to direct a beam toward the floor surface and a beam receiver responsive to a reflection of the directed beam from the floor surface and mounted in a downwardly-directed receptacle of the chassis. The floor proximity sensor may be a substantially sealed unit (e.g., in the downward direction) and may also include a beam-transparent cover having a forward and rearward edge disposed across a lower end of the receptacle to prohibit accumulation of sediment, “carpet fuzz”, hair, or household dust within the receptacle. The cover may include a lens made of an anti-static material. The forward edge of the cover, i.e., the edge of the cover in the direction of robot motion, at the leading edge of the robot, is elevated above the rearward edge. The lower surface of the receptacle may be wedge shaped. In one example, the floor proximity sensor includes at least one infrared emitter and receiver pair, substantially as disclosed in “Robot obstacle detection system”, U.S. Pat. No. 6,594,844.


In one implementation, the drive system of the robot includes at least one driven wheel suspended from the chassis and at least one wheel-floor proximity sensor carried by the chassis and housed adjacent one of the wheels, the wheel-floor proximity sensor configured to detect the floor surface adjacent the wheel. The drive system may also include a controller configured to maneuver the robot away from a perceived cliff in response a signal received from the floor proximity sensor. In some instances, the drive system includes a wheel drop sensor housed near one of the wheels and responsive to substantial downward displacement of the wheel with respect to the chassis. The drive system may include a validation system that validates the operability of the floor proximity sensors when all wheels drop. The validation is based on the inference that all wheels dropped are likely the result of a robot being lifted off the floor by a person, and checks to see that all floor proximity sensors do not register a floor surface (either no reflection measured, or a reflection that is too strong). Any sensor that registers a floor surface or a too strong reflection (e.g., indicating a blocked sensor) is considered blocked. In response to this detection, the robot may initiate a maintenance reporting session in which indicia or lights indicate that the floor proximity sensors are to be cleaned. In response to this detection, the robot will prohibit forward motion until a validation procedure determines that all floor proximity sensors are clear and are functional. Each wheel-floor and wheel drop proximity sensors may include at least one infrared emitter and receiver pair.





DESCRIPTION OF DRAWINGS


FIG. 1 shows an above-perspective view of an example autonomous coverage robot.



FIG. 2 shows a below-perspective view of an example autonomous coverage robot.



FIG. 3 shows an exploded view of an example autonomous coverage robot.



FIG. 4 shows a front-perspective view of an example main cleaning head which may be incorporated in an autonomous coverage robot.



FIG. 5 shows an exploded view of an example main cleaning head which may be used with an autonomous coverage robot.



FIG. 6A shows an above-perspective view of an example edge cleaning head which uses a rotatable brush.



FIG. 6B shows an exploded view of an example edge cleaning head.



FIG. 6C shows schematic views of a tilt of an example edge cleaning head.



FIG. 7 shows an example of an edge cleaning head with a rotatable squeegee.



FIG. 8A shows a bumper which may be used with autonomous coverage robot.



FIG. 8B shows kinetic bump sensors and proximity sensors.



FIG. 9A shows a block diagram of an exemplary robot; FIGS. 9B and 9C show flow charts describing motion control and brush operation.



FIG. 10 shows floor proximity sensors and an attachment brace which may be used for detecting an adjacent floor.



FIGS. 11 and 12 show side and exploded views of a floor proximity sensor.



FIG. 13 shows an exploded view of a cover used with the floor proximity sensor shown in FIGS. 11 and 12.



FIG. 14 is an exploded view showing an example of a caster wheel assembly.



FIG. 15 is an exploded view showing an example of a wheel-drop sensor.



FIG. 16 is a cross-sectional view showing an example of a caster wheel assembly.



FIGS. 17 A-H illustrate examples of methods for disentangling coverage robots with various configurations of cleaning heads.



FIG. 17A illustrates a method of disentangling which may be used with a coverage robot having an agitating roller



FIG. 17B illustrates a method of disentangling which may be used with a coverage robot having an agitating roller and a brush roller.



FIG. 17C has a side view and a bottom view that illustrates a method for disentangling a coverage robot with dual agitating rollers.



FIG. 17D illustrates an alternate method of disentangling with the robot shown in FIG. 17C.



FIG. 17E illustrates a method of disentangling a coverage robot with two agitation rollers and a brush roller.



FIG. 17F illustrates another method of disentangling the coverage robot.



FIG. 17G has a side view and a bottom view illustrating a disentanglement method with a coverage robot 300 with two agitation rollers and two air ducts.



FIG. 17H has a side view and a bottom view illustrating a disentanglement method with a coverage robot 300 with two agitation rollers, a brush roller and two air ducts.





Like reference symbols in the various drawings indicate like elements.


DETAILED DESCRIPTION


FIGS. 1-3 show above-perspective, below-perspective, and exploded views of an example autonomous coverage robot 100. Robot 100 has a chassis 102, a drive system 104, an edge cleaning head 106a, and a controller 108. Drive system 104 is mounted on the chassis 102, and is a differential drive (left and right wheels near to or on the center diameter of the robot and independently speed controllable) configured to maneuver robot 100. Edge cleaning head 106a is mounted to extend past the side edge of chassis 102 for removing dirt and debris below and immediately adjacent to robot 100, and more particularly to sweep dirt and debris into the cleaning path of the main cleaning head 106b as the robot cleans in a forward direction. In some implementations, the main or edge cleaning heads 106b, 106a may also be used to apply surface treatments. A controller 108 (also depicted in FIG. 9A) is carried by chassis 102 and is controlled by behavior based robotics to provide commands to the components of robot 100 based on sensor readings or directives, as described below, to clean or treat floors in an autonomous fashion. A battery 109 may provide a source of power for robot 100 and its subsystems. A bottom cover 110 may protect internal portions of robot 100 and keep out dust and debris.


Drive system 104 includes a left drive wheel assembly 112, a right drive wheel assembly 114 and a castor wheel assembly 116. Drive wheel assemblies 112, 114 and castor wheel assembly 116 are connected to chassis 102 and provide support to robot 106. Controller 108 may provide commands to the drive system to drive wheels 112 and 114 forward or backwards to maneuver robot 100. For instance, a command may be issued by controller 108 to engage both wheel assemblies in a forward direction, resulting in forward motion of robot 100. In another instance, a command may be issued for a left turn that causes left wheel assembly 112 to be engaged in the forward direction while right wheel assembly 114 is driven in the rear direction, resulting in robot 100 making a clockwise turn when viewed from above.



FIGS. 4 and 5 show front perspective and exploded views of a main cleaning brush 111 which may be incorporated in the main cleaning head 106b of the robot 100 via attachment to chassis 102. General structure of a robot and cleaning heads as disclosed herein is similar to that disclosed in U.S. Pat. No. 6,883,201, herein incorporated by reference in its entirety, except when so noted. In general, when a robot brush becomes entangled with cords, strings, hair, fringes or tassels, the brush motor may encounter overcurrent or temperature rise, and may cause increased energy consumption, poor cleaning, slowing or jamming of the brush. If the robot is so controlled or the entangling item is heavy or secured, the robot may be held in place, and if sensors are available to detect stasis, may stop moving and thereby fail to clean. A robot that gets stuck during its working routine must be “rescued” and cleaned in order to continue autonomous function. Theoretically, there may be additional expenditure of energy to combat static or dynamic friction in the drive wheels, caster, bin squeegee and cleaning head drive train (reverse-drive). The fringes/tassels/cords may wind tightly around a smallest wind diameter of the cleaning brush (e.g., usually the core of a brush 111, if the brush 111 includes only bristles). If the smallest diameter of the cleaning brush 111 is solid (no elasticity), additional energy may be required to overcome static or dynamic friction in a gear train of the cleaning head and the brushes in contact with the floor, e.g., when the brush is rotated in the opposite within the cleaning head in order to unwind the fringes/tassels/cords. If the tassel or string is permitted to continue winding about the brush, it may be necessary to remove the brush 111 from the cleaning head 106b in order to remove the entanglement. Main cleaning head 111 has baffles or soft flaps 113 and bristles 115 arranged along a cleaning head body 117. Soft flaps 113 disposed along the length of cleaning head body 117 may minimize static friction. Cleaning head body 117 may be rotated about its horizontal axis so that it engages the floor surface while robot 100 is moving across a floor, causing baffles 113 and bristles 115 to agitate dirt and debris which may be on the floor's surface. Controller 108 may be configured to reverse bias the rotation of main cleaning head 111 (i.e., provide sufficient reverse current to permit the cleaning brush to freely rotate when the robot draws out and unwinds an entanglement as it moves away in a forward direction) following a sharp rise in or an elevated main cleaning head motor current, while continuing to conduct a cleaning cycle or other cycle as the controller 108 executes individual motion control behaviors to move the robot 100 across the floor. A rim 116 of soft flaps 113 in this case can become the smallest diameter of cleaning head 111. Rim 116 is flexible (pliable, soft), so as to require little energy to deform, potentially diverting energy away from that required to initiate robot 100 movement. A momentary delay in a brush gear train encountering static friction provides an opportunity for robot 100 to resume movement, thereby enabling easier disentanglement of brushes. Similarly, a cord or tassel may become less entangled about the larger diameter of the rim 116 (in comparison to a core such as core 117 or even smaller core) simply because the brush 111 does not complete as many turns per unit length of entangled cord or tassel. Furthermore, a length-wise scooped (curved) nature of the flaps 113 further acts as a spring forcing the tassels/fringes to unravel/open during the momentary lag between the robot being set in motion and a reverse bias to bias back-driving of the entangled cleaning head 111. Bristles 115 may be used primarily used to clean, while flaps 113 may be used primarily for disentanglement purposes. This allows robot 100 to continue to clean (agitate the carpet) if an entangled string snaps off and gets retained by flaps 113 in cleaning head 111. Other robot details and features combinable with those described herein may be found in the following U.S. Provisional Patent Application No. 60/747,791, the entire contents of which are hereby incorporated by reference.



FIGS. 6A and 6B show above-perspective and exploded views of edge cleaning head 106. Edge cleaning head 106a is carried by chassis 102 and driven by an edge cleaning head motor 118 and drive transmission 119 to rotate a brush 120 about a non-horizontal axis. Brush 120 has brush elements 122A-F that extend beyond a peripheral edge of chassis 102. Each brush element 122A-F forms an angle of about 60 degrees with adjacent brush elements and is tipped with bristles extending along the axis of the elements. Brush 120 may be rotated about a vertical axis, such that the ends of bush elements 122A-F move normal to the work surface. Edge cleaning head 106 may be located near the edge of robot 100 so that brush 120 is capable of sweeping dirt and debris beyond the edge of chassis 102. In some implementations, the edge cleaning head 106 operates about an axis offset (tilted) from a vertical axis of the robot. As shown in schematic form in FIG. 6C the brush 106 may be tilted, in both forward and side to side directions (i.e., tilted downward with respect to the plane of wheel contact about a line about 45 degrees from the direction of travel within that plane), in order to collect debris from outside the robot's periphery toward the main work width, but not disturb such collected debris once it is there or otherwise eject debris from the work width of the robot. The axis offset is optionally adjustable to customize the tilt of the cleaning head 106 to suit various carpet types, such as shag.


Other configurations of edge cleaning heads may also be used with robot 100. For example, an edge cleaning head may have three evenly-spaced brush elements separated by 120 degrees. FIG. 7 shows another example of an edge cleaning head 124 in which a rotatable squeegee 126 is used in place of a brush. In other configurations, an edge cleaning head may have one or more absorbent fibers that extend beyond a peripheral edge of chassis 102.



FIG. 8A shows a bumper 130 which may be used with the autonomous coverage robot 100. FIG. 8B shows proximity sensors 134 which may be housed within bumper 130. Drive system 104 may be configured to maneuver robot 100 according to a heading setting and a speed setting. Proximity sensors 134 may sense a potential obstacle in front of the robot.



FIG. 9A shows a schematic view of electronics of the robot 100. The robot 100 includes a controller 103 which communicates with a bumper micro-controller 107A, that together control an omni-directional receiver, directional receiver, the wall proximity sensors 134, and the bumper switches 132. The controller 103 monitors all other sensor inputs, including the cliff sensors 140 and motor current sensors for each of the motors.


Control of the direction and speed of the robot 100 may be handled by motion control behaviors selected by an arbiter according to the principles of behavior based robotics for coverage and confinement, generally disclosed in U.S. Pat. Nos. 6,809,490 and 6,781,338, herein incorporated by reference in their entireties (and executed by controller 108), to reduce the speed magnitude of robot 100 when proximity sensor 134 detects a potential obstacle. The motion behaviors executed by the controller 108 may also alter the velocity of robot 100 when kinetic bump sensors 132 detect a collision of robot 100 with an obstacle. Accordingly, referring to FIG. 9A, robot 100 traverses a floor surface by executing a cruising or STRAIGHT behavior 900. When robot 100 detects a proximate, but not yet contacting obstacle via proximity sensors 134, robot 100 executes a gentle touch routine 902 (which may be a behavior, a part of a behavior, or formed by more than one behavior), in which robot 100 does not proceed at full cleaning speed into the obstacle; but instead reduces its approach speed from a full cleaning speed of about 300 mm/sec to a reduced cleaning speed of about 100 mm/sec via controller 108 toward the potential obstacle, such that when a collision does occur, the collision is less noisy, and less likely to mar surfaces. The overall noise, the potential damage to the robot 100 or the object being collided thereby is reduced. When robot 100 detects contact with the object via kinetic bump sensors 132, robot 100 executes one of the following routines: bounce 910, follow obstacle perimeter 912, alter drive direction and move away from object 914, or alter drive direction to curve to approach the object and follow along it (e.g., a wall). Bounce 910 entails robot 100 moving so as to bounce along the object. Follow obstacle perimeter 912 entails robot 100 using proximity sensors 134 to follow along a perimeter of the object at a predefined distance to, for example, clean up close to the object and/or clean to the very edge of a wall. The robot 100 continuously cleans the room, and when it detects a proximate object (which may be a wall, table, chair, sofa, or other obstacle) in the forward direction, it continues cleaning in the same direction without interruption, albeit at a reduced speed. In predetermined and/or random instances, the robot 100 will bump the object, turn in place so that the edge of the main cleaning head 106b is as close to the wall as possible, and closely follow the object on the side of the robot, essentially at the reduced cleaning speed, such that the side/edge brush 106a collects debris or dirt from the corner between the floor and the wall or obstacle. Once the robot 100 leaves the wall, after a predetermined and/or randomized distance within predetermined limits, the robot 100 increases its speed up to full cleaning speed. On other occasions, it will bump the object, turn in place until facing away from the object or wall, and immediately proceed away from the object or wall at full cleaning speed.


The robot 100 employs a behavioral software architecture within the controller 103. While embodiments of the robot 100 discussed herein may use behavioral based control only in part or not at all, behavior based control is effective at controlling the robot to be robust (i.e. not getting stuck or failing) as well as safe. The robot 100 employs a control and software architecture that has a number of behaviors that are executed by an arbiter in controller 103. A behavior is entered into the arbiter in response to a sensor event. In one embodiment, all behaviors have a fixed relative priority with respect to one another. The arbiter (in this case) recognizes enabling conditions, which behaviors have a full set of enabling conditions, and selects the behavior having the highest priority among those that have fulfilled enabling conditions. In order of decreasing priority, the behaviors are generally categorized as escape and/or avoidance behaviors (such as avoiding a cliff or escaping a corner), and working behaviors (e.g., wall following, bouncing, or driving in a straight line). The behaviors may include: different escape (including escaping corners, anti-canyoning, stuck situations, “ballistic” temporary fire-and-forget movement that suppress some avoid behaviors, e.g., as disclosed in U.S. Pat. No. 6,809,490) cliff avoiding, virtual wall avoiding (a virtual wall may be a beacon with a gateway beam), spot coverage (covering in a confined pattern such as a spiral or boustrophedon patch), align (turning in place, using side proximity sensors to align with a forward obstacle encountered while obstacle following, e.g., an inside corner), following (representing either or both of substantially parallel or bump following along an obstacle using a side proximity sensor or bumper that extends to the side of the robot), responding to a bump in order to “bounce” (a behavior that occurs after the robot bumps an object), and drive (cruising). Movement of the robot, if any, occurs while a behavior is arbitrated. If more than one behavior is in the arbiter, the behavior with a higher priority is executed, as long as any corresponding required conditions are met. For example, the cliff avoiding behavior will not be executed unless a cliff has been detected by a cliff detection sensor, but execution of the cliff avoiding behavior always takes precedence over the execution of other behaviors that also have satisfied enabling conditions.


The reactive behaviors have, as their enabling conditions or triggers, various sensors and detections of phenomena. These include sensors for obstacle avoidance and detection, such as forward proximity detection (multiple), forward bump detection (multiple), cliff sensors (multiple), detection of a virtual wall signal (which may instead be considered a coverage trigger). Sensors of these types are be monitored and conditioned by filters, conditioning, and their drivers, which can generate the enabling conditions as well as record data that helps the behavior act predictably and on all available information (e.g., conversion to one-bit “true/false” signals, recording of likely angle of impact or incidence based on strength or time differences from a group of sensors, or historical, averaging, frequency, or variance information).


Actual physical sensors may be represented in the architecture by “virtual” sensors synthesized from the conditioning and drivers. Additional “virtual” sensors that are synthesized from detectable or interpreted physical properties, proprioceptive or interpreted upon the robot 100, such as over-current of a motor, stasis or stuck condition of the robot 100 (by monitoring a lack of odometry reading from a wheel encoder or counter), battery charge state via coulometry, and other virtual sensors.


In addition, reactive behaviors can act according to enabling conditions that represent detected phenomena to be sought or followed. A beam or wireless (RF, acoustic) signal can be detected without direction; or in some cases with direction. A remote beam or marker (bar code, retro-reflective, distinctive, fiducial, or natural recognized by vision landmark) giving a direction can permit homing or relative movement; without direction the robot 100 can nonetheless move to servo on the presence, absence, and/or relative strength of a detected signal. The reflection of a beam from the robot 100, edge, or line can be similarly detected, and following behaviors (such as obstacle following by the robot 100) conducted by servoing on such signal. A debris or artifact signal can be collected by monitoring debris or objects collected by or traversed by the robot, and that signal can be an enabling condition for a reactive behavior controlling a spot coverage pattern.


The robot 100 maintains concurrent processes, “parallel” processes that are not generally considered reactive behaviors. A scheduler may be necessary to allocate processor time to most other processes, e.g., including the arbiter and behaviors, in a co-operative or other multitasking manner. If more threading is available, less processes may be managed by the scheduler. As noted, filters and conditioning and drivers, can interpret and translate raw signals. These processes are not considered reactive behaviors, and exercise no direct control over the motor drives or other actuators. In addition, in the present embodiment, brush motor controller(s) control the main and side brushes, although these may alternatively be controlled by dedicated brush behaviors and a brush control arbiter.


In accordance with another example, the gentle touch routine 902 may employ an infrared proximity detector 134 that should go off (i.e., when a receiver receives from a reflection originating in the overlapping space of an emitter and receiver angled toward one another) from about 1 to 10 inches (preferably, from 1 to 4 inches. This distance is selected in order to be within the effective range of the IR proximity or cross-beam sensor 134, yet with sufficient time to slow the mobile robot 100 before a collision with a detected obstacle). Conventional proximity sensors return a signal strength depending on obstacle albedo; cross-beam sensors 134 can be thresholded for various albedos intruding in the specific distance from the sensor where the receiver and emitter's beam/field cross. Additionally, slowing down based on a proximately detected wall may be suppressed in or turned off by the user, independently of the bump sensor 132. Controller 108 may slow the robot's descent substantially in a steady reduction then cruise slowly. Controller 108 may execute an S-curve slowly over about 3 inches, can slow down steadily but at an accelerating or decelerating rate over about 3 inches. During escape behaviors, for example, panic, stasis, stuck, anti-canyoning, the robot may essentially can be turn off the proximity sensors 134—usually by not using the proximity sensors 134 as an enabling condition for any escape behavior or some avoidance behaviors


Drive system 104 may be configured to reduce the speed setting in response to a signal from proximity sensor 134 which indicating detection of a forward obstacle, while continuing to advance the robot 100 and work the floor or surface according to the existing heading setting. Drive system 104 may be configured to alter the heading setting in response to a signal received from bump sensor 132 that indicates contact with an obstacle. For example, drive system 104 may be configured to alter the heading setting in response to the signals received from the bump sensor 132 and the proximity sensor 134 such that robot 100 follows a perimeter of the obstacle. In another example, drive system 104 may be configured to change heading to direct robot 104 away from the obstacle.


Proximity sensors 134 may include one or more pairs of infrared emitters and receivers. For instance, a modulated emitter and a standard receiver may be used. A light pipe (not shown), collimating or diffusing optics, Fresnel or diffractive optics, may be used in some implementations to eliminate blind spots by providing a more uniform light pattern or a light pattern more concentrated or more likely to be detected in high probability/high impact areas, such as the immediate forward direction. Alternatively, some implementations may make use of sonar or other types of proximity sensors.


In some implementations, kinetic bump sensor 132 may include a mechanical switch 130. In some implementations, bump sensor 132 may include a capacitive sensor. Other types of contact sensors may also be used as well.


Drive system 104 may be configured to maneuver robot 100 at a torque (or motor current) setting in response to a signal received from bump sensor 132 which indicates contact with an obstacle. For instance, drive system 104 may increase the torque (or motor current) setting in response to a signal received from the bump sensor indicating contact with an obstacle.


In another example method of navigating an autonomous coverage robot with respect to an object on a floor, robot 100 may be initially placed on the floor (or may already be on the floor, e.g., if the robot starts itself from a charging dock) with robot 100 autonomously traversing the floor in a cleaning mode at a full cleaning speed. If robot 100 senses a nearby object in front of robot 100, it reduces the cleaning speed (e.g., to a reduced cleaning speed) and continues moving toward the object and working/cleaning the floor until detecting impact, which is likely to be with the object but may be another object. Upon sensing impact with an object, robot 100 turns with respect to the object that it bumped and cleans next to, i.e., along, the object. Robot 100 may, for instance, follow the object's perimeter while cleaning along or next to the object. In another instance, robot 100 may maintain a somewhat constant following distance from the object while cleaning next to the object in response to the contact with the object. The following distance from the object may be a distance between robot 100 and the object immediately after the contact with the object, for instance, 0 to 2 inches. The distance is optionally less than the distance that the side or edge brush unit 106a extends beyond the side of the robot.


Robot 100 may, in some instances, perform a maneuver to move around the object in response to the contact with the object. For example, robot 100 may move in a somewhat semi-circular path around the object, or a succession of alternating partial spirals (e.g., arcs with progressively decreasing radius). In another instance, robot 100 may move away from the object and then move in a direction that is somewhat tangential to the object.


Robot 100 may decrease the cleaning speed to a reduced speed at a constant rate, for instance, at a non-linear or exponential rate. The full cleaning speed of robot 100 may be about 300 mm/s and the reduced cleaning speed of robot 100 may be about 100 mm/s.



FIG. 10 shows kinetic bump sensors 132, floor proximity sensors 140 and an attachment brace 142 which may be used with robot 100 for detecting an adjacent floor. Kinetic bump sensors 132 may sense collisions between robot 100 and objects in the robot's forward path. Floor proximity sensors may be carried by chassis 102 and be used to sense when robot 100 is near a “cliff”, such as a set of stairs. Floor proximity sensors 140 may send signals to controller 108 indicating whether or not a cliff is detected. Based on signals from the floor proximity sensors 140, controller 108 may direct drive system 104 to change speed or velocity to avoid the cliff.



FIGS. 11 and 12 show side and exploded views of a floor proximity sensor 140. Floor proximity sensor 140 has a body with a forward section 144, a rear section 146, an emitter 148, a receiver 150, and a cover 152. Emitter 148 and receiver 150 may be capable of emitting and receiving infrared light. Emitter 148 and receiver 150 are arranged within the forward and rear body sections 144, 146 at an angle so that their axes line up at a point beneath robot 100 at the approximate floor distance.



FIG. 13 shows an exploded view of cover 152. Cover 152 consists of a lens 154 and a cover body 156. Lens 152 may be transparent to infrared light and cover body 156 may be opaque to facilitate focusing emissions sent from emitter 148. The forward edge 158 of cover 152 is elevated above its rearward edge 159 to aid in reducing dust build up and to ensure that light is received by receiver 150 primarily when sensor 140 is positioned correctly over a floor and a reduced amount is received when sensor 140 is over a “cliff”. In some implementations, cover 152 is constructed using a material with anti-static (dissipative or conductive) properties, such as an anti-static polycarbonate, copper oxide doped or coated polycarbonate, anti-static Lexan “LNP” available from General Electric, Inc., anti-static polyethylene, anti-static ABS/polycarbonate alloy, or other like material. One example includes ABS 747 and PC 114R or 1250Y mixed with antistatic powder. Preferably, the robot shell, chassis, and other parts are also anti-static (e.g., antistatic ABS), dissipative and/or conductive, at least in part in order to ground the anti-static cover 152. The cover 152 may also be grounded by any conductive path to ground. When the coverage robot 100 traverses a floor, a cover 152 with out anti-static properties can become electrostatically charged (e.g., via friction), thereby having a propensity to accumulate oppositely charged debris, such as fuzz, which may obstructing a sensing view of the emitter 148 and receiver 150.


In cases where the floor proximity sensor 140 is properly placed on a floor, light emitted from emitter 148 reflects off the floor and back to receiver 150, resulting in a signal that is readable by controller 108. In the event that the floor proximity sensor 140 is not over a floor, the amount of light received by receiver 150 is reduced, resulting in a signal that may be interpreted by controller 108 as a cliff.



FIG. 14 is an exploded view showing an example of the caster wheel assembly 116. Caster wheel assembly 116 is separately and independently removable from the chassis 102 and the coverage robot 100. The caster wheel assembly 116 includes a caster wheel housing 162, a caster wheel 164, a wheel-drop sensor 166, and a wheel-floor proximity sensor 168.


The caster wheel housing 162 carries the caster wheel 164, the wheel drop sensor 866, and wheel-floor proximity sensor 168. The caster wheel 164 turns about a vertical axis and rolls about a horizontal axis in the caster wheel housing 162.


The wheel drop sensor 166 detects downward displacement of the caster wheel 164 with respect to the chassis 102. The wheel drop sensor 166 determines if the caster wheel 164 is in contact with the work surface.


The wheel-floor proximity sensor 168 is housed adjacent to the caster wheel 164. The wheel-floor proximity sensor 168 detects the proximity of the floor relative to the chassis 102. The wheel-floor proximity sensor 168 includes an infrared (IR) emitter and an IR receiver. The IR emitter produces an IR signal. The IR signal reflects off of the work surface. The IR receiver detects the reflected IR signal and determines the proximity of the work surface. Alternatively, the wheel-floor proximity sensor 168 may use another type of sensor, such as a visible light sensor. The wheel-floor proximity sensor 808 prevents the coverage robot 100 from moving down a cliff in the work surface, such as a stair step or a ledge. In certain implementations, the drive wheel assemblies 114, 116 each include a wheel-floor proximity sensor.



FIG. 15 is an exploded view showing an example of the wheel-drop sensor 166. The wheel drop sensor 806 includes an IR emitter 170 and an IR receiver 172 in a housing 173. The IR emitter 170 produces an IR signal. The IR signal reflects from the caster wheel 164. The IR receiver 172 detects the reflected IR signal and determines the vertical position of the caster wheel 164.



FIG. 16 is a cross-sectional view showing an example of the caster wheel assembly 116. The view shows a top surface 174 of the caster wheel 164 from which the IR signal reflects. The IR receiver 172 uses the reflected IR signal to determine the vertical position of the caster wheel 164.


In some instances, drive system 104 may further include a validation system that validates the operability of the floor proximity sensors when all wheels drop. The validation is based on the inference that all wheels dropped are likely the result of a robot being lifted off the floor by a person, and checks to see that all floor proximity sensors do not register a floor surface (either no reflection measured, or a reflection that is too strong). Any sensor that registers a floor surface or a too strong reflection (e.g., indicating a blocked sensor) is considered blocked. In response to this detection, the robot may initiate a maintenance reporting session in which indicia or lights indicate that the floor proximity sensors are to be cleaned. In response to this detection, the robot will prohibit forward motion until a validation procedure determines that all floor proximity sensors are clear and are functional. For example, a mechanical switch sensor may be positioned above castor wheel 168 at a location 176 that causes it to close when the castor is depressed (e.g. it is pushed upwards by the floor), thus providing a alternate signal to controller 108 that castor wheel 164 is on the floor.


Occasionally, an autonomous coverage robot may find itself entangled with an external object, such as frills on the end of a rug or shoe laces dangling from a untied shoe. A method of disentangling an autonomous coverage robotic (such as robot 100) may initially include placing robot 100 on a floor surface, which should be considered to include instances when the robot starts itself from a dock (e.g., after a significant delay, but nonetheless having been placed on the floor). Robot 100 autonomously moves forward across the floor surface while operating the cleaning heads 106a, 106b. Robot 100 may reverse bias edge cleaning head motor 118 in response to a measured increase (e.g., spike or increase above threshold, rapid increase of a predetermined slope) in motor current while continuing to maneuver across the floor surface in an unchanged direction, working and/or cleaning the floor without interruption.


In some instances, robot 100 may move forward before (independently of forward motion control by the motion behaviors) reverse biasing the rotation of edge cleaning head 106a in response to an elevated cleaning head motor current. Robot 100 may independently reverse the rotation of edge cleaning head 106a in response to an increased edge cleaning head 106a motor current for a period of time. The time period for increased current may be specified, for instance, in seconds. After reverse biasing the rotation of edge cleaning head 106, robot 100 may move in a reverse direction, alter its direction of travel, and move in the new direction.


In particular combination, the robot includes a main cleaning head 106b extending across the middle of the robot, e.g., in a direction transverse to the robot working path or substantially in a direction parallel to the main drive wheels, as well as an edge cleaning head which is arranged at the lateral side of the robot, in a position to extend the edge cleaning head beyond the perimeter of the robot in the side direction so as to clean beside the robot (as opposed to solely underneath the body of the robot). The main cleaning head 106b includes at least one rotationally driven brush 111, and the edge cleaning head 106a includes at least one rotationally driven brush 120.


As shown in FIG. 9C, the main cleaning head 106b is controlled by, e.g., a brush motor control process 930. The brush motor control process monitors a current sensor of the main cleaning head motor, and when a rapid current rise occurs (e.g., spike or rise above threshold, integrated or otherwise determined slope of a predetermined amount), optionally checks if the robot is moving forward (e.g., by monitoring a process, a flag indicating forward motion, or the main drive motors directly). If the robot 100 is moving forward, without interrupting such motion (optionally isolated from the capability to do so as the robot motion is controlled by independent behaviorally controlled drive), the brush motor control process 930 applies a reverse bias to the brush motor.


The reverse bias does not rapidly rotate the motor in the reverse direction so as to avoid winding the same entangled cord, string, or tassel about the brush in the opposite direction. Instead, the brush motor control process 930 applies a slight bias, sufficient to keep the rotation of the brush near neutral. When the robot 100 moves forward, the cord, string, or tassel pulling on the brush to unwind the entanglement will only transmit an attenuated torque in the reverse direction to the motor (e.g., because of a reduction gearbox between the motor and brush permitting back-driving the gearbox at a reversed mechanical advantage), but, combined with the reverse bias, the attenuated torque results in assisted but slow unwinding of the entangled brush, of increasing speed as more tension is applied by the cord or string, e.g., as the robot moves further away from the site where the cord or string or tassel is fixed.


The reverse bias continues until a time out or until no pulling or jamming load (e.g., no entanglement) is detected on the motor, whereupon the process ends and the cleaning head resumes normal rotation in a direction to clean the surface.


The edge brush 120 of the edge cleaning head 106a is subject to substantially the same control in an edge brush motor control process 960, in which the edge brush 120 rotation is reverse biased 962 in a similar fashion (also shown in FIG. 9B).


Accordingly, both main 106b and edge 106a brushes are controlled independently of one another and of robot motion, and each may disentangle itself without monitoring or disturbing the other. In some instances, each will become simultaneously entangled, and independent but simultaneous control permits them to the unwound or self-clearing at the same time. In addition, by having the brush motor under reactive control (not awaiting a drive motor state or other overall robot state) and with only a slight reverse bias, the brush will be available to unwind as soon as any rapid current rise is detected, catching an entanglement earlier, but will not move in reverse by any amount sufficient to cause a similar entangling problem in the opposite direction.


In some instances, because the motion control is independent of and does not monitor the brush state, the robot 100 continues to move forward and the cleaning head 106b begins to reverse bias the rotation of main cleaning head 111 after the robot 100 has proceeded some amount forward. In some instances, robot 100 may reverse the rotation of main cleaning head 111 in response to an elevated cleaning head motor current for a period of time. After reversing the rotation of main cleaning head 111, robot 100 may move in a reverse direction, alter a drive direction, and move in the drive direction.



FIGS. 17 A-H illustrate examples of methods for disentangling coverage robots with various configurations of cleaning heads. In general, the cleaning heads have rollers which may be driven by electric motors. Dirt and debris may be picked up by the cleaning heads and deposited in a container for later manual or automatic disposal. Electronic control devices may be provided for the control of drive motors for changing the coverage robot's direction, and also for the control of agitating brush rollers. Such methods may allow coverage robots to resume cleaning unattended after encountering an entanglement situation.



FIG. 17A shows a side view of a cleaning head 201 of a coverage robot 200 with an agitating roller 202 in tangential contact with the work surface. Roller 202 brushes up dirt 203 towards a suction duct 204 which is integrated within a brush chamber 206. By using an air suction stream, the collected debris 210 may be conveyed to a container 212.


If the movement of rollers 202 is blocked or obstructed to a predetermined or a settable extent, the cleaning head 201 may be stopped, allowing robot 200 to reverse direction with roller 202 minimally powered in the reverse direction sufficiently enough to release the obstruction. For example, if a cord has become wound about roller 202, the roller 202 may be disengaged and allowed to turn so that the cord unwinds as robot 200 retreats. Robot 200 may then resume operation of roller 202 in the original direction of rotation and resume robot motion in the original direction.



FIG. 17B shows another example of disentanglement using robot 200 with the addition of a brush roller 214. Brush roller 214 may be driven by the same or a different motor and rotate normal to the working surface. Brush roller 214 sends dirt 216 from the edges of robot 200 to a pickup area 218 of roller 202.


In this example, if the movement of either rollers 202 or 212 is blocked or obstructed to a predetermined or a settable extent, cleaning head 201 may be stopped, allowing robot 200 to reverse direction with rollers 202, 212 minimally powered in the reverse direction sufficiently enough to release the obstruction. For example, if a cord becomes wound about either roller 202 or 212, the roller 202 or 212, or both, may be disengaged and allowed to turn so that the cord unwinds as robot 200 retreats. Robot 200 may then resume operation of rollers 202, 212 in the original direction of rotation and resume robot motion in the original direction.



FIG. 17C shows a below view of a coverage robot 240 and a side view of a cleaning head 242 within it. A first brush roller 244 and a second brush roller 246 are in tangential contact with the work surface. Rollers 244 and 246 may be rotated by a single or multiple motors for the purpose of agitating the work surface and dynamically lifting debris 248 trapped between them, towards a suction duct 250 which is integrated within brush chamber 252. By means of an air suction stream 254, the collected debris 256 may be conveyed to a container 258.


If the movement of rollers 244, 246 is blocked or obstructed to a predetermined or a settable extent, rollers 202, 212 may be stopped, allowing robot 240 to advance forward, as shown by arrow 260, with the rollers 202, 212 minimally powered in the reverse direction sufficiently enough to release obstruction, and resume operation of the roller motor in the original direction of rotation.



FIG. 17D shows robot 240 performing an alternate example method for disentanglement. If the movement of the agitating rollers 244, 246 is blocked or obstructed to a predetermined or a settable extent, the rollers 244, 246 may be disengaged (i.e. not actively driven). Robot 240 may then reverse directions, as shown by arrow 262, with rollers 244, 246 minimally powered in the reverse direction sufficiently enough to release the obstruction, upon which rollers 244246 may be reengaged in their original direction of rotation and robot 240 resumes driving in its original direction (shown by arrow 264).



FIG. 17E shows a side view of a coverage robot 270 with three rollers. Robot 270 has a cleaning head 272 and a side brush 274. Cleaning head 272 has a normal agitating roller 276 and a counter-rotating agitating roller 278. Agitating rollers 276 and 278 may be rotationally driven parallel to each other and to the work surface and brush roller 274 may be driven normally to the work surface by electric motor(s) (not shown). Brush roller 274 may pre-sweep the work surface and pushing dirt and debris towards the agitating rollers 276, 278, as shown by arrow 279. Agitating rollers 276, 278 may push dirt 280 towards a suction duct 282 which is integrated within a brush chamber 284. By using an air suction stream, the collected debris 288 may be conveyed to a container 290.


If the movement of agitating rollers 276, 278 is blocked or obstructed to a predetermined or a settable extent, the roller motor(s) may be stopped or temporarily activated in the opposite direction in an attempt to remove the blockage or obstruction. The roller motor(s) may then resume operation in the original direction of rotation.



FIG. 17F illustrates another example of a method for disentangling coverage robot 270. If the movement of agitating rollers 276, 278 is blocked or obstructed to a predetermined or a settable extent, the roller motor(s) may be stopped or temporarily activated in the opposite direction. The roller motor(s) may then resume driving rollers 276, 278 in the original direction of rotation while simultaneously reversing the direction of travel of robot 270 or imparting a twisting motion about its axis. Robot 270 may then resume motion in the original direction.



FIG. 17G shows a side view and a bottom view of a coverage robot 300 with two rollers and two air ducts. Robot 300 has a cleaning head 302 a normal agitating roller 304 and a counter-rotating agitating roller 306. Agitating rollers 304 and 306 may be rotationally driven parallel to each other and to the work surface by electric motor(s) (not shown).


Rollers 304, 306 may dynamically lift and push dirt and debris 307 towards a primary air duct 308 which is integrated within a brush chamber 312. Dirt and debris that are passed over by rollers 304, 306 may encounter a secondary air duct 310 located be hind the rollers. A suction stream generated by an air suction motor (not shown) may convey the collected dirt and debris via the ducts 308, 210 to a container 314. Associated electronic control devices provide control to drive motors for turning and changing direction of robot 300, and also for directional control of the agitating rollers 304, 306.


If the movement of the agitating rollers 304, 306 is blocked or obstructed, then the control device do one or more of stopping or minimally powering the roller motor(s) in the reverse direction, then resume operating the roller motor in the original direction of rotation. Simultaneously, robot 300 may at least momentarily reverse its direction or imparting a twisting motion about its axis and then resuming motion in its original direction.



FIG. 17H shows another example of a disentangling method, involving robot 300 with the addition of a brush roller 316. Brush roller 316 has an axis of rotation normal and may be driven by an existing or dedicated electric motor. Brush roller 316 may pre-sweep the work surface and push dirt and debris 318 towards the agitating rollers 304, 306 (as shown by arrow 318). Dirt and debris may then be removed as described above.


If the movement of the agitating rollers 304, 306 is blocked or obstructed, the control device may stop or minimally power the roller motor(s) in the reverse direction reverse, then resume operating the roller motor in the original direction of rotation. Simultaneously, robot 300 may at least momentarily reverse its direction or imparting a twisting motion about its axis and then resuming motion in its original direction.


Other robot details and features combinable with those described herein may be found in the following U.S. patent applications entitled “AUTONOMOUS COVERAGE ROBOT NAVIGATION SYSTEM” having assigned Ser. No. 11/633,869; “MODULAR ROBOT” having assigned Ser. No. 11/633,886; and “ROBOT SYSTEM” having assigned Ser. No. 11/633,883, the entire contents of the aforementioned applications are hereby incorporated by reference.


A number of implementations have been described. Nevertheless, it will be understood that various modifications may be made without departing from the spirit and scope of the following claims. Accordingly, other implementations are within the scope of the following claims.

Claims
  • 1. An autonomous coverage robot comprising: a drive system configured to maneuver the robot according to a heading setting and a speed setting;a bump sensor responsive to a collision of the robot with an obstacle in a forward direction; anda proximity sensor responsive to a potential obstacle forward of the robot;wherein the drive system is configured to reduce the speed setting in response to a signal from the proximity sensor indicating detection of a potential obstacle, while continuing to advance the robot according to the heading setting;wherein the drive system is configured to increase the speed setting if the drive system does not receive a subsequent signal indicating the presence of an obstacle while continuing to advance according to the heading setting and the reduced speed setting; andwherein the drive system is configured to alter the heading setting in response to a signal received from the bump sensor indicating contact with an obstacle.
  • 2. The robot of claim 1 wherein the drive system is configured to alter the heading setting in response to the signals received from the bump sensor and the proximity sensor to follow a perimeter of the obstacle.
  • 3. The robot of claim 1 wherein the drive system is configured to alter the heading setting in response to the signals received from the bump sensor and the proximity sensor to direct the robot away from the obstacle.
  • 4. The robot of claim 1 wherein the proximity sensor comprises at least one infrared emitter and receive pair.
  • 5. The robot of claim 1 wherein the proximity sensor comprises a sonar device.
  • 6. The robot of claim 1 wherein the bump sensor comprises a switch.
  • 7. The robot of claim 1 wherein the bump sensor comprises a capacitive sensor.
  • 8. The robot of claim 1 wherein the drive system is configured to maneuver the robot at a torque setting, wherein the drive system is configured to alter the torque setting in response to a signal received from the bump sensor indicating contact with an obstacle.
  • 9. The robot of claim 8 wherein the drive system increases the torque setting in response to a signal received from the bump sensor indicating contact with an obstacle.
  • 10. The robot of claim 1 wherein the drive system is configured to increase the speed setting if the drive system does not receive the subsequent signal from the bump sensor indicating the presence of an obstacle within an elapsed time after the speed setting is reduced.
  • 11. A method of navigating an autonomous coverage robot with respect to an object on a floor, the method comprising the robot: autonomously traversing the floor in a cleaning mode at a cleaning speed;upon sensing a proximity of the object forward of the robot, reducing the cleaning speed to a reduced speed while continuing towards the object;in response to not sensing the presence of the object while advancing at the reduced speed, increasing the speed setting; andin response to sensing contact with the object, turning with respect to the object and cleaning next to the object.
  • 12. The method of claim 11 wherein the robot follows a perimeter of the object while cleaning next to the object.
  • 13. The method of claim 11 wherein the robot maintains a substantially constant following distance from the object while cleaning next to the object in response to the contact with the object.
  • 14. The method of claim 13 wherein the following distance from the object is substantially a distance between the robot and the object substantially immediately after the contact with the object.
  • 15. The method of claim 13 wherein the following distance from the object is between about 0 and 2 inches.
  • 16. The method of claim 13 wherein the robot performs a maneuver to move around the object in response to the contact with the object.
  • 17. The method of claim 16 wherein the maneuver comprises the robot moving in a substantially semi-circular path around the object.
  • 18. The method of claim 16 wherein the maneuver comprises the robot moving away from the object and then moving in a direction substantially tangential to the object.
  • 19. The method of claim 13 wherein the robot decreases the cleaning speed to a reduced speed at a constant rate.
  • 20. The method of claim 13 wherein the robot decreases the cleaning speed to a reduced speed at an exponential rate.
  • 21. The method of claim 13 wherein the robot decreases the cleaning speed to a reduced speed at a nor-linear rate.
  • 22. The method of claim 13 wherein the cleaning speed of the robot is about 300 mm/sec.
  • 23. The method of claim 13 wherein the reduced speed of the robot is about 100 mm/sec.
  • 24. The method of claim 13 wherein the robot autonomously traverses the floor in the cleaning mode having a torque setting, wherein upon sensing contact with the object, the robot increasing the torque setting.
  • 25. An autonomous coverage robot comprising: a drive system configured to maneuver the robot according to a heading setting and a speed setting;a bump sensor responsive to a collision of the robot with an obstacle in a forward direction; anda proximity sensor responsive to a potential obstacle forward of the robot;wherein the drive system is configured to reduce the speed setting in response to a signal from the proximity sensor indicating detection of a potential obstacle, while continuing to advance the robot according to the heading setting;wherein the drive system is configured to increase the speed setting if the drive system does not receive a signal from the bump sensor within an elapsed time after the speed setting is reduced; andwherein the drive system is configured to alter the heading setting in response to a signal received from the bump sensor indicating contact with an obstacle.
CROSS-REFERENCE TO RELATED APPLICATIONS

This U.S. continuation patent application claims priority under 35 U.S.C. §120 to U.S. patent application Ser. No. 11/633,885 filed on Dec. 4, 2006, which claims priority under 35 U.S.C. §119(e) to U.S. provisional patent application 60/741,442 filed on Dec. 2, 2005, the entire contents of the aforementioned applications are hereby incorporated by reference.

US Referenced Citations (897)
Number Name Date Kind
1755054 Darst Apr 1930 A
1780221 Buchmann Nov 1930 A
1900885 Smellie Mar 1933 A
1970302 Gerhardt Aug 1934 A
2136324 John Nov 1938 A
2302111 Dow et al. Nov 1942 A
2353621 Sav et al. Jul 1944 A
2770825 Pullen Nov 1956 A
3119369 Harland et al. Jan 1964 A
3166138 Dunn Jan 1965 A
3333564 Waters Aug 1967 A
3375375 Robert et al. Mar 1968 A
3381652 Schaefer et al. May 1968 A
3457575 Bienek Jul 1969 A
3550714 Bellinger Dec 1970 A
3569727 Aggarwal et al. Mar 1971 A
3674316 De Brey Jul 1972 A
3678882 Kinsella Jul 1972 A
3696727 Yokozato Oct 1972 A
3744586 Leinauer Jul 1973 A
3756667 Bombardier et al. Sep 1973 A
3809004 Leonheart May 1974 A
3816004 Bignardi Jun 1974 A
3821028 Ziener et al. Jun 1974 A
3845831 James Nov 1974 A
3853086 Asplund Dec 1974 A
3863285 Hukuba Feb 1975 A
3888181 Kups Jun 1975 A
3937174 Haaga Feb 1976 A
3952361 Wilkins Apr 1976 A
3989311 Debrey Nov 1976 A
3989931 Phillips Nov 1976 A
4004313 Capra Jan 1977 A
4012681 Finger et al. Mar 1977 A
4070170 Leinfelt Jan 1978 A
4099284 Shinozaki et al. Jul 1978 A
4119900 Kremnitz Oct 1978 A
4175589 Nakamura et al. Nov 1979 A
4175892 De Brey Nov 1979 A
4196727 Verkaart et al. Apr 1980 A
4198727 Farmer Apr 1980 A
4199838 Simonsson Apr 1980 A
4209254 Reymond et al. Jun 1980 A
D258901 Keyworth Apr 1981 S
4297578 Carter Oct 1981 A
4306329 Yokoi Dec 1981 A
4309758 Halsall et al. Jan 1982 A
4328545 Halsall et al. May 1982 A
4367403 Miller Jan 1983 A
4369543 Chen et al. Jan 1983 A
4401909 Gorsek Aug 1983 A
4416033 Specht Nov 1983 A
4445245 Lu May 1984 A
4465370 Yuasa et al. Aug 1984 A
4477998 You Oct 1984 A
4481692 Kurz Nov 1984 A
4482960 Pryor Nov 1984 A
4492058 Goldfarb et al. Jan 1985 A
4513469 Godfrey et al. Apr 1985 A
D278732 Ohkado May 1985 S
4518437 Sommer May 1985 A
4534637 Suzuki et al. Aug 1985 A
4556313 Miller et al. Dec 1985 A
4575211 Matsumura et al. Mar 1986 A
4580311 Kurz Apr 1986 A
4601082 Kurz Jul 1986 A
4618213 Chen Oct 1986 A
4620285 Perdue Oct 1986 A
4624026 Olson et al. Nov 1986 A
4626995 Lofgren et al. Dec 1986 A
4628454 Ito Dec 1986 A
4638445 Mattaboni Jan 1987 A
4644156 Takahashi et al. Feb 1987 A
4649504 Krouglicof et al. Mar 1987 A
4652917 Miller Mar 1987 A
4654492 Koerner et al. Mar 1987 A
4654924 Getz et al. Apr 1987 A
4660969 Sorimachi et al. Apr 1987 A
4662854 Fang May 1987 A
4674047 Tyler Jun 1987 A
4674048 Okumura Jun 1987 A
4679152 Perdue Jul 1987 A
4680827 Hummel Jul 1987 A
4696074 Cavalli Sep 1987 A
D292223 Trumbull Oct 1987 S
4700301 Dyke Oct 1987 A
4700427 Knepper Oct 1987 A
4703820 Reinaud Nov 1987 A
4710020 Maddox et al. Dec 1987 A
4716621 Zoni Jan 1988 A
4728801 O'Connor Mar 1988 A
4733343 Yoneda et al. Mar 1988 A
4733430 Westergren Mar 1988 A
4733431 Martin Mar 1988 A
4735136 Lee et al. Apr 1988 A
4735138 Gawler et al. Apr 1988 A
4748336 Fujie et al. May 1988 A
4748833 Nagasawa Jun 1988 A
4756049 Uehara Jul 1988 A
4767213 Hummel Aug 1988 A
4769700 Pryor Sep 1988 A
4777416 George et al. Oct 1988 A
D298766 Tanno et al. Nov 1988 S
4782550 Jacobs Nov 1988 A
4796198 Boultinghouse et al. Jan 1989 A
4806751 Abe et al. Feb 1989 A
4811228 Hyyppa Mar 1989 A
4813906 Matsuyama et al. Mar 1989 A
4815157 Tsuchiya Mar 1989 A
4817000 Eberhardt Mar 1989 A
4818875 Weiner Apr 1989 A
4829442 Kadonoff et al. May 1989 A
4829626 Harkonen et al. May 1989 A
4832098 Palinkas et al. May 1989 A
4854000 Takimoto Aug 1989 A
4854006 Nishimura et al. Aug 1989 A
4855915 Dallaire Aug 1989 A
4857912 Everett et al. Aug 1989 A
4858132 Holmquist Aug 1989 A
4867570 Sorimachi et al. Sep 1989 A
4880474 Koharagi et al. Nov 1989 A
4884506 Guerreri Dec 1989 A
4887415 Martin Dec 1989 A
4891762 Chotiros Jan 1990 A
4893025 Lee Jan 1990 A
4901394 Nakamura et al. Feb 1990 A
4905151 Weiman et al. Feb 1990 A
4912643 Beirne Mar 1990 A
4918441 Bohman Apr 1990 A
4919224 Shyu et al. Apr 1990 A
4919489 Kopsco Apr 1990 A
4920060 Parrent et al. Apr 1990 A
4920605 Takashima May 1990 A
4933864 Evans et al. Jun 1990 A
4937912 Kurz Jul 1990 A
4953253 Fukuda et al. Sep 1990 A
4954962 Evans et al. Sep 1990 A
4955714 Stotler et al. Sep 1990 A
4956891 Wulff Sep 1990 A
4961303 McCarty et al. Oct 1990 A
4961304 Ovsborn et al. Oct 1990 A
4962453 Pong et al. Oct 1990 A
4971591 Raviv et al. Nov 1990 A
4973912 Kaminski et al. Nov 1990 A
4974283 Holsten et al. Dec 1990 A
4977618 Allen Dec 1990 A
4977639 Takahashi et al. Dec 1990 A
4986663 Cecchi et al. Jan 1991 A
5001635 Yasutomi et al. Mar 1991 A
5002145 Wakaumi et al. Mar 1991 A
5012886 Jonas et al. May 1991 A
5018240 Holman May 1991 A
5020186 Lessig et al. Jun 1991 A
5022812 Coughlan et al. Jun 1991 A
5023788 Kitazume et al. Jun 1991 A
5024529 Svetkoff et al. Jun 1991 A
D318500 Malewicki et al. Jul 1991 S
5032775 Mizuno et al. Jul 1991 A
5033151 Kraft et al. Jul 1991 A
5033291 Podoloff et al. Jul 1991 A
5040116 Evans et al. Aug 1991 A
5045769 Everett Sep 1991 A
5049802 Mintus et al. Sep 1991 A
5051906 Evans et al. Sep 1991 A
5062819 Mallory Nov 1991 A
5070567 Holland Dec 1991 A
5084934 Lessig et al. Feb 1992 A
5086535 Grossmeyer et al. Feb 1992 A
5090321 Abouav Feb 1992 A
5093955 Blehert et al. Mar 1992 A
5094311 Akeel Mar 1992 A
5105502 Takashima Apr 1992 A
5105550 Shenoha Apr 1992 A
5109566 Kobayashi et al. May 1992 A
5115538 Cochran et al. May 1992 A
5127128 Lee Jul 1992 A
5136675 Hodson Aug 1992 A
5136750 Takashima et al. Aug 1992 A
5142985 Stearns et al. Sep 1992 A
5144471 Takanashi et al. Sep 1992 A
5144714 Mori et al. Sep 1992 A
5144715 Matsuyo et al. Sep 1992 A
5152028 Hirano Oct 1992 A
5152202 Strauss Oct 1992 A
5155684 Burke et al. Oct 1992 A
5163202 Kawakami et al. Nov 1992 A
5163320 Goshima et al. Nov 1992 A
5164579 Pryor et al. Nov 1992 A
5165064 Mattaboni Nov 1992 A
5173881 Sindle Dec 1992 A
5182833 Yamaguchi et al. Feb 1993 A
5202742 Frank et al. Apr 1993 A
5204814 Noonan et al. Apr 1993 A
5208521 Aoyama May 1993 A
5216777 Moro et al. Jun 1993 A
5227985 DeMenthon Jul 1993 A
5233682 Abe et al. Aug 1993 A
5239720 Wood et al. Aug 1993 A
5251358 Moro et al. Oct 1993 A
5261139 Lewis Nov 1993 A
5276618 Everett Jan 1994 A
5276939 Uenishi Jan 1994 A
5277064 Knigga et al. Jan 1994 A
5279672 Betker et al. Jan 1994 A
5284452 Corona Feb 1994 A
5284522 Kobayashi et al. Feb 1994 A
5293955 Lee Mar 1994 A
D345707 Alister Apr 1994 S
5303448 Hennessey et al. Apr 1994 A
5307273 Oh et al. Apr 1994 A
5309592 Hiratsuka May 1994 A
5310379 Hippely et al. May 1994 A
5315227 Pierson et al. May 1994 A
5319827 Yang Jun 1994 A
5319828 Waldhauser et al. Jun 1994 A
5321614 Ashworth Jun 1994 A
5323483 Baeg Jun 1994 A
5324948 Dudar et al. Jun 1994 A
5341188 Kato Aug 1994 A
5341540 Soupert et al. Aug 1994 A
5341549 Wirtz et al. Aug 1994 A
5345649 Whitlow Sep 1994 A
5353224 Lee et al. Oct 1994 A
5363305 Cox et al. Nov 1994 A
5363935 Schempf et al. Nov 1994 A
5369347 Yoo Nov 1994 A
5369838 Wood et al. Dec 1994 A
5386862 Glover et al. Feb 1995 A
5399951 Lavallee et al. Mar 1995 A
5404612 Ishikawa Apr 1995 A
5410479 Coker Apr 1995 A
5435405 Schempf et al. Jul 1995 A
5440216 Kim Aug 1995 A
5442358 Keeler et al. Aug 1995 A
5444965 Colens Aug 1995 A
5446356 Kim Aug 1995 A
5446445 Bloomfield et al. Aug 1995 A
5451135 Schempf et al. Sep 1995 A
5454129 Kell Oct 1995 A
5455982 Armstrong et al. Oct 1995 A
5465525 Mifune et al. Nov 1995 A
5465619 Sotack et al. Nov 1995 A
5467273 Faibish et al. Nov 1995 A
5471560 Allard et al. Nov 1995 A
5491670 Weber Feb 1996 A
5497529 Boesi Mar 1996 A
5498948 Bruni et al. Mar 1996 A
5502638 Takenaka Mar 1996 A
5505072 Oreper Apr 1996 A
5507067 Hoekstra et al. Apr 1996 A
5510893 Suzuki Apr 1996 A
5511147 Abdel Apr 1996 A
5515572 Hoekstra et al. May 1996 A
5534762 Kim Jul 1996 A
5537017 Feiten et al. Jul 1996 A
5537711 Tseng Jul 1996 A
5539953 Kurz Jul 1996 A
5542146 Hoekstra et al. Aug 1996 A
5542148 Young Aug 1996 A
5546631 Chambon Aug 1996 A
5548511 Bancroft Aug 1996 A
5551525 Pack et al. Sep 1996 A
5553349 Kilstrom et al. Sep 1996 A
5555587 Guha Sep 1996 A
5560077 Crotchett Oct 1996 A
5563366 La Mura Oct 1996 A
5568589 Hwang Oct 1996 A
D375592 Ljunggren Nov 1996 S
5608306 Rybeck et al. Mar 1997 A
5608894 Kawakami et al. Mar 1997 A
5608944 Gordon Mar 1997 A
5610488 Miyazawa Mar 1997 A
5611106 Wulff Mar 1997 A
5611108 Knowlton et al. Mar 1997 A
5613261 Kawakami et al. Mar 1997 A
5613269 Miwa Mar 1997 A
5621291 Lee Apr 1997 A
5622236 Azumi et al. Apr 1997 A
5634237 Paranjpe Jun 1997 A
5634239 Tuvin et al. Jun 1997 A
5636402 Kubo et al. Jun 1997 A
5642299 Hardin et al. Jun 1997 A
5646494 Han Jul 1997 A
5647554 Ikegami et al. Jul 1997 A
5650702 Azumi Jul 1997 A
5652489 Kawakami Jul 1997 A
5682313 Edlund et al. Oct 1997 A
5682839 Grimsley et al. Nov 1997 A
5696675 Nakamura et al. Dec 1997 A
5698861 Oh Dec 1997 A
5709007 Chiang Jan 1998 A
5710506 Broell et al. Jan 1998 A
5714119 Kawagoe et al. Feb 1998 A
5717169 Liang et al. Feb 1998 A
5717484 Hamaguchi et al. Feb 1998 A
5720077 Nakamura et al. Feb 1998 A
5732401 Conway Mar 1998 A
5735959 Kubo et al. Apr 1998 A
5745235 Vercammen et al. Apr 1998 A
5752871 Tsuzuki May 1998 A
5756904 Oreper et al. May 1998 A
5761762 Kubo Jun 1998 A
5764888 Bolan et al. Jun 1998 A
5767437 Rogers Jun 1998 A
5767960 Orman Jun 1998 A
5776486 Kim Jul 1998 A
5777596 Herbert Jul 1998 A
5781960 Kilstrom et al. Jul 1998 A
5786602 Pryor et al. Jul 1998 A
5787545 Colens Aug 1998 A
5793900 Nourbakhsh et al. Aug 1998 A
5794297 Muta Aug 1998 A
5812267 Everett et al. Sep 1998 A
5814808 Takada et al. Sep 1998 A
5815880 Nakanishi Oct 1998 A
5815884 Imamura et al. Oct 1998 A
5819008 Asama et al. Oct 1998 A
5819360 Fujii Oct 1998 A
5819938 Saveliev et al. Oct 1998 A
5820821 Kawagoe et al. Oct 1998 A
5821730 Drapkin Oct 1998 A
5825981 Matsuda Oct 1998 A
5828770 Leis et al. Oct 1998 A
5831597 West et al. Nov 1998 A
5839156 Park et al. Nov 1998 A
5839532 Yoshiji et al. Nov 1998 A
5841259 Kim et al. Nov 1998 A
5867800 Leif Feb 1999 A
5869910 Colens Feb 1999 A
5896611 Haaga Apr 1999 A
5903124 Kawakami May 1999 A
5905209 Oreper May 1999 A
5907886 Buscher Jun 1999 A
5910700 Crotzer Jun 1999 A
5911260 Suzuki Jun 1999 A
5916008 Wong Jun 1999 A
5924167 Wright et al. Jul 1999 A
5926909 McGee Jul 1999 A
5933102 Miller et al. Aug 1999 A
5933913 Wright et al. Aug 1999 A
5935179 Kleiner et al. Aug 1999 A
5940346 Sadowsky et al. Aug 1999 A
5940927 Haegermarck et al. Aug 1999 A
5940930 Oh et al. Aug 1999 A
5942869 Katou et al. Aug 1999 A
5943730 Boomgaarden Aug 1999 A
5943733 Tagliaferri Aug 1999 A
5947225 Kawakami et al. Sep 1999 A
5950408 Schaedler Sep 1999 A
5959423 Nakanishi et al. Sep 1999 A
5968281 Wright et al. Oct 1999 A
5974348 Rocks Oct 1999 A
5974365 Mitchell Oct 1999 A
5983448 Wright et al. Nov 1999 A
5984880 Lander et al. Nov 1999 A
5989700 Krivopal Nov 1999 A
5991951 Kubo et al. Nov 1999 A
5995883 Nishikado Nov 1999 A
5995884 Allen et al. Nov 1999 A
5996167 Close Dec 1999 A
5998953 Nakamura et al. Dec 1999 A
5998971 Corbridge Dec 1999 A
6000088 Wright et al. Dec 1999 A
6009358 Angott et al. Dec 1999 A
6021545 Delgado et al. Feb 2000 A
6023813 Thatcher et al. Feb 2000 A
6023814 Imamura Feb 2000 A
6025687 Himeda et al. Feb 2000 A
6026539 Mouw et al. Feb 2000 A
6030464 Azevedo Feb 2000 A
6030465 Marcussen et al. Feb 2000 A
6032542 Warnick et al. Mar 2000 A
6036572 Sze Mar 2000 A
6038501 Kawakami Mar 2000 A
6038572 Sze Mar 2000 A
6040669 Hog Mar 2000 A
6041471 Charky et al. Mar 2000 A
6041472 Kasen et al. Mar 2000 A
6046800 Ohtomo et al. Apr 2000 A
6049620 Dickinson et al. Apr 2000 A
6052821 Chouly et al. Apr 2000 A
6055702 Imamura et al. May 2000 A
6061868 Moritsch et al. May 2000 A
6065182 Wright et al. May 2000 A
6073432 Schaedler Jun 2000 A
6076025 Ueno et al. Jun 2000 A
6076026 Jambhekar et al. Jun 2000 A
6076226 Reed Jun 2000 A
6076227 Schallig et al. Jun 2000 A
6081257 Zeller Jun 2000 A
6088020 Mor Jul 2000 A
6094775 Behmer Aug 2000 A
6099091 Campbell Aug 2000 A
6099661 Conrad Aug 2000 A
6101671 Wright et al. Aug 2000 A
6108031 King et al. Aug 2000 A
6108067 Okamoto Aug 2000 A
6108076 Hanseder Aug 2000 A
6108269 Kabel Aug 2000 A
6108597 Kirchner et al. Aug 2000 A
6112143 Allen et al. Aug 2000 A
6112996 Matsuo Sep 2000 A
6119057 Kawagoe Sep 2000 A
6122798 Kobayashi et al. Sep 2000 A
6124694 Bancroft et al. Sep 2000 A
6125498 Roberts et al. Oct 2000 A
6131237 Kasper et al. Oct 2000 A
6138063 Himeda Oct 2000 A
6142252 Kinto et al. Nov 2000 A
6146278 Kobayashi Nov 2000 A
6154279 Thayer Nov 2000 A
6154694 Aoki et al. Nov 2000 A
6160479 Åhlén et al. Dec 2000 A
6167332 Kurtzberg et al. Dec 2000 A
6167587 Kasper et al. Jan 2001 B1
6173651 Pathe et al. Jan 2001 B1
6192548 Huffman Feb 2001 B1
6216307 Kaleta et al. Apr 2001 B1
6220865 Macri et al. Apr 2001 B1
6226830 Hendriks et al. May 2001 B1
6230362 Kasper et al. May 2001 B1
6237741 Guidetti May 2001 B1
6240342 Fiegert et al. May 2001 B1
6243913 Frank et al. Jun 2001 B1
6255793 Peless et al. Jul 2001 B1
6259979 Holmquist Jul 2001 B1
6261379 Conrad et al. Jul 2001 B1
6263539 Baig Jul 2001 B1
6263989 Won Jul 2001 B1
6272936 Oreper et al. Aug 2001 B1
6276478 Hopkins et al. Aug 2001 B1
6278918 Dickson et al. Aug 2001 B1
6282526 Ganesh Aug 2001 B1
6283034 Miles Sep 2001 B1
6285778 Nakajima et al. Sep 2001 B1
6285930 Dickson et al. Sep 2001 B1
6300737 Bergvall et al. Oct 2001 B1
6321337 Reshef et al. Nov 2001 B1
6321515 Colens Nov 2001 B1
6323570 Nishimura et al. Nov 2001 B1
6324714 Walz et al. Dec 2001 B1
6327741 Reed Dec 2001 B1
6332400 Meyer Dec 2001 B1
6339735 Peless et al. Jan 2002 B1
6362875 Burkley Mar 2002 B1
6370453 Sommer Apr 2002 B2
6374155 Wallach et al. Apr 2002 B1
6374157 Takamura Apr 2002 B1
6381802 Park May 2002 B2
6385515 Dickson et al. May 2002 B1
6388013 Saraf et al. May 2002 B1
6389329 Colens May 2002 B1
6400048 Nishimura et al. Jun 2002 B1
6401294 Kasper Jun 2002 B2
6408226 Byrne et al. Jun 2002 B1
6412141 Kasper et al. Jul 2002 B2
6415203 Inoue et al. Jul 2002 B1
6421870 Basham et al. Jul 2002 B1
6427285 Legatt et al. Aug 2002 B1
6430471 Kintou et al. Aug 2002 B1
6431296 Won Aug 2002 B1
6437227 Theimer Aug 2002 B1
6437465 Nishimura et al. Aug 2002 B1
6438456 Feddema et al. Aug 2002 B1
6438793 Miner et al. Aug 2002 B1
6442476 Poropat Aug 2002 B1
6443509 Levin et al. Sep 2002 B1
6444003 Sutcliffe Sep 2002 B1
6446302 Kasper et al. Sep 2002 B1
6454036 Airey et al. Sep 2002 B1
D464091 Christianson Oct 2002 S
6457206 Judson Oct 2002 B1
6459955 Bartsch et al. Oct 2002 B1
6463368 Feiten et al. Oct 2002 B1
6465982 Bergvall et al. Oct 2002 B1
6473167 Odell Oct 2002 B1
6480762 Uchikubo et al. Nov 2002 B1
6481515 Kirkpatrick et al. Nov 2002 B1
6490539 Dickson et al. Dec 2002 B1
6490977 Bossarte et al. Dec 2002 B1
6491127 Holmberg et al. Dec 2002 B1
6493612 Bisset et al. Dec 2002 B1
6493613 Peless et al. Dec 2002 B2
6496754 Song et al. Dec 2002 B2
6496755 Wallach et al. Dec 2002 B2
6502657 Kerrebrock et al. Jan 2003 B2
6504610 Bauer et al. Jan 2003 B1
6507773 Parker et al. Jan 2003 B2
6525509 Petersson et al. Feb 2003 B1
D471243 Cioffi et al. Mar 2003 S
6532404 Colens Mar 2003 B2
6535793 Allard Mar 2003 B2
6540607 Mokris et al. Apr 2003 B2
6548982 Papanikolopoulos et al. Apr 2003 B1
6553612 Dyson et al. Apr 2003 B1
6556722 Russell et al. Apr 2003 B1
6556892 Kuroki et al. Apr 2003 B2
6557104 Vu et al. Apr 2003 B2
D474312 Stephens et al. May 2003 S
6563130 Dworkowski et al. May 2003 B2
6571415 Gerber et al. Jun 2003 B2
6571422 Gordon et al. Jun 2003 B1
6572711 Sclafani et al. Jun 2003 B2
6574536 Kawagoe et al. Jun 2003 B1
6580246 Jacobs Jun 2003 B2
6584376 Van Kommer Jun 2003 B1
6586908 Petersson et al. Jul 2003 B2
6587573 Stam et al. Jul 2003 B1
6590222 Bisset et al. Jul 2003 B1
6594551 McKinney et al. Jul 2003 B2
6594844 Jones Jul 2003 B2
D478884 Slipy et al. Aug 2003 S
6601265 Burlington Aug 2003 B1
6604021 Imai et al. Aug 2003 B2
6604022 Parker et al. Aug 2003 B2
6605156 Clark et al. Aug 2003 B1
6611120 Song et al. Aug 2003 B2
6611734 Parker et al. Aug 2003 B2
6611738 Ruffner Aug 2003 B2
6615108 Peless et al. Sep 2003 B1
6615885 Ohm Sep 2003 B1
6622465 Jerome et al. Sep 2003 B2
6624744 Wilson et al. Sep 2003 B1
6625843 Kim et al. Sep 2003 B2
6629028 Paromtchik et al. Sep 2003 B2
6639659 Granger Oct 2003 B2
6658325 Zweig Dec 2003 B2
6658354 Lin Dec 2003 B2
6658692 Lenkiewicz et al. Dec 2003 B2
6658693 Reed Dec 2003 B1
6661239 Ozick Dec 2003 B1
6662889 De Fazio et al. Dec 2003 B2
6668951 Won Dec 2003 B2
6670817 Fournier et al. Dec 2003 B2
6671592 Bisset et al. Dec 2003 B1
6687571 Byrne et al. Feb 2004 B1
6690134 Jones et al. Feb 2004 B1
6690993 Foulke et al. Feb 2004 B2
6697147 Ko et al. Feb 2004 B2
6711280 Stafsudd et al. Mar 2004 B2
6732826 Song et al. May 2004 B2
6737591 Lapstun et al. May 2004 B1
6741054 Koselka et al. May 2004 B2
6741364 Lange et al. May 2004 B2
6748297 Song et al. Jun 2004 B2
6756703 Chang Jun 2004 B2
6760647 Nourbakhsh et al. Jul 2004 B2
6764373 Osawa et al. Jul 2004 B1
6769004 Barrett Jul 2004 B2
6774596 Bisset Aug 2004 B1
6779380 Nieuwkamp Aug 2004 B1
6781338 Jones et al. Aug 2004 B2
6809490 Jones et al. Oct 2004 B2
6810305 Kirkpatrick Oct 2004 B2
6830120 Yashima et al. Dec 2004 B1
6832407 Salem et al. Dec 2004 B2
6836701 McKee Dec 2004 B2
6841963 Song et al. Jan 2005 B2
6845297 Allard Jan 2005 B2
6856811 Burdue et al. Feb 2005 B2
6859010 Jeon et al. Feb 2005 B2
6859682 Naka et al. Feb 2005 B2
6860206 Rudakevych et al. Mar 2005 B1
6865447 Lau et al. Mar 2005 B2
6870792 Chiappetta Mar 2005 B2
6871115 Huang et al. Mar 2005 B2
6883201 Jones et al. Apr 2005 B2
6886651 Slocum et al. May 2005 B1
6888333 Laby May 2005 B2
6901624 Mori et al. Jun 2005 B2
6906702 Tanaka et al. Jun 2005 B1
6101670 Song Jul 2005 C1
6914403 Tsurumi Jul 2005 B2
6917854 Bayer Jul 2005 B2
6925679 Wallach et al. Aug 2005 B2
6929548 Wang Aug 2005 B2
D510066 Hickey et al. Sep 2005 S
6938298 Aasen Sep 2005 B2
6940291 Ozick Sep 2005 B1
6941199 Bottomley et al. Sep 2005 B1
6956348 Landry et al. Oct 2005 B2
6957712 Song et al. Oct 2005 B2
6960986 Asama et al. Nov 2005 B2
6965209 Jones et al. Nov 2005 B2
6965211 Tsurumi Nov 2005 B2
6968592 Takeuchi et al. Nov 2005 B2
6971140 Kim Dec 2005 B2
6975246 Trudeau Dec 2005 B1
6980229 Ebersole Dec 2005 B1
6985556 Shanmugavel et al. Jan 2006 B2
6993954 George et al. Feb 2006 B1
6999850 McDonald Feb 2006 B2
7013527 Thomas et al. Mar 2006 B2
7024278 Chiappetta et al. Apr 2006 B2
7024280 Parker et al. Apr 2006 B2
7027893 Perry et al. Apr 2006 B2
7030768 Wanie Apr 2006 B2
7031805 Lee et al. Apr 2006 B2
7032469 Bailey Apr 2006 B2
7053578 Diehl et al. May 2006 B2
7054716 McKee et al. May 2006 B2
5987383 Keller et al. Jun 2006 C1
7055210 Keppler et al. Jun 2006 B2
7057120 Ma et al. Jun 2006 B2
7057643 Iida et al. Jun 2006 B2
7065430 Naka et al. Jun 2006 B2
7066291 Martins et al. Jun 2006 B2
7069124 Whittaker et al. Jun 2006 B1
7079923 Abramson et al. Jul 2006 B2
7085623 Siegers Aug 2006 B2
7085624 Aldred et al. Aug 2006 B2
7113847 Chmura et al. Sep 2006 B2
7133746 Abramson et al. Nov 2006 B2
7142198 Lee Nov 2006 B2
7148458 Schell et al. Dec 2006 B2
7155308 Jones Dec 2006 B2
7167775 Abramson et al. Jan 2007 B2
7171285 Kim et al. Jan 2007 B2
7174238 Zweig Feb 2007 B1
7188000 Chiappetta et al. Mar 2007 B2
7193384 Norman et al. Mar 2007 B1
7196487 Jones et al. Mar 2007 B2
7201786 Wegelin et al. Apr 2007 B2
7206677 Huldén Apr 2007 B2
7211980 Bruemmer et al. May 2007 B1
7225500 Diehl et al. Jun 2007 B2
7246405 Yan Jul 2007 B2
7248951 Huldén Jul 2007 B2
7275280 Haegermarck et al. Oct 2007 B2
7283892 Boillot et al. Oct 2007 B1
7288912 Landry et al. Oct 2007 B2
7318248 Yan et al. Jan 2008 B1
7320149 Huffman et al. Jan 2008 B1
7324870 Lee Jan 2008 B2
7328196 Peters Feb 2008 B2
7332890 Cohen et al. Feb 2008 B2
7352153 Yan Apr 2008 B2
7359766 Jeon et al. Apr 2008 B2
7360277 Moshenrose et al. Apr 2008 B2
7363108 Noda et al. Apr 2008 B2
7388879 Sabe et al. Jun 2008 B2
7389166 Harwig et al. Jun 2008 B2
7408157 Yan Aug 2008 B2
7418762 Arai et al. Sep 2008 B2
7430455 Casey et al. Sep 2008 B2
7430462 Chiu et al. Sep 2008 B2
7441298 Svendsen et al. Oct 2008 B2
7444206 Abramson et al. Oct 2008 B2
7448113 Jones et al. Nov 2008 B2
7459871 Landry et al. Dec 2008 B2
7467026 Sakagami et al. Dec 2008 B2
7474941 Kim et al. Jan 2009 B2
7503096 Lin Mar 2009 B2
7515991 Egawa et al. Apr 2009 B2
7555363 Augenbraun et al. Jun 2009 B2
7557703 Yamada et al. Jul 2009 B2
7568259 Yan Aug 2009 B2
7571511 Jones et al. Aug 2009 B2
7578020 Jaworski et al. Aug 2009 B2
7600521 Woo Oct 2009 B2
7603744 Reindle Oct 2009 B2
7617557 Reindle Nov 2009 B2
7620476 Morse et al. Nov 2009 B2
7636982 Jones et al. Dec 2009 B2
7647144 Haegermarck Jan 2010 B2
7650666 Jang Jan 2010 B2
7660650 Kawagoe et al. Feb 2010 B2
7663333 Jones et al. Feb 2010 B2
7693605 Park Apr 2010 B2
7706917 Chiappetta et al. Apr 2010 B1
6925357 Wang et al. May 2010 B2
7765635 Park Aug 2010 B2
7801645 Taylor et al. Sep 2010 B2
7805220 Taylor et al. Sep 2010 B2
7809944 Kawamoto Oct 2010 B2
7849555 Hahm et al. Dec 2010 B2
7853645 Brown et al. Dec 2010 B2
7920941 Park et al. Apr 2011 B2
7937800 Yan May 2011 B2
7957836 Myeong et al. Jun 2011 B2
20010004719 Sommer Jun 2001 A1
20010013929 Torsten Aug 2001 A1
20010020200 Das et al. Sep 2001 A1
20010025183 Shahidi Sep 2001 A1
20010037163 Allard Nov 2001 A1
20010043509 Green et al. Nov 2001 A1
20010045883 Holdaway et al. Nov 2001 A1
20010047231 Peless et al. Nov 2001 A1
20010047895 De Fazio et al. Dec 2001 A1
20020011367 Kolesnik Jan 2002 A1
20020011813 Koselka et al. Jan 2002 A1
20020016649 Jones Feb 2002 A1
20020021219 Edwards Feb 2002 A1
20020027652 Paromtchik et al. Mar 2002 A1
20020036779 Kiyoi et al. Mar 2002 A1
20020081937 Yamada et al. Jun 2002 A1
20020095239 Wallach et al. Jul 2002 A1
20020097400 Jung et al. Jul 2002 A1
20020104963 Mancevski Aug 2002 A1
20020108209 Peterson Aug 2002 A1
20020112742 Bredo et al. Aug 2002 A1
20020113973 Ge Aug 2002 A1
20020116089 Kirkpatrick Aug 2002 A1
20020120364 Colens Aug 2002 A1
20020124343 Reed Sep 2002 A1
20020153185 Song et al. Oct 2002 A1
20020156556 Ruffner Oct 2002 A1
20020159051 Guo Oct 2002 A1
20020166193 Kasper Nov 2002 A1
20020169521 Goodman et al. Nov 2002 A1
20020173877 Zweig Nov 2002 A1
20020189871 Won Dec 2002 A1
20030009259 Hattori et al. Jan 2003 A1
20030019071 Field et al. Jan 2003 A1
20030023356 Keable Jan 2003 A1
20030024986 Mazz et al. Feb 2003 A1
20030025472 Jones et al. Feb 2003 A1
20030028286 Glenn et al. Feb 2003 A1
20030030399 Jacobs Feb 2003 A1
20030058262 Sato et al. Mar 2003 A1
20030060928 Abramson et al. Mar 2003 A1
20030067451 Tagg et al. Apr 2003 A1
20030097875 Lentz et al. May 2003 A1
20030120389 Abramson et al. Jun 2003 A1
20030124312 Autumn Jul 2003 A1
20030126352 Barrett Jul 2003 A1
20030137268 Papanikolopoulos et al. Jul 2003 A1
20030146384 Logsdon et al. Aug 2003 A1
20030192144 Song et al. Oct 2003 A1
20030193657 Uomori et al. Oct 2003 A1
20030216834 Allard Nov 2003 A1
20030221114 Hino et al. Nov 2003 A1
20030229421 Chmura et al. Dec 2003 A1
20030229474 Suzuki et al. Dec 2003 A1
20030233171 Heiligensetzer Dec 2003 A1
20030233177 Johnson et al. Dec 2003 A1
20030233870 Mancevski Dec 2003 A1
20030233930 Ozick Dec 2003 A1
20040016077 Song et al. Jan 2004 A1
20040020000 Jones Feb 2004 A1
20040030448 Solomon Feb 2004 A1
20040030449 Solomon Feb 2004 A1
20040030450 Solomon Feb 2004 A1
20040030451 Solomon Feb 2004 A1
20040030570 Solomon Feb 2004 A1
20040030571 Solomon Feb 2004 A1
20040031113 Wosewick et al. Feb 2004 A1
20040049877 Jones et al. Mar 2004 A1
20040055163 McCambridge et al. Mar 2004 A1
20040068351 Solomon Apr 2004 A1
20040068415 Solomon Apr 2004 A1
20040068416 Solomon Apr 2004 A1
20040074038 Im et al. Apr 2004 A1
20040074044 Diehl et al. Apr 2004 A1
20040076324 Burl et al. Apr 2004 A1
20040083570 Song et al. May 2004 A1
20040085037 Jones et al. May 2004 A1
20040088079 Lavarec et al. May 2004 A1
20040093122 Galibraith May 2004 A1
20040098167 Yi et al. May 2004 A1
20040111184 Chiappetta et al. Jun 2004 A1
20040111821 Lenkiewicz et al. Jun 2004 A1
20040113777 Matsuhira et al. Jun 2004 A1
20040117064 McDonald Jun 2004 A1
20040117846 Karaoguz et al. Jun 2004 A1
20040118998 Wingett et al. Jun 2004 A1
20040128028 Miyamoto et al. Jul 2004 A1
20040133316 Dean Jul 2004 A1
20040134336 Solomon Jul 2004 A1
20040134337 Solomon Jul 2004 A1
20040143919 Wilder Jul 2004 A1
20040148419 Chen et al. Jul 2004 A1
20040148731 Damman et al. Aug 2004 A1
20040153212 Profio et al. Aug 2004 A1
20040156541 Jeon et al. Aug 2004 A1
20040158357 Lee et al. Aug 2004 A1
20040181706 Chen et al. Sep 2004 A1
20040187249 Jones et al. Sep 2004 A1
20040187457 Colens Sep 2004 A1
20040196451 Aoyama Oct 2004 A1
20040200505 Taylor et al. Oct 2004 A1
20040204792 Taylor et al. Oct 2004 A1
20040210345 Noda et al. Oct 2004 A1
20040210347 Sawada et al. Oct 2004 A1
20040211444 Taylor et al. Oct 2004 A1
20040221790 Sinclair et al. Nov 2004 A1
20040236468 Taylor et al. Nov 2004 A1
20040244138 Taylor et al. Dec 2004 A1
20040255425 Arai et al. Dec 2004 A1
20050000543 Taylor et al. Jan 2005 A1
20050010330 Abramson et al. Jan 2005 A1
20050010331 Taylor et al. Jan 2005 A1
20050021181 Kim et al. Jan 2005 A1
20050067994 Jones et al. Mar 2005 A1
20050085947 Aldred et al. Apr 2005 A1
20050137749 Jeon et al. Jun 2005 A1
20050144751 Kegg et al. Jul 2005 A1
20050150074 Diehl et al. Jul 2005 A1
20050150519 Keppler et al. Jul 2005 A1
20050154795 Kuz et al. Jul 2005 A1
20050156562 Cohen et al. Jul 2005 A1
20050165508 Kanda et al. Jul 2005 A1
20050166354 Uehigashi Aug 2005 A1
20050166355 Tani Aug 2005 A1
20050172445 Diehl et al. Aug 2005 A1
20050183229 Uehigashi Aug 2005 A1
20050183230 Uehigashi Aug 2005 A1
20050187678 Myeong et al. Aug 2005 A1
20050192270 Landau Sep 2005 A1
20050192707 Park et al. Sep 2005 A1
20050204717 Colens Sep 2005 A1
20050209736 Kawagoe Sep 2005 A1
20050211880 Schell et al. Sep 2005 A1
20050212929 Schell et al. Sep 2005 A1
20050213082 DiBernardo et al. Sep 2005 A1
20050213109 Schell et al. Sep 2005 A1
20050217042 Reindle Oct 2005 A1
20050218852 Landry et al. Oct 2005 A1
20050222933 Wesby Oct 2005 A1
20050229340 Sawalski et al. Oct 2005 A1
20050229355 Crouch et al. Oct 2005 A1
20050235451 Yan Oct 2005 A1
20050251292 Casey et al. Nov 2005 A1
20050255425 Pierson Nov 2005 A1
20050258154 Blankenship et al. Nov 2005 A1
20050273967 Taylor et al. Dec 2005 A1
20050288819 de Guzman Dec 2005 A1
20060000050 Cipolla et al. Jan 2006 A1
20060010638 Shimizu et al. Jan 2006 A1
20060020369 Taylor et al. Jan 2006 A1
20060020370 Abramson Jan 2006 A1
20060021168 Nishikawa Feb 2006 A1
20060025134 Cho et al. Feb 2006 A1
20060037170 Shimizu Feb 2006 A1
20060042042 Mertes et al. Mar 2006 A1
20060044546 Lewin et al. Mar 2006 A1
20060060216 Woo Mar 2006 A1
20060061657 Rew et al. Mar 2006 A1
20060064828 Stein et al. Mar 2006 A1
20060087273 Ko et al. Apr 2006 A1
20060089765 Pack et al. Apr 2006 A1
20060100741 Jung May 2006 A1
20060119839 Bertin et al. Jun 2006 A1
20060143295 Costa-Requena et al. Jun 2006 A1
20060146776 Kim Jul 2006 A1
20060190133 Konandreas et al. Aug 2006 A1
20060190146 Morse et al. Aug 2006 A1
20060196003 Song et al. Sep 2006 A1
20060220900 Ceskutti et al. Oct 2006 A1
20060259194 Chiu Nov 2006 A1
20060259494 Watson et al. Nov 2006 A1
20060288519 Jaworski et al. Dec 2006 A1
20060293787 Kanda et al. Dec 2006 A1
20070006404 Cheng et al. Jan 2007 A1
20070017061 Yan Jan 2007 A1
20070028574 Yan Feb 2007 A1
20070032904 Kawagoe et al. Feb 2007 A1
20070042716 Goodall et al. Feb 2007 A1
20070043459 Abbott et al. Feb 2007 A1
20070061041 Zweig Mar 2007 A1
20070114975 Cohen et al. May 2007 A1
20070150096 Yeh et al. Jun 2007 A1
20070157415 Lee et al. Jul 2007 A1
20070157420 Lee et al. Jul 2007 A1
20070179670 Chiappetta et al. Aug 2007 A1
20070226949 Hahm et al. Oct 2007 A1
20070234492 Svendsen et al. Oct 2007 A1
20070244610 Ozick et al. Oct 2007 A1
20070250212 Halloran et al. Oct 2007 A1
20070266508 Jones et al. Nov 2007 A1
20080007203 Cohen et al. Jan 2008 A1
20080039974 Sandin et al. Feb 2008 A1
20080052846 Kapoor et al. Mar 2008 A1
20080091304 Ozick et al. Apr 2008 A1
20080184518 Taylor et al. Aug 2008 A1
20080276407 Schnittman et al. Nov 2008 A1
20080281470 Gilbert et al. Nov 2008 A1
20080282494 Won et al. Nov 2008 A1
20080294288 Yamauchi Nov 2008 A1
20080302586 Yan Dec 2008 A1
20080307590 Jones et al. Dec 2008 A1
20090007366 Svendsen et al. Jan 2009 A1
20090038089 Landry et al. Feb 2009 A1
20090049640 Lee et al. Feb 2009 A1
20090055022 Casey et al. Feb 2009 A1
20090102296 Greene et al. Apr 2009 A1
20090292393 Casey et al. Nov 2009 A1
20100011529 Won et al. Jan 2010 A1
20100049365 Jones et al. Feb 2010 A1
20100063628 Landry et al. Mar 2010 A1
20100107355 Won et al. May 2010 A1
20100257690 Jones et al. Oct 2010 A1
20100257691 Jones et al. Oct 2010 A1
20100263158 Jones et al. Oct 2010 A1
20100268384 Jones et al. Oct 2010 A1
20100312429 Jones et al. Dec 2010 A1
RE28268 Dec 1974 E1
Foreign Referenced Citations (454)
Number Date Country
2003275566 Jun 2004 AU
21 28 842 Dec 1980 DE
2128842 Dec 1980 DE
3317376 Nov 1984 DE
34 04 202 May 1987 DE
33 17 376 Dec 1987 DE
3536907 Feb 1989 DE
3404202 Dec 1992 DE
199311014 Oct 1993 DE
4414683 Oct 1995 DE
4338841 Aug 1999 DE
19849978 Feb 2001 DE
19849978 Feb 2001 DE
10242257 Apr 2003 DE
102004038074 Jun 2005 DE
10357636 Jul 2005 DE
102004041021 Aug 2005 DE
102004041021 Aug 2005 DE
102005046813 Apr 2007 DE
102005046913 Apr 2007 DE
338988 Dec 1988 DK
265542 May 1988 EP
281085 Sep 1988 EP
358628 May 1991 EP
0 433 697 Jun 1991 EP
0 437 024 Jul 1991 EP
437024 Jul 1991 EP
437024 Jul 1991 EP
433697 Dec 1992 EP
479273 May 1993 EP
294101 Dec 1993 EP
554978 Mar 1994 EP
615719 Sep 1994 EP
0 792 726 Sep 1997 EP
0 861 629 Sep 1998 EP
861629 Sep 1998 EP
0 930 040 Jul 1999 EP
307381 Jul 1999 EP
930040 Oct 1999 EP
845237 Apr 2000 EP
1018315 Jul 2000 EP
1172719 Jan 2002 EP
1228734 Jun 2003 EP
1 331 537 Jul 2003 EP
1 331 537 Jul 2003 EP
1331537 Jul 2003 EP
1380245 Jan 2004 EP
1380246 Mar 2005 EP
1 553 472 Jul 2005 EP
1553472 Jul 2005 EP
1557730 Jul 2005 EP
1642522 Nov 2007 EP
2238196 Nov 2006 ES
2601443 Nov 1991 FR
2 828 589 Aug 2001 FR
702426 Jan 1954 GB
2128842 Apr 1986 GB
2213047 Aug 1989 GB
2225221 May 1990 GB
2 283 838 May 1995 GB
2284957 Jun 1995 GB
2234360 Dec 1995 GB
2300082 Sep 1999 GB
2 404 330 Feb 2005 GB
2404330 Jul 2005 GB
2 417 354 Feb 2006 GB
2417354 Feb 2006 GB
53021869 Feb 1978 JP
53110257 Sep 1978 JP
53110257 Sep 1978 JP
943901 Mar 1979 JP
57014726 Jan 1982 JP
57064217 Apr 1982 JP
59-5315 Jan 1984 JP
59005315 Feb 1984 JP
59033511 Mar 1984 JP
59094005 May 1984 JP
59099308 Jul 1984 JP
59112311 Jul 1984 JP
59033511 Aug 1984 JP
59120124 Aug 1984 JP
59131668 Sep 1984 JP
59164973 Sep 1984 JP
59184917 Oct 1984 JP
2283343 Nov 1984 JP
59212924 Dec 1984 JP
59226909 Dec 1984 JP
60089213 May 1985 JP
60089213 Jun 1985 JP
60211510 Oct 1985 JP
60259895 Dec 1985 JP
61023221 Jan 1986 JP
61097712 May 1986 JP
61023221 Jun 1986 JP
62074018 Apr 1987 JP
62070709 May 1987 JP
62-120510 Jun 1987 JP
62120510 Jun 1987 JP
62-154008 Jul 1987 JP
62154008 Jul 1987 JP
62164431 Oct 1987 JP
62263507 Nov 1987 JP
62263508 Nov 1987 JP
62189057 Dec 1987 JP
63079623 Apr 1988 JP
63-183032 Jul 1988 JP
63158032 Jul 1988 JP
63-241610 Oct 1988 JP
1162454 Jun 1989 JP
2-6312 Jan 1990 JP
2006312 Jan 1990 JP
2026312 Jan 1990 JP
2026312 Jun 1990 JP
2283343 Nov 1990 JP
03-051023 Mar 1991 JP
3051023 Mar 1991 JP
3197758 Aug 1991 JP
3201903 Sep 1991 JP
4019586 Mar 1992 JP
4084921 Mar 1992 JP
5-023269 Feb 1993 JP
HEI 05-046246 Feb 1993 JP
5023269 Apr 1993 JP
5091604 Apr 1993 JP
5042076 Jun 1993 JP
5046246 Jun 1993 JP
5150827 Jun 1993 JP
5150829 Jun 1993 JP
5046239 Jul 1993 JP
5054620 Jul 1993 JP
5054620 Jul 1993 JP
5-257533 Oct 1993 JP
5040519 Oct 1993 JP
5257527 Oct 1993 JP
5257533 Oct 1993 JP
5285861 Nov 1993 JP
HEI 05-324068 Dec 1993 JP
6003251 Jan 1994 JP
6026312 Apr 1994 JP
6137828 May 1994 JP
6293095 Oct 1994 JP
06-327598 Nov 1994 JP
6105781 Dec 1994 JP
7059702 Mar 1995 JP
7-129239 May 1995 JP
7059702 Jun 1995 JP
7222705 Aug 1995 JP
7222705 Aug 1995 JP
7270518 Oct 1995 JP
7281742 Oct 1995 JP
7281752 Oct 1995 JP
7-295636 Nov 1995 JP
7295636 Nov 1995 JP
7311041 Nov 1995 JP
7313417 Dec 1995 JP
7319542 Dec 1995 JP
8-16776 Jan 1996 JP
8000393 Jan 1996 JP
8000393 Jan 1996 JP
8016241 Jan 1996 JP
8016776 Feb 1996 JP
8063229 Mar 1996 JP
8083125 Mar 1996 JP
8083125 Mar 1996 JP
08-089451 Apr 1996 JP
8089449 Apr 1996 JP
2520732 May 1996 JP
8123548 May 1996 JP
08-152916 Jun 1996 JP
8152916 Jun 1996 JP
8256960 Oct 1996 JP
8263137 Oct 1996 JP
8286741 Nov 1996 JP
8286744 Nov 1996 JP
8322774 Dec 1996 JP
8322774 Dec 1996 JP
8335112 Dec 1996 JP
9043901 Feb 1997 JP
9044240 Feb 1997 JP
9047413 Feb 1997 JP
9066855 Mar 1997 JP
9145309 Jun 1997 JP
9160644 Jun 1997 JP
9160644 Jun 1997 JP
9-179625 Jul 1997 JP
9179625 Jul 1997 JP
9179685 Jul 1997 JP
9185410 Jul 1997 JP
9192069 Jul 1997 JP
9204223 Aug 1997 JP
9206258 Aug 1997 JP
9206258 Aug 1997 JP
9233712 Sep 1997 JP
09251318 Sep 1997 JP
9251318 Sep 1997 JP
9265319 Oct 1997 JP
9269807 Oct 1997 JP
9269810 Oct 1997 JP
02555263 Nov 1997 JP
9319431 Dec 1997 JP
9319432 Dec 1997 JP
9319434 Dec 1997 JP
9325812 Dec 1997 JP
10055215 Feb 1998 JP
10117973 May 1998 JP
10117973 May 1998 JP
10118963 May 1998 JP
10177414 Jun 1998 JP
10214114 Aug 1998 JP
10214114 Aug 1998 JP
10228316 Aug 1998 JP
10240342 Sep 1998 JP
10260727 Sep 1998 JP
10295595 Nov 1998 JP
11015941 Jan 1999 JP
11065655 Mar 1999 JP
11085269 Mar 1999 JP
11102219 Apr 1999 JP
11102220 Apr 1999 JP
11162454 Jun 1999 JP
11174145 Jul 1999 JP
11175149 Jul 1999 JP
11178764 Jul 1999 JP
11178765 Jul 1999 JP
11-508810 Aug 1999 JP
11212642 Aug 1999 JP
11212642 Aug 1999 JP
11213157 Aug 1999 JP
11508810 Aug 1999 JP
11-510935 Sep 1999 JP
11248806 Sep 1999 JP
11510935 Sep 1999 JP
11282532 Oct 1999 JP
11282533 Oct 1999 JP
11295412 Oct 1999 JP
HEI 11-267994 Oct 1999 JP
11346964 Dec 1999 JP
2000047728 Feb 2000 JP
2000056006 Feb 2000 JP
2000056831 Feb 2000 JP
2000066722 Mar 2000 JP
2000075925 Mar 2000 JP
10240343 May 2000 JP
2000275321 Oct 2000 JP
2000353014 Dec 2000 JP
2000353014 Dec 2000 JP
2001022443 Jan 2001 JP
2001067588 Mar 2001 JP
2001087182 Apr 2001 JP
2001087182 Apr 2001 JP
2001121455 May 2001 JP
2001125641 May 2001 JP
2001216482 Aug 2001 JP
2001-258807 Sep 2001 JP
2001265437 Sep 2001 JP
2001-275908 Oct 2001 JP
2001289939 Oct 2001 JP
2001306170 Nov 2001 JP
2001320781 Nov 2001 JP
2001-525567 Dec 2001 JP
2002-78650 Mar 2002 JP
2002-086377 Mar 2002 JP
2002-204768 Jul 2002 JP
22204768 Jul 2002 JP
2002204769 Jul 2002 JP
2002247510 Aug 2002 JP
2002-532178 Oct 2002 JP
3356170 Oct 2002 JP
2002-323925 Nov 2002 JP
3375843 Nov 2002 JP
22323925 Nov 2002 JP
2002333920 Nov 2002 JP
2002-355206 Dec 2002 JP
2002-360471 Dec 2002 JP
2002-360482 Dec 2002 JP
2002-366227 Dec 2002 JP
22360479 Dec 2002 JP
2002360479 Dec 2002 JP
2002366227 Dec 2002 JP
2002369778 Dec 2002 JP
2002369778 Dec 2002 JP
2003-10076 Jan 2003 JP
2003-28528 Jan 2003 JP
2003010076 Jan 2003 JP
2003010088 Jan 2003 JP
2003015740 Jan 2003 JP
2003028528 Jan 2003 JP
2003-5296 Feb 2003 JP
2003-036116 Feb 2003 JP
2003-38401 Feb 2003 JP
2003-38402 Feb 2003 JP
2003-047579 Feb 2003 JP
2003-505127 Feb 2003 JP
23052596 Feb 2003 JP
2003036116 Feb 2003 JP
2003047579 Feb 2003 JP
2003052596 Feb 2003 JP
2003-061882 Mar 2003 JP
2003084994 Mar 2003 JP
2003167628 Jun 2003 JP
2003-180586 Jul 2003 JP
2003180587 Jul 2003 JP
2003186539 Jul 2003 JP
2003190064 Jul 2003 JP
2003190064 Jul 2003 JP
2003241836 Aug 2003 JP
2003262520 Sep 2003 JP
2003285288 Oct 2003 JP
2003304992 Oct 2003 JP
2003-310489 Nov 2003 JP
2003310509 Nov 2003 JP
2003330543 Nov 2003 JP
2004123040 Apr 2004 JP
2004148021 May 2004 JP
2004160102 Jun 2004 JP
2004166968 Jun 2004 JP
2004174228 Jun 2004 JP
2004-522231 Jul 2004 JP
2004198330 Jul 2004 JP
2004219185 Aug 2004 JP
2004-303134 Oct 2004 JP
2005118354 May 2005 JP
2005135400 May 2005 JP
2005-211360 Aug 2005 JP
2005211360 Aug 2005 JP
2005224265 Aug 2005 JP
2005-230032 Sep 2005 JP
2005230032 Sep 2005 JP
2005245916 Sep 2005 JP
2005-296511 Oct 2005 JP
2005296511 Oct 2005 JP
2005346700 Dec 2005 JP
2005352707 Dec 2005 JP
2006043071 Feb 2006 JP
2006155274 Jun 2006 JP
2006164223 Jun 2006 JP
2006227673 Aug 2006 JP
2006247467 Sep 2006 JP
2006260161 Sep 2006 JP
2006293662 Oct 2006 JP
2006296697 Nov 2006 JP
2007034866 Feb 2007 JP
2007213180 Aug 2007 JP
2007-313099 Dec 2007 JP
04074285 Apr 2008 JP
2009015611 Jan 2009 JP
2010198552 Sep 2010 JP
WO 9526512 Oct 1995 WO
WO9526512 Oct 1995 WO
WO9530887 Nov 1995 WO
WO9617258 Feb 1997 WO
WO 9715224 May 1997 WO
WO 9740734 Nov 1997 WO
WO 9741451 Nov 1997 WO
WO9740734 Nov 1997 WO
WO9741451 Nov 1997 WO
WO9853456 Nov 1998 WO
WO9905580 Feb 1999 WO
WO9905580 Feb 1999 WO
WO9916078 Apr 1999 WO
WO 9928800 Jun 1999 WO
WO 9938056 Jul 1999 WO
WO 9938237 Jul 1999 WO
WO 9943250 Sep 1999 WO
WO9959042 Nov 1999 WO
WO 0004430 Jan 2000 WO
WO0004430 Jan 2000 WO
WO0004430 Apr 2000 WO
WO 0036962 Jun 2000 WO
WO 0038026 Jun 2000 WO
WO0038026 Jun 2000 WO
WO0038028 Jun 2000 WO
WO0038029 Jun 2000 WO
WO 0078410 Dec 2000 WO
WO0106904 Feb 2001 WO
WO 0106904 Feb 2001 WO
WO 0106905 Feb 2001 WO
WO0106904 Feb 2001 WO
WO0180703 Nov 2001 WO
WO 0191623 Dec 2001 WO
WO0191623 Dec 2001 WO
WO 0239864 May 2002 WO
WO 0239868 May 2002 WO
WO0239868 May 2002 WO
WO 02058527 Aug 2002 WO
WO 02062194 Aug 2002 WO
WO02075350 Aug 2002 WO
WO 02067744 Sep 2002 WO
WO 02067745 Sep 2002 WO
WO 02071175 Sep 2002 WO
WO 02074150 Sep 2002 WO
WO 02075356 Sep 2002 WO
WO 02075469 Sep 2002 WO
WO 02075470 Sep 2002 WO
WO02067744 Sep 2002 WO
WO02067752 Sep 2002 WO
WO02069774 Sep 2002 WO
WO02075356 Sep 2002 WO
WO02075469 Sep 2002 WO
WO02075470 Sep 2002 WO
WO02081074 Oct 2002 WO
WO02081074 Oct 2002 WO
WO 02101477 Dec 2002 WO
WO02101477 Dec 2002 WO
WO02101477 Dec 2002 WO
WO03015220 Feb 2003 WO
WO03015220 Mar 2003 WO
WO03024292 Mar 2003 WO
WO 03026474 Apr 2003 WO
WO 03040845 May 2003 WO
WO 03040846 May 2003 WO
WO02069775 May 2003 WO
WO03040546 May 2003 WO
WO03040546 May 2003 WO
WO03062852 Jul 2003 WO
WO03062850 Jul 2003 WO
WO03062852 Jul 2003 WO
WO 2004006034 Jan 2004 WO
WO2004006034 Jan 2004 WO
WO 2004004533 Jan 2004 WO
WO2004004534 Jan 2004 WO
WO2004005956 Jan 2004 WO
WO2004005956 Jan 2004 WO
WO 2004058028 Jan 2004 WO
WO 2005077244 Jan 2004 WO
WO 2006068403 Jan 2004 WO
WO 2004025947 Mar 2004 WO
WO2004043215 May 2004 WO
WO2004058028 Jul 2004 WO
WO2004059409 Jul 2004 WO
WO2004058028 Jul 2004 WO
WO2005006935 Jan 2005 WO
WO2005036292 Apr 2005 WO
WO 2005055795 Jun 2005 WO
WO2005055796 Jun 2005 WO
WO2005076545 Aug 2005 WO
WO 2005077244 Aug 2005 WO
WO2005076545 Aug 2005 WO
WO2005077243 Aug 2005 WO
WO 2005081074 Sep 2005 WO
WO 2005083541 Sep 2005 WO
WO2005081074 Sep 2005 WO
WO2005082223 Sep 2005 WO
WO2005082223 Sep 2005 WO
WO2005098475 Oct 2005 WO
WO2005098476 Oct 2005 WO
WO2006046400 May 2006 WO
WO2006046400 May 2006 WO
WO2006061133 Jun 2006 WO
WO2006073248 Jul 2006 WO
WO2007036490 May 2007 WO
WO2007065033 Jun 2007 WO
WO2007065033 Jun 2007 WO
WO2007137234 Nov 2007 WO
Non-Patent Literature Citations (262)
Entry
PCT International Search Report, May 6, 2007.
Chamberlin, et al., Robot Locator Beacon System, NASA Goddard SFC, Design Proposal, Feb. 17, 2006.
European Search Report dated Apr. 3, 2009 in connection with EP Application No. 09154458.5 2206, 6 pages.
European Search Report dated Oct. 6, 2009 in connection with EP Application No. 09168571.9 2206. 149 pages.
HITACHI: News release: The home cleaning robot of the autonomous movement type (experimental machine) is developed, website: http://www.i4u.com/japanreleases/hitachirobot.htm, accessed Mar. 18, 2005.
International Preliminary Report for counterpart application, PCT/US2006/046395 dated Feb. 28, 2008.
International Search Report for PCT/US2006/046398, dated Oct. 31, 2007.
NPL0120 Range-Only Robot Localization and SLAM with Radio, Robotics Institute Carnegie Mellon—Author: Derek Kurth.
NPL0126 Physical Management of IT Assets in Data Centers Using RFID Technologies, Eric Champy.
NPL0127 Radio Frequency Identification: Tracking ISS Consumables, Author Unknown.
NPL0128 Robust Statistical Methods for Securing Wireless Localization in Sensor Networks, Zang Li, Wade.
Robot Review Samsung Robot Vacuum (VC-RP30W), website: http://www.onrobo.com/reviews/At—Home/Vacuum—Cleaners/on00verp30rosam/index.htm, accessed Mar. 18, 2005.
Robotic Vacuum Cleaner-Blue, website: http://www.sharperimage.com/us/en/catalog/productview.jthml?sku=S1727BLU, accessed Mar. 18, 2005.
Cameron Morland, Autonomous Lawn Mower Control, Jul. 24, 2002.
Doty, Keith L et al, “Sweep Strategies for a Sensory-Driven, Behavior-Based Vacuum Cleaning Agent” AAAI 1993 Fall Symposium Series Instantiating Real-World Agents Research Triangle Park, Raleigh, NC, Oct. 22-24, 1993, pp. 1-6.
Electrolux designed for the well-lived home, website: http://www.electroluxusa.com/node57.as[?currentURL=node142.asp%3F, acessed Mar. 18, 2005.
eVac Robotic Vacuum S1727 Instruction Manual, Sharper Image Corp, Copyright 2004, 16 pgs.
Everyday Robots, website: http://www.everydayrobots.com/index.php?option=content&task=view&id=9, accessed Apr. 20, 2005.
Facts on the Trilobite webpage: “http://trilobiteelectroluxse/presskit—en/nodel1335asp?print=yes&pressID=” accessed Dec. 12, 2003.
Friendly Robotics Robotic Vacuum RV400—The Robot Store website: http://www.therobotstore.com/s.nl/sc.9/category,-109/it.A/id.43/.f, accessed Apr. 20, 2005, 5 pgs.
Gat, Erann, Robust Low-computation Sensor-driven Control for Task-Directed Navigation, Proceedings of the 1991 IEEE, International Conference on Robotics and Automation, Sacramento, California, Apr. 1991, pp. 2484-2489.
Kärcher Product Manual Download webpage: “http://wwwkarchercom/bta/downloadenshtml?ACTION=SELECTTEILENR&ID=rc3000&submitButtonName=Select+Product+Manual” and associated pdf file “5959-915enpdf (47 MB) English/English” accessed Jan. 21, 2004.
Karcher RC 3000 Cleaning Robot—user manual Manufacturer: Alfred-Karcher GmbH & Co, Cleaning Systems, Alfred Karcher-Str 28-40, PO Box 160, D-71349 Winnenden, Germany, Dec. 2002.
Kärcher RoboCleaner RC 3000 Product Details webpages: “http://wwwrobocleanerde/english/screen3html” through “...screenohtml” accessed Dec. 12, 2003.
Karcher USA, RC3000 Robotic Cleaner, website: http://www.karcher.usa.com/showproducts.php?op=view—prod&param1=143&param2=&param3=, accessed Mar. 18, 2005.
Koolvac Robotic Vacuum Cleaner Owner's Manual, Koolatron, Undated.
NorthStar Low-Cost, Indoor Localization, Evolution robotics, Powering Intelligent Products.
Put Your Roomba . . . On “Automatic” Roomba Timer> Timed Cleaning-Floorvac Robotic Vacuum webpages: http://cgi.ebay.com/ws/eBayISAPI.dll?ViewItem&category=43575198387&rd=1, accessed Apr. 20, 2005.
Put Your Roomba . . . On “Automatic” webpages: “http://www.acomputeredge.com/roomba,” accessed Apr. 20, 2005.
RoboMaid Sweeps Your Floors So You Won't Have To, the Official Site, website: http://www.thereobomaid.com/, acessed Mar. 18, 2005.
Schofield, Monica, “Neither Master nor Slave” A Practical Study in the Development and Employment of Cleaning Robots, Emerging Technologies and Factory Automation, 1999 Proceedings EFA'99 1999 7th IEEE International Conference on Barcelona, Spain Oct. 18-21, 1999, pp. 1427-1434.
Wired News: Robot Vacs Are in the House, website: http://www.wired.com/news/print/0,1294,59237,00.html, accessed Mar. 18, 2005.
Zoombot Remote Controlled Vaccum-RV-500 New Roomba 2, website: http://egi.ebay.com/ws/eBayISAPI.dll?ViewItem&category=43526&item=4373497618&rd=1, accessed Apr. 20, 2005.
Shimoga et al., Touch and force reflection for telepresence surgery, 1994, IEEE, pp. 1049-1050.
Autonomous Lawn Care Applications, Conference on Robotics, Authors: Michael Gregg, Dr. Eric M. Schwartz, Dr. Antonio A. Arroyo—future work includes a mechanism for adjusting cutter height, additional motor and blade for edge cutting, May 25-26, 2006.
Sweep Strategies for a Sensory Driven Behavior Based Vacuum Cleaning Agent, Keith L. Doty and Reid R. Harrison, Oct. 22-24, 1993.
Jarosiewicz et al. “Final Report—Lucid”, Unversity of Florida Departmetn of Electrical and Computer Engineering EEL 5666—Intelligent Machine Design Laboratory, 50 pages, Aug. 4, 1999.
Jensfelt, et al. “Active Global Localization for a mobile robot using multiple hypothesis tracking”, IEEE Transactions on Robots and Automation vol. 17. No. 5, pp. 748-760. Oct. 2001.
Jeong, et al. “An Inteligent map-buiding system for indoor mobile robot using low cost photo sensors”, SPIE vol. 6042 6 pages, 2005.
Kahney, “Robot Vacs are in the House,” www.wired.com/news/technology/o,1282,59237,00.html, 6 pages, Jun. 18, 2003.
Karcher “Product Manual Download Karch”, www.karcher.com, 17 pages, 2004.
Karcher “Karcher RoboCleaner RC 3000”, www.robocleaner.de/english/screen3.html, 4 pages, Dec. 12, 2003.
Karcher USA “RC 3000 Robotics cleaner”, www.karcher-usa.com, 3 pages, Mar. 18, 2005.
Karlsson et al., The vSLAM Algorithm for Robust Localiztion and Mapping, Proceedings of the 2005 IEEE International Conference on Robotics and Automation, Barcelona, Spain, pp. 24-29, Apr. 2005.
Karlsson, et al Core Technologies for service Robotics, IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2004), vol. 3, pp. 2979-2984, Sep. 28-Oct. 2, 2004.
King “Heplmate-TM-Autonomous mobile Robots Navigation Systems”, SPIE vol. 1388 Mobile Robots pp. 190-198, 1990.
Kleinberg, The Localization Problem for Mobile Robots, Laboratory for Computer Science, Massachusetts institute of Technology, 1994 IEEE, pp. 521-531, 1994.
Knight, et al., “Localization and Identification of Visual Landmarks”, Journal of Computing Sciences in Colleges, vol. 16 Issue 4, 2001 pp. 312-313, May 2001.
Kolodko et al. “Experimental System for Real-Time Motion Estimation”, Proceedings of the 2003 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM 2003), pp. 981-986, 2003.
Komoriya et al., Planning of Landmark Measurement for the Navigation of a Mobile Robot, Proceedings of the 1992 IEEE/RSJ International Cofnerence on Intelligent Robots and Systems, Raleigh, NC pp. 1476-1481, Jul. 7-10, 1992.
Koolatron “KOOLVAC—Owner's Manual”, 13 pages.
Krotov, et al. “Digital Sextant”, Downloaded from the internet at: http://www.cs.cmu.edu/˜epk/, 1 page, 1995.
Krupa et al. “Autonomous 3-D Positioning of Surgical Instruments in Robotized Laparoscopic Surgery Using Visual Servoing”, IEEE Transactions on Robotics and Automation, vol. 19, No. 5, pp. 842-853, Oct. 5, 2013.
Kuhl et al. “Self Localization in Envronments using Visual Angles”, VRCAI '04 Poceedings of the 2004 ACM SIGGRAPH international conference on Virtual Reality continuum and its applications in industry, pp. 472-475, 2004.
Kurth, “Range-Only Robot Localization and SLAM with Radio”, http://www.ri.cmu.edu/pub—files/pub4/kurth—derek—2004—1/kurth—derek—2004—1.pdf. 60 pages, May 2004.
Lambrinos, et al. “A mobile robot employing insect strategies for navigation”, http://www8.cs.umu.se/kurser/TDBD17/VT04/dl/Assignment%20Papers/lambrinos-RAS-2000.pdf, 38 pages, Feb. 19, 1999.
Lang et al. “Visual Measurement of Orientation Using Ceiling Features”, 1994 IEEE, pp. 552-555, 1994.
Lapin, “Adaptive position estimation for an automated guided vehicle”, SPIE vol. 1831 Mobile Robots VII, pp. 82-94, 1992.
LaValle et al. “Robot Motion Planning in a Changing, Partially Predictable Environment”, 1994 IEEE International Symposium on Intelligent Control. Columbus, OH, pp. 261-266, Aug. 16-18, 1994.
Lee, et al. “Localization Of a Mobile Robot Using the Image of a Moving Object”, IEEE Transaction on Industrial Electronics, vol. 50, No. 3 pp. 612-619, Jun. 2003.
Lee, et al. “Development of Indoor Navigation system for Humanoid Robot Using Multi-sensors Integration”, ION NTM, San Diego, CA pp. 798-805, Jan. 22-24, 2007.
Leonard, et al. “Mobile Robot Localization by tracking Geometric Beacons”, IEEE Transaction on Robotics and Automation, vol. 7, No. 3 pp. 376-382, Jun. 1991.
Li et al. “Robost Statistical Methods for Securing Wireless Localization in Sensor Networks”, Wireless Information Network Laboratory, Rutgers University.
Li et al “Making a Local Map of Indoor Environments by Swiveling a Camera and a Sonar”, Proceedings of the 1999 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 954-959, 1999.
Lin, et al.. “Mobile Robot Navigation Using Artificial Landmarks”, Journal of robotics System 14(2). pp. 93-106, 1997.
Linde “Dissertation, “On Aspects of Indoor Localization”” https://eldorado.tu-dortmund.de/handle/2003/22854, University of Dortmund, 138 pages, Aug. 28, 2006.
Lumelsky, et al. “An Algorithm for Maze Searching Azimuth Input”, 1994 IEEE International Conference on Robotics and Automation, San Diego, CA vol. 1, pp. 111-116, 1994.
Luo et al., “Real-time Area-Covering Operations with Obstacle Avoidance for Cleaning Robots,” 2002, IEeE, p. 2359-2364.
Ma “Thesis: Documentation On Northstar”, California Institute of Technology, 14 pages, May 17, 2006.
Madsen, et al. “Optimal landmark selection for triangulation of robot position”, Journal of Robotics and Autonomous Systems vol. 13, pp. 277-292, 1998.
Martishevcky, “The Accuracy of point light target coordinate determination by dissectoral tracking system”, SPIE vol. 2591 pp. 25-30.
Matsutek Enterprises Co. Ltd “Automatic Rechargeable Vacuum Cleaner”, http://matsutek.manufacturer.globalsources.com/si/6008801427181/pdtl/Home-vacuum/10 . . . , Apr. 23, 2007.
McGillem, et al. “Infra-red Lacation System for Navigation and Autonomous Vehicles”, 1988 IEEE International Conference on Robotics and Automation, vol. 2, pp. 1236-1238, Apr. 24-29, 1988.
McGillem, et al. “A Beacon Navigation Method for Autonomous Vehicles”, IEEE Transactions on Vehicular Technology, vol. 38, No. 3, pp. 132-139, Aug. 1989.
Michelson “Autonomous Navigation”, 2000 Yearbook of Science & Technology, McGraw-Hill. New York, ISBN 0-07-052771-7, pp. 28-30, 1999.
Miro, et al. “Towards Vision Based Navigation in Large Indoor Environments”, Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Beijing, China, pp. 2096-2102, Oct. 9-15, 2006.
MobileMag “Samsung Unveils High-tech Robot Vacuum Cleaner”, http://www.mobilemag.com/content/100/102/C2261/, 4 pages, Mar. 18, 2005.
Monteiro, et al. “Visual Servoing for Fast Mobile Robot: Adaptive Estimation of Kinematic Parameters”, Proceedings of the IECON '93., International Conference on Industrial Electronics, Maui, HI, pp. 1588-1593, Nov. 15-19, 1993.
Moore, et al. A simple Map-bases Localization strategy using range measurements, SPIE vol. 5804 pp. 612-620, 2005.
Munich et al. “SIFT-ing Through Features with ViPR”, IEEE Robotics & Automation Magazine, pp. 72-77, Sep. 2006.
Munich et al. “ERSP: A Software Platform and Architecture for the Service Robotics Industry”, Intelligent Robots and Systems, 2005. (IROS 2005), pp. 460-467, Aug. 2-6, 2005.
Nam, et al. “Real-Time Dynamic Visual Tracking Using PSD Sensors and extended Trapezoidal Motion Planning”, Applied Intelligence 10, pp. 53-70, 1999.
Nitu et al. “Optomechatronic System for Position Detection of a Mobile Mini-Robot”, IEEE Ttransactions on Industrial Electronics, vol. 52, No. 4, pp. 969-973, Aug. 2005.
On Robo “Robot Reviews Samsung Robot Vacuum (VC-RP30W)”, www.onrobo.com/reviews/AT—Home/vacuum—cleaners/on00vcrb30rosam/index.htm.. 2 pages, 2005.
InMach “lntelligent Machines”, www.inmach.de/inside.html, 1 page, Nov. 19, 2008.
Innovation First “2004 EDU Robot Controller Reference Guide”, http://www.ifirobotics.com, 13 pgs., Mar. 1, 2004.
OnRobo “Samsung Unveils Its Multifunction Robot Vacuum”, www.onrobo.com/enews/0210/samsung—vacuum.shtml, 3 pages, Mar. 18, 2005.
Pages et al. “Optimizing Plane-to-Plane Positioning Tasks by Image-Based Visual Servoing and Structured Light”, IEEE Transactions on Robotics. vol. 22, No. 5. pp. 1000-1010, Oct. 2006.
Pages et al. “A camera-projector system for robot positioning by visual servoing”, Proceedings of the 2006 Conference on Computer Vision and Pattern Recognition Workshop (CVPRW06), 8 pages, Jun. 17-22, 2006.
Pages, et al. “Robust decoupled visual servoing based on structured light”, 2005 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 2676-2681, 2005.
Park et al. “A Neural Network Based Real-Time Robot Tracking Controller Using Position Sensitive Detectors,” IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on Neutral Networks, Orlando, Florida pp. 2754-2758, Jun. 27-Jul. 2, 1994.
Park, et al. “Dynamic Visual Servo Control of Robot Manipulators using Neutral Networks”, The Korean Institute Telematics and Electronics, vol. 29-B, No. 10. pp. 771-779, Oct. 1992.
Paromtchik “Toward Optical Guidance of Mobile Robots”.
Paromtchik, et al. “Optical Guidance System for Multiple mobile Robots”, Proceedings 2001 ICRA. IEEE International Conference on Robotics and Automation, vol. 3, pp. 2935-2940 (May 21-26, 2001).
Penna, et al. “Models for Map Building and Navigation”, IEEE Transactions on Systems. Man. And Cybernetics, vol. 23 No. 5, pp. 1276-1301. Sep./Oct. 1993.
Pirjanian “Reliable Reaction”, Proceedings of the 1996 IEEE/SICE/RSJ International Conference on Multisensor Fusion and Integration for lntelligent Systems, pp. 158-165, 1996.
Pirjanian “Challenges for Standards for consumer Robotics”, IEEE Workshop on Advanced Robotics and its Social impacts, pp. 260-264, Jun. 12-15, 2005.
Pirjanian et al. “Distributed Control for a Modular, Reconfigurable Cliff Robot”, Proceedings of the 2002 IEEE International Conference on Robotics & Automation, Washington, D.C. pp. 4083-4088, May 2002.
Pirjanian et al. “Representation and Execution of Plan Sequences for Multi-Agent Systems”, Proceedings of the 2001 IEEE/RSJ International Conference on Intelligent Robots and Systems, Maui, Hawaii, pp. 2117-2123, Oct. 29-Nov. 3, 2001.
Pirjanian et al. “Multi-Robot Target Acquisition using Multiple Objective Behavior Coordination”. Proceedings of the 2000 IEEE International Conference on Robotics & Automation, San Francisco, CA, pp. 2696-2702, Apr. 2000.
Pirjanian et al. “A decision-theoretic approach to fuzzy behavior coordination”, 1999 IEEE International Symposium on Computational Intelligence in Robotics and Automation, 1999. CIRA '99., Monterey, CA, pp. 101-106. Nov. 8-9, 1999.
Pirjanian et al. “Improving Task Reliability by Fusion of Redundant Homogeneous Modules Using Voting Schemes”, Proceedings of the 1997 IEEE International Conference on Robotics and Automation, Albuquerque, NM, pp. 425-430. Apr. 1997.
Prassler et al., “A Short History of Cleaning Robots”, Autonomous Robots 9, 211-226, 2000, 16 pages.
Radio Frequency Identification: Tracking ISS Consumables, Author Unknown, 41 pages (NPL0127).
Remazeilles, et al. “Image based robot navigation in 3D environments”, Proc. of SPIE, vol. 6052, pp. 1-14, Dec. 6, 2005.
Rives, et al. “Visual servoing based on ellipse features”, SPIE vol. 2056 Intelligent Robots and Computer Vision pp. 356-367, 1993.
Robotics World Jan. 2001: “A Clean Sweep” (Jan. 2001).
Ronnback “On Methods for Assistive Mobile Robots”, http://www.openthesis.org/documents/methods-assistive-mobile-robots-595019.html, 218 pages, Jan. 1, 2006.
Roth-Tabak, et al. “Environment Model for mobile Robots Indoor Navigation”, SPIE vol. 1388 Mobile Robots pp. 453-463, 1990.
Sadath M Maiik et al. “Virtual Prototyping for Conceptual Design of a Tracked Mobile Robot”. Electrical and Computer Engineering. Canadian Conference on, IEEE, Pl. May 1, 2006, pp. 2349-2352.
Sahin, et al. “Development of a Visual Object Localization Module for Mobile Robots”, 1999 Third European Workshop on Advanced Mobile Robots, (Eurobot '99), pp. 65-72, 1999.
Salomon, et al. “Low-Cost Optical Indoor Localization system for Mobile Objects without Image Processing”, IEEE Conference on Emerging Technologies and Factory Automation. 2006. (ETFA '06). pp. 629-632, Sep. 20-22, 2006.
Sato “Range Imaging Based on Moving Pattern Light and Spatio-Temporal Matched Filter”, Proceedings International Conference on Image Processing, vol. 1., Lausanne, Switzerland. pp. 33-36, Sep. 16-19, 1996.
Schenker, et al, “Lightweight rovers for Mars science exploration and sample return”, Intelligent Robots and Computer Vision XVI, SPIE Proc. 3208, pp. 24-36, 1997.
Sebastian Thrun, Learning Occupancy Grid Maps With Forward Sensor Models, School of Computer Science, Carnegie Mellon University, pp. 1-28.
Shimoga et al. “Touch and Force Reflection for Telepresence Surgery”, Engineering in Medicine and Biology Society, 1994. Engineering Advances: New Opportunities for Biomedical Engineers. Proceedings of the 16th Annual International Conference of the IEEE, Baltimore, MD, pp. 1049-1050, 1994.
Sim, et al “Learning Visual Landmarks for Pose Estimation”, IEEE International Conference on Robotics and Automation, vol. 3, Detroit, MI, pp. 1972-1978, May 10-15, 1999.
Sobh et al. “Case Studies in Web-Controlled Devices and Remote Manipulation”, Automation Congress, 2002 Proceedings of the 5th Biannual World, pp. 435-440, Dec. 10, 2002.
Stella, et al. “Self-Location for Indoor Navigation of Autonomous Vehicles”, Part of the SPIE conference on Enhanced and Synthetic Vision SPIE vol. 3364 pp. 298-302, 1998.
Summet “Tracking Locations of Moving Hand-held Displays Using Projected Light”, Pervasive 2005, LNCS 3468 pp. 37-46 (2005).
Svedman et al. “Structure from Stereo Vision using Unsynchronized Cameras for Simultaneous Localization and Mapping”. 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems. pp. 2993-2998, 2005.
Takio et al. “Real-Time Position and Pose Tracking Method of Moving Object Using Visual Servo System”, 47th IEEE International Symposium on Circuits and Systems, pp. 167-170, 2004.
Teller “Pervasive pose awareness for people, Objects and Robots”, http://www.ai.mit/edu/lab/dangerous-ideas/Spring2003/teller-pose.pdf, 6 pages, Apr. 30, 2003.
Terada et al. “An Acquisition of the Relation between Vision and Action using Self-Organizing Map and Reinforcement Learning”, 1998 Second International Conference on Knowledge-Based Intelligent Electronic Systems, Adelaide, Australiam pp. 429-434, Apr. 21-23, 1998.
The Sharper Image “e-Vac Robotic Vacuum, S1727 Instructions”www.sharperimage.com, 18 pages.
The Sharper Image “Robotic Vacuum Cleaner—Blue” www.Sharperimage.com, 2 pages, Mar. 18, 2005.
The Sharper Image “E Vac Robotic Vacuum”www.sharperiamge.com/us/en/templates/products/pipmorework1printable.jhtml, 2 pages, Mar. 18, 2005.
TheRobotStore.com “Friendly Robotics Robotic Vacuum RV400—The Robot Store”, www.therobotstore.com/s.nl/sc/9/category.-109/it.A/id.43/.f, 1 page, Apr. 20, 2005.
TotalVac.com RC3000 RoboCleaner website Mar. 18, 2005.
Trebi-Ollennu et al. “Mars Rover Pair Cooperatively Transporting a Long Payload”, Proceedings of the 2002 IEEE International Conference on Robotics & Automation, Washington, D.C. pp. 3136-3141, May 2002.
Tribelhorn et al., “Evaluating the Roomba: A low-cost ubiquitous platform for robotics research and education,” 2007, IEEE, p. 1393-1399.
Tse et al “Design of a Navigation System for a Household Mobile Robot Using Neural Networks”, Department of Manufacturing Engg. & Engg. Management, City University of Hong Kong, pp. 2151-2156, 1998.
UAMA (Asia) Industrial Co., Ltd. “RobotFamily”, 2005.
Watanabe et al. “Position Estimation of Mobile Robots With Internal and External Sensors Using Uncertainty Evolution Technique”, 1990 IEEE International Conference on Robotics and Automation, Cincinnati, OH, pp. 2011-2016, May 13-18, 1990.
Watts “Robot, boldly goes where no man can”, The Times-pp. 20, Jan. 1985.
Wijk et al. “Triangulation-Based Fusion of Sonar Data with Application in Robot Pose Tracking ”, IEEE Transactions on Robotics and Automation, vol. 16, No. 6, pp. 740-752, Dec. 2000.
Examination report dated Apr. 18, 2011 from corresponding U.S. Appl. No. 11/633,869.
Examination report dated Apr. 5, 2011 from corresponding U.S. Appl. No. 12/959,879.
Examination report dated Dec. 22, 2010, for corresponding appplication. EP 10174129.6.
Examination report dated Feb. 22, 2010 from corresponding U.S. Appl. No. 11/633,883.
Examination report dated Feb. 8, 2011, for corresponding application EP 10174129.6.
Examination report dated May 2, 2011 for corresponding application KR 10-2008-7016058.
Examination report dated Jul. 15, 2011 from corresponding U.S. Appl. No. 12/211,938.
Examination report dated Jul. 20, 2011 from corresponding JP application 2008-543548.
Examination report dated Jun. 21, 2011 from corresponding JP application 2011-088402.
Examination report dated Mar. 10, 2011 from corresponding JP application 2010-282185.
Examination report dated May 2, 2011 from corresponding U.S. Appl. No. 11/773,845.
Examination report dated Oct. 29, 2010 from corresponding U.S. Appl. No. 11/633,886.
Examination report dated Sep. 16, 2010 from corresponding U.S. Appl. No. 11/633,869.
Prassler et al., “A Short History of Cleaning Robots”, Autonomous Robots 9, 211-226, 2000.
Wolf et al. “Robust Vision-based Localization for Mobile Robots Using an Image Retrieval System Based on Invariant Features”, Proceedings of the 2002 IEEE international Conference on Robotics & Automation, Washington, D.C. pp. 359-365, May 2002.
Wolf et al. “Robust Vision-Based Localization by Combining an Image-Retrieval System with Monte Carol Localization”, IEEE Transactions on Robotics. vol. 21. No. 2. pp. 208-216, Apr. 2005.
Wong “EIED Online>> Robot Business”, ED Online ID# 13114, 17 pages, Jul. 2006.
Yamamoto et al. “Optical Sensing for Robot Perception and Localization”, 2005 IEEE Workshop on Advanced Robotics and its Social Impacts, pp. 14-17, 2005.
Yata et al. “Wall Following Using Angle Information Measured by a Single Ultrasonic Transducer”, Proceedings of the 1998 IEEE, International Conference on Robotics & Automation, Leuven, Belgium. pp. 1590-1596, May 1998.
Yun, et al. “Image-Based Absolute Positioning System for Mobile Robot Navigation”, IAPR International Workshops SSPR, Hong Kong, pp. 261-269, Aug. 17-19, 2006.
Yun, et al. “Robust Positioning a Mobile Robot with Active Beacon Sensors”, Lecture Notes in Computer Science, 2006, vol. 4251, pp. 690-897, 2006.
Yuta, et al. “Implementation of an Active Optical Range sensor Using Laser Slit for In-Door Intelligent Mobite Robot”, IEE/RSJ International workshop on Intelligent Robots and systems (IROS 91) vol. 1. Osaka, Japan, pp. 415-420. Nov. 3-5, 1991.
Zha et at. “Mobile Robot Localization Using Incomplete Maps for Change Detection in a Dynamic Environment”, Advanced Intelligent Mechatronics '97. Final Program and Abstacts., IEEE/ASME International Conference, pp. 110, Jun. 16-20, 1997.
Zhang, et at. “A Novel Mobile Robot Localization Based on Vision”, SPIE vol. 6279, 6 pages, Jan. 29, 2007.
Euroflex Intellegente Monstre Mauele (English only except).
Roboking—not just a vacuum cleaner, a robot! Jan. 21, 2004, 5 pages.
SVET Computers—New Technologies—Robot vacuum cleaner, 1 page.
Popco.net Make your digital life http://www.popco.net/zboard/view.php?id=tr—review&no=40 accessed Nov. 1, 2011.
Matsumura Camera Online Shop http://www.rakuten.co.jp/matsucame/587179/711512/ acessed Nov. 1, 2011.
Dyson's Robot Vacuum Cleaner—the DC06, May 2, 2004 http://www.gizmag.com/go/1282/ accessed Nov. 11, 2011.
Electrolux Trilobite, http://www.electrolux-ui.com:8080/2002%5C822%5C833102EN.pdf 10 pages.
Electrolux Trilobite, Time to enjoy life, 38 pages http://www.robocon.co.kr/trilobite/Presentation—Trilobite—Kor—030104.ppt accessed Dec. 22, 2011.
Facts on the Trilobite http://www.frc.ri.cmu.edu/˜hpm/talks/Extras/trilobite.desc.html 2 pages accessed Nov. 1, 2011.
Euroflex Jan. 1, 2006 http://www.euroflex.tv/novita—dett.php?id=15 1 page accessed Nov. 1, 2011.
FloorBotics, VR-8 Floor Cleaning Robot, Product Description for Manuafacturers, http://www.consensus.com.au/SoftwareAwards/CSAarchive/CSA2004/CSAart04/FloorBot/F.
Friendly Robotics, 18 pages http://www.robotsandrelax.com/PDFs/RV400Manual.pdf accessed Dec. 22, 2011.
It's eye, 2003 www.hitachi.co.jp/rd/pdf/topics/hitac2003—10.pdf 2 pages.
Hitachi, May 29, 2003 http://www.hitachi.co.jp/New/cnews/hl—030529—hl—030529.pdf 8 pages.
Robot Buying Guide, LG announces the first robotic vacuum cleaner for Korea. Apr. 21, 2003 http://robotbg.com/news/2003/04122/lg—announces—the—first—robotic—vacu.
CleanMate 365, Intelligent Automatic Vacuum Cleaner, Model No. QQ-1, User Manual www.metapo.com/support/user—manual.pdf 11 pages.
UBOT, cleaning robot capable of wiping with a wet duster, http://us.aving.net/news/view.php?articieId=23031, 4 pages accessed Nov. 1, 2011.
Taipei Times, Robotic vacuum by Matsuhita about ot undergo testing, Mar. 26, 2002 http://www.taipeitimes.com/News/worldbiz/archives/2002/03/26/0000129338 accessed.
Tech-on! http://techon.nikkeibp.co.jp/members/01db/200203/1006501/, 4 pages, accessed Nov. 1, 2011.
http://ascii.jp/elem/000/000/330/330024/.
IT media http://www.itmedia.co.jp/news/0111/16/robofesta—m.html accessed Nov. 1, 2011.
Yujin Robotics, an intelligent cleaning robot ‘iclebo Q’ AVING USA http://us.aving.net/news/view.php?articleId=7257, 8 pages accessed Nov. 4, 2011.
Special Reports, Vacuum Cleaner Robot Operated in Conjunction with 3G Celluar Phone vol. 59. No. 9 (2004) 3 pages http://www.toshiba.co.jp/tech/review/2004/09/59—0.
Toshiba Corporation 2003, http://warp.ndl.go.jp/info:ndljp/pid/258151/www.soumu.go.jp/joho—tsusin/policyreports/chousa/netrobot/pdf/030214—1—33—a.pdf 16 pages.
http://www.karcher.de/versions/intg/assets/video/2—4—robo—en.swf, Accessed Sep. 25, 2009.
McLurkin “The Ants: A community of Microrobots”, Paper submitted for requirements of BSEE at MIT, May 12, 1995.
Grumet “Robots Clean House”, Popular Mechanics, N0vember 2003.
McLurkin Stupid Robot Tricks: A Behavior-based Distributed Algorithm Library for Programming Swarms of Robots, Paper submitted for requirements of BSEE at MIT, May 2004.
Kurs et al, Wireless Power transfer via Strongly Coupled Magnetic Resonances, Downloaded from www.sciencemag.org, Aug. 17, 2007.
Hitachi “Feature”, http://kadenfan.hitachi.co.jp/robot/feature/feature.html , 1 page Nov. 19, 2008.
Home Robot—UBOT: Microbotusa.com, retrieved from the WWW at www.microrobotusa.com, accessed Dec. 2, 2008.
Andersen et al., “Landmark based navigation strategies”, SPIE Conference on Mobile Robots XIII, SPIE vol. 3525, pp. 170-181, Jan. 8, 1999.
Ascii, Mar. 25, 2002, http://ascii.jp/elem/000/000/330/330024/ accessed Nov. 1, 2011.
Certified U.S. Appl. No. 60/605,066 as provided to WIPO in PCT/US2005/030422, corresponding to U.S. Appl. No. 11/574,290, U.S. publication 2008/0184518, filing date Aug. 27, 2004.
Certified U.S. Appl. No. 60/605,181 as provided to WIPO in PCT/US2005/030422, corresponding to U.S. Appl. No. 11/574,290, U.S. publication 2008/0184518, filed Aug. 27, 2004.
Derek Kurth, “Range-Only Robot Localization and SLAM with Radio”, http://www.ri.cmu.edu/pub—files/pub4/kurth—derek—2004—1/kurth—derek—2004—1.pdf. 60 pages, May 2004, accessed Jul. 27, 2012.
Electrolux Trilobite, Jan. 12, 2001, http://www.electrolux-ui.com:8080/2002%5C822%5C833102EN.pdf, accessed Jul. 2, 2012, 10 pages.
Florbot GE Plastics, 1989-1990, 2 pages, available at http://www.fuseid.com/, accessed Sep. 27, 2012.
Gregg et al., “Autonomous Lawn Care Applications,” 2006 Florida Conference on Recent Advances in Robotics, Miami, Florida, May 25-26, 2006, Florida International University, 5 pages.
Hitachi ‘Feature’, http://kadenfan.hitachi.co.jp/robot/feature/feature.html, 1 page, Nov. 19, 2008.
Hitachi, http://www.hitachi.co.jp/New/cnews/hi—030529—hi—030529.pdf , 8 pages, May 29, 2003.
Home Robot—UBOT; Microbotusa.com, retrieved from the WWW at www.microrobotusa.com, accessed Dec. 2, 2008.
King and Weiman, “Helpmate™ Autonomous Mobile Robots Navigation Systems,” SPIE vol. 1388 Mobile Robots, pp. 190-198 (1990).
Li et al. “Robust Statistical Methods for Securing Wireless Localization in Sensor Networks,” Information Procesing in Sensor Networks, 2005, Fourth International Symposium on, pp. 91-98, Apr. 2005.
Martishevcky, “The Accuracy of point light target coordinate determination by dissectoral tracking system”, SPIE vol. 2591, pp. 25-30, Oct. 23, 2005.
Maschinemarkt Würzburg 105, Nr. 27, pp. 3, 30, Jul. 5, 1999.
Miwako Doi “Using the symbiosis of human and robots from approaching Research and Development Center,” Toshiba Corporation, 16 pages, available at http://warp.ndl.go.jp/info:ndljp/pid/258151/www.soumu.go.jp/joho—tsusin/policyreports/chousa/netrobot/pdf/030214—1—33—a.pdf, Feb. 26, 2003.
Paromtchik “Toward Optical Guidance of Mobile Robots,” Proceedings of the Fourth World Multiconference on Systemics, Cybermetics and Informatics, Orlando, FL, USA, Jul. 23, 2000, vol. IX, pp. 44-49, available at http://emotion.inrialpes.fr/˜paromt/infos/papers/paromtchik:asama:sci:2000.ps.gz, accessed Jul. 3, 2012.
Roboking—not just a vacuum cleaner, a robot!, Jan. 21, 2004, infocom.uz/2004/01/21/robokingne-prosto-pyilesos-a-robot/, accessed Oct. 10, 2011, 7 pages.
Sebastian Thrun, “Learning Occupancy Grid Maps With Forward Sensor Models,” Autonomous Robots 15, 111-127, Sep. 1, 2003.
SVET Computers—New Technologies—Robot Vacuum Cleaner, Oct. 1999, available at http://www.sk.rs/1999/10/sknt01.html, accessed Nov. 1, 2011.
Written Opinion of the International Searching Authority, PCT/US2004/001504, Aug. 20, 2012, 9 pages.
Borges et al. “Optimal Mobile Robot Pose Estimation Using Geometrical Maps”, IEEE Transactions on Robotics and Automation, vol. 18, No. 1, pp. 87-94, Feb. 2002.
Braunstingl et al. “Fuzzy Logic Wall Following of a Mobile Robot Based on the Concept of General Perception” ICAR '95, 7th International Conference on Advanced Robotics, Sant Feliu De Guixols, Spain, pp. 367-376, Sep. 1995.
Bulusu, et al. “Self Configuring Localization systems: Design and Experimental Evaluation”, ACM Transactons on Embedded Computing Systems vol. 3, No. 1, pp. 24-60, 2003.
Caccia, et al. “Bottom-Following for Remotely Operated Vehicies”, 5th IFAC conference, Alaborg, Denmark, pp. 245-250 Aug. 1, 2000.
Chae, et al. “StarLITE: A new artificial landmark for the navigation of mobile robots”, http://www.irc.atr.jp/jk-nrs2005/pdf/Starlite.pdf, 4 pages, 2005.
Chamberlin et al. “Team 1: Robot Locator Beacon System” NASA Goddard SFC, Design Proposal, 15 pages, Feb. 17, 2006.
Champy “Physical management of IT assets in Data Centers using RFID technologies”, RFID 2005 University, Oct. 12-14, 2005 (NPL0126).
Chiri “Joystck Control for Tiny OS Robot”, http://eecs.berkely.edu/Programs/ugrad/superb/papers2002/chiri.pdf. 12 pages, Aug. 8, 2002.
Christensen et al. “Theoretical Methods for Planning and Control in Mobile Robotics” 1997 First International Conference on Knowledge-Based Intelligent Electronic Systems, Adelaide, Australia, pp. 81-86, May 21-27, 1997.
Andersen et al., “Landmark based navigation strategies”, SPIE Conference on Mobile Robots XIII, SPIE vol. 3525, pp.
Clerentin, et al. “A localization method based on two omnidirectional perception systems cooperation” Proc of IEEE International Conference on Robotics & Automation, San Francisco, CA vol. 2, pp. 1219-1224, Apr. 2000.
Corke “High Performance Visual serving for robots end-point control”. SPIE vol. 2056 Intelligent robots and computer vision 1993.
Cozman et al. “Robot Localization using a Computer Vision Sextant”. IEEE International Midwest Conference on Robotics and Automation, pp. 106-111, 1995.
D'Orazio, et al. “Model based Vision System for mobile robot position estimation”, SPIE vol. 2058 Mobile Robots VIII, pp. 38-49, 1992.
De Bakker, et al. “Smart PSD—array for sheet of light range imaging”, Proc. Of SPIE vol. 3965. pp. 1-12, May 15, 2000.
Desaulniers, et al. “An Efficient Algorithm to find a shortest path for a car-like Robot”, IEEE Transactions on robotcs and Automation vol. 11 No. 6, pp. 819-828, Dec. 1995.
Dorfmüller-Ulhaas “Optical Tracking From User Motion to 3D Interaction”, http://www.cg.tuwien.ac.at/research/publications/2002/Dorfmueller-Ulhaas-thesis, 182 pages, 2002.
Dorsch, et al. “Laser Triangulation: Fundamental uncertainty in distance measurement”, Applied Optics. vol. 33 No. 7. pp. 1306-1314, Mar. 1, 1994.
Dudek, et al. “Localizing A Robot with Minimum Travel” Proceedings of the sixth annual ACM-SIAM symposium on Discrete algorithms, vol. 27 No. 2 pp. 583-604, Apr. 1998.
Dulimarta, et al. “Mobile Robot Localization in Indoor Environment”, Pattern Recognition, vol. 30, No. 1, pp. 99-111, 1997.
EBay “Roomba Timer -> Timed Cleaning—Floorvac Robotic Vacuum”, Cgi.ebay.com/ws/eBay|SAP|.dll?viewitem&category=43526&item=4375198387&rd=1, 5 pages, Apr. 20, 2005.
Electrolux “Welcome to the Electrolux trilobite” www.electroluxusa.com/node57.asp?currentURL=node142.asp%3F, 2 pages, Mar. 18, 2005.
Eren, et al. “Accuracy in position estimation of mobile robots based on coded infrared signal transmission”, Proceedings: Integrating Intelligent Instrumentation and Control, Instrumentation and Measurement Technology Conference, 1995. IMTC/95, pp. 548-551, 1995.
Eren, at al. “Operation of Mobile Robots in a Structured Infrared Environment”, Proceedings. ‘Sensing, Processing, Networking’, IEEE Instrumentation and Measurement Technology Conference, 1997 (IMTC/97), Ottawa, Canada vol. 1, pp. 20-25, May 19-21, 1997.
Barker, “Navigation by the Stars—Ben Barker 4th Year Project” Power point pp. 1-20.
Becker, et al. “Reliable Navigation Using Landmarks” IEEE International Conference on Robotics and Automation, 0-7803-1965-6, pp. 401-406, 1995.
Benayad-Cherif, et al., “Mobile Robot Navigation Sensors” SPIE vol. 1831 Mobile Robots, VII, pp. 378-387, 1992.
Facchinetti, Claudio et al. “Using and Learning Vision-Based Self-Positoning for Autonomous Robot Navigaton”, ICARCV '94, vol. 3 pp. 1694-1698, 1994.
Betke, et al., “Mobile Robot localization using Landmarks” Proceedings of the IEEE/RSJ/GI International Conference on Intelligent Robots and Systems '94 “Advanced Robotic Systems and the Real World” OROS '94), vol.
Facchinetti, Claudio et al. “Self-Positioning Robot Navigation Using Ceiling Images Sequences”, ACV '95, 5 pages, Dec. 5-8, 1995.
Fairfield, Nathaniel et al. “Mobile Robot Localization with Sparse Landmarks”, SPIE vol. 4573 pp. 148-155, 2002.
Favre-Bulle, Bernard “Efficient tracking of 3D—Robot Position by Dynamic Triangulation”, IEEE Instrumentation and Measurement Technology Conference IMTC 98 Session on Instrumentation and Measurement in Robotics, vol. 1, pp. 446-449, May 18-21, 1998.
Fayman “Exploiting Process Integration and Composition in the context of Active Vision”, IEEE Transactons on Systems, Man, and Cybernetics—Part C: Application and reviews, vol. 29 No. 1, pp. 73-86, Feb. 1999.
Florbot GE Plastics Image (1989-1990).
Franz, et al. “Biomimetric robot navigation”, Robotics and Autonomous Systems vol. 30 pp. 133-153, 2000.
Friendly Robotics “Friendly Robotics—Friendly Vac, Robotic Vacuum Cleaner”, www.friendlyrobotics.com/vac.htm. 5 pages Apr. 20, 2005.
Fuentes, et al. “Mobile Robotics 1994”, University of Rochester. Computer Science Department, TR 588, 44 pages, Dec. 7, 1994.
Bison, P et al., “Using a structured beacon for cooperative position estimation” Robotics and Autonomous Systems vol. 29, No. 1, pp. 33-40, Oct. 1999.
Fukuda, et al. “Navigation System based on Ceiling Landmark Recognition for Autonomous mobile robot”. 1995 IEEE/RSJ International Conference on Intelligent Robots and Systems 95. ‘Human Robot Interaction and Cooperative Robots’, Pittsburgh, PA, pp. 1466/1471, Aug. 5-9, 1995.
Gionis “A hand-held optical surface scanner for environmental Modeling and Virtual Reality”, Virtual Reality World, 16 pages 1996
Goncalves et al. “A Visual Front-End for Simultaneous Localization and Mapping”, Proceedings of the 2005 IEEE International Conference on Robotics and Automation, Barcelona, Spain, pp. 44-49. Apr. 2005.
Gregg et al. “Autonomous Lawn Care Applications”, 2006 Florida Conference on Recent Advances in Robotics, FCRAR 2006. pp. 1-5, May 25-26, 2006.
Hamamatsu “SI PIN Diode S5980, S5981 S5870—Multi-element photdiodes for surface mounting”, Hamatsu Photonics, 2 pages Apr. 2004.
Hammacher Schlemmer “Electrolux Trilobite Robotic Vacuum” www.hammacher.com/publish/71579.asp?promo=xsells, 3 pages, Mar. 18, 2005.
Haralick et al. “Pose Estimation from Corresponding Point Data”, IEEE Transactions on systems, Man, and Cybernetics, vol. 19, No. 6, pp. 1426-1446, Nov. 1989.
Hausler “About the Scaling Behaviour of Optical Range Sensors”, Fringe '97, Proceedings of the 3rd International Workshop on Automatic Processing of Fringe Patterns, Bremen, Germany, pp. 147-155, Sep. 15-17, 1997.
Blaasvaer, et al. “AMOR—An Autonomous Mobile Robot Navigation System”, Proceedings of the IEEE Intenatonal Conference on Systems, Man, and Cybernetics, pp. 2266-2271, 1994.
Hoag, et al. “Navigation and Guidance in interstellar space”, ACTA Astronautica vol. 2, pp. 513-533, Feb. 14, 1975.
Huntsberger et al. “Campout: A Control Architecture for Tightly Coupled Coordination of Multirobot Systems for Planetary Surface Exploration”, IEEE Transactions on Systems, Man, and Cybernetics—Part A: Systems and Humans, vol. 33, No. 5, pp. 550-559, Sep. 2003.
Iirobotics.com “Samsung Unveils Its Multifunction Robot Vacuum”, www.iirobotics.com/webpages/hotstuff.php?ubre=111, 3 pages, Mar. 18, 2005.
Related Publications (1)
Number Date Country
20080091305 A1 Apr 2008 US
Provisional Applications (1)
Number Date Country
60741442 Dec 2005 US
Continuations (1)
Number Date Country
Parent 11633885 Dec 2006 US
Child 11758289 US