Celestial navigation system for an autonomous vehicle

Information

  • Patent Grant
  • 8972052
  • Patent Number
    8,972,052
  • Date Filed
    Tuesday, November 3, 2009
    15 years ago
  • Date Issued
    Tuesday, March 3, 2015
    9 years ago
Abstract
A navigation control system for an autonomous vehicle comprises a transmitter and an autonomous vehicle. The transmitter comprises an emitter for emitting at least one signal, a power source for powering the emitter, a device for capturing wireless energy to charge the power source, and a printed circuit board for converting the captured wireless energy to a form for charging the power source. The autonomous vehicle operates within a working area and comprises a receiver for detecting the at least one signal emitted by the emitter, and a processor for determining a relative location of the autonomous vehicle within the working area based on the signal emitted by the emitter.
Description
BACKGROUND

Autonomous vehicles including robotic devices are becoming more prevalent today and are used to perform tasks traditionally considered mundane, time-consuming, or dangerous. As programming technology increases, so does the demand for robotic devices that can navigate around a complex environment or working space with little or no assistance from a human operator.


Autonomous vehicles and associated controls, navigation systems, and other related systems are being developed. For example, U.S. Pat. No. 6,594,844 discloses a Robot Obstacle Detection System, the disclosure of which is hereby incorporated by reference in its entirety. Additional robot control and navigation systems, and other related systems, are disclosed in PCT Published Patent Application No. WO 2004/025947, and in U.S. Pat. Nos. 6,809,490, 6,690,134, 6,781,338, 7,024,278, 6,883,201, and 7,332,890, the disclosures of which are hereby incorporated by reference in their entireties.


Many autonomous vehicles navigate a working space by moving randomly until an obstacle is encountered. Generally, these types of vehicles have on-board obstacle detectors, such as bump sensors or similar devices, which register contact with an obstacle. Once contact is made, command routines can direct the autonomous vehicle to move in a direction away from the obstacle. These types of systems, which are useful for obstacle avoidance, are limited in their ability to allow an autonomous vehicle to track its location within a room or other working environment. Other systems, often used in conjunction with bump sensors as described above, use an infrared or other detector to sense the presence of nearby walls, obstacles, or other objects, and either follow the obstacle or direct the vehicle away from it. These systems, however, are also limited in their ability to allow an autonomous vehicle to navigate effectively in a complex environment, as they only allow the vehicle to recognize when objects are in its immediate vicinity.


In more advanced navigation systems, an autonomous vehicle comprises an infrared or other type of transmitter, which directs a series of infrared patterns in horizontal directions around the autonomous vehicle. These patterns can be detected by a stationary receiver placed at or near a boundary of the working space, for example on a wall. A microprocessor can use the information from signals generated by the receiver to calculate where in the working space the autonomous vehicle is located at all times. Using such systems, the vehicle can navigate around an entire area. These systems, however, are best employed in working spaces where few objects are present that may interfere with the dispersed patterns of infrared signals.


Limitations of the above types of navigation systems are, at present, a hurdle to creating a highly independent autonomous vehicle that can navigate in a complex environment.


SUMMARY

The present teachings provide a navigation control system for an autonomous vehicle. The system comprises a transmitter and an autonomous vehicle. The transmitter comprises an emitter for emitting at least one signal, a power source for powering the emitter, a device for capturing wireless energy to charge the power source, and a printed circuit board for converting the captured wireless energy to a form for charging the power source. The autonomous vehicle operates within a working area and comprises a receiver for detecting the at least one signal emitted by the emitter, and a processor for determining a relative location of the autonomous vehicle within the working area based on the signal emitted by the emitter.


The present teachings also provide a transmitter for use in a navigation control system for an autonomous vehicle. The transmitter comprises an emitter for emitting at least one signal, a power source for powering the emitter, a device for capturing wireless energy to charge the power source, and a printed circuit board for converting the captured wireless energy to a form for charging the power source.


The present teachings further provide a method for controlling navigation of an autonomous vehicle within one or more work areas. The method comprises emitting one or more signals from a transmitter, receiving the one or more signals on the autonomous vehicle, powering the transmitter with a power source, charging the power source wirelessly, localizing the autonomous vehicle with respect to the transmitter, and navigating the autonomous vehicle within the one or more work areas.


Additional objects and advantages of the present teachings will be set forth in part in the description which follows, and in part will be obvious from the description, or may be learned by practice of the teachings. The objects and advantages of the present teachings will be realized and attained by means of the elements and combinations particularly pointed out in the appended claims.


It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the present teachings, as claimed.


The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate several embodiments of the present teachings and together with the description, serve to explain the principles of the present teachings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a schematic view of a navigation system for an autonomous vehicle in accordance with an exemplary embodiment of the present teachings.



FIG. 2 is a schematic view of a navigation system for an autonomous vehicle in accordance with another exemplary embodiment of the present teachings.



FIG. 3A is a side view of a stationary emitter in accordance with an exemplary embodiment of the present teachings.



FIG. 3B is a side view of a stationary emitter in accordance with another exemplary embodiment of the present teachings.



FIG. 4A is a side view of an infrared receiver for an autonomous vehicle in accordance with an exemplary embodiment of the present teachings.



FIG. 4B is a top view of the infrared receiver of FIG. 4A.



FIG. 4C is a side view of an infrared receiver for an autonomous vehicle in accordance with another exemplary embodiment of the present teachings.



FIG. 5A illustrates a control system for an infrared receiver for an autonomous vehicle in accordance with an exemplary embodiment of the present teachings.



FIG. 5B is a flowchart of a signal detection and localization program in accordance with an exemplary embodiment of the present teachings.



FIG. 6 is a top view of a navigation system for an autonomous vehicle in accordance with another exemplary embodiment of the present teachings.



FIGS. 7-14 are schematic circuit diagrams of infrared receivers and transmitters for a navigation system in accordance with an exemplary embodiment of the present teachings.



FIGS. 15A-15C illustrate side, bottom, and end views, respectively, of an exemplary embodiment of a transmitter in accordance with the present teachings.



FIGS. 16A-16C illustrate side, bottom, and end views, respectively, of another exemplary embodiment of a transmitter in accordance with the present teachings.



FIG. 17 illustrates the transmitter of FIGS. 15A-15C used in a doorway.



FIG. 18 also illustrates the transmitter of FIGS. 15A-15C used in a doorway.



FIGS. 19A-19C illustrate exemplary embodiments of setup screens on an exemplary remote control in accordance with the present teachings.



FIGS. 20A-20C illustrate exemplary embodiments of schedule screens on an exemplary remote control in accordance with the present teachings.



FIGS. 21A-21C illustrate exemplary embodiments of mode screen on an exemplary remote control in accordance with the present teachings.



FIG. 22 illustrates exemplary embodiments of a status screen on an exemplary remote control in accordance with the present teachings.



FIG. 23 schematically illustrates an embodiment of a system in accordance with the present teachings.





DESCRIPTION OF THE PRESENT TEACHINGS

Reference will now be made in detail to embodiments of the present teachings, examples of which are illustrated in the accompanying drawings. Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts.


In accordance with an exemplary implementation of the present teachings, FIG. 1 is a schematic view of a navigation system 10 for an autonomous vehicle such as a robotic cleaning device 12. The components of the system 10 include, in this embodiment, a transmitter 20, a charging or base station 22, and an autonomous vehicle 12 that operates in a room or other similar working area 14. The working area 14 can be a floor of a room, bounded at least in part by walls 16. Borders of a ceiling 18 intersect the walls 16 and are remote from the working area 14. The depicted transmitter 20 includes two emitters 24A, 24B. In this exemplary embodiment, the base station 22 includes an emitter 26 as well. In various embodiments, any combination or quantity of emitters may be used on the base station 22, or transmitter 20, or both. The autonomous vehicle 12 can include an on-board microprocessor, power and drive components, task-specific components (dirt sensors, vacuums, brushes, etc.), and at least one receiver, such as an infrared receiver 28. The vehicle 12 may also include certain buttons, switches, etc. for programming the robot, or such instructions may be directed by a remote control (see FIG. 18) or a personal computer (not shown). Depending on the application, certain components may be removed from the disclosed system 10, or other components may be added.


For simplicity, this disclosure will describe vacuuming as a demonstrative task of the depicted robotic cleaning device 12. It will be apparent, though, that the navigation system disclosed herein has wide applications across a variety of autonomous systems. For example, an autonomous vehicle may be used for floor waxing and polishing, floor scrubbing, ice resurfacing, sweeping and vacuuming, unfinished floor sanding, stain/paint application, ice melting and snow removal, grass cutting, etc. Any number of task-specific components may be required for such duties, and may each be incorporated into the autonomous vehicle, as necessary.


The transmitter 20 directs at least two infrared signals 22a, 24a from emitters 24A and 24B to a surface remote from the working area 14 upon which the autonomous vehicle 12 operates. The depicted embodiment directs the infrared signals 22a, 24a to the ceiling 18, but it may also direct the signals 22a, 24a to a portion of a wall 16 or to both the walls 16 and ceiling 18. The signals 22a, 24a can be directed to a variety of points on the remote surface, but directing the signals as high as possible above the working area 14 can allow the signals 22a, 24a to be more easily detected by the autonomous vehicle 12, because the field of view of the autonomous vehicle's receiver 28 is less likely to be blocked by an obstacle (such as, for example, a high-backed chair or tall plant). In this disclosure, the regions of contact 22b, 24b of the signals 22a, 24a on the remote surface will be referred to as “points,” regardless of the size of the intersection. For example, by using a collimator in conjunction with the emitters (described below), the points of intersection 22b, 24b of the signals 22a, 24a can be a finite area with the signal strongest at approximately central points.


In certain embodiments of the transmitter 20, the signals 22a, 24a are directed toward a ceiling 18, at two points 22c, 24c, forming a line proximate and parallel to the wall 16 upon which the transmitter 20 is located. Alternatively, and as depicted in FIG. 1, the signals 22a, 24a can be directed away from the wall 16, at an angle of approximately 5° or more, to avoid interference with objects such as pictures secured to or hung from the wall 16. The signals 22a, 24a can be transmitted at a known angle θ therebetween. In an exemplary embodiment, angle θ can equal approximately 30°, but other angles are contemplated by the present teachings. In accordance with certain embodiments, angle θ can be set at the time of manufacture or user-defined based on particular applications or other requirements. By setting the angle θ to a known value, the distance S between the signals 22a, 24a at the point of contact 22c, 24c with ceiling 18 may be determined, provided the heights of the ceiling h1, h2 at the points of contact 22c, 24c are known. When used on a flat ceiling 18, as depicted, h1 equals h2. In the embodiment depicted in FIG. 1, base station 22 emits a signal 26a that can serve as an additional or optional signal for utilization by the autonomous vehicle 12. Signal 26a is directed toward a wall 16, so that the point of contact 26b is high enough to avoid objects that may obstruct the autonomous vehicle's field of view. A central point 26c (or laser point) of the point of contact 26b contacts the wall 16 at height h3.


As the autonomous vehicle 12 moves within a working area 14, it detects the signals 22a, 24a emitted by the transmitter 20 as energy bouncing or reflecting off of the diffuse ceiling surface 18. In an alternative embodiment, visible points can be used in place of infrared points. A camera onboard the autonomous vehicle can replace the infrared receiver in detecting either infrared or visible points. The autonomous vehicle's microprocessor can convert the signals 22a, 24a sensed by the receiver 28 into bearings from the robot 12 to the signals 22a, 24a. The microprocessor can then calculate representative elevation angles ε1, ε2 and azimuths α1, α2 of the signals to determine the location of the autonomous vehicle 12 within the working area 14. In this embodiment, the azimuths α1, α2 are measured using a “forward” direction of movement M of the autonomous vehicle 12 as a datum, but any suitable datum can be used. By calculating the elevation angle and azimuth from the autonomous vehicle 12 to the two signals 22a, 24a, the autonomous vehicle 12 can locate itself within a working area with improved accuracy.



FIG. 2 depicts another exemplary embodiment of a navigation system 110 for an autonomous vehicle 112. In the illustrated exemplary embodiment, an autonomous vehicle 112 moves in a working area having a floor 114. A transmitter 120 can be mounted at a top frame of a doorway 132 between two rooms 136, 138. Similar to the embodiment depicted in FIG. 1, the transmitter 120 is installed at a known distance h4 above the floor 114. In alternative embodiments, the transmitter 120 can be installed at the height of the ceiling 118. The transmitter 120 can be recessed within the door frame 130 or ceiling 118 to reduce its profile and limit its impact on architectural aesthetics of a room. Additionally, the transmitter 120 can be disguised to resemble a cover plate for a sprinkler head, speaker, or other device.


The transmitter 120 emits two signals 122a, 124a (depicted graphically by a plurality of arrows) into the two rooms 136, 138, respectively. The signals 122a, 124a can be configured to not overlap each other, thus providing a distinct signal on each side of the door centerline 130. In other embodiments, an overlap of the signals 122a, 124a can be desirable. The autonomous vehicle 112 includes a receiver 128 having a field of vision 134. The emitted signals 122a, 124a can be detected by the receiver 128 when the autonomous vehicle's field of vision 134 intersects the signals 122a, 124a. Similar to the embodiment of FIG. 1, the autonomous vehicle can calculate the azimuth and elevation to the transmitter 120 to determine its relative location. Similar to the embodiment described above, by detecting only one signal, the autonomous vehicle 112 can calculate a bearing to the transmitter 120. Accordingly, the transmitter 120 functions as a beacon for the autonomous vehicle 112 to follow and, if the signal is coded, the autonomous vehicle 112 can determine which room of a number of rooms it is located in, based on the coded signal. The autonomous vehicle 112 is thus able to determine its relative location, on a room-by-room basis, as opposed to determining its location within a room. Exemplary embodiments of a doorway-based transmitter are described in more detail with reference to FIGS. 15-18.



FIG. 3A shows a transmitter 20 in accordance with certain embodiments of the present teachings. The depicted transmitter 20 receives power from a wall outlet 40 for convenience and unobtrusiveness, but one skilled in the art will appreciate that transmitters can be powered by means other than a wall outlet. For example, the transmitter can be placed anywhere in a room, provided it has an available power source. For example, battery-powered transmitters are particularly versatile, because they can be located remote from a wall outlet. Such battery-operated transmitters can be unobtrusively located above window or door frames, or on top of tall furniture such as dressers or bookshelves.


In accordance with various embodiments of the present teachings, the transmitter can include a visible signal option (not shown), aligned with the emitted signals, allowing a user to direct the signals to particular locations. In accordance with the present teachings, more than one transmitter maybe used. Such a system could include communication capability between the various transmitters, for example to ensure that only one signal or a subset of signals is emitted at any given time.


A battery-powered transmitter located above a window or door frame can not only permit the autonomous vehicle to localize within a map, coordinate system, or cell grid relative to the transmitter, but can also localize the transmitter within the same map, coordinate system, or cell grid, thereby localizing the window or door frame. Localization of an autonomous vehicle within a working environment is described in detail in U.S. Patent Publication No. 2008/0294288, filed Nov. 27, 2008, the entire disclosure of which is incorporated herein by reference. In the case of a door frame, the door is ordinarily the passage by which the autonomous vehicle navigates from room to room. The transmitter illustrated in FIG. 3A, which can project points upward onto a wall or ceiling, can be battery operated. A transmitter as illustrated in FIGS. 3B-3D can be placed above or at the top of a door (e.g., more than six feet high, where household power may be unavailable) and can also benefit from battery operation (see below).


The exemplary embodiment of a transmitter 20 illustrated in FIG. 3A includes a housing 42 constructed of, for example, a plastic or like material. In this figure, the transmitter 20 is shown cut-away above the line L so that the emitters can be seen. The transmitter 20 can include a power receptacle 44, allowing the outlet used by the transmitter 20 to remain available for other uses. The transmitter 20 includes two emitters 24A, 24B, set within the housing 42. Alternatively, the emitters 24A, 24B can be flush with or extend beyond the housing 42. Setting the emitters 24A, 24B within the housing 42 allows the signals 22a, 24a to be directed by utilizing collimators 22e, 24e. The collimators 22e, 24e can be formed within the housing 42 or can be discreet components within the housing 42. Alternatively, the collimators 22e, 24e can be secured to the outside of the housing 42. In alternative embodiments, lenses 22d, 24d can be included, with or without collimators 22e, 24e, to focus and direct the signals 22a, 24a. These basic manufacturing considerations can also be adapted for emitters located on charging or base stations. One or more emitters on a base station can serve as an additional point of navigation for the autonomous vehicle within the room, or may simply aid the autonomous vehicle in locating the base station.



FIG. 3B depicts an embodiment of a transmitter 120 for use, for example, with the navigation system 110 depicted in FIG. 2. The transmitter 120 is secured to the underside of an upper cross member of the door frame 132, but can also be recessed therein or secured to or recessed in a ceiling 118. The transmitter 120 includes two emitters 122, 124. Other embodiments of the transmitter 120 can include more than two emitters or a single emitter. By utilizing two emitters, the transmitter 120 can direct signals into two different rooms, on either side of the centerline 130 of the door frame 132. This can allow an autonomous vehicle to distinguish which room it is located in.


In accordance with various embodiments of the present teachings, more than two emitters can be utilized with collimators 22e, 24e, 122e, 124e, to distinguish different areas within a room. Such a configuration allows the autonomous vehicle to sense its relative location within a room and adjust its cleaning behavior accordingly. For example, a signal could mark an area of the room that an autonomous vehicle would likely get stuck in. The signal could allow an autonomous vehicle to recognize the area and accordingly not enter it, even though it might otherwise be able to do so unimpeded. Alternatively, or in addition, different signals could mark areas that require different cleaning behaviors (e.g., due to carpeting or wood floors, high traffic areas, etc.).


Turning back to FIG. 3B, the emitters 122, 124 can be installed flush with or extend beyond the housing 142. Setting the emitters 122, 124 within the housing 142 allows the signals 122a, 124a to be directed by utilizing collimators 122e, 124e. The collimators 122e, 124e allow the signals 122a, 124a to be directed to two sides of a centerline 130 of a doorframe 132, without any signal overlap, if so desired. The collimators 122e, 124e can be formed within the housing 142 or can be discreet components within the housing 142. Alternatively, the collimators 122e, 124e can be secured to the outside of the housing 142. In alternative embodiments, lenses 122d, 124d may be included, with or without collimators 122e, 124e, to focus and direct the signals 122a, 124a.


In various embodiments of the present teachings, each signal (regardless of the emitter's location or the number of signals) can be modulated at 10 kHz and coded with an 8-bit code serving as a unique signal identifier, preventing the autonomous vehicle from confusing one signal or point with another. Accordingly, more than two uniquely encoded signals can be employed to increase the accuracy of the autonomous vehicle's calculations regarding its location within a working area. As noted above, using only one emitter allows an autonomous vehicle to take a heading based on that signal. Using two or more signals can allow the autonomous vehicle to continue navigating if fewer than all of the signals are detected (either due to a failure of a signal transmission or the autonomous vehicle moving to a location where fewer than all of the signals are visible).


In certain embodiments, the transmitter can pulse the coded signals as follows. After an initial synchronization pulse, a first signal at 10 kHz is emitted for 100 ms. This can provide a sufficient time for the autonomous vehicle's receiver and processor to calculate azimuth and elevation angles, as discussed in detail below. So that the autonomous vehicle can determine which signal is being received, the transmitter can pulse a series of five bits, each for 10 ms. The five bits include two start bits, for example a zero and a one, followed by a unique three bit identifier to identify that particular signal or point. After a 100 ms delay, the transmitter repeats the sequence for the second signal or point. By changing the modulation frequency and/or the identifier, the second signal or point can be uniquely distinguished from the first. Any number of unique signals can be transmitted and identified in this manner. After the series of signals are transmitted, the transmitter can wait a substantially longer period of time, for example on the order of one to two seconds, before repeating the transmitting sequence, starting again with the first signal. The length of time for each transmission is merely exemplary, and may be varied based on a particular application, device, etc. As stated above, the signals can be modulated at the same or different frequencies.



FIG. 4A depicts a side view of an exemplary receiver 228 that is surface mounted on a housing 212 of an autonomous vehicle. FIG. 4B is a top view of the same receiver 228. The receiver 228 can include an outer shell or housing 244 comprising a generally translucent or transparent, high-impact plastic or like material. Four photodiodes 246a, 246b, 246c, and 246d can be installed in an orientation in the housing 244 generally corresponding to four adjacent sides of a cube. Accordingly, each photodiode can be generally perpendicular to the photodiodes on either side of it and parallel to the photodiode opposite it. In certain embodiments, a fifth photodiode 246e can be located generally above the plane of orientation of photodiodes 246a-246d. At least one photodiode, in this case photodiode 246a, is oriented toward a direction of forward movement M of the robot. The photodiodes can be connected via control wiring and other components to the autonomous vehicle's microprocessor and related systems. Installing a receiver 228 on top of the housing 212 can provide the autonomous vehicle with a wide field of view. As depicted, the field of view δ1 for a horizontally-oriented photodiode 246e is extremely wide. Depending on the sensitivity of the photodiode 246e, the thickness or translucence of the plastic, and other factors, the field of view δ1 may approach or exceed 180°. Similarly, due to the orientation of photodiodes 246a-246d, their field of view δ2 approaches near vertical in an upward direction from the autonomous vehicle's housing 212 and is limited below only by the autonomous vehicle's housing 212. There can be an overlap between the fields of view δ1 and δ2 in the longitudinal plane, as depicted in FIG. 4B.


As illustrated in FIG. 4A, there can be overlap between the fields of view δ1 and δ2, allowing the autonomous vehicle to detect signals in its operating area. The overlap creates a total field of view for the receiver that approaches the entire volume of the room above the robot housing. Accordingly, this embodiment of the receiver 212, is well-suited to the exemplary embodiment of the navigation system depicted and described in FIG. 2, wherein a signal is projected into an entire room. Of course, this receiver 228 could also be used with the system depicted in FIG. 1. Although installing the receiver closer to or above a top surface of the autonomous vehicle can provide for a wider range of view, this configuration increases a height of the autonomous vehicle slightly and can limit autonomous vehicle travel beneath certain obstacles such as couches, low tables, or the like.



FIG. 4C depicts an exemplary embodiment of the receiver 328 installed below a surface of the autonomous vehicle housing 312. The photodiodes 346a-346e (as a group referred to as 346) can be installed in a void 350 or other cavity below the surface of the autonomous vehicle housing 312. A translucent or transparent plastic cover 312a can be fitted over the photodiodes 346. The cover 312a can be secured to the housing 312, for example, with screws, press-fit connections, or other connectors. Alternatively, the cover 312a can be set in place without connectors, allowing easier access to the photodiodes 346 for service or replacement. This lower profile version reduces or eliminates the risk associated with a surface mounted receiver getting stuck below obstacles (as described above).


The construction of the receiver 328 can be similar to that of FIG. 4A. Four of the photodiodes 346a-346d can be installed orthogonal to each other, facing a predefined direction on the autonomous vehicle (e.g., front, back, right, and left). The fifth photodiode 346e can be installed orthogonal to the other four photodiodes, facing directly up from a top of the autonomous vehicle. Because the photodiodes 346 are set within the housing 312, the receiver's overall field of view δ3 can be limited to a certain degree. In this embodiment, δ3 equals approximately 120°. The field of view δ3 can be wider or narrower depending on the depth of installation below the surface of the autonomous vehicle housing 312. Alternatively, the field of view δ3 can be modified by utilizing a cover 312a having particular effects on signal transmission, such as a fish-eye lens or the like.



FIG. 5A illustrates an exemplary embodiment of a control schematic 560 for a receiver 528. The receiver 528 can include a number of independent photodiodes 546a-546e (as a group referred to as 546), pre-amplified and multiplexed into a single microprocessor 562. As described above, four of the photodiodes 546a-546d can be installed orthogonal to each other, facing a predefined direction on the autonomous vehicle (e.g., front, back, right, and left). A fifth photodiode 546e can be installed orthogonal to the other four, facing directly up from the top of the robot. Once a reflected signal is received by a photodiode 546, the receiver 528 determines the frequency of modulation of the signal, the identity sequence, if any, and the envelope of received energy (i.e., the demodulation of energy). These values can be sent to the microprocessor 562, which can calculate the location of the autonomous vehicle relative to the signals and the identities of the signals. Additionally, if only a single point is detected by the receiver 528 (if for example, the robot's view of the second signal is obstructed), the autonomous vehicle can use this point as a heading. By following this heading, the autonomous vehicle can move within the work area until a second point is detected.


In operation, a receiver (e.g., an infrared receiver) can first measure the “noise floor” of the autonomous vehicle's environment, comprising the amount of energy (e.g., infrared energy) present in the autonomous vehicle's environment, which it sets as the threshold value. This value can represent an average of the values for each photodiode. Any subsequent measurement above this threshold value can trigger an event (e.g., a calculation of point azimuth and elevation). The receiver can then measure the modulation frequency again, searching for an expected increase at 10 kHz (i.e., the frequency of the initial synchronization signal transmitted by the transmitter). If a 10 kHz frequency increase is detected, the autonomous vehicle recognizes the increase as an emitted navigation signal. The autonomous vehicle can then measure the amplitude of the reflected point on all five photodiodes to determine an average value. This value can then be compared to a list of signal frequencies to determine which of the signals has been detected. Alternatively, any detected identity sequence associated with the signal can be compared to a list of transmitter codes or signal IDs stored in a lookup table in the autonomous vehicle's processor memory.


The on-board microprocessor can use the amplitude value to determine the azimuth and elevation of the received signals, which it can then use to determine its location within a working area. To determine the azimuth, the microprocessor enters the values of the two strongest readings from the four side photodiodes into an algorithm. The algorithm takes the ratio of these two readings to determine the azimuth angle. For example, if the two strongest readings from two photodiodes are equal, the algorithm recognizes that the point is located at an azimuth angle that is directly between the two photodiodes (i.e., at 45°). In a similar algorithm, the amplitude value measured from the strongest side photodiode and the amplitude value measured from the top-facing photodiode value are used to determine the elevation of the signal. These values can be stored in the autonomous vehicle's memory for future reference.


After the receiver has detected at least two points, and determines the azimuth and elevation of each point, it determines its location within the working space. A triangulation algorithm based on the known ceiling height and the azimuth and elevation of the two detected points allows the processor to determine where in the working space the autonomous vehicle is located. Over time, the values of elevation and azimuth between each coded point and specific locations of the autonomous vehicle within the workspace can be stored in the autonomous vehicle's memory, creating a map of the environment in which the autonomous vehicle operates.


In various embodiments, a navigation system 200 as depicted in FIG. 5B uses an angle-based localization system. Values corresponding to elevation and azimuth are determined by synchronously comparing average amplitudes from the number of detectors arranged on the robot. Of the five detectors, four are arranged in a plane and are angularly spaced by 90° increments. The fifth detector is in the center of the aforementioned four-detector array and aimed so that it is orthogonal to the plane in which the other detectors lie, directed vertically from the autonomous vehicle. Together, this five-element array can have a full or near-full hemispherical field of view.


In the embodiment depicted in FIG. 5B, all five detectors monitor for amplitude (Step 705) until an amplitude that crosses a preset threshold is detected (Step 710). After the amplitude on any detector crosses the preset detection threshold, the frequency of the signal on the strongest detector is measured and compared against known transmit frequencies (Step 715). If the measured frequency is one of the known transmit frequencies (Step 720), the next step in the detection process can be executed. If the signal is not a known transmit frequency, the detection process can be aborted (Step 725) and the signal detected can be declared to be “out of band.” Once an “in band” frequency is detected, a time-averaged amplitude for each photo detector can be measured, converted to a binary number, and stored for later processing in a microprocessor memory (Step 730). Upon storing the five numerical values (one for each photodiode), the azimuth angle can be determined.


Of the four detectors that reside in a single plane, the values of the two strongest signals detected are used to form a ratio to determine the azimuth angle (Step 735). The ratio of second-strongest signal over the strongest signal is either compared to a look-up table or inserted into a mathematical equation to determine an azimuth angle output. Both the look-up table and the equation represent the overlap of the received sensitivity patterns of two orthogonal detectors with known sensor responses. In this embodiment, the photo detector output is modeled as a fourth-order Gaussian response to angle off of “boresight,” a term that generally refers to a vector that is orthogonal to the semiconductor die in the detector package.


To calculate elevation, the strongest signal from azimuth calculation (i.e., the denominator of the ratio) must first be normalized, as if it were on boresight of the respective detector (Step 740). For example, if the azimuth has been determined to be 10° off of boresight from a given detector, that 10° angle is entered into a look-up table or equation that describes the sensor response of any single photo detector. At zero degrees, the output of this look-up table/equation would be 1.00000. As the angle deviates from zero degrees, the output drops to some fraction of 1.00000 (the normalized value at boresight). For example, if a value of 10° is entered into the equation, the output of this operation can be, for example. 0.99000. The denominator of the azimuth ratio can then be divided by this fractional value in order to scale up, or “normalize” that value to what it would be if the azimuth were actually zero degrees. This normalized value can then be stored in memory and elevation can be determined therefrom.


To calculate elevation, the normalized output from the previous step is used to produce a new ratio with the output from the upward-looking (fifth) detector, so that the numerator is the second-strongest of the two values and the denominator is the strongest of the two values (Step 745). This ratio is then entered into the same look-up table or equation from the step above (used to calculate azimuth), thus outputting an elevation angle.


The benefits of this type of navigation system can be numerous. As the autonomous vehicle moves about a working area, measuring the azimuth and elevation of the various points detected, it can create a map of the area, thus determining its location within a given space. With this information, it can fuse data from all of its on-board sensors and improve cleaning or other task efficiency. One way it can do this is to create a map where the high-traffic areas in a house or other building are located (as indicated by readings from the dirt sensor, for example). The autonomous vehicle would then clean the areas it identified as high traffic (and therefore, often dirty) each time it passes over that area, whether directed to or not. The autonomous vehicle may also improve its cleaning function by merging the output from the wheel drop, stasis, bumper, and wall-following sensors to roughly mark areas of entrapment, or where large obstacles exist, so that those areas can potentially be avoided in future runs.


In accordance with various embodiments of the present teachings, another method of improving cleaning efficiency involves selectively programming the autonomous vehicle to clean particular areas, as detailed below. For example, a personal computer or remote control may be used to control the autonomous vehicle. Although the autonomous vehicle can operate without operator intervention, an operator can initially set up the autonomous vehicle, or can direct the autonomous vehicle to operate in particular areas or at particular times. For example, by using more than one transmitter in various rooms on one floor of a house, an operator may be able to direct the autonomous vehicle to clean specific rooms in a particular order and/or at a specific time. The operator could select, in a control program field of a computer program for example, the living room, family room, bathroom, and kitchen areas for cleaning. A remote control for use in accordance with the present teachings is described in more detail with respect to FIGS. 19-22.


Once commanded (either immediately or on a predefined schedule), the autonomous vehicle can be signaled to begin its cleaning cycle. The autonomous vehicle undocks from its base/charging station and begins cleaning the closest or first room on the programmed list. It can recognize this room and can differentiate it by the coded group of infrared points (e.g., on a ceiling of the room) or the coded signal emitted in the room. After the first room is cleaned, the autonomous vehicle can, for example, check its level of power and return to its charger for additional charging if needed. In accordance with certain embodiments, in order to return to the charger, the autonomous vehicle can follow the point or points on the ceiling back to the base station. Alternatively, the autonomous vehicle can use a known docking behavior. After charging is complete, the autonomous vehicle can traverse roughly back to the place it left off and resume cleaning. This sequence of events continues until all of the programmed rooms have been cleaned. Alternatively, the selection of particular areas to clean could be, for example, made by remote control or by pressing buttons on a control panel located on the base station. By using a personal computer, however, multiple transmitters could communicate with each other and with the base station via power lines using a known communication technology.


An alternative embodiment of the present teachings is depicted in FIG. 6, wherein an autonomous vehicle uses a number of signals for headings to move from room to room. The autonomous vehicle 612 is moving in a direction M within room A when its power level drops below a predetermined level, requiring its return to a base charging station 622. Upon crossing the predetermined power level, the autonomous vehicle's receiver 628 searches for a signal from a nearby emitter. As the vehicle is located in room A, it detects the signal 622a emitted from transmitter 620a and, using the signal 622a as a heading, moves directly for that signal 622a.


Alternatively, the autonomous vehicle 612 can emit its own coded pulse, to determine if any transmitters are in the area. This coded pulse could “awaken” sleeping or otherwise dormant transmitters, which would then begin their own emission cycle. Alternatively, the pulse can be an audible or visual signal such as a distinct beep, buzz, or visual strobe. Some pulses need not be within the field of view of the transmitter.


The robot 612 will continue to move toward signal 622a until one of several events happens at or near doorway 632a. In a first event, the autonomous vehicle may determine, based on readings from its photodiodes, that it is directly under the transmitter 620a. In a second event, the autonomous vehicle 612 may sense a second signal 624a, which may overlap the first detected signal 622a. This could occur if the configuration of the emitters, collimators, etc., as described in more detail above, provides overlapping signal patterns between signals 622a and 624a. In a third event, autonomous vehicle 612 can sense a signal from an entirely different transmitter, in this case signal 622b from transmitter 620b. Other events are also contemplated, as suitable for a particular application. The occurrence of an event presents the autonomous vehicle 612 with any number of behavioral, functional, or other options. For example, each coded signal may serve as a unique marker for a different working space. Upon detecting the unique marker associated with a particular working space, the autonomous vehicle may alter its cleaning function. Thus, if room A is carpeted but room B is uncarpeted, the autonomous vehicle can adjust its cleaning as it moves from room A to room B. Upon detecting a second signal (in this case, signal 622b) the autonomous vehicle can, in certain embodiments, completely disregard the first signal 622a received when its return to the base station 622 began. Using new signal 622b as a heading, it begins moving toward that signal 622b. The autonomous vehicle 612 can, in certain embodiments, check its battery level at each event, storing that value in its microprocessor. Over time, the autonomous vehicle can thereby create a table of battery levels at each event (and battery level change from event to event), and be able to accurately determine precise battery power remaining at each transmitter location.


Once the autonomous vehicle is traversing room B (shown in phantom as 612′), it will eventually determine, based on battery level, time, or other factors, to follow the heading provided by signal 622b, and continue its return to its base station 622. The autonomous vehicle 612 can follow the heading until an event occurs at or near doorway 632b. Again, the event can be detecting a strength of signal 622b, indicating that the autonomous vehicle is directly below the transmitter, detecting an overlap signal from 624b, or detecting a new signal 622c. The autonomous vehicle 612 can again perform any of the behaviors described above: check and store its battery level; change cleaning characteristics; etc.


Once in room C, the autonomous vehicle can begin following the heading provided by signal 622c. At or near the doorway 632c to room D, an event may direct the autonomous vehicle to perform any number of behaviors. Alternatively, the autonomous vehicle can move directly to charging station 622, guided by emitted signal 626 or another signal or program.


During its return to the base station, as the autonomous vehicle 612 moves from room A to room B to room C and so on, it detects and stores information about each coded signal that it detects along its route. By storing this information, the autonomous vehicle can create a map, using the coded signals as guideposts, allowing it to return to its starting location in the future. After charging, the autonomous vehicle can return to the room it was working in prior to returning to its base by comparing the detected signals and their strengths to the stored information.



FIGS. 7-9 depict schematic circuit representations for exemplary embodiments of various components of an infrared signal transmitter, namely an AC-DC converter, a microcontroller and support circuitry, and LED drivers. More specifically, FIG. 7 illustrates an electronic circuit that takes 120VAC160 Hz line voltage and converts it to a regulated +5VDC supply. This supply can be used to power the microcontroller and associated circuitry of the transmitter depicted in FIG. 8. In addition to power conversion, this circuit can also provide an isolated digital logic signal to the microcontroller, whenever a “zero-crossing” in the AC line input is detected.



FIG. 8 illustrates a transmitter microcontroller and support circuitry (i.e., a clock oscillator and an in-circuit serial programming port). In addition, there is a circuit that allows a user-initiated button press to project visible light from a pair of LEDs, co-located with a pair of IR LEDs, onto a remote surface for the purpose of assisting the user in aiming the infrared signal points.



FIG. 9 illustrates two channels of an IR LED driver. Each driver can control a preset constant current into a single IR LED, which can then emit near-infrared light that can be collimated by an external lens and projected onto the remote surface. Each IR LED can be modulated and pulse-coded independently of the other. This allows the microcontroller in the autonomous vehicle to discern between the different transmitter signals, to determine which detected signal is which.



FIGS. 10-14 depict schematic circuit representations in accordance with certain embodiments of various components of a vehicle-mounted infrared receiver, namely DC-DC linear power converters, a five channel preamplifier, a multiplexer and programmable tuned amplifier, detectors, and a microcontroller and associated peripherals. More specifically, FIG. 10 depicts two independent linear voltage regulators. One of the regulation circuits can be switched ON-OFF via a microcontroller to conserve battery power during a sleep mode.



FIG. 11 depicts five independent preamplifiers that can convert respective photodiode output currents into voltages of much larger magnitudes. Each preamplifier is built using an operational amplifier in a transimpedance topology. This allows the preamplifiers to be configured with low noise. Also, there is an active feedback circuit that is used to null large photodiode current offsets caused by exposure of the circuit to sunlight and other strong low-frequency light sources.



FIG. 12 illustrates an exemplary multiplexer and programmable tuned amplifier for the receiver. This circuitry can be segregated into three functional blocks. The first block is a multiplexer that receives signals from the five photodiode preamplifiers and outputs one of the signals to a programmable attenuator, as commanded by the receiver's microcontroller. The second block is a programmable attenuator that can be used to reduce the overall receiver gain, to deal with the large dynamic range of received signals. As depicted herein, there are two digital inputs from the microcontroller, which permits four discrete gain levels to be selected. The third block is a tuned, band-pass amplifier that can provide the bulk of the voltage amplification to signals that fall within the circuit's pass band.



FIG. 13 depicts an exemplary embodiment of two detectors that can be used in the receiver. The first detector is a rectifying, envelope detector with integral voltage gain, and can be used to strip modulation frequency and provide a signal envelope to the microcontroller's analog to-digital converter. The signal envelope can be used the by microcontroller to determine the magnitude of the received signal. The second detector is a voltage comparator, which can be used to “square up” received pulses and convert them to a CMOS logic level, thereby allowing the microcontroller to extract digital data from the received signals.


Lastly, FIG. 14 illustrates the microcontroller and its peripherals. The peripherals can include a clock oscillator, ICSP port, voltage supervisor/reset generator, and RS-232 level serial port for interfacing with a host personal computer or a main robot processor.


Accordingly, the navigation system can be operationally robust and adapted to compensate for variances in infrared energy. For example, if the autonomous vehicle is operating in an environment with high base infrared readings (e.g., a space with a large number of fluorescent lighting fixtures or windows that allow entry of sunlight), the autonomous vehicle can distinguish the infrared signals generated by the transmitter from the infrared noise present in the space. Similarly, the receiver can distinguish between other off-band signals, such as infrared remote controls. In such cases, establishing an initial threshold value of infrared energy and emitting a predefined, known, modulated infrared signal from the transmitter overcomes these environmental disturbances. Additionally, the transmitter can be tuned to emit a sufficiently strong infrared signal to accommodate surfaces with varied reflectivity.



FIGS. 15A-15C illustrate side, bottom, and end views, respectively, of an exemplary embodiment of a transmitter 200 having a thin rectangular housing and configured for placement in a variety of locations including a top surface of a doorway as illustrated in FIGS. 2, 6, 17, and 18. In the illustrated embodiment, an emitter 222, 224 is located adjacent each edge EL, ER of the transmitter 200. In accordance with certain embodiments of the present teachings, each emitter can comprise a lens 222d, 224d as described above to focus and direct the emitted signal. The present teachings also contemplate the transmitter 200 comprising a third emitter 226 with a lens 226d to focus and direct the emitted signal. The illustrated transmitter 200 also comprises a battery 230 and a printed circuit board 240. As discussed above, the battery 230 can provide power to the transmitter 200 while allowing the transmitter 200 to be located without regard to proximity of power supplies such as wall outlets. Other portable power sources such as capacitors can also be used instead of, or in addition to, the battery. The printed circuit board 240 can be employed to modulate and code the emitted signals, and to provide power conversion for wirelessly charging the battery 230 or other power source. An antenna 250 can be utilized to intercept fields for conversion to current for wirelessly charging the battery, as described in more detail below.


Wireless charging in accordance with the present teachings can comprise, for example, RF scavenging or magnetoresonance. Wireless charging via RF scavenging can be accomplished as disclosed in U.S. Patent Publication No. 2009/0102296, the entire disclosure of which is incorporated herein by reference. The antenna 250 (e.g., a RF wireless communication antenna) can facilitate both energy harvesting and wireless communication for the transmitter 200 and, to facilitate energy harvesting, can harvest RF energy from a variety of sources including, for instance, medium frequency AM radio broadcast, very high frequency (VHF) FM radio broadcast, cellular base stations, wireless data access points, etc. The energy can be harvested from that naturally available in the environment (work area) or can be broadcast by a source such as an RF signal emitter on the autonomous vehicle or on another device such as a base station or a dedicated emitter. FIG. 23 schematically illustrates an embodiment of the present teachings wherein an autonomous vehicle 12 includes a RF signal emitter 360 that directs an RF signal toward the transmitter 200 for harvesting to ensure adequate RF energy for recharging the battery 230 or other power source. The printed circuit board 240 can serve to convert the harvested RF energy into a usable form, for example AC voltage or DC voltage. The printed circuit board 240 can also regulate the converted power.


Certain embodiments of the present teachings contemplate wireless charging via strongly coupled magnetic resonances, or magnetoresonance. Such wireless charging is described in detail in Kurs, et al., Wireless Power Transfer via Strongly Coupled Magnetic Resonances, Science, Vol. 317, pp. 83-86 (Jul. 6, 2008), the entire disclosure of which is incorporated herein by reference. For wireless charging via magnetoresonance, the antenna 250 can comprise, for example, a capture coil that can capture and convert magnetic energy to AC voltage or DC voltage. The magnetic energy captured by the capture coil can be supplied by a power source such as a highly resonant magnetic source. The power source can be located, for example, on the autonomous vehicle (in a scenario such as that illustrated in FIG. 23), on a dedicated device, or on a base station for the autonomous vehicle.


One skilled in the art will appreciate that the transmitter 200 can derive its power from a source other than a battery, for example from a wall plug or by direct connection to a building's power supply. Also, the emitters can have differing locations on the transmitter, and need not be combined with a lens as illustrated. The size of the transmitter can vary in accordance with functional considerations (e.g., being large enough to house its components) as well as aesthetic considerations (e.g., minimizing size to be less obtrusive).



FIGS. 16A-16C illustrate side, bottom, and end views, respectively, of another exemplary embodiment of a transmitter 300 having a thin rectangular housing and configured for placement in a variety of locations including a top surface of a doorway as illustrated in FIGS. 2, 6, 17, and 18. In the illustrated embodiment, an emitter 322, 324 is located adjacent each edge of the transmitter 300. In accordance with certain embodiments of the present teachings, each emitter can comprise a collimator 322e, 324e and a lens 324d (see FIG. 16C) as described above to focus and direct the emitted signal. Although a third emitter is not illustrated in this embodiment, the transmitter can comprise at least one additional emitter and can employ a lens and/or collimator thereon to focus and direct the emitted signal. The illustrated exemplary transmitter 300 also comprises a battery 330 and a printed circuit board 340. As discussed above, the battery 330 can provide power to the transmitter 300 while allowing the transmitter 300 to be located without regard to proximity of power supplies such as wall outlets. The printed circuit board 340 can be employed to modulate and code the emitted signals, and to provide power conversion for wirelessly charging the battery 330 or other power source. An antenna 350 can be utilized to intercept magnetic or RF fields for conversion to current for wirelessly charging the battery 330, as described above with respect to FIG. 15.


One skilled in the art will appreciate that the transmitter 300 can derive its power from a source other than a battery, for example from a wall plug or by direct connection to a building's power supply. Also, the emitters can have differing locations on the transmitter, and need not be combined with a collimator and/or a lens as illustrated. The size of the transmitter can vary in accordance with functional considerations (e.g., being large enough to house its components) as well as aesthetic considerations (e.g., minimizing size to be less obtrusive).



FIG. 17 illustrates a transmitter 200 mounted on a top surface T of a doorway DW or other passage between two areas. In the illustrated embodiment, because the transmitter 200 is placed at a high position within the room or work area, the emitted signals should not be directed upward toward the ceiling and instead should be directed toward the portion of the room through which the autonomous vehicle 12 travels. In accordance with various embodiments, the emitted signals can be coded and modulated as discussed above, so that the autonomous vehicle 12 can recognize the transmitter for localization and/or navigation purposes. In addition, in accordance with certain embodiments, the emitted signals can include information for the autonomous vehicle 12, such as information instructing the autonomous vehicle to adjust its cleaning behavior.


In embodiments of the present teachings employing more than two emitters, the signals can be utilized, e.g., with collimators or lenses, to distinguish different areas within a room. Such a configuration allows the autonomous vehicle 12 to sense its relative location within a room and adjust its cleaning behavior accordingly. For example, a signal could mark an area of the room that an autonomous vehicle would likely get stuck in. The signal could allow an autonomous vehicle to recognize the area and accordingly not enter it, even though it might otherwise be able to do so unimpeded. Alternatively or additionally, different signals could mark areas that require different cleaning behaviors (e.g., due to carpeting or wood floors, high traffic areas, etc.).


The transmitters 200, 300 as illustrated in FIGS. 15A-15C and FIGS. 16A-16C, respectively, can function in a manner similar to transmitter 120 in FIG. 2, as described above, with the additional emitter(s) allowing more functionality, as described above, such as indicating areas requiring different cleaning behaviors. The transmitters 200, 300 can also function in a manner similar to the transmitters illustrated in FIG. 6, and particularly those located within the doorway/room transitions in FIG. 6.



FIG. 18 illustrates the autonomous vehicle of FIG. 17 passing through a doorway DW, and additionally illustrates an exemplary embodiment of the present teachings utilizing a remote control 370 to communicate with the autonomous vehicle 12 and/or the transmitter 200. An exemplary embodiment of a remote control 370 is disclosed in more detail in FIGS. 19A-22.


As illustrated in FIGS. 19A-19C, the remote control 370 can include one or more power buttons 340 for powering ON/OFF the remote control 370, the transmitter 200, 300, and/or the autonomous vehicle 12. In addition, the remote control 370 can include a display 310 (e.g., a liquid crystal display) and one or more input devices 320, 330 such as buttons and/or a toggle pad. FIGS. 19A-19C show the remote control 370 being used to set up an autonomous vehicle for cleaning. In FIG. 19A, the display 310 displays a variety of room types to be cleaned by the autonomous vehicle. In the illustrated embodiment, the user can locate himself and the remote control 370 in a work area to be cleaned and select from a number of room type choices, such as bedroom, office, kitchen, utility room, living room, dining room, bathroom, and hallway. The system can identify this room via an encoded and/or modulated emitted signal from a nearby transmitter. The user selects one of the room types by pressing an adjacent button 320. Thereafter, the display 310 can acknowledge the user's selection and automatically connect to a controller (see FIG. 19B), such as a personal computer, to allow the user to provide a specific name for the room. In other embodiments, the remote control can correlate the coded emitted signal with the chosen/assigned name and allow a user to choose whether to engage in specific room naming (e.g., via input 320) or just assign a predetermined name to the room such as bedroom 1, office 1, kitchen 1, etc. Once a room has been assigned an appropriate name, the remote control can allow the user to enter additional names or continue other aspects of setup. In FIG. 19C, the remote control 370 displays the rooms that have been registered and allows the user to select which rooms are to be cleaned. In the illustrated exemplary embodiment, the user can select one or more of the registered rooms by pressing an adjacent button 320. The system can then determine the order of the rooms to be cleaned and the start time (e.g., immediately), or can allow the user to determine the order of the rooms to be cleaned and/or the start time. In certain embodiments, the system can allow the user to select a start time for each selected room.


Another input device 330, shown in the illustrated embodiment as a toggle pad or toggle button, can allow the user to direct the autonomous vehicle to perform a number of functions. For example, the user can press a center “CLEAN” portion of the toggle button to direct the autonomous vehicle to begin cleaning immediately, or can select the right “DOCK NOW” button to direct the autonomous vehicle to begin a homing behavior and dock. A top “SCHEDULE” button can be pressed to allow the user to select a schedule of rooms and/or times for cleaning, an exemplary process for which is illustrated in FIGS. 20A-20C. The user can also select the left “MODES” button to select among a variety of available cleaning modes such as spot clean, deep clean, area rug, drive now, etc. as illustrated in FIG. 21A. The modes displayed in FIG. 21A can be selected by pressing a button 320 adjacent a desired mode. In certain embodiments, after a mode has been selected, the remote control 370 can provide further instructions to the user. For example, if an “area rug” mode has been selected, the remote control 370 can display instructions confirming that the autonomous vehicle is in “area rug” mode and instructing the user to place the autonomous vehicle on the area rug and then press the central “CLEAN” button. In the illustrated embodiment of FIG. 21B, the remote control 370 confirms that the “ROBOT WILL CLEAN THE RUG ONLY.” In another example, if a “DRIVE NOW” mode is selected, the remote control 370 can allow the user to drive the vehicle. In accordance with FIG. 21C, the remote control 370 can inform the user that the autonomous vehicle is in a “DRIVE NOW MODE” and instruct the user to press certain buttons to drive the robot. For example, the top “SCHEDULE” button can be pressed to turn the autonomous vehicle forward, the left “MODES” button can be used to turn the vehicle to the left, the right “DOCK NOW” button can be used to move the vehicle to the right, and the bottom “SETUP” button can be used to move the vehicle backward. One skilled in the art will appreciate that other buttons can be used to drive the vehicle, such as a dedicated drive toggle or input buttons 320. In certain embodiments, the remote control 370 can also inform the user how to exit the “DRIVE NOW” mode, such as by pressing a portion of the toggle button 330.



FIGS. 20A-20C illustrate an exemplary embodiment of cleaning schedule displays that can be utilized when the user has pressed the top “SCHEDULE” portion of toggle button 330. In the illustrated exemplary embodiment, cleaning frequency choices are first displayed for user selection. For example, twice daily, daily, three times per week, weekly, bi-weekly, or monthly can be selected. In certain embodiments, a “CUSTOM” selection can also be made. Users select a frequency by pressing a button adjacent their preferred frequency. In accordance with certain embodiments, once a frequency has been selected or, if “CUSTOM” is selected, the remote control can display the days of the week for cleaning (see FIG. 20B). The user can select an appropriate number of desired days by pressing the button 320 adjacent those days. In addition, in accordance with certain embodiments, the user can select a time for cleaning for all selected days or a time for cleaning for each selected day. Thereafter, as illustrated in FIG. 20C, the user can be prompted by the display 310 to select one or more rooms for cleaning at the desired date and time. In accordance with various embodiments of the present teachings, a user could select “CUSTOM” and set a date and time for each room registered in accordance with FIGS. 19A-19C, or could select a predefined schedule as illustrated in FIG. 20A and personalize that selection by choosing days and times if desired.


In accordance with certain embodiments of the present teachings, the remote control 370, can also display a status screen such as that illustrated in FIG. 22. The status screen can have a variety of formats for informing the user how much of a scheduled cleaning has been completed. The status screen can be accessed in a variety of ways via manipulation of the remote control 370, or may appear in the manner of a screen saver when the remote control 370 is not being used for controlling an autonomous vehicle or inputting data. One skilled in the art will understand that the selections facilitated by the remote control 370 in FIGS. 19A-22 can also be accomplished via other devices, such as a handheld PDA, a cellular phone, or a laptop or other similar computing devices.


Other embodiments of the present teachings will be apparent to those skilled in the art from consideration of the specification and practice of the teachings disclosed herein. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the present teachings being indicated by the following claims.

Claims
  • 1. A navigation control system for an autonomous vehicle, the system comprising: a transmitter comprising: a first emitter for emitting a first signal;a second emitter for emitting a second signal;a power source for powering both the first emitter and the second emitter;an antenna for capturing wireless energy; anda printed circuit board for converting the captured wireless energy to a form for charging the power source powering both the first emitter and the second emitter; andan autonomous vehicle operating within a working area, the autonomous vehicle having a receiving system comprising: a receiver for detecting the first and second signals; anda processor for determining a relative location of the autonomous vehicle within the working area based on the first and second signals, wherein a source of the wireless energy is located on the autonomous vehicle.
  • 2. The system of claim 1, wherein the wireless energy is RF energy.
  • 3. The system of claim 1, wherein the wireless energy is magnetoresonance energy.
  • 4. The system of claim 2, wherein the antenna is an RF antenna.
  • 5. The system of claim 3, wherein the antenna is a capture coil.
  • 6. The system of claim 1, wherein a source of the wireless energy is located on a base station for the remote vehicle.
  • 7. The system of claim 1, wherein the wireless energy is naturally available in an environment of the transmitter.
  • 8. The system of claim 7, wherein the first and second signals are coded.
  • 9. The system of claim 8, wherein the printed circuit board codes the first and second signals.
  • 10. A method performed by an autonomous vehicle for controlling navigation within one or more work areas, the method comprising: receiving first and second signals emitted by first and second emitters of a transmitter, each of the first and second emitters being powered by a power source of the transmitter;localizing the autonomous vehicle with respect to the transmitter using the first and second signals, and, after localizing the autonomous vehicle, charging the power source of the transmitter using wireless energy; andnavigating the autonomous vehicle within the one or more work areas using the first and second signals.
  • 11. The method of claim 10, wherein the wireless energy is RF energy.
  • 12. The method of claim 10, wherein the wireless energy is magnetoresonance energy.
  • 13. The method of claim 11, wherein the transmitter comprises an RF antenna configured to capture the wireless energy.
  • 14. The method of claim 12, wherein the transmitter comprises a capture coil configured to capture the wireless energy.
  • 15. The method of claim 10, wherein the first and second signals are coded.
  • 16. The method of claim 15, wherein the transmitter comprises a printed circuit board configured to code the first and second signals.
INTRODUCTION

This application is a continuation-in-part of U.S. patent application Ser. No. 12/415,554, filed Mar. 3, 2009 now U.S. Pat. No. 8,594,840 and U.S. patent application Ser. No. 12/415,512, filed Mar. 3, 2009. These two Applications are continuations of U.S. patent application Ser. No. 11/176,048, filed Jul. 7, 2005 now U.S. Pat. No. 7,706,917, which claims priority to and incorporates by reference U.S. Provisional Patent Application No. 60/586,046, entitled “Celestial Navigation System for an Autonomous vehicle,” filed on Jul. 7, 2004. The present teachings relate to robotic systems and, more specifically, to navigation systems for autonomous vehicles.

US Referenced Citations (1026)
Number Name Date Kind
1755054 Darst Apr 1930 A
1780221 Buchmann Nov 1930 A
1970302 Gerhardt Aug 1934 A
2136324 John Nov 1938 A
2302111 Dow et al. Nov 1942 A
2353621 Sav et al. Jul 1944 A
2770825 Pullen Nov 1956 A
2930055 Fallen et al. Mar 1960 A
3119369 Harland et al. Jan 1964 A
3166138 Dunn Jan 1965 A
3333564 Waters Aug 1967 A
3375375 Robert et al. Mar 1968 A
3381652 Schaefer et al. May 1968 A
3457575 Bienek Jul 1969 A
3550714 Bellinger Dec 1970 A
3569727 Aggarwal et al. Mar 1971 A
3649981 Woodworth Mar 1972 A
3674316 De Brey Jul 1972 A
3678882 Kinsella Jul 1972 A
3690559 Rudloff Sep 1972 A
3744586 Leinauer Jul 1973 A
3756667 Bombardier et al. Sep 1973 A
3809004 Leonheart May 1974 A
3816004 Bignardi Jun 1974 A
3845831 James Nov 1974 A
RE28268 Autrand Dec 1974 E
3851349 Lowder Dec 1974 A
3853086 Asplund Dec 1974 A
3863285 Hukuba Feb 1975 A
3888181 Kups Jun 1975 A
3937174 Haaga Feb 1976 A
3952361 Wilkins Apr 1976 A
3989311 Debrey Nov 1976 A
3989931 Phillips Nov 1976 A
4004313 Capra Jan 1977 A
4012681 Finger et al. Mar 1977 A
4070170 Leinfelt Jan 1978 A
4099284 Shinozaki et al. Jul 1978 A
4119900 Kremnitz Oct 1978 A
4175589 Nakamura et al. Nov 1979 A
4175892 De Brey Nov 1979 A
4196727 Verkaart et al. Apr 1980 A
4198727 Farmer Apr 1980 A
4199838 Simonsson Apr 1980 A
4209254 Reymond et al. Jun 1980 A
D258901 Keyworth Apr 1981 S
4297578 Carter Oct 1981 A
4305234 Pichelman Dec 1981 A
4306329 Yokoi Dec 1981 A
4309758 Halsall et al. Jan 1982 A
4328545 Halsall et al. May 1982 A
4367403 Miller Jan 1983 A
4369543 Chen et al. Jan 1983 A
4401909 Gorsek Aug 1983 A
4416033 Specht Nov 1983 A
4445245 Lu May 1984 A
4465370 Yuasa et al. Aug 1984 A
4477998 You Oct 1984 A
4481692 Kurz Nov 1984 A
4482960 Pryor Nov 1984 A
4492058 Goldfarb et al. Jan 1985 A
4513469 Godfrey et al. Apr 1985 A
D278732 Ohkado May 1985 S
4518437 Sommer May 1985 A
4534637 Suzuki et al. Aug 1985 A
4556313 Miller et al. Dec 1985 A
4575211 Matsumura et al. Mar 1986 A
4580311 Kurz Apr 1986 A
4601082 Kurz Jul 1986 A
4618213 Chen Oct 1986 A
4620285 Perdue Oct 1986 A
4624026 Olson et al. Nov 1986 A
4626995 Lofgren et al. Dec 1986 A
4628454 Ito Dec 1986 A
4638445 Mattaboni Jan 1987 A
4644156 Takahashi et al. Feb 1987 A
4649504 Krouglicof et al. Mar 1987 A
4652917 Miller Mar 1987 A
4654492 Koerner et al. Mar 1987 A
4654924 Getz et al. Apr 1987 A
4660969 Sorimachi et al. Apr 1987 A
4662854 Fang May 1987 A
4674048 Okumura Jun 1987 A
4679152 Perdue Jul 1987 A
4680827 Hummel Jul 1987 A
4696074 Cavalli Sep 1987 A
D292223 Trumbull Oct 1987 S
4700301 Dyke Oct 1987 A
4700427 Knepper Oct 1987 A
4703820 Reinaud Nov 1987 A
4709773 Clement et al. Dec 1987 A
4710020 Maddox et al. Dec 1987 A
4712740 Duncan et al. Dec 1987 A
4716621 Zoni Jan 1988 A
4728801 O'Connor Mar 1988 A
4733343 Yoneda et al. Mar 1988 A
4733430 Westergren Mar 1988 A
4733431 Martin Mar 1988 A
4735136 Lee et al. Apr 1988 A
4735138 Gawler et al. Apr 1988 A
4748336 Fujie et al. May 1988 A
4748833 Nagasawa Jun 1988 A
4756049 Uehara Jul 1988 A
4767213 Hummel Aug 1988 A
4769700 Pryor Sep 1988 A
4777416 George et al. Oct 1988 A
D298766 Tanno et al. Nov 1988 S
4782550 Jacobs Nov 1988 A
4796198 Boultinghouse et al. Jan 1989 A
4806751 Abe et al. Feb 1989 A
4811228 Hyyppa Mar 1989 A
4813906 Matsuyama et al. Mar 1989 A
4815157 Tsuchiya Mar 1989 A
4817000 Eberhardt Mar 1989 A
4818875 Weiner Apr 1989 A
4829442 Kadonoff et al. May 1989 A
4829626 Harkonen et al. May 1989 A
4832098 Palinkas et al. May 1989 A
4851661 Everett Jul 1989 A
4854000 Takimoto Aug 1989 A
4854006 Nishimura et al. Aug 1989 A
4855915 Dallaire Aug 1989 A
4857912 Everett et al. Aug 1989 A
4858132 Holmquist Aug 1989 A
4867570 Sorimachi et al. Sep 1989 A
4880474 Koharagi et al. Nov 1989 A
4887415 Martin Dec 1989 A
4891762 Chotiros Jan 1990 A
4893025 Lee Jan 1990 A
4901394 Nakamura et al. Feb 1990 A
4905151 Weiman et al. Feb 1990 A
4909972 Britz Mar 1990 A
4912643 Beirne Mar 1990 A
4918441 Bohman Apr 1990 A
4919224 Shyu et al. Apr 1990 A
4919489 Kopsco Apr 1990 A
4920060 Parrent et al. Apr 1990 A
4920605 Takashima May 1990 A
4933864 Evans et al. Jun 1990 A
4937912 Kurz Jul 1990 A
4953253 Fukuda et al. Sep 1990 A
4954962 Evans et al. Sep 1990 A
4955714 Stotler et al. Sep 1990 A
4956891 Wulff Sep 1990 A
4961303 McCarty et al. Oct 1990 A
4961304 Ovsborn et al. Oct 1990 A
4962453 Pong et al. Oct 1990 A
4967862 Pong et al. Nov 1990 A
4971591 Raviv et al. Nov 1990 A
4973912 Kaminski et al. Nov 1990 A
4974283 Holsten et al. Dec 1990 A
4977618 Allen Dec 1990 A
4977639 Takahashi et al. Dec 1990 A
4986663 Cecchi et al. Jan 1991 A
5001635 Yasutomi et al. Mar 1991 A
5002145 Wakaumi et al. Mar 1991 A
5012886 Jonas et al. May 1991 A
5018240 Holman May 1991 A
5020186 Lessig et al. Jun 1991 A
5022812 Coughlan et al. Jun 1991 A
5023788 Kitazume et al. Jun 1991 A
5024529 Svetkoff et al. Jun 1991 A
D318500 Malewicki et al. Jul 1991 S
5032775 Mizuno et al. Jul 1991 A
5033151 Kraft et al. Jul 1991 A
5033291 Podoloff et al. Jul 1991 A
5040116 Evans et al. Aug 1991 A
5045769 Everett Sep 1991 A
5049802 Mintus et al. Sep 1991 A
5051906 Evans et al. Sep 1991 A
5062819 Mallory Nov 1991 A
5070567 Holland Dec 1991 A
5084934 Lessig et al. Feb 1992 A
5086535 Grossmeyer et al. Feb 1992 A
5090321 Abouav Feb 1992 A
5093955 Blehert et al. Mar 1992 A
5094311 Akeel Mar 1992 A
5098262 Wecker et al. Mar 1992 A
5105502 Takashima Apr 1992 A
5105550 Shenoha Apr 1992 A
5109566 Kobayashi et al. May 1992 A
5111401 Everett, Jr. et al. May 1992 A
5115538 Cochran et al. May 1992 A
5127128 Lee Jul 1992 A
5136675 Hodson Aug 1992 A
5136750 Takashima et al. Aug 1992 A
5142985 Stearns et al. Sep 1992 A
5144471 Takanashi et al. Sep 1992 A
5144714 Mori et al. Sep 1992 A
5144715 Matsuyo et al. Sep 1992 A
5152028 Hirano Oct 1992 A
5152202 Strauss Oct 1992 A
5155684 Burke et al. Oct 1992 A
5163202 Kawakami et al. Nov 1992 A
5163320 Goshima et al. Nov 1992 A
5164579 Pryor et al. Nov 1992 A
5165064 Mattaboni Nov 1992 A
5170352 McTamaney et al. Dec 1992 A
5173881 Sindle Dec 1992 A
5182833 Yamaguchi et al. Feb 1993 A
5187662 Kamimura et al. Feb 1993 A
5202742 Frank et al. Apr 1993 A
5204814 Noonan et al. Apr 1993 A
5206500 Decker et al. Apr 1993 A
5208521 Aoyama May 1993 A
5216777 Moro et al. Jun 1993 A
5222786 Sovis et al. Jun 1993 A
5227985 DeMenthon Jul 1993 A
5233682 Abe et al. Aug 1993 A
5239720 Wood et al. Aug 1993 A
5251358 Moro et al. Oct 1993 A
5261139 Lewis Nov 1993 A
5276618 Everett Jan 1994 A
5276939 Uenishi Jan 1994 A
5277064 Knigga et al. Jan 1994 A
5279672 Betker et al. Jan 1994 A
5284452 Corona Feb 1994 A
5284522 Kobayashi et al. Feb 1994 A
5293955 Lee Mar 1994 A
D345707 Alister Apr 1994 S
5303448 Hennessey et al. Apr 1994 A
5307273 Oh et al. Apr 1994 A
5309592 Hiratsuka May 1994 A
5310379 Hippely et al. May 1994 A
5315227 Pierson et al. May 1994 A
5319827 Yang Jun 1994 A
5319828 Waldhauser et al. Jun 1994 A
5321614 Ashworth Jun 1994 A
5323483 Baeg Jun 1994 A
5324948 Dudar et al. Jun 1994 A
5331713 Tipton Jul 1994 A
5341186 Kato Aug 1994 A
5341540 Soupert et al. Aug 1994 A
5341549 Wirtz et al. Aug 1994 A
5345649 Whitlow Sep 1994 A
5352901 Poorman Oct 1994 A
5353224 Lee et al. Oct 1994 A
5363305 Cox et al. Nov 1994 A
5363935 Schempf et al. Nov 1994 A
5369347 Yoo Nov 1994 A
5369838 Wood et al. Dec 1994 A
5386862 Glover et al. Feb 1995 A
5399951 Lavallee et al. Mar 1995 A
5400244 Watanabe et al. Mar 1995 A
5404612 Ishikawa Apr 1995 A
5410479 Coker Apr 1995 A
5435405 Schempf et al. Jul 1995 A
5440216 Kim Aug 1995 A
5442358 Keeler et al. Aug 1995 A
5444965 Colens Aug 1995 A
5446356 Kim Aug 1995 A
5446445 Bloomfield et al. Aug 1995 A
5451135 Schempf et al. Sep 1995 A
5454129 Kell Oct 1995 A
5455982 Armstrong et al. Oct 1995 A
5465525 Mifune et al. Nov 1995 A
5465619 Sotack et al. Nov 1995 A
5467273 Faibish et al. Nov 1995 A
5471560 Allard et al. Nov 1995 A
5491670 Weber Feb 1996 A
5497529 Boesi Mar 1996 A
5498948 Bruni et al. Mar 1996 A
5502638 Takenaka Mar 1996 A
5505072 Oreper Apr 1996 A
5507067 Hoekstra et al. Apr 1996 A
5510893 Suzuki Apr 1996 A
5511147 Abdel Apr 1996 A
5515572 Hoekstra et al. May 1996 A
5534762 Kim Jul 1996 A
5535476 Kresse et al. Jul 1996 A
5537017 Feiten et al. Jul 1996 A
5537711 Tseng Jul 1996 A
5539953 Kurz Jul 1996 A
5542146 Hoekstra et al. Aug 1996 A
5542148 Young Aug 1996 A
5546631 Chambon Aug 1996 A
5548511 Bancroft Aug 1996 A
5551119 Worwag Sep 1996 A
5551525 Pack et al. Sep 1996 A
5553349 Kilstrom et al. Sep 1996 A
5555587 Guha Sep 1996 A
5560077 Crotchett Oct 1996 A
5568589 Hwang Oct 1996 A
D375592 Ljunggren Nov 1996 S
5608306 Rybeck et al. Mar 1997 A
5608894 Kawakami et al. Mar 1997 A
5608944 Gordon Mar 1997 A
5610488 Miyazawa Mar 1997 A
5611106 Wulff Mar 1997 A
5611108 Knowlton et al. Mar 1997 A
5613261 Kawakami et al. Mar 1997 A
5613269 Miwa Mar 1997 A
5621291 Lee Apr 1997 A
5622236 Azumi et al. Apr 1997 A
5634237 Paranjpe Jun 1997 A
5634239 Tuvin et al. Jun 1997 A
5636402 Kubo et al. Jun 1997 A
5642299 Hardin et al. Jun 1997 A
5646494 Han Jul 1997 A
5647554 Ikegami et al. Jul 1997 A
5650702 Azumi Jul 1997 A
5652489 Kawakami Jul 1997 A
5682313 Edlund et al. Oct 1997 A
5682839 Grimsley et al. Nov 1997 A
5696675 Nakamura et al. Dec 1997 A
5698861 Oh Dec 1997 A
5709007 Chiang Jan 1998 A
5710506 Broell et al. Jan 1998 A
5714119 Kawagoe et al. Feb 1998 A
5717169 Liang et al. Feb 1998 A
5717484 Hamaguchi et al. Feb 1998 A
5720077 Nakamura et al. Feb 1998 A
5732401 Conway Mar 1998 A
5735017 Barnes et al. Apr 1998 A
5735959 Kubo et al. Apr 1998 A
5742975 Knowlton et al. Apr 1998 A
5745235 Vercammen et al. Apr 1998 A
5752871 Tsuzuki May 1998 A
5756904 Oreper et al. May 1998 A
5761762 Kubo Jun 1998 A
5764888 Bolan et al. Jun 1998 A
5767437 Rogers Jun 1998 A
5767960 Orman Jun 1998 A
5770936 Hirai et al. Jun 1998 A
5777596 Herbert Jul 1998 A
5778486 Kim Jul 1998 A
5781697 Jeong Jul 1998 A
5781960 Kilstrom et al. Jul 1998 A
5784755 Karr et al. Jul 1998 A
5786602 Pryor et al. Jul 1998 A
5787545 Colens Aug 1998 A
5793900 Nourbakhsh et al. Aug 1998 A
5794297 Muta Aug 1998 A
5802665 Knowlton et al. Sep 1998 A
5812267 Everett et al. Sep 1998 A
5814808 Takada et al. Sep 1998 A
5815880 Nakanishi Oct 1998 A
5815884 Imamura et al. Oct 1998 A
5819008 Asama et al. Oct 1998 A
5819360 Fujii Oct 1998 A
5819936 Saveliev et al. Oct 1998 A
5820821 Kawagoe et al. Oct 1998 A
5821730 Drapkin Oct 1998 A
5825981 Matsuda Oct 1998 A
5828770 Leis et al. Oct 1998 A
5831597 West et al. Nov 1998 A
5836045 Anthony et al. Nov 1998 A
5839156 Park et al. Nov 1998 A
5839532 Yoshiji et al. Nov 1998 A
5841259 Kim et al. Nov 1998 A
5867800 Leif Feb 1999 A
5867861 Kasen et al. Feb 1999 A
5869910 Colens Feb 1999 A
5894621 Kubo Apr 1999 A
5896611 Haaga Apr 1999 A
5903124 Kawakami May 1999 A
5905209 Oreper May 1999 A
5907886 Buscher Jun 1999 A
5910700 Crotzer Jun 1999 A
5911260 Suzuki Jun 1999 A
5916008 Wong Jun 1999 A
5924167 Wright et al. Jul 1999 A
5926909 McGee Jul 1999 A
5933102 Miller et al. Aug 1999 A
5933913 Wright et al. Aug 1999 A
5935179 Kleiner et al. Aug 1999 A
5935333 Davis Aug 1999 A
5940346 Sadowsky et al. Aug 1999 A
5940927 Haegermarck et al. Aug 1999 A
5940930 Oh et al. Aug 1999 A
5942869 Katou et al. Aug 1999 A
5943730 Boomgaarden Aug 1999 A
5943733 Tagliaferri Aug 1999 A
5943933 Evans et al. Aug 1999 A
5947225 Kawakami et al. Sep 1999 A
5950408 Schaedler Sep 1999 A
5959423 Nakanishi et al. Sep 1999 A
5968281 Wright et al. Oct 1999 A
5974348 Rocks Oct 1999 A
5974365 Mitchell Oct 1999 A
5983448 Wright et al. Nov 1999 A
5984880 Lander et al. Nov 1999 A
5987383 Keller et al. Nov 1999 A
5989700 Krivopal Nov 1999 A
5991951 Kubo et al. Nov 1999 A
5995883 Nishikado Nov 1999 A
5995884 Allen et al. Nov 1999 A
5996167 Close Dec 1999 A
5998953 Nakamura et al. Dec 1999 A
5998971 Corbridge Dec 1999 A
6000088 Wright et al. Dec 1999 A
6009358 Angott et al. Dec 1999 A
6012618 Matsuo Jan 2000 A
6021545 Delgado et al. Feb 2000 A
6023813 Thatcher et al. Feb 2000 A
6023814 Imamura Feb 2000 A
6025687 Himeda et al. Feb 2000 A
6026539 Mouw et al. Feb 2000 A
6030464 Azevedo Feb 2000 A
6030465 Marcussen et al. Feb 2000 A
6032327 Oka et al. Mar 2000 A
6032542 Warnick et al. Mar 2000 A
6036572 Sze Mar 2000 A
6038501 Kawakami Mar 2000 A
6040669 Hog Mar 2000 A
6041471 Charky et al. Mar 2000 A
6041472 Kasen et al. Mar 2000 A
6046800 Ohtomo et al. Apr 2000 A
6049620 Dickinson et al. Apr 2000 A
6050648 Keleny Apr 2000 A
6052821 Chouly et al. Apr 2000 A
6055042 Sarangapani Apr 2000 A
6055702 Imamura et al. May 2000 A
6061868 Moritsch et al. May 2000 A
6065182 Wright et al. May 2000 A
6070290 Schwarze et al. Jun 2000 A
6073432 Schaedler Jun 2000 A
6076025 Ueno et al. Jun 2000 A
6076026 Jambhekar et al. Jun 2000 A
6076226 Reed Jun 2000 A
6076227 Schallig et al. Jun 2000 A
6081257 Zeller Jun 2000 A
6088020 Mor Jul 2000 A
6094775 Behmer Aug 2000 A
6099091 Campbell Aug 2000 A
6101670 Song Aug 2000 A
6101671 Wright et al. Aug 2000 A
6108031 King et al. Aug 2000 A
6108067 Okamoto Aug 2000 A
6108076 Hanseder Aug 2000 A
6108269 Kabel Aug 2000 A
6108597 Kirchner et al. Aug 2000 A
6108859 Burgoon Aug 2000 A
6112143 Allen et al. Aug 2000 A
6112996 Matsuo Sep 2000 A
6119057 Kawagoe Sep 2000 A
6122798 Kobayashi et al. Sep 2000 A
6124694 Bancroft et al. Sep 2000 A
6125498 Roberts et al. Oct 2000 A
6131237 Kasper et al. Oct 2000 A
6138063 Himeda Oct 2000 A
6142252 Kinto et al. Nov 2000 A
6146041 Chen et al. Nov 2000 A
6146278 Kobayashi Nov 2000 A
6154279 Thayer Nov 2000 A
6154694 Aoki et al. Nov 2000 A
6160479 Ahlen et al. Dec 2000 A
6167332 Kurtzberg et al. Dec 2000 A
6167587 Kasper et al. Jan 2001 B1
6192548 Huffman Feb 2001 B1
6192549 Kasen et al. Feb 2001 B1
6202243 Beaufoy et al. Mar 2001 B1
6216307 Kaleta et al. Apr 2001 B1
6220865 Macri et al. Apr 2001 B1
6226830 Hendriks et al. May 2001 B1
6230362 Kasper et al. May 2001 B1
6237741 Guidetti May 2001 B1
6240342 Fiegert et al. May 2001 B1
6243913 Frank et al. Jun 2001 B1
6255793 Peless et al. Jul 2001 B1
6259979 Holmquist Jul 2001 B1
6261379 Conrad et al. Jul 2001 B1
6263539 Baig Jul 2001 B1
6263989 Won Jul 2001 B1
6272936 Oreper et al. Aug 2001 B1
6276478 Hopkins et al. Aug 2001 B1
6278918 Dickson et al. Aug 2001 B1
6279196 Kasen et al. Aug 2001 B2
6282526 Ganesh Aug 2001 B1
6283034 Miles Sep 2001 B1
6285778 Nakajima et al. Sep 2001 B1
6285930 Dickson et al. Sep 2001 B1
6286181 Kasper et al. Sep 2001 B1
6300737 Bergvall et al. Oct 2001 B1
6321337 Reshef et al. Nov 2001 B1
6321515 Colens Nov 2001 B1
6323570 Nishimura et al. Nov 2001 B1
6324714 Walz et al. Dec 2001 B1
6327741 Reed Dec 2001 B1
6332400 Meyer Dec 2001 B1
6339735 Peless et al. Jan 2002 B1
6362875 Burkley Mar 2002 B1
6370453 Sommer Apr 2002 B2
6374155 Wallach et al. Apr 2002 B1
6374157 Takamura Apr 2002 B1
6381802 Park May 2002 B2
6385515 Dickson et al. May 2002 B1
6388013 Saraf et al. May 2002 B1
6389329 Colens May 2002 B1
6397429 Legatt et al. Jun 2002 B1
6400048 Nishimura et al. Jun 2002 B1
6401294 Kasper Jun 2002 B2
6408226 Byrne et al. Jun 2002 B1
6412141 Kasper et al. Jul 2002 B2
6415203 Inoue et al. Jul 2002 B1
6418586 Fulghum Jul 2002 B2
6421870 Basham et al. Jul 2002 B1
6427285 Legatt et al. Aug 2002 B1
6430471 Kintou et al. Aug 2002 B1
6431296 Won Aug 2002 B1
6437227 Theimer Aug 2002 B1
6437465 Nishimura et al. Aug 2002 B1
6438456 Feddema et al. Aug 2002 B1
6438793 Miner et al. Aug 2002 B1
6442476 Poropat Aug 2002 B1
6442789 Legatt et al. Sep 2002 B1
6443509 Levin et al. Sep 2002 B1
6444003 Sutcliffe Sep 2002 B1
6446302 Kasper et al. Sep 2002 B1
6454036 Airey et al. Sep 2002 B1
D464091 Christianson Oct 2002 S
6457206 Judson Oct 2002 B1
6459955 Bartsch et al. Oct 2002 B1
6463368 Feiten et al. Oct 2002 B1
6465982 Bergvall et al. Oct 2002 B1
6473167 Odell Oct 2002 B1
6480762 Uchikubo et al. Nov 2002 B1
6481515 Kirkpatrick et al. Nov 2002 B1
6482252 Conrad et al. Nov 2002 B1
6490539 Dickson et al. Dec 2002 B1
6491127 Holmberg et al. Dec 2002 B1
6493612 Bisset et al. Dec 2002 B1
6493613 Peless et al. Dec 2002 B2
6496754 Song et al. Dec 2002 B2
6496755 Wallach et al. Dec 2002 B2
6502657 Kerrebrock et al. Jan 2003 B2
6504610 Bauer et al. Jan 2003 B1
6507773 Parker et al. Jan 2003 B2
6519808 Legatt et al. Feb 2003 B2
6525509 Petersson et al. Feb 2003 B1
D471243 Cioffi et al. Mar 2003 S
6530102 Pierce et al. Mar 2003 B1
6530117 Peterson Mar 2003 B2
6532404 Colens Mar 2003 B2
6535793 Allard Mar 2003 B2
6540424 Hall et al. Apr 2003 B1
6540607 Mokris et al. Apr 2003 B2
6543210 Rostoucher et al. Apr 2003 B2
6548982 Papanikolopoulos et al. Apr 2003 B1
6553612 Dyson et al. Apr 2003 B1
6556722 Russell et al. Apr 2003 B1
6556892 Kuroki et al. Apr 2003 B2
6557104 Vu et al. Apr 2003 B2
D474312 Stephens et al. May 2003 S
6563130 Dworkowski et al. May 2003 B2
6571415 Gerber et al. Jun 2003 B2
6571422 Gordon et al. Jun 2003 B1
6572711 Sclafani et al. Jun 2003 B2
6574536 Kawagoe et al. Jun 2003 B1
6580246 Jacobs Jun 2003 B2
6581239 Dyson et al. Jun 2003 B1
6584376 Van Kommer Jun 2003 B1
6586908 Petersson et al. Jul 2003 B2
6587573 Stam et al. Jul 2003 B1
6590222 Bisset et al. Jul 2003 B1
6594551 McKinney et al. Jul 2003 B2
6594844 Jones Jul 2003 B2
6597076 Scheible et al. Jul 2003 B2
D478884 Slipy et al. Aug 2003 S
6601265 Burlington Aug 2003 B1
6604021 Imai et al. Aug 2003 B2
6604022 Parker et al. Aug 2003 B2
6605156 Clark et al. Aug 2003 B1
6609269 Kasper Aug 2003 B2
6611120 Song et al. Aug 2003 B2
6611734 Parker et al. Aug 2003 B2
6611738 Ruffner Aug 2003 B2
6615108 Peless et al. Sep 2003 B1
6615434 Davis et al. Sep 2003 B1
6615885 Ohm Sep 2003 B1
6622465 Jerome et al. Sep 2003 B2
6624744 Wilson et al. Sep 2003 B1
6625843 Kim et al. Sep 2003 B2
6629028 Paromtchik et al. Sep 2003 B2
6633150 Wallach et al. Oct 2003 B1
6637546 Wang Oct 2003 B1
6639659 Granger Oct 2003 B2
6658325 Zweig Dec 2003 B2
6658354 Lin Dec 2003 B2
6658692 Lenkiewicz et al. Dec 2003 B2
6658693 Reed Dec 2003 B1
6661239 Ozick Dec 2003 B1
6662889 De Fazio et al. Dec 2003 B2
6668951 Won Dec 2003 B2
6670817 Fournier et al. Dec 2003 B2
6671592 Bisset et al. Dec 2003 B1
6671925 Field et al. Jan 2004 B2
6677938 Maynard Jan 2004 B1
6687571 Byrne et al. Feb 2004 B1
6690134 Jones et al. Feb 2004 B1
6690993 Foulke et al. Feb 2004 B2
6697147 Ko et al. Feb 2004 B2
6705332 Field et al. Mar 2004 B2
6711280 Stafsudd et al. Mar 2004 B2
6732826 Song et al. May 2004 B2
6735811 Field et al. May 2004 B2
6735812 Hekman et al. May 2004 B2
6737591 Lapstun et al. May 2004 B1
6741054 Koselka et al. May 2004 B2
6741364 Lange et al. May 2004 B2
6748297 Song et al. Jun 2004 B2
6756703 Chang Jun 2004 B2
6760647 Nourbakhsh et al. Jul 2004 B2
6764373 Osawa et al. Jul 2004 B1
6769004 Barrett Jul 2004 B2
6774596 Bisset Aug 2004 B1
6779380 Nieuwkamp Aug 2004 B1
6781338 Jones et al. Aug 2004 B2
6809490 Jones et al. Oct 2004 B2
6810305 Kirkpatrick Oct 2004 B2
6810350 Blakley Oct 2004 B2
6830120 Yashima et al. Dec 2004 B1
6832407 Salem et al. Dec 2004 B2
6836701 McKee Dec 2004 B2
6841963 Song et al. Jan 2005 B2
6845297 Allard Jan 2005 B2
6848146 Wright et al. Feb 2005 B2
6854148 Rief et al. Feb 2005 B1
6856811 Burdue et al. Feb 2005 B2
6859010 Jeon et al. Feb 2005 B2
6859682 Naka et al. Feb 2005 B2
6860206 Rudakevych et al. Mar 2005 B1
6865447 Lau et al. Mar 2005 B2
6870792 Chiappetta Mar 2005 B2
6871115 Huang et al. Mar 2005 B2
6883201 Jones et al. Apr 2005 B2
6886651 Slocum et al. May 2005 B1
6888333 Laby May 2005 B2
6901624 Mori et al. Jun 2005 B2
6906702 Tanaka et al. Jun 2005 B1
6914403 Tsurumi Jul 2005 B2
6917854 Bayer Jul 2005 B2
6925357 Wang et al. Aug 2005 B2
6925679 Wallach et al. Aug 2005 B2
6929548 Wang Aug 2005 B2
D510066 Hickey et al. Sep 2005 S
6938298 Aasen Sep 2005 B2
6940291 Ozick Sep 2005 B1
6941199 Bottomley et al. Sep 2005 B1
6956348 Landry et al. Oct 2005 B2
6957712 Song et al. Oct 2005 B2
6960986 Asama et al. Nov 2005 B2
6965209 Jones et al. Nov 2005 B2
6965211 Tsurumi Nov 2005 B2
6968592 Takeuchi et al. Nov 2005 B2
6971140 Kim Dec 2005 B2
6975246 Trudeau Dec 2005 B1
6980229 Ebersole Dec 2005 B1
6985556 Shanmugavel et al. Jan 2006 B2
6993954 George et al. Feb 2006 B1
6999850 McDonald Feb 2006 B2
7013527 Thomas et al. Mar 2006 B2
7024278 Chiappetta et al. Apr 2006 B2
7024280 Parker et al. Apr 2006 B2
7027893 Perry et al. Apr 2006 B2
7030768 Wanie Apr 2006 B2
7031805 Lee et al. Apr 2006 B2
7032469 Bailey Apr 2006 B2
7040869 Beenker May 2006 B2
7051399 Field et al. May 2006 B2
7053578 Diehl et al. May 2006 B2
7054716 McKee et al. May 2006 B2
7055210 Keppler et al. Jun 2006 B2
7057120 Ma et al. Jun 2006 B2
7057643 Iida et al. Jun 2006 B2
7059012 Song et al. Jun 2006 B2
7065430 Naka et al. Jun 2006 B2
7066291 Martins et al. Jun 2006 B2
7069124 Whittaker et al. Jun 2006 B1
7079923 Abramson et al. Jul 2006 B2
7085623 Siegers Aug 2006 B2
7085624 Aldred et al. Aug 2006 B2
7113847 Chmura et al. Sep 2006 B2
7133746 Abramson et al. Nov 2006 B2
7142198 Lee Nov 2006 B2
7148458 Schell et al. Dec 2006 B2
7155308 Jones Dec 2006 B2
7167775 Abramson et al. Jan 2007 B2
7171285 Kim et al. Jan 2007 B2
7173391 Jones et al. Feb 2007 B2
7174238 Zweig Feb 2007 B1
7188000 Chiappetta et al. Mar 2007 B2
7193384 Norman et al. Mar 2007 B1
7196487 Jones et al. Mar 2007 B2
7201786 Wegelin et al. Apr 2007 B2
7206677 Huldén Apr 2007 B2
7211980 Bruemmer et al. May 2007 B1
7225500 Diehl et al. Jun 2007 B2
7246405 Yan Jul 2007 B2
7248951 Huldén Jul 2007 B2
7275280 Haegermarck et al. Oct 2007 B2
7283892 Boillot et al. Oct 2007 B1
7288912 Landry et al. Oct 2007 B2
7318248 Yan Jan 2008 B1
7320149 Huffman et al. Jan 2008 B1
7321807 Laski Jan 2008 B2
7324870 Lee Jan 2008 B2
7328196 Peters Feb 2008 B2
7332890 Cohen et al. Feb 2008 B2
7346428 Huffman et al. Mar 2008 B1
7352153 Yan Apr 2008 B2
7359766 Jeon et al. Apr 2008 B2
7360277 Moshenrose et al. Apr 2008 B2
7363108 Noda et al. Apr 2008 B2
7388879 Sabe et al. Jun 2008 B2
7389156 Ziegler et al. Jun 2008 B2
7389166 Harwig et al. Jun 2008 B2
7408157 Yan Aug 2008 B2
7418762 Arai et al. Sep 2008 B2
7430455 Casey et al. Sep 2008 B2
7430462 Chiu et al. Sep 2008 B2
7441298 Svendsen et al. Oct 2008 B2
7444206 Abramson et al. Oct 2008 B2
7448113 Jones et al. Nov 2008 B2
7459871 Landry et al. Dec 2008 B2
7467026 Sakagami et al. Dec 2008 B2
7474941 Kim et al. Jan 2009 B2
7503096 Lin Mar 2009 B2
7515991 Egawa et al. Apr 2009 B2
7539557 Yamauchi May 2009 B2
7555363 Augenbraun et al. Jun 2009 B2
7557703 Yamada et al. Jul 2009 B2
7568259 Yan Aug 2009 B2
7571511 Jones et al. Aug 2009 B2
7578020 Jaworski et al. Aug 2009 B2
7600521 Woo Oct 2009 B2
7603744 Reindle Oct 2009 B2
7611583 Buckley et al. Nov 2009 B2
7617557 Reindle Nov 2009 B2
7620476 Morse et al. Nov 2009 B2
7636928 Uno Dec 2009 B2
7636982 Jones et al. Dec 2009 B2
7647144 Haegermarck Jan 2010 B2
7650666 Jang Jan 2010 B2
7660650 Kawagoe et al. Feb 2010 B2
7663333 Jones et al. Feb 2010 B2
7693605 Park Apr 2010 B2
7706917 Chiappetta et al. Apr 2010 B1
7761954 Ziegler et al. Jul 2010 B2
7765635 Park Aug 2010 B2
7784147 Burkholder et al. Aug 2010 B2
7801645 Taylor et al. Sep 2010 B2
7805220 Taylor et al. Sep 2010 B2
7809944 Kawamoto Oct 2010 B2
7832048 Harwig et al. Nov 2010 B2
7849555 Hahm et al. Dec 2010 B2
7853645 Brown et al. Dec 2010 B2
7860680 Arms et al. Dec 2010 B2
7920941 Park et al. Apr 2011 B2
7937800 Yan May 2011 B2
7957836 Myeong et al. Jun 2011 B2
8035255 Kurs et al. Oct 2011 B2
8087117 Kapoor et al. Jan 2012 B2
8106539 Schatz et al. Jan 2012 B2
8304935 Karalis et al. Nov 2012 B2
8324759 Karalis et al. Dec 2012 B2
8400017 Kurs et al. Mar 2013 B2
8410636 Kurs et al. Apr 2013 B2
8441154 Karalis et al. May 2013 B2
8461719 Kesler et al. Jun 2013 B2
8461720 Kurs et al. Jun 2013 B2
8461721 Karalis et al. Jun 2013 B2
8461722 Kurs et al. Jun 2013 B2
8466583 Karalis et al. Jun 2013 B2
8471410 Karalis et al. Jun 2013 B2
8476788 Karalis et al. Jul 2013 B2
8482158 Kurs et al. Jul 2013 B2
8487480 Kesler et al. Jul 2013 B1
8497601 Hall et al. Jul 2013 B2
8552592 Schatz et al. Oct 2013 B2
8569914 Karalis et al. Oct 2013 B2
8587153 Schatz et al. Nov 2013 B2
8587155 Giler et al. Nov 2013 B2
8598743 Hall et al. Dec 2013 B2
8618696 Kurs et al. Dec 2013 B2
20010004719 Sommer Jun 2001 A1
20010013929 Torsten Aug 2001 A1
20010020200 Das et al. Sep 2001 A1
20010025183 Shahidi Sep 2001 A1
20010037163 Allard Nov 2001 A1
20010043509 Green et al. Nov 2001 A1
20010045883 Holdaway et al. Nov 2001 A1
20010047231 Peless et al. Nov 2001 A1
20010047895 De Fazio et al. Dec 2001 A1
20020011367 Kolesnik Jan 2002 A1
20020011813 Koselka et al. Jan 2002 A1
20020016649 Jones Feb 2002 A1
20020021219 Edwards Feb 2002 A1
20020027652 Paromtchik et al. Mar 2002 A1
20020036779 Kiyoi et al. Mar 2002 A1
20020081937 Yamada et al. Jun 2002 A1
20020095239 Wallach et al. Jul 2002 A1
20020097400 Jung et al. Jul 2002 A1
20020104963 Mancevski Aug 2002 A1
20020108209 Peterson Aug 2002 A1
20020112742 Bredo et al. Aug 2002 A1
20020113973 Ge Aug 2002 A1
20020116089 Kirkpatrick Aug 2002 A1
20020120364 Colens Aug 2002 A1
20020124343 Reed Sep 2002 A1
20020153185 Song et al. Oct 2002 A1
20020156556 Ruffner Oct 2002 A1
20020159051 Guo Oct 2002 A1
20020166193 Kasper Nov 2002 A1
20020169521 Goodman et al. Nov 2002 A1
20020173877 Zweig Nov 2002 A1
20020189871 Won Dec 2002 A1
20030009259 Hattori et al. Jan 2003 A1
20030019071 Field et al. Jan 2003 A1
20030023356 Keable Jan 2003 A1
20030024986 Mazz et al. Feb 2003 A1
20030025472 Jones et al. Feb 2003 A1
20030028286 Glenn et al. Feb 2003 A1
20030030399 Jacobs Feb 2003 A1
20030058262 Sato et al. Mar 2003 A1
20030060928 Abramson et al. Mar 2003 A1
20030067451 Tagg et al. Apr 2003 A1
20030097875 Lentz et al. May 2003 A1
20030120389 Abramson et al. Jun 2003 A1
20030124312 Autumn Jul 2003 A1
20030126352 Barrett Jul 2003 A1
20030137268 Papanikolopoulos et al. Jul 2003 A1
20030146384 Logsdon et al. Aug 2003 A1
20030159232 Hekman et al. Aug 2003 A1
20030168081 Lee et al. Sep 2003 A1
20030175138 Beenker Sep 2003 A1
20030192144 Song et al. Oct 2003 A1
20030193657 Uomori et al. Oct 2003 A1
20030216834 Allard Nov 2003 A1
20030221114 Hino et al. Nov 2003 A1
20030229421 Chmura et al. Dec 2003 A1
20030229474 Suzuki et al. Dec 2003 A1
20030233171 Heiligensetzer Dec 2003 A1
20030233177 Johnson et al. Dec 2003 A1
20030233870 Mancevski Dec 2003 A1
20030233930 Ozick Dec 2003 A1
20040016077 Song et al. Jan 2004 A1
20040020000 Jones Feb 2004 A1
20040030448 Solomon Feb 2004 A1
20040030449 Solomon Feb 2004 A1
20040030450 Solomon Feb 2004 A1
20040030451 Solomon Feb 2004 A1
20040030570 Solomon Feb 2004 A1
20040030571 Solomon Feb 2004 A1
20040031113 Wosewick et al. Feb 2004 A1
20040049877 Jones et al. Mar 2004 A1
20040055163 McCambridge et al. Mar 2004 A1
20040068351 Solomon Apr 2004 A1
20040068415 Solomon Apr 2004 A1
20040068416 Solomon Apr 2004 A1
20040074038 Im et al. Apr 2004 A1
20040074044 Diehl et al. Apr 2004 A1
20040076324 Burl et al. Apr 2004 A1
20040083570 Song et al. May 2004 A1
20040085037 Jones et al. May 2004 A1
20040088079 Lavarec et al. May 2004 A1
20040093122 Galibraith May 2004 A1
20040098167 Yi et al. May 2004 A1
20040111184 Chiappetta et al. Jun 2004 A1
20040111821 Lenkiewicz et al. Jun 2004 A1
20040113777 Matsuhira et al. Jun 2004 A1
20040117064 McDonald Jun 2004 A1
20040117846 Karaoguz et al. Jun 2004 A1
20040118998 Wingett et al. Jun 2004 A1
20040128028 Miyamoto et al. Jul 2004 A1
20040133316 Dean Jul 2004 A1
20040134336 Solomon Jul 2004 A1
20040134337 Solomon Jul 2004 A1
20040143919 Wilder Jul 2004 A1
20040148419 Chen et al. Jul 2004 A1
20040148731 Damman et al. Aug 2004 A1
20040153212 Profio et al. Aug 2004 A1
20040156541 Jeon et al. Aug 2004 A1
20040158357 Lee et al. Aug 2004 A1
20040181706 Chen et al. Sep 2004 A1
20040187249 Jones et al. Sep 2004 A1
20040187457 Colens Sep 2004 A1
20040196451 Aoyama Oct 2004 A1
20040200505 Taylor et al. Oct 2004 A1
20040201361 Koh et al. Oct 2004 A1
20040204792 Taylor et al. Oct 2004 A1
20040204804 Lee et al. Oct 2004 A1
20040210345 Noda et al. Oct 2004 A1
20040210347 Sawada et al. Oct 2004 A1
20040211444 Taylor et al. Oct 2004 A1
20040221790 Sinclair et al. Nov 2004 A1
20040236468 Taylor et al. Nov 2004 A1
20040244138 Taylor et al. Dec 2004 A1
20040255425 Arai et al. Dec 2004 A1
20050000543 Taylor et al. Jan 2005 A1
20050010330 Abramson et al. Jan 2005 A1
20050010331 Taylor et al. Jan 2005 A1
20050015913 Kim et al. Jan 2005 A1
20050021181 Kim et al. Jan 2005 A1
20050028316 Thomas et al. Feb 2005 A1
20050053912 Roth et al. Mar 2005 A1
20050055796 Wright et al. Mar 2005 A1
20050067994 Jones et al. Mar 2005 A1
20050085947 Aldred et al. Apr 2005 A1
20050091782 Gordon et al. May 2005 A1
20050091786 Wright et al. May 2005 A1
20050137749 Jeon et al. Jun 2005 A1
20050144751 Kegg et al. Jul 2005 A1
20050150074 Diehl et al. Jul 2005 A1
20050150519 Keppler et al. Jul 2005 A1
20050154795 Kuz et al. Jul 2005 A1
20050156562 Cohen et al. Jul 2005 A1
20050163119 Ito et al. Jul 2005 A1
20050165508 Kanda et al. Jul 2005 A1
20050166354 Uehigashi Aug 2005 A1
20050166355 Tani Aug 2005 A1
20050172445 Diehl et al. Aug 2005 A1
20050183229 Uehigashi Aug 2005 A1
20050183230 Uehigashi Aug 2005 A1
20050187678 Myeong et al. Aug 2005 A1
20050192707 Park et al. Sep 2005 A1
20050204717 Colens Sep 2005 A1
20050209736 Kawagoe Sep 2005 A1
20050211880 Schell et al. Sep 2005 A1
20050212929 Schell et al. Sep 2005 A1
20050213082 DiBernardo et al. Sep 2005 A1
20050213109 Schell et al. Sep 2005 A1
20050217042 Reindle Oct 2005 A1
20050218852 Landry et al. Oct 2005 A1
20050222933 Wesby Oct 2005 A1
20050229340 Sawalski et al. Oct 2005 A1
20050229355 Crouch et al. Oct 2005 A1
20050235451 Yan Oct 2005 A1
20050251292 Casey et al. Nov 2005 A1
20050255425 Pierson Nov 2005 A1
20050258154 Blankenship et al. Nov 2005 A1
20050273967 Taylor et al. Dec 2005 A1
20050288819 de Guzman Dec 2005 A1
20060000050 Cipolla et al. Jan 2006 A1
20060009879 Lynch et al. Jan 2006 A1
20060010638 Shimizu et al. Jan 2006 A1
20060020369 Taylor et al. Jan 2006 A1
20060020370 Abramson Jan 2006 A1
20060021168 Nishikawa Feb 2006 A1
20060025134 Cho et al. Feb 2006 A1
20060037170 Shimizu Feb 2006 A1
20060042042 Mertes et al. Mar 2006 A1
20060044546 Lewin et al. Mar 2006 A1
20060060216 Woo Mar 2006 A1
20060061657 Rew et al. Mar 2006 A1
20060064828 Stein et al. Mar 2006 A1
20060087273 Ko et al. Apr 2006 A1
20060089765 Pack et al. Apr 2006 A1
20060100741 Jung May 2006 A1
20060107894 Buckley et al. May 2006 A1
20060119839 Bertin et al. Jun 2006 A1
20060143295 Costa et al. Jun 2006 A1
20060146776 Kim Jul 2006 A1
20060150361 Aldred et al. Jul 2006 A1
20060184293 Konandreas et al. Aug 2006 A1
20060185690 Song et al. Aug 2006 A1
20060190133 Konandreas et al. Aug 2006 A1
20060190134 Ziegler et al. Aug 2006 A1
20060190146 Morse et al. Aug 2006 A1
20060196003 Song et al. Sep 2006 A1
20060200281 Ziegler et al. Sep 2006 A1
20060220900 Ceskutti et al. Oct 2006 A1
20060229774 Park et al. Oct 2006 A1
20060259194 Chiu Nov 2006 A1
20060259494 Watson et al. Nov 2006 A1
20060278161 Burkholder et al. Dec 2006 A1
20060288519 Jaworski et al. Dec 2006 A1
20060293787 Kanda et al. Dec 2006 A1
20060293808 Qian Dec 2006 A1
20070006404 Cheng et al. Jan 2007 A1
20070016328 Ziegler et al. Jan 2007 A1
20070017061 Yan Jan 2007 A1
20070028574 Yan Feb 2007 A1
20070032904 Kawagoe et al. Feb 2007 A1
20070042716 Goodall et al. Feb 2007 A1
20070043459 Abbott et al. Feb 2007 A1
20070061041 Zweig Mar 2007 A1
20070061043 Ermakov et al. Mar 2007 A1
20070114975 Cohen et al. May 2007 A1
20070142964 Abramson Jun 2007 A1
20070150096 Yeh et al. Jun 2007 A1
20070156286 Yamauchi Jul 2007 A1
20070157415 Lee et al. Jul 2007 A1
20070157420 Lee et al. Jul 2007 A1
20070179670 Chiappetta et al. Aug 2007 A1
20070226949 Hahm et al. Oct 2007 A1
20070234492 Svendsen et al. Oct 2007 A1
20070244610 Ozick et al. Oct 2007 A1
20070245511 Hahm et al. Oct 2007 A1
20070250212 Halloran et al. Oct 2007 A1
20070261193 Gordon et al. Nov 2007 A1
20070266508 Jones et al. Nov 2007 A1
20080007203 Cohen et al. Jan 2008 A1
20080039974 Sandin et al. Feb 2008 A1
20080052846 Kapoor et al. Mar 2008 A1
20080091304 Ozick et al. Apr 2008 A1
20080109126 Sandin et al. May 2008 A1
20080134458 Ziegler et al. Jun 2008 A1
20080140255 Ziegler et al. Jun 2008 A1
20080155768 Ziegler et al. Jul 2008 A1
20080184518 Taylor et al. Aug 2008 A1
20080266748 Lee Oct 2008 A1
20080276407 Schnittman et al. Nov 2008 A1
20080281470 Gilbert et al. Nov 2008 A1
20080282494 Won et al. Nov 2008 A1
20080294288 Yamauchi Nov 2008 A1
20080302586 Yan Dec 2008 A1
20080307590 Jones et al. Dec 2008 A1
20090007366 Svendsen et al. Jan 2009 A1
20090038089 Landry et al. Feb 2009 A1
20090048727 Hong et al. Feb 2009 A1
20090049640 Lee et al. Feb 2009 A1
20090055022 Casey et al. Feb 2009 A1
20090102296 Greene et al. Apr 2009 A1
20090292393 Casey et al. Nov 2009 A1
20100006028 Buckley et al. Jan 2010 A1
20100011529 Won et al. Jan 2010 A1
20100049365 Jones et al. Feb 2010 A1
20100063628 Landry et al. Mar 2010 A1
20100082193 Chiappetta Apr 2010 A1
20100257690 Jones et al. Oct 2010 A1
20100257691 Jones et al. Oct 2010 A1
20100263158 Jones et al. Oct 2010 A1
20100268384 Jones et al. Oct 2010 A1
20100293742 Chung et al. Nov 2010 A1
20100312429 Jones et al. Dec 2010 A1
Foreign Referenced Citations (330)
Number Date Country
2128842 Dec 1980 DE
3317376 Dec 1987 DE
3536907 Feb 1989 DE
3404202 Dec 1992 DE
199311014 Oct 1993 DE
4338841 May 1995 DE
4414683 Oct 1995 DE
19849978 Feb 2001 DE
102004038074 Jun 2005 DE
10357636 Jul 2005 DE
102004041021 Aug 2005 DE
102005046813 Apr 2007 DE
338988 Dec 1988 DK
0114926 Aug 1984 EP
265542 May 1988 EP
281085 Sep 1988 EP
286328 Oct 1988 EP
294101 Dec 1988 EP
352045 Jan 1990 EP
433697 Jun 1991 EP
437024 Jul 1991 EP
554978 Aug 1993 EP
0615719 Sep 1994 EP
792726 Sep 1997 EP
861629 Sep 1998 EP
930040 Jul 1999 EP
845237 Apr 2000 EP
1139847 Oct 2001 EP
1228734 Aug 2002 EP
1331537 Jul 2003 EP
1380245 Jan 2004 EP
1380246 Jan 2004 EP
1018315 Nov 2004 EP
1557730 Jul 2005 EP
1642522 Apr 2006 EP
1836941 Sep 2007 EP
2238196 Aug 2005 ES
722755 Mar 1932 FR
2601443 Jan 1988 FR
2828589 Feb 2003 FR
702426 Jan 1954 GB
2128842 May 1984 GB
2225221 May 1990 GB
2267360 Dec 1993 GB
2283838 May 1995 GB
2284957 Jun 1995 GB
2300082 Oct 1996 GB
2404330 Feb 2005 GB
2417354 Feb 2006 GB
53021869 Feb 1978 JP
53110257 Sep 1978 JP
57064217 Apr 1982 JP
59005315 Jan 1984 JP
59033511 Feb 1984 JP
59094005 May 1984 JP
59112311 Jun 1984 JP
59120124 Jul 1984 JP
59131668 Sep 1984 JP
59164973 Sep 1984 JP
59184917 Oct 1984 JP
59212924 Dec 1984 JP
59226909 Dec 1984 JP
60089213 May 1985 JP
68889213 May 1985 JP
60211510 Oct 1985 JP
60259895 Dec 1985 JP
61023221 Jan 1986 JP
61097712 May 1986 JP
61160366 Jul 1986 JP
62074018 Apr 1987 JP
62263507 Nov 1987 JP
62263508 Nov 1987 JP
62263508 Nov 1987 JP
63079623 Apr 1988 JP
63079623 Apr 1988 JP
63158032 Jul 1988 JP
63183032 Jul 1988 JP
63203483 Aug 1988 JP
63241610 Oct 1988 JP
10295595 Nov 1988 JP
1118752 May 1989 JP
2283343 Nov 1990 JP
3051023 Mar 1991 JP
4019586 Jan 1992 JP
4074285 Mar 1992 JP
4084921 Mar 1992 JP
04300516 Oct 1992 JP
5023269 Feb 1993 JP
5042076 Feb 1993 JP
5046246 Feb 1993 JP
5060049 Mar 1993 JP
5091604 Apr 1993 JP
5095879 Apr 1993 JP
5150827 Jun 1993 JP
5150829 Jun 1993 JP
5054620 Jul 1993 JP
5257527 Oct 1993 JP
5257533 Oct 1993 JP
5285861 Nov 1993 JP
5302836 Nov 1993 JP
5312514 Nov 1993 JP
5046239 Dec 1993 JP
5341904 Dec 1993 JP
6003251 Jan 1994 JP
6038912 Feb 1994 JP
6105781 Apr 1994 JP
6154143 Jun 1994 JP
6293095 Oct 1994 JP
6293095 Oct 1994 JP
6327598 Nov 1994 JP
7047046 Feb 1995 JP
7129239 May 1995 JP
7059702 Jun 1995 JP
7222705 Aug 1995 JP
7270518 Oct 1995 JP
7281752 Oct 1995 JP
7311041 Nov 1995 JP
7313417 Dec 1995 JP
7319542 Dec 1995 JP
8000393 Jan 1996 JP
8016241 Jan 1996 JP
8063229 Mar 1996 JP
8084696 Apr 1996 JP
8089449 Apr 1996 JP
80894151 Apr 1996 JP
8123548 May 1996 JP
8152916 Jun 1996 JP
8256960 Oct 1996 JP
8263137 Oct 1996 JP
8286741 Nov 1996 JP
8286744 Nov 1996 JP
8286745 Nov 1996 JP
8286747 Nov 1996 JP
8322774 Dec 1996 JP
8335112 Dec 1996 JP
8339297 Dec 1996 JP
9044240 Feb 1997 JP
9047413 Feb 1997 JP
10240343 Feb 1997 JP
9066855 Mar 1997 JP
9145309 Jun 1997 JP
9160644 Jun 1997 JP
9179625 Jul 1997 JP
9185410 Jul 1997 JP
9192069 Jul 1997 JP
9204223 Aug 1997 JP
9204224 Aug 1997 JP
9233712 Sep 1997 JP
9251318 Sep 1997 JP
9265319 Oct 1997 JP
9269807 Oct 1997 JP
9269810 Oct 1997 JP
9269824 Oct 1997 JP
9319431 Dec 1997 JP
9319432 Dec 1997 JP
9319434 Dec 1997 JP
9325812 Dec 1997 JP
10055215 Feb 1998 JP
10117973 May 1998 JP
10118963 May 1998 JP
10165738 Jun 1998 JP
10177414 Jun 1998 JP
10214114 Aug 1998 JP
10240342 Sep 1998 JP
10260727 Sep 1998 JP
10314088 Dec 1998 JP
11015941 Jan 1999 JP
11065655 Mar 1999 JP
11102219 Apr 1999 JP
11102220 Apr 1999 JP
11162454 Jun 1999 JP
11174145 Jul 1999 JP
11175149 Jul 1999 JP
11178765 Jul 1999 JP
11212642 Aug 1999 JP
11213157 Aug 1999 JP
11282532 Oct 1999 JP
11282533 Oct 1999 JP
11295412 Oct 1999 JP
11346964 Dec 1999 JP
2000047728 Feb 2000 JP
2000056006 Feb 2000 JP
2000056831 Feb 2000 JP
2000060782 Feb 2000 JP
2000075925 Mar 2000 JP
2000102499 Apr 2000 JP
2000275321 Oct 2000 JP
2000279353 Oct 2000 JP
2000342497 Dec 2000 JP
2000353014 Dec 2000 JP
2001022443 Jan 2001 JP
2001067588 Mar 2001 JP
2001087182 Apr 2001 JP
2001121455 May 2001 JP
2001197008 Jul 2001 JP
3197758 Aug 2001 JP
3201903 Aug 2001 JP
2001216482 Aug 2001 JP
2001258807 Sep 2001 JP
2001265437 Sep 2001 JP
2001275908 Oct 2001 JP
2001289939 Oct 2001 JP
2001306170 Nov 2001 JP
2002078650 Mar 2002 JP
2002204768 Jul 2002 JP
2002204769 Jul 2002 JP
2002247510 Aug 2002 JP
2002323925 Nov 2002 JP
2002333920 Nov 2002 JP
3356170 Dec 2002 JP
2002355206 Dec 2002 JP
2002360471 Dec 2002 JP
2002360479 Dec 2002 JP
2002360482 Dec 2002 JP
2002366227 Dec 2002 JP
2002369778 Dec 2002 JP
2003010076 Jan 2003 JP
2003010088 Jan 2003 JP
2003028528 Jan 2003 JP
2003036116 Feb 2003 JP
2003038401 Feb 2003 JP
2003038402 Feb 2003 JP
2003047579 Feb 2003 JP
2000066722 Mar 2003 JP
2003061882 Mar 2003 JP
2003084994 Mar 2003 JP
2003167628 Jun 2003 JP
2003180586 Jul 2003 JP
2003180587 Jul 2003 JP
2003186539 Jul 2003 JP
2003190064 Jul 2003 JP
2003241836 Aug 2003 JP
2003262520 Sep 2003 JP
2003304992 Oct 2003 JP
2003310509 Nov 2003 JP
2003330543 Nov 2003 JP
2004123040 Apr 2004 JP
2004148021 May 2004 JP
2004160102 Jun 2004 JP
2004166968 Jun 2004 JP
2004198330 Jul 2004 JP
2004219185 Aug 2004 JP
2004351234 Dec 2004 JP
2005118354 May 2005 JP
2005211360 Aug 2005 JP
2005224265 Aug 2005 JP
2005230032 Sep 2005 JP
2005245916 Sep 2005 JP
2005346700 Dec 2005 JP
2005352707 Dec 2005 JP
2006043071 Feb 2006 JP
2006155274 Jun 2006 JP
2006164223 Jun 2006 JP
2006227673 Aug 2006 JP
2006247467 Sep 2006 JP
2006260161 Sep 2006 JP
2006293662 Oct 2006 JP
2007034866 Feb 2007 JP
2007213180 Aug 2007 JP
2009015611 Jan 2009 JP
2010198552 Sep 2010 JP
WO9526512 Oct 1995 WO
WO9530887 Nov 1995 WO
WO9617258 Jun 1996 WO
WO9715224 May 1997 WO
WO9740734 Nov 1997 WO
WO9741451 Nov 1997 WO
WO9853456 Nov 1998 WO
WO9905580 Feb 1999 WO
WO9916078 Apr 1999 WO
WO9928800 Jun 1999 WO
WO9938056 Jul 1999 WO
WO9938237 Jul 1999 WO
WO9943250 Sep 1999 WO
WO9959042 Nov 1999 WO
WO0004430 Jan 2000 WO
WO0038026 Jun 2000 WO
WO0038028 Jun 2000 WO
WO0038029 Jun 2000 WO
WO0106904 Feb 2001 WO
WO0106905 Feb 2001 WO
WO0180703 Nov 2001 WO
WO0191623 Dec 2001 WO
WO02039864 May 2002 WO
WO02039868 May 2002 WO
WO02058527 Aug 2002 WO
WO02062194 Aug 2002 WO
WO02067744 Sep 2002 WO
WO02067745 Sep 2002 WO
WO02067752 Sep 2002 WO
WO02069775 Sep 2002 WO
WO02071175 Sep 2002 WO
WO02075350 Sep 2002 WO
WO02075356 Sep 2002 WO
WO02075469 Sep 2002 WO
WO02075470 Sep 2002 WO
WO02081074 Oct 2002 WO
WO02101477 Dec 2002 WO
WO03015220 Feb 2003 WO
WO03024292 Mar 2003 WO
WO03040546 May 2003 WO
WO03040845 May 2003 WO
WO03040846 May 2003 WO
WO03062850 Jul 2003 WO
WO03062852 Jul 2003 WO
WO 2004025947 Mar 2004 WO
WO2004025947 Mar 2004 WO
WO2004058028 Jul 2004 WO
WO2004059409 Jul 2004 WO
WO2005006935 Jan 2005 WO
WO2005037496 Apr 2005 WO
WO2005055795 Jun 2005 WO
WO2005055796 Jun 2005 WO
WO2005076545 Aug 2005 WO
WO2005077243 Aug 2005 WO
WO2005077244 Aug 2005 WO
WO2005081074 Sep 2005 WO
WO2005082223 Sep 2005 WO
WO2005083541 Sep 2005 WO
WO2005098475 Oct 2005 WO
WO2005098476 Oct 2005 WO
WO2006046400 May 2006 WO
WO2006061133 Jun 2006 WO
WO2006068403 Jun 2006 WO
WO2006073248 Jul 2006 WO
WO2006089307 Aug 2006 WO
WO2007028049 Mar 2007 WO
WO2007036490 Apr 2007 WO
WO2007065033 Jun 2007 WO
WO2007137234 Nov 2007 WO
Non-Patent Literature Citations (167)
Entry
Kurs, A., et al., “Wireless Power Transfer via Strongly Coupled Magnetic Resonances,” Science, vol. 317, pp. 83-86, Jul. 6, 2008.
Prassler et al., “A Short History of Cleaning Robots,” Autonomous Robots, vol. 9, pp. 211-226 (2000).
Jarosiewicz et al. “Final Report—Lucid”, University of Florida, Departmetn of Electrical and Computer Engineering, EEL 5666—Intelligent Machine Design Laboratory, 50 pages, Aug. 4, 1999.
Jensfelt, et al. “Active Global Localization for a mobile robot using multiple hypothesis tracking”, IEEE Transactions on Robots and Automation vol. 17, No. 5, pp. 748-760, Oct. 2001.
Jeong, et al. “An intelligent map-building system for indoor mobile robot using low cost photo sensors”, SPIE vol. 6042 6 pages, 2005.
Kahney, “Robot Vacs are in the House,” www.wired.com/news/technology/o,1282,59237,00.html, 6 pages, Jun. 18, 2003.
Karcher “Karcher RoboCleaner RC 3000”, www.robocleaner.de/english/screen3.html, 4 pages, Dec. 12, 2003.
Karcher USA “RC 3000 Robotics cleaner”, www.karcher-usa.com, 3 pages, Mar. 18, 2005.
Karlsson et al., The vSLAM Algorithm for Robust Localization and Mapping, Proceedings of the 2005 IEEE International Conference on Robotics and Automation, Barcelona, Spain, pp. 24-29, Apr. 2005.
Karlsson, et al Core Technologies for service Robotics, IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2004), vol. 3, pp. 2979-2984, Sep. 28-Oct. 2, 2004.
Kleinberg, The Localization Problem for Mobile Robots, Laboratory for Computer Science, Massachusetts Institute of Technology, 1994 IEEE, pp. 521-531, 1994.
Knight, et al., “Localization and Identification of Visual Landmarks”, Journal of Computing Sciences in Colleges, vol. 16 Issue 4, 2001 pp. 312-313, May 2001.
Kolodko et al. “Experimental System for Real-Time Motion Estimation”, Proceedings of the 2003 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM 2003), pp. 981-986, 2003.
Komoriya et al., Planning of Landmark Measurement for the Navigation of a Mobile Robot, Proceedings of the 1992 IEEE/RSJ International Cofnerence on Intelligent Robots and Systems, Raleigh, NC pp. 1476-1481, Jul. 7-10, 1992.
Krotov, et al. “Digital Sextant”, Downloaded from the internet at: http://www.cs.cmu.edu/˜epk/ , 1 page, 1995.
Krupa et al. “Autonomous 3-D Positioning of Surgical Instruments in Robotized Laparoscopic Surgery Using Visual Servoing”, IEEE Transactions on Robotics and Automation, vol. 19, No. 5, pp. 842-853, Oct. 5, 2003.
Kuhl, et al. “Self Localization in Environments using Visual Angles”, VRCAI '04 Proceedings of the 2004 ACM SIGGRAPH international conference on Virtual Reality continuum and its applications in industry, pp. 472-475, 2004.
Friendly Robotics, 18 pages http://www.robotsandrelax.com/PDFs/RV400Manual.pdf accessed Dec. 22, 2011.
It's eye, 2003 www.hitachi.co.jp/rd/pdf/topics/hitac2003—10.pdf 2 pages.
Grumet “Robots Clean House”, Popular Mechanics, Nov. 2003, 3 pages.
Kurs et al, Wireless Power transfer via Strongly Coupled Magnetic Resonances, Downloaded from www.sciencemag.org , Aug. 17, 2007, 5 pages.
Franz, et al. “Biomimetric robot navigation”, Robotics and Autonomous Systems vol. 30 pp. 133-153, 2000.
Friendly Robotics “Friendly Robotics—Friendly Vac, Robotic Vacuum Cleaner”, www.friendlyrobotics.com/vac.htm. 5 pages Apr. 20, 2005.
Fuentes, et al. “Mobile Robotics 1994”, University of Rochester. Computer Science Department, TR 588, 44 pages, Dec. 7, 1994.
Fukuda, et al. “Navigation System based on Ceiling Landmark Recognition for Autonomous mobile robot”, 1995 IEEE/RSJ International Conference on Intelligent Robots and Systems 95. ‘Human Robot Interaction and Cooperative Robots’, Pittsburgh, PA, pp. 1466/1471, Aug. 5-9, 1995.
Gat, Erann “Robust Low-Computation Sensor-driven Control for Task-Directed Navigation”, Proc of IEEE International Conference on robotics and Automation , Sacramento, CA pp. 2484-2489, Apr. 1991.
Gionis “A hand-held optical surface scanner for environmental Modeling and Virtual Reality”, Virtual Reality World, 16 pages 1996.
Goncalves et al. “A Visual Front-End for Simultaneous Localization and Mapping”, Proceedings of the 2005 IEEE International Conference on Robotics and Automation, Barcelona, Spain, pp. 44-49, Apr. 2005.
Gregg et al. “Autonomous Lawn Care Applications”, 2006 Florida Conference on Recent Advances in Robotics, FCRAR 2006, pp. 1-5, May 25-26, 2006.
Hamamatsu “Si PIN Diode S5980, S5981 S5870—Multi-element photodiodes for surface mounting”, Hamatsu Photonics, 2 pages Apr. 2004.
Hammacher Schlemmer “Electrolux Trilobite Robotic Vacuum” www.hammacher.com/publish/71579.asp? promo=xsells, 3 pages, Mar. 18, 2005.
Haralick et al. “Pose Estimation from Corresponding Point Data”, IEEE Transactions on systems, Man, and Cybernetics, vol. 19, No. 6, pp. 1426-1446, Nov. 1989.
Hausler “About the Scaling Behaviour of Optical Range Sensors”, Fringe '97, Proceedings of the 3rd International Workshop on Automatic Processing of Fringe Patterns, Bremen, Germany, pp. 147-155, Sep. 15-17, 1997.
Hitachi: News release: “The home cleaning robot of the autonomous movement type (experimental machine)”, www.i4u.com./japanreleases/hitachirobot.htm, 5 pages, Mar. 18, 2005.
Hoag, et al. “Navigation and Guidance in interstellar space”, ACTA Astronautica vol. 2, pp. 513-533 , Feb. 14, 1975.
Huntsberger et al. “CAMPOUT: A Control Architecture for Tightly Coupled Coordination of Multirobot Systems for Planetary Surface Exploration”, IEEE Transactions on Systems, Man, and Cybernetics—Part A: Systems and Humans, vol. 33, No. 5, pp. 550-559, Sep. 2003.
Iirobotics.com “Samsung Unveils Its Multifunction Robot Vacuum”, www.iirobotics.com/webpages/hotstuff.php? ubre=111, 3 pages, Mar. 18, 2005.
InMach “Intelligent Machines”, www.inmach.de/inside.html, 1 page, Nov. 19, 2008.
Borges et al. “Optimal Mobile Robot Pose Estimation Using Geometrical Maps”, IEEE Transactions on Robotics and Automation, vol. 18, No. 1, pp. 87-94, Feb. 2002.
Braunstingl et al. “Fuzzy Logic Wall Following of a Mobile Robot Based on the Concept of General Perception” ICAR '95, 7th International Conference on Advanced Robotics, Sant Feliu De Guixols, Spain, pp. 367-376, Sep. 1995.
Bulusu, et al. “Self Configuring Localizaton systems: Design and Experimental Evaluation” ACM Transactions on Embedded Computing Systems vol. 3 No. 1 pp. 24-60, 2003.
Caccia, et al. “Bottom-Following for Remotely Operated Vehicies”, 5th IFAC conference, Alaborg, Denmark, pp. 245-250 Aug. 1, 2000.
Chae, et al. “StarLITE: A new artificial landmark for the navigation of mobile robots”, http://www.irc.atr.jp/jk-nrs2005/pdf/Starlite.pdf, 4 pages, 2005.
Chamberlin et al. “Team 1: Robot Locator Beacon System” NASA Goddard SFC, Design Proposal, 15 pages, Feb. 17, 2006.
Champy “Physical management of IT assets in Data Centers using RFID technologies”, RFID 2005 University, Oct. 12-14, 2005.
Chiri “Joystck Control for Tiny OS Robot”, http://eecs.berkeley.edu/Programs/ugrad/superb/papers2002/chiri.pdf. 12 pages, Aug. 8, 2002.
Christensen et al. “Theoretical Methods for Planning and Control in Mobile Robotics” 1997 First International Conference on Knowledge-Based Intelligent Electronic Systems, Adelaide, Australia, pp. 81-86, May 21-27, 1997.
Andersen et al., “Landmark based navigation strategies”, SPIE Conference on Mobile Robots XIII, SPlE vol. 3525, pp. 170-181, 1998.
Clerentin, et al. “A localization method based on two omnidirectional perception systems cooperation” Proc of IEEE International Conference on Robotics & Automation, San Francisco, CA vol. 2, pp. 1219-1224, Apr. 2000.
Corke “High Performance Visual serving for robots end-point control”. SPIE vol. 2056 Intelligent robots and computer vision 1993, 378-387.
Cozman et al. “Robot Localization using a Computer Vision Sextant”, IEEE International Midwest Conference on Robotics and Automation, pp. 106-111, 1995.
D'Orazio, et al. “Model based Vision System for mobile robot position estimation”, SPIE vol. 2058 Mobile Robots VIII, pp. 38-49, 1992.
De Bakker, et al. “Smart PSD—array for sheet of light range imaging”, Proc. of SPIE vol. 3965, pp. 1-12, May 15, 2000.
Desaulniers, et al. “An Efficient Algorithm to find a shortest path for a car-like Robot”, IEEE Transactions on robotics and Automation vol. 11 No. 6, pp. 819-828, Dec. 1995.
Dorfmüller-Ulhaas “Optical Tracking From User Motion to 3D Interaction”, http://www.cg.tuwien.ac.at/research/publications/2002/Dorfmueller-Ulhaas-thesis, 182 pages, 2002.
Dorsch, et al. “Laser Triangulation: Fundamental uncertainty in distance measurement”, Applied Optics, vol. 33 No. 7, pp. 1306-1314, Mar. 1, 1994.
Yuta, et al. “Implementation of an Active Optical Range sensor Using Laser Slit for In-Door Intelligent Mobile Robot”, IEE/RSJ International workshop on Intelligent Robots and systems (IROS 91) vol. 1, Osaka, Japan, pp. 415-420, Nov. 3-5, 1991.
Doty et al. “Sweep Strategies for a Sensory-Driven, Behavior-Based Vacuum Cleaning Agent”, AAAI 1993 Fall Symposium Series, Instantiating Real-World Agents, pp. 1-6. Oct. 22-24, 1993.
Electrolux Trilobite, Time to enjoy life, 38 pages http://www.robocon.co.kr/trilobite/Presentation—Trilobite—Kor—030104.ppt accessed Dec. 22, 2011.
Dudek, et al. “Localizing a Robot with Minimum Travel” Proceedings of the sixth annual ACM-SIAM symposium on Discrete algorithms, vol. 27 No. 2 pp. 583-604, Apr. 1998.
Dulimarta, et al. “Mobile Robot Localization in Indoor Environment”, Pattern Recognition, vol. 30, No. 1, pp. 99-111, 1997.
EBay “Roomba Timer -> Timed Cleaning—Floorvac Robotic Vacuum”, Cgi.ebay.com/ws/eBay|SAP|.dll?viewitem&category=43526&item=4375198387&rd=1, 5 pages, Apr. 20, 2005.
Electrolux “Welcome to the Electrolux trilobite” www.electroluxusa.com/node57.asp?currentURL=node142.asp%3F, 2 pages, Mar. 18, 2005.
Eren, et al. “Accuracy in position estimation of mobile robots based on coded infrared signal transmission”, Proceedings: Integrating Intelligent Instrumentation and Control, Instrumentation and Measurement Technology Conference, 1995. IMTC/95. pp. 548-551, 1995.
Eren, et al. “Operation of Mobile Robots in a Structured Infrared Environment”, Proceedings. ‘Sensing, Processing, Networking’, IEEE Instrumentation and Measurement Technology Conference, 1997 (IMTC/97), Ottawa, Canada vol. 1, pp. 20-25, May 19-21, 1997.
Everyday Robots “Everyday Robots: Reviews, Discussion and News for Consumers”, www.everydayrobots.com/index.php?option=content&task=view&id=9, Apr. 20, 2005, 7 pages.
Evolution Robotics “NorthStar—Low-cost Indoor Localiztion—How it Works”, E Evolution robotics , 2 pages, 2005.
Becker, et al. “Reliable Navigation Using Landmarks” IEEE International Conference on Robotics and Automation, 0-7803-1965-6, pp. 401-406, 1995.
Benayad-Cherif, et al., “Mobile Robot Navigation Sensors” SPIE vol. 1831 Mobile Robots, VII, pp. 378-387, 1992.
Facchinetti, Claudio et al. “Using and Learning Vision-Based Self-Positioning for Autonomous Robot Navigation”, ICARCV '94, vol. 3 pp. 1694-1698, 1994.
Betke, et al., “Mobile Robot localization using Landmarks” Proceedings of the IEEE/RSJ/GI International Conference on Intelligent Robots and Systems '94 “Advanced Robotic Systems and the Real World” (IROS '94), 8 pages.
Facchinetti, Claudio et al. “Self-Positioning Robot Navigation Using Ceiling Images Sequences”, ACCV '95, 5 pages, Dec. 5-8, 1995.
Fairfield, Nathaniel et al. “Mobile Robot Localization with Sparse Landmarks”, SPIE vol. 4573 pp. 148-155, 2002.
Favre-Bulle, Bernard “Efficient tracking of 3D—Robot Position by Dynamic Triangulation”, IEEE Instrumentation and Measurement Technology Conference IMTC 98 Session on Instrumentation and Measurement in Robotics, vol. 1, pp. 446-449, May 18-21, 1998.
Fayman “Exploiting Process Integration and Composition in the context of Active Vision”, IEEE Transactions on Systems, Man, and Cybernetics—Part C: Application and reviews, vol. 29 No. 1, pp. 73-86, Feb. 1999.
Bison, P et al., “Using a structured beacon for cooperative position estimation” Robotics and Autonomous Systems vol. 29, No. 1, pp. 33-40, Oct. 1999.
Blaasvaer, et al. “AMOR—An Autonomous Mobile Robot Navigation System”, Proceedings of the IEEE International Conference on Systems, Man, and Cybernetics, pp. 2266-2271, 1994.
Wolf et al. “Robust Vision-Based Localization by Combining an Image-Retrieval System with Monte Carol Localization”, IEEE Transactions on Robotics, vol. 21, No. 2, pp. 208-216, Apr. 2005.
Wong “EIED Online>>Robot Business”, ED Online ID# 13114, 17 pages, Jul. 2006.
Yamamoto et al. “Optical Sensing for Robot Perception and Localization”, 2005 IEEE Workshop on Advanced Robotics and its Social Impacts, pp. 14-17, 2005.
Yata et al. “Wall Following Using Angle Information Measured by a Single Ultrasonic Transducer”, Proceedings of the 1998 IEEE, International Conference on Robotics & Automation, Leuven, Belgium, pp. 1590-1596, May 1998.
Yun, et al. “Image-Based Absolute Positioning System for Mobile Robot Navigation”, IAPR International Workshops SSPR, Hong Kong, pp. 261-269, Aug. 17-19, 2006.
Yun, et al. “Robust Positioning a Mobile Robot with Active Beacon Sensors”, Lecture Notes in Computer Science, 2006, vol. 4251, pp. 890-897, 2006.
Zha et al. “Mobile Robot Localization Using Incomplete Maps for Change Detection in a Dynamic Environment”, Advanced Intelligent Mechatronics '97. Final Program and Abstracts., IEEE/ASME International Conference, pp. 110, Jun. 16-20, 1997.
Zhang, et al. “A Novel Mobile Robot Localization Based on Vision”, SPIE vol. 6279, 6 pages, Jan. 29, 2007.
Prassler et al., “A Short History of Cleaning Robots”, Autonomous Robots 9, 211-226, 2000, 16 pages.
Remazeilles, et al. “Image based robot navigation in 3D environments”, Proc. of SPIE, VOI. 6052, pp. 1-14, Dec. 6, 2005.
Rives, et al. “Visual servoing based on ellipse features”, SPIE vol. 2056 Intelligent Robots and Computer Vision pp. 356-367, 1993.
Robotics World Jan. 2001: “A Clean Sweep” (Jan. 2001)., 5 pages.
Ronnback “On Methods for Assistive Mobile Robots”, http://www.openthesis.org/documents/methods-assistive-mobile-robots-595019.html, 218 pages, Jan. 1, 2006.
Roth-Tabak, et al. “Environment Model for mobile Robots Indoor Navigation”, SPIE vol. 1388 Mobile Robots pp. 453-463, 1990.
Sadath M Malik et al. “Virtual Prototyping for Conceptual Design of a Tracked Mobile Robot”. Electrical and Computer Engineering, Canadian Conference on, IEEE, PI. May 1, 2006, pp. 2349-2352.
Sahin, et al. “Development of a Visual Object Localization Module for Mobile Robots”, 1999 Third European Workshop on Advanced Mobile Robots, (Eurobot '99), pp. 65-72, 1999.
Salomon, et al. “Low-Cost Optical Indoor Localization system for Mobile Objects without Image Processing”, IEEE Conference on Emerging Technologies and Factory Automation, 2006. (ETFA '06), pp. 629-632, Sep. 20-22, 2006.
Sato “Range Imaging Based on Moving Pattern Light and Spatio-Temporal Matched Filter”, Proceedings International Conference on Image Processing, vol. 1., Lausanne, Switzerland, pp. 33-36, Sep. 16-19, 1996.
Schenker, et al. “Lightweight rovers for Mars science exploration and sample return”, Intelligent Robots and Computer Vision XVI, SPIE Proc. 3208, pp. 24-36, 1997.
Schofield Monica “Neither Master nor slave” A Practical Study in the Development and Employment of Cleaning Robots, Emerging Technologies and Factory Automation, 1999 Proceedings ETFA '99 1999 7th IEEE International Conference on Barcelona, Spain, Oct. 18-21, 1999, pp. 1427-1434.
Shimoga et al. “Touch and Force Reflection for Telepresence Surgery”, Engineering in Medicine and Biology Society, 1994. Engineering Advances: New Opportunities for Biomedical Engineers. Proceedings of the 16th Annual International Conference of the IEEE, Baltimore, MD, pp. 1049-1050, 1994.
Sim, et al “Learning Visual Landmarks for Pose Estimation”, IEEE International Conference on Robotics and Automation, vol. 3, Detroit, MI, pp. 1972-1978, May 10-15, 1999.
Sobh et al. “Case Studies in Web-Controlled Devices and Remote Manipulation”, Automation Congress, 2002 Proceedings of the 5th Biannual World, pp. 435-440, Dec. 10, 2002.
Stella, et al. “Self-Location for Indoor Navigation of Autonomous Vehicles”, Part of the SPIE conference on Enhanced and Synthetic Vision SPIE vol. 3364 pp. 298-302, 1998.
Summet “Tracking Locations of Moving Hand-held Displays Using Projected Light”, Pervasive 2005, LNCS 3468 pp. 37-46 (2005).
Svedman et al. “Structure from Stereo Vision using Unsynchronized Cameras for Simultaneous Localization and Mapping”, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 2993-2998, 2005.
Takio et al. “Real-Time Position and Pose Tracking Method of Moving Object Using Visual Servo System”, 47th IEEE International Symposium on Circuits and Systems, pp. 167-170, 2004.
Teller “Pervasive pose awareness for people, Objects and Robots”, http://www.ai.mit.edu/lab/dangerous-ideas/Spring2003/teller-pose.pdf, 6 pages, Apr. 30, 2003.
Terada et al. “An Acquisition of the Relation between Vision and Action using Self-Organizing Map and Reinforcement Learning”, 1998 Second International Conference on Knowledge-Based Intelligent Electronic Systems, Adelaide, Australiam pp. 429-434, Apr. 21-23, 1998.
The Sharper Image “E Vac Robotic Vacuum”, www.sharperiamge.com/us/en/templates/products/pipmorework1printable.jhtml, 2 pages, Mar. 18, 2005.
TheRobotStore.com “Friendly Robotics Robotic Vacuum RV400—The Robot Store”, www.therobotstore.com/s.nl/sc.9/category.-109/it.A/id.43/.f, 1 page, Apr. 20, 2005.
TotalVac.com RC3000 RoboCleaner website Mar. 18, 2005, 3 pages.
Trebi-Ollennu et al. “Mars Rover Pair Cooperatively Transporting a Long Payload”, Proceedings of the 2002 IEEE International Conference on Robotics & Automation, Washington, D.C. pp. 3136-3141, May 2002.
Tribelhorn et al., “Evaluating the Roomba: A low-cost, ubiquitous platform for robotics research and education,” 2007, IEEE, p. 1393-1399.
Tse et al. “Design of a Navigation System for a Household Mobile Robot Using Neural Networks”, Department of Manufacturing Engg. & Engg. Management, City University of Hong Kong, pp. 2151-2156, 1998.
UAMA (Asia) Industrial Co., Ltd. “RobotFamily”, 2005, 1 page.
Watanabe et al. “Position Estimation of Mobile Robots With Internal and External Sensors Using Uncertainty Evolution Technique”, 1990 IEEE International Conference on Robotics and Automation, Cincinnati, OH, pp. 2011-2016, May 13-18, 1990.
Watts “Robot, boldly goes where no man can”, The Times—pp. 20, Jan. 1985.
Wijk et al. “Triangulation-Based Fusion of Sonar Data with Application in Robot Pose Tracking”, IEEE Transactions on Robotics and Automation, vol. 16, No. 6, pp. 740-752, Dec. 2000.
Wolf et al. “Robust Vision-based Localization for Mobile Robots Using an Image Retrieval System Based on Invariant Features”, Proceedings of the 2002 IEEE International Conference on Robotics & Automation, Washington, D.C. pp. 359-365, May 2002.
Robot Buying Guide, LG announces the first robotic vacuum cleaner for Korea, Apr. 21, 2003 http://robotbg.com/news/2003/04/22/lg—announces—the—first—robotic—vacu, 1 page.
Taipei Times, Robotic vacuum by Matsuhita about ot undergo testing, Mar. 26, 2002 http://www.taipeitimes.com/News/worldbiz/archives/2002/03/26/0000129338 accessed, 2.
Special Reports, Vacuum Cleaner Robot Operated in Conjunction with 3G Celluar Phone vol. 59, No. 9 (2004) 3 pages http://www.toshiba.co.jp/tech/review/2004/09/59—0.
Lambrinos, et al. “A mobile robot employing insect strategies for navigation”, http://www8.cs.umu.se/kurser/TDBD17/VT04/dl/Assignment%20Papers/lambrinos-RAS-2000.pdf, 38 pages, Feb. 19, 1999.
Lang et al. “Visual Measurement of Orientation Using Ceiling Features”, 1994 IEEE, pp. 552-555, 1994.
Lapin, “Adaptive position estimation for an automated guided vehicle”, SPIE vol. 1831 Mobile Robots VII, pp. 82-94, 1992.
Morland,“Autonomous Lawnmower Control”, Downloaded from the internet at: http://cns.bu.edu/˜cjmorlan/robotics/lawnmower/report.pdf, 10 pages, Jul. 24, 2002.
LaValle et al. “Robot Motion Planning in a Changing, Partially Predictable Environment”, 1994 IEEE International Symposium on Intelligent Control, Columbus, OH, pp. 261-266, Aug. 16-18, 1994.
Lee, et al. “Localization of a Mobile Robot Using the Image of a Moving Object”, IEEE Transaction on Industrial Electronics, vol. 50, No. 3 pp. 612-619, Jun. 2003.
Lee, et al. “Development of Indoor Navigation system for Humanoid Robot Using Multi-sensors Integration”, ION NTM, San Diego, CA pp. 798-805, Jan. 22-24, 2007.
Leonard, et al. “Mobile Robot Localization by tracking Geometric Beacons”, IEEE Transaction on Robotics and Automation, vol. 7, No. 3 pp. 376-382, Jun. 1991.
Li et al. “Making a Local Map of Indoor Environments by Swiveling a Camera and a Sonar”, Proceedings of the 1999 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 954-959, 1999.
Lin, et al.. “Mobile Robot Navigation Using Artificial Landmarks”, Journal of robotics System 14(2). pp. 93-106, 1997.
Linde “Dissertation, “On Aspects of Indoor Localization”” https://eldorado.tu-dortmund.de/handle/2003/22854, University of Dortmund, 138 pages, Aug. 28, 2006.
Lumelsky, et al. “An Algorithm for Maze Searching with Azimuth Input”, 1994 IEEE International Conference on Robotics and Automation, San Diego, CA vol. 1, pp. 111-116, 1994.
Luo et al., “Real-time Area-Covering Operations with Obstacle Avoidance for Cleaning Robots,” 2002, IEeE, p. 2359-2364.
Ma “Thesis: Documentation on Northstar”, California Institute of Technology, 14 pages, May 17, 2006.
Madsen, et al. “Optimal landmark selection for triangulation of robot position”, Journal of Robotics and Autonomous Systems vol. 13 pp. 277-292, 1998.
Matsutek Enterprises Co. Ltd “Automatic Rechargeable Vacuum Cleaner”, http://matsutek.manufacturer.globalsources.com/si/6008801427181/pdtl/Home-vacuum/10 . . . , Apr. 23, 2007, 3 pages.
McGillem, et al. “Infra-red Lacation System for Navigation and Autonomous Vehicles”, 1988 IEEE International Conference on Robotics and Automation, vol. 2, pp. 1236-1238, Apr. 24-29, 1988.
McGillem,et al. “A Beacon Navigation Method for Autonomous Vehicles”, IEEE Transactions on Vehicular Technology, vol. 38, No. 3, pp. 132-139, Aug. 1989.
Michelson “Autonomous Navigation”, 2000 Yearbook of Science & Technology, McGraw-Hill, New York, ISBN 0-07-052771-7, pp. 28-30, 1999.
Miro, et al. “Towards Vision Based Navigation in Large Indoor Environments”, Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Beijing, China, pp. 2096-2102, Oct. 9-15, 2006.
MobileMag “Samsung Unveils High-tech Robot Vacuum Cleaner”, http://www.mobilemag.com/content/100/102/C2261/, 4 pages, Mar. 18, 2005.
Monteiro, et al. “Visual Servoing for Fast Mobile Robot: Adaptive Estimation of Kinematic Parameters”, Proceedings of the IECON '93., International Conference on Industrial Electronics, Maui, HI, pp. 1588-1593, Nov. 15-19, 1993.
Moore, et al. A simple Map-bases Localization strategy using range measurements, SPIE vol. 5804 pp. 612-620, 2005.
Munich et al. “SIFT-ing Through Features with ViPR”, IEEE Robotics & Automation Magazine, pp. 72-77, Sep. 2006.
Munich et al. “ERSP: A Software Platform and Architecture for the Service Robotics Industry”, Intelligent Robots and Systems, 2005. (IROS 2005), pp. 460-467, Aug. 2-6, 2005.
Nam, et al. “Real-Time Dynamic Visual Tracking Using PSD Sensors and extended Trapezoidal Motion Planning”, Applied Intelligence 10, pp. 53-70, 1999.
Nitu et al. “Optomechatronic System for Position Detection of a Mobile Mini-Robot”, IEEE Ttransactions on Industrial Electronics, vol. 52, No. 4, pp. 969-973, Aug. 2005.
On Robo “Robot Reviews Samsung Robot Vacuum (VC-RP30W)”, www.onrobo.com/reviews/AT—Home/vacuum—cleaners/on00vcrb30rosam/index.htm . . . 2 pages, 2005.
OnRobo “Samsung Unveils Its Multifunction Robot Vacuum”, www.onrobo.com/enews/02101samsung—vacuum.shtml, 3 pages, Mar. 18, 2005.
Pages et al. “Optimizing Plane-to-Plane Positioning Tasks by Image-Based Visual Servoing and Structured Light”, IEEE Transactions on Robotics, vol. 22, No. 5, pp. 1000-1010, Oct. 2006.
Pages et al. “A camera-projector system for robot positioning by visual servoing”, Proceedings of the 2006 Conference on Computer Vision and Pattern Recognition Workshop (CVPRW06), 8 pages, Jun. 17-22, 2006.
Pages, et al. “Robust decoupled visual servoing based on structured light”, 2005 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 2676-2681, 2005.
Park et al. “A Neural Network Based Real-Time Robot Tracking Controller Using Position Sensitive Detectors,” IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on Neutral Networks, Orlando, Florida pp. 2754-2758, Jun. 27-Jul. 2, 1994.
Park, et al. “Dynamic Visual Servo Control of Robot Manipulators using Neutral Networks”, The Korean Institute Telematics and Electronics, vol. 29-B, No. 10, pp. 771-779, Oct. 1992.
Paromtchik, et al. “Optical Guidance System for Multiple mobile Robots”, Proceedings 2001 ICRA. IEEE International Conference on Robotics and Automation, vol. 3, pp. 2935-2940 (May 21-26, 2001).
Penna, et al. “Models for Map Building and Navigation”, IEEE Transactions on Systems. Man. and Cybernetics. vol. 23 No. 5, pp. 1276-1301, Sep./Oct. 1993.
Pirjanian “Reliable Reaction”, Proceedings of the 1996 IEEE/SICE/RSJ International Conference on Multisensor Fusion and Integration for Intelligent Systems, pp. 158-165, 1996.
Pirjanian “Challenges for Standards for consumer Robotics”, IEEE Workshop on Advanced Robotics and its Social impacts, pp. 260-264, Jun. 12-15, 2005.
Pirjanian et al. “Distributed Control for a Modular, Reconfigurable Cliff Robot”, Proceedings of the 2002 IEEE International Conference on Robotics & Automation, Washington, D.C. pp. 4083-4088, May 2002.
Pirjanian et al. “Representation and Execution of Plan Sequences for Multi-Agent Systems”, Proceedings of the 2001 IEEE/RSJ International Conference on Intelligent Robots and Systems, Maui, Hawaii, pp. 2117-2123, Oct. 29-Nov. 3, 2001.
Pirjanian et al. “Multi-Robot Target Acquisition using Multiple Objective Behavior Coordination”, Proceedings of the 2000 IEEE International Conference on Robotics & Automation, San Francisco, CA, pp. 2696-2702, Apr. 2000.
Pirjanian et al. “A decision-theoretic approach to fuzzy behavior coordination”, 1999 IEEE International Symposium on Computational Intelligence in Robotics and Automation, 1999. CIRA '99., Monterey, CA, pp. 101-106, Nov. 8-9, 1999.
Pirjanian et al. “Improving Task Reliability by Fusion of Redundant Homogeneous Modules Using Voting Schemes”, Proceedings of the 1997 IEEE International Conference on Robotics and Automation, Albuquerque, NM, pp. 425-430, Apr. 1997.
McLurkin “The Ants: A community of Microrobots”, Paper submitted for requirements of BSEE at MIT, May 21, 1995, 60 pages.
McLurkin Stupid Robot Tricks: A Behavior-based Distributed Algorithm Library for Programming Swarms of Robots, Paper submitted for requirements of BSEE at MIT, May 2004, 127 pages.
Kurth, “Range-Only Robot Localization and SLAM with Radio”, http://www.ri.cmu.edu/pub—files/pub4/kurth—derek—2004—1/kurth—derek—2004—1.pdf. 60 pages, May 2004, accessed Jul. 27, 2012.
Florbot GE Plastics, 1989-1990, 2 pages, available at http://www.fuseid.com/, accessed Sep. 27, 2012.
Related Publications (1)
Number Date Country
20100082193 A1 Apr 2010 US
Provisional Applications (1)
Number Date Country
60586046 Jul 2004 US
Continuations (2)
Number Date Country
Parent 11176048 Jul 2005 US
Child 12415554 US
Parent 11176048 US
Child 12415512 US
Continuation in Parts (2)
Number Date Country
Parent 12415554 Mar 2009 US
Child 12611814 US
Parent 12415512 Mar 2009 US
Child 12415554 US