This application claims the benefit under 35 U.S.C. §119(e) of U.S. Provisional Application No. 60/557,252, filed Mar. 29, 2004, the entirety of which is hereby incorporated by reference.
Appendix A, which forms a part of this disclosure, is a list of commonly owned co-pending U.S. patent applications. Each one of the co-pending applications listed in Appendix A is hereby incorporated herein in its entirety by reference thereto.
1. Field of the Invention
This invention generally relates to the estimation of position and orientation of an object with respect to a local or a global coordinate system. In particular, the invention relates to the method and apparatus that provides estimation and tracking of the position and orientation. The method and apparatus that can be used in vehicles, such as in mobile robots.
2. Description of the Related Art
Position estimation is a topic of interest for a wide variety of application areas ranging from autonomous mobile robots, ubiquitous computing, mobile devices, tracking of assets, tracking of people, position tracking of customers in a store, tracking of pets, position of nodes in ad hoc wireless networks, position tracking of vehicles, and position tracking of mobile devices such as cell phones, personal digital assistants, and the like.
Robots are becoming more and more commonplace in society. It will be understood that these robots can be embodied in a variety of forms, such as in automated floor care products such as vacuum cleaners. A variety of applications can be found for mobile robots, such as, but not limited to, entertainment applications such as toy robots, healthcare applications such as elderly care robots, patrolling and security applications, telepresence robots, cleaning applications such as floor cleaning robots, utility applications in environments that are unfriendly to humans such as space, deep water, cold temperature, radiation, chemical exposure, biohazards, etc., dangerous tasks such as defusing of potential explosives, operation in confined spaces such as collapsed buildings, and the performance of menial tasks such as cleaning. Mobile robots, robots that can move from one location to another, often use knowledge of their position relative to their environment.
Localization techniques refer to processes by which a robot determines its position and orientation relative to a reference coordinate system. The reference coordinate system can be either local (for example, relative to an object of interest) or global. Position estimation can include estimation of any quantity that is related to at least some of an object's six degrees of freedom of in three dimensions (3-D). These six degrees of freedom can be described as the object's (x, y, z) position and its angles of rotation around each axis of a 3-D coordinate system, which angles are denoted α, β, and θ and respectively termed “pitch,” “roll,” and “yaw.” Such position estimation can be useful for various tasks and application. For example, the bearing of a robot relative to a charging station can be useful for allowing the robot to servo to the charging station and recharge its batteries autonomously. The estimation of the distance of a pet from the front door can be used to alert the owner about a possible problem. For indoor environments, it is typically desired to track the (x, y) position of an object in a two-dimensional (2-D) floor plane and its orientation, θ, relative to an axis normal to the floor plane. That is, it can be convenient to assume that a z coordinate of the robot, as well as the robot's roll and pitch angles, are constant. The (x, y) position and the θ orientation of an object are referred to together as the pose of the object.
Numerous devices, processes, sensors, equipment, and mechanisms have been proposed for position estimation. These methods can be divided into two main categories. One category uses beacons in the environment to enable position estimation, and the second category uses natural landmarks in the environment. Because the method and apparatus described herein fall into the first category of beacon-based position estimation or localization, this section will focus on beacon-based localization methods.
Beacons are artificial devices in the environment that can be detected by an appropriate sensing apparatus. Beacons can be passive or active. Examples of passive beacons include retroreflective materials. By projecting a light source onto a retroreflective material, one can create a signature or signal that can be detected readily using one or more appropriate optical sensors. Using the signature or signal, the one or more sensors can determine their positions relative to the beacons and/or relative to the environment.
Active optical beacons emit light that can be detected by an optical sensor. The optical sensor can measure various characteristics of the emitted light, such as the distance to the emitter (using time-of-flight), the bearing to the emitter, the signal strength, and the like. Using such characteristics, one can estimate the position of the sensor using an appropriate technique, such as triangulation or trilateration. These approaches, which use active optical beacons paired with optical sensors, are disadvantageously constrained by line-of-sight between the emitters and the sensors. Without line-of-sight, a sensor will not be able to detect the emitter.
Embodiments described herein are related to methods and apparatus for the estimation of the position and orientation of a device, such as a robot, relative to a global or local reference frame. The apparatus described herein comprises an optical sensor, an optical emitter, and associated signal processing circuitry. The poses of the sensors are typically unknown, and the apparatus and methods described herein can be used to measure or estimate the pose of at least one sensor and the position of an emitter projection on a surface.
A typical application of the method and apparatus includes estimation and tracking of the position of a mobile autonomous robot. Other applications include estimation and tracking of an object for position-aware, ubiquitous devices. Additional applications include tracking of the positions of people or pets in an indoor environment. The methods and apparatus comprise one or more optical emitters, one or more optical sensors, signal processing circuitry, and signal processing methods to determine the position and orientation of at least one of the optical sensors based at least in part on the detection of the signal of one or more emitted light sources reflected from a surface.
These and other features of the invention will now be described with reference to the drawings summarized below. These drawings (not to scale) and the associated descriptions are provided to illustrate preferred embodiments of the invention and are not intended to limit the scope of the invention.
Pose: A pose is a position and orientation in space. In three dimensions, pose can refer to a position (x, y, z) and an orientation (α, β, θ) with respect to the axes of the three-dimensional space. In two dimensions, pose can refer to a position (x, y) in a plane and an orientation θ relative to the normal to the plane.
Optical sensor: An optical sensor is a sensor that uses light to detect a condition and describe the condition quantitatively. In general, an optical sensor refers to a sensor that can measure one or more physical characteristics of a light source. Such physical characteristics can include the number of photons, the position of the light on the sensor, the color of the light, and the like.
Position-sensitive detector: A position-sensitive detector, also known as a position sensing detector or a PSD, is an optical sensor that can measure the centroid of an incident light source, typically in one or two dimensions. For example, a PSD can convert an incident light spot into relatively continuous position data.
Imager: An imager refers to an optical sensor that can measure light on an active area of the sensor and can measure optical signals along at least one axis or dimension. For example, a photo array can be defined as a one-dimensional imager, and a duo-lateral PSD can be defined as a two-dimensional imager.
Camera: A camera typically refers to a device including one or more imagers, one or more lenses, and associated support circuitry. Optionally, a camera can also include one or more optical filters and a housing or casing.
PSD camera: A PSD camera is a camera that uses a PSD.
Projector: A projector refers to an apparatus that projects light. A projector includes an emitter, a power source, and associated support circuitry. A projector can project one or more light spots on a surface.
Spot: A spot refers to a projection of light on a surface. A spot can correspond to an entire projection, or can correspond to only part of an entire projection.
Optical position sensor: An optical position sensor is a device that includes one or more cameras, a signal processing unit, a power supply, and support circuitry and can estimate its position, distance, angle, or pose relative to one or more spots.
Although these methods and apparatus will be described in terms of certain preferred embodiments, other embodiments that are apparent to those of ordinary skill in the art, including embodiments that do not provide all of the benefits and features set forth herein, are also within the scope of the invention
Embodiments advantageously use active optical beacons in position estimation. Advantageously, disclosed techniques minimize or reduce the line-of-sight limitation of conventional active optical beacon-based localization by projecting the light sources onto a surface that is observable from a relatively large portion of the environment. It will be understood that the light sources can include sources of light that are not visible to the naked eye, such as, for example, infrared (IR) sources. For example, in an indoor environment, it can be advantageous to project the emitted light from the beacon onto the ceiling. In many indoor environments, the ceiling of a room is observable from most locations with the room.
As an illustration, one can consider an application of the method and apparatus for an autonomous mobile robot such as a robotic vacuum cleaner. A common approach to self-docking and self-charging is to place active infrared (IR) beacons on the charging station, which the robot can sense with photo detectors, and use the associated sensory information to find the docking station. This approach suffers from line-of-sight limitations. If the robot and the docking station do not have line-of-sight separation, the robot cannot find its position relative to the docking station.
In one embodiment, the IR emitter can advantageously be placed in such a way that it projects onto the ceiling above the docking station, and a robot can have a photo detector that generally faces the ceiling or is capable of observing the ceiling. The robot can advantageously observe the IR projection on the ceiling even in the absence of line-of-sight separation between the robot and the docking station. In relatively many situations, the robot has a line-of-sight view of the ceiling, which enables the robot to detect the IR projection and move to the docking station for self-charging.
The method and apparatus described herein include numerous variations that differ in the type and number of active beacons used, differ in the type and number of optical sensors used for detection of reflected light, and differ in the type of signal processing used to determine the pose of an object. Embodiments of the method and apparatus include systems for estimation of the distance of an object relative to another object, estimation of the bearing of an object relative to another object, estimation of the (x, y) position of an object in a two-dimensional plane, estimation of the (x, y, z) position of an object in three-dimensional space, estimation of the position and orientation of an object in two dimensions or in three dimensions, estimation of the linear or angular velocity of an object, and estimation of the linear or angular acceleration of an object.
Embodiments of the method and apparatus are related to estimation of the position and orientation of a device, such as a mobile robot, relative to a global or a local coordinate system. The apparatus includes one or more optical sensors, one or more optical emitters, and signal processing circuitry. The initial position and orientations of the sensors can be unknown, and the apparatus and methods can be used to measure or estimate the position and orientation of one or more of the sensors and the position of the emitter projections on a surface.
In one embodiment, an optical sensor measures the optical signals generated by the optical emitters that are within the sensor's field of view by measuring the light that is projected onto a surface. By contrast, in a conventional system, such optical devices for distance or position measurement disadvantageously require line-of-sight between the emitter and the sensor. Advantageously, embodiments described herein can detect optical signals projected onto a surface, such as a ceiling of an indoor environment. The optical emitters can be configured to project one or more spots of light onto a surface that is observable by a sensor from a relatively large portion of the environment. The sensor detects the spot and estimates the sensor's position relative to the spot. The sensor can measure quantities such as the position of the spot in the sensor's reference frame and the intensity of the signal generated by the spot, and can associate a unique identifier with each spot. Each such measurement or set of measurements defines a relationship between the position of the sensor and the position of the spot. Using multiple such relationships defined between one or more sensors and one or more spots, signal processing circuitry can estimate the pose of at least one of the sensors, and, optionally, the position of one or more spots.
Embodiments of the method and apparatus described herein can vary in the number and type of optical sensors used, can vary in the number and type of optical emitters used, can vary in the projection of the light onto the sensor via, optionally, one or more spots, and can vary in the methods used for estimation of the distance, heading, position, orientation, velocity, angular velocity, acceleration, and angular acceleration of the sensor or sensors. For example, a light spot can be generated by an IR sensor that emits IR light onto a surface, and a photo detector can be used to detect the light reflected from the surface. With one emitter and one sensor, the distance and relative heading to the projected light can be measured. With two emitters and one two-dimensional sensor, the position of the sensor in a plane and the rotation of the sensor around an axis normal to that plane can be measured.
Embodiments of the method and apparatus described herein can use a wide variety of optical sensors. Some embodiments use digital or analog imaging or video cameras, such as CMOS imagers, CCD imagers, and the like. Other embodiments use PSDs, such as one-dimensional PSDs, angular one-dimensional PSDs, two-dimensional PSDs, quad PSDs, duo-lateral PSDs, tetra-lateral PSDs, and the like. Other embodiments use photo detectors.
In one embodiment, the optical sensor is combined with a lens and one or more optical filters to form a camera. For example, a PSD sensor can be enclosed in a casing with an open side that fits the lens and optical filters to filter incoming light and reduce effects of ambient light.
Embodiments of the method and apparatus described herein can also use a wide variety of optical emitters, including visible light devices, invisible light devices, laser light devices, infrared light devices, polarized light devices, light-emitting diodes (LEDs), laser diodes, light bulbs, halogen lights, projectors, and the like.
One embodiment of the method and apparatus described herein uses one two-dimensional PSD camera and a plurality of infrared (IR) emitters. Each IR emitter projects a spot onto the ceiling in a room. Each emitter is modulated with a unique pattern or frequency. The PSD camera is mounted, on a robot, for example, and faces the ceiling in such a way that its field of view intersects at least a portion of the plane that defines the ceiling onto which the spots are projected. The PSD camera provides an indication of the projected position of each observable spot in the camera sensor coordinates. In the illustrated embodiment, the position of each observed spot is defined as its centroid.
A camera position of each observed spot can correspond to the projection of a spot's position onto the image plane of the camera as defined by a corresponding perspective transformation. The PSD camera can measure the camera position of each spot. Using the measured camera positions of the spot and information related to the distance between the spots, the position (x, y) of the PSD camera in one plane and the rotation (θ) of the PSD camera around an axis normal to that plane can be determined. The position and orientation of the camera defined by (x, y, θ) is known as the pose of the camera.
For example, the PSD camera can be coupled to a mobile device such as a robot, and the device's pose can advantageously be relatively accurately determined within a room with two or more spots. Pose estimation, also known as localization, is an important component in many applications, including automated vacuuming, automated floor cleaning, telepresence, security, and entertainment. Without accurate position estimates, it is relatively difficult or impossible for a conventional robot to execute a path or trajectory because the conventional robot's internal position estimate tends to drift, and the conventional robot is generally unable to measure or account for the drift. For systematic floor coverage in a robotic vacuum cleaner, for example, a conventional robot without the ability to localize generally cannot maintain knowledge of the areas it has cleaned and the areas it has not cleaned, and the robot is therefore relatively likely to clean the same areas repeatedly and inefficiently and is relatively unlikely to clean other areas with sufficient frequency. Accordingly, many conventional robotic vacuum cleaners execute a random trajectory. By contrast, a robotic vacuum cleaner according to an embodiment with the ability to localize in a relatively accurate manner can follow a relatively efficient planned path. A robotic vacuum cleaner according to an embodiment can clean a room in a relatively efficient manner because it can track its path and can execute a planned, traversable path. Similarly, a mobile robot with the ability to localize can navigate to a desirable location and maintain a history of paths that it has taken.
Another embodiment of the method and apparatus described herein uses one two-dimensional PSD camera and one IR emitter. The IR emitter projects a spot on the ceiling, and the PSD camera faces the ceiling such that its field of view intersects at least a portion of the plane that defines the ceiling onto which the spot is projected. The PSD camera can provide indications for a measurement of the distance from the camera to the spot and the heading from the camera to the spot relative to the tangent of the circle with radius defined by the distance measurement. The distance measurement defines a circle centered at the spot projected onto the plane of the camera. In one example, the illustrated embodiment can be used for an application in which it is desired to position a device relative to the spot. Advantageously, when the camera is underneath the spot on the ceiling, then the camera position is at the center of the PSD camera. For example, if the spot is projected over a charging station, a mobile device can approach the charging station and recharge autonomously. In a related embodiment that further comprises wheel encoders, a robotic vacuum cleaner can move along concentric circles or move along a spiral to implement a floor coverage strategy that is relatively efficient, compared to a random coverage strategy.
While various embodiments have been and will be further described in the context of autonomous mobile robots, it will be understood by those of ordinary skill in the art that the principles and advantages set forth herein are applicable to other applications that benefit from position estimation, which are also within the scope of the invention.
Examples of embodiments will now be described.
The projector 111 includes a light source 102. By way of example, the light source 102 can correspond to a device, such as a laser device, an infrared device, and the like, that can be modulated by a modulator 101. Optionally, the light from the light source 102 can pass through one or more lenses 103 to project the light onto the surface 116.
The optical position sensor 112 includes a camera 117 and a processing unit 118. The camera 117 can detect and measure the intensity and position of the light 114 reflected from the surface 116 and can generate corresponding signals that are processed by the signal processing unit 118 to estimate the position of the optical position sensor 112 relative to the projected light pattern 119. It will be understood that the optical position sensor 112 can include multiple cameras 117 and/or multiple processing units 118.
The camera 117 includes an imager 104. The imager 104 can, for example, correspond to a CMOS imager, a CCD imager, an infrared imager, and the like. The camera can optionally include an optical filter 105 and can optionally include a lens 106. The lens 106 can correspond to a normal lens or can correspond to a special lens, such as a wide-angle lens, a fish-eye lens, an omni-directional lens, and the like. Further, the lens 106 can include reflective surfaces, such as planar, parabolic, or conical mirrors, which can be used to provide a relatively large field of view or multiple viewpoints. The lens 106 collects the reflected light 114 and projects it onto the imager 104. The optical filter 105 can constrain the wavelengths of light that pass from the lens 106 to the imager 104, which can advantageously be used to reduce the effect of ambient light, to narrow the range of light to match the wavelength of the light coming from the projector 111, and/or to limit the amount of light projected onto the imager 104, which can limit the effects of over-exposure or saturation. The filter 105 can be placed in front of the lens 106 or behind the lens 106. It will be understood that the camera 117 can include multiple imagers 104, multiple optical filters 105, and/or multiple lenses 106.
The signal processing unit 118 can include analog components and can include digital components for processing the signals generated by the camera 117. The major components of the signal processing unit 118 preferably include an amplifier 107, a filter 108, an analog-to-digital converter 109, and a microprocessor 110, such as a peripheral interface controller, also known as a PIC. It will be understood that the signal processing unit 118 can include multiple filters 108 and/or multiple microprocessors 110.
Embodiments of the apparatus are not constrained to the specific implementations of the projector 111 or the optical position sensor 112 described herein. Other implementations, embodiments, and modifications of the apparatus that do not depart from the true spirit and scope of the apparatus will be readily apparent to one of ordinary skill in the art.
In one embodiment, it is convenient to define the Y axis such that the Y axis is parallel to the vector originating at the point w1 301 and passing through the point w2 302. Additionally, it is convenient to define the X axis such that the X axis is perpendicular to the Y axis and lies in the plane defined by the floor. Further, it is convenient to define the Z axis such that the positive Z direction is specified by the cross product of the unit vector in the X direction with the unit vector in the Y direction; in standard vector calculus notation, this relationship is expressed as {circumflex over (Z)}={circumflex over (X)}×Ŷ. Thus, in the illustrated coordinate system, the Z axis is normal to the floor plane and is directed from the floor to the ceiling. In the global coordinate system, an origin O is defined as the point having coordinates (0, 0, 0). Also in the global coordinate system, the point w1 301 is defined as having coordinates (x1, y1, h), and the point w2 302 is defined as having coordinates (x2, y2, h). Further, it is convenient to assume that the origin O is located directly below the point w1 301, so that x1=0 and y1=0. Additionally, the definition of the X axis implies that x2=0 as well. Thus, the point w1 301 has the coordinates (0, 0, h), and the point w2 302 has the coordinates (0, y2, h). It will be understood that the aforementioned definitions can be made with no loss of generality.
A coordinate system relative to an imager is defined with a u axis, a v axis, and a z axis and can be referred to as the camera coordinate system or the camera reference frame. In the illustrated embodiment, the imager corresponds to a two-dimensional PSD sensor. In one embodiment, the height of the PSD sensor off the floor plane is relatively small compared to the ceiling height h, so the PSD sensor and the origin of the camera coordinate system use the coordinates (x, y, 0) and the orientation θ in the global coordinate system. The displacement from the origin of the global coordinate system to the origin of the camera coordinate system is denoted S; thus, ∥S∥=√{square root over (x2+y2)}, where ∥S∥ denotes the norm, or magnitude, of the vector S. The point c1 311 represents the projection of the point w1 301 onto the imager, and the point c2 312 represents the projection of the point w2 302 onto the imager. The point c1 311 has the coordinates (u1, v1, 0) in the camera reference frame, and the point c2 312 has the coordinates (u2, v2, 0) in the camera reference frame. It will be understood that the aforementioned definitions can be made with no loss of generality.
In one embodiment, the spots 204, 205 can be identified using unique signals or unique signatures. For example, the emitters that produce the spots 204, 205 can be on-off modulated with different frequencies. The emitter that produces the first spot 204 can be modulated with a first frequency f1, and the emitter that produces the second spot 205 can be modulated with a second frequency f2, wherein the first frequency and the second frequency are different; that is f1≠f2.
At this point, it should be noted that the ceiling height h and the separation y2 between the point w1 301 and the point w2 302 can be determined in a variety of ways. For example, if the mobile robot 201 using the optical position sensor is capable of producing wheel odometry estimates, then the robot 201 can estimate h and y2 using measurements or observations of the points w1 301 and w2 302 from multiple positions. Other appropriate techniques will be readily determined by one of ordinary skill in the art.
Exemplary Position Estimation Using the Method and Apparatus
With reference to the coordinate systems, distances, angles, and points described earlier in connection with
In one embodiment, the PSD measures the coordinates of the centroid of the light projected onto the PSD by generating electrical current proportional to the position and intensity of the light centroid. The associated processing can be accomplished in a wide variety of ways, including analog circuitry, digital circuits, hardware, software, firmware, and combinations thereof. For example, a microcontroller, a microprocessor, a CPU, a general-purpose digital signal processor, dedicated hardware, and the like can be used.
To measure the centroids of multiple spots, a number of conditions are preferable. First, the sensor preferably does not become saturated with light, ambient or otherwise. In one embodiment, this is accomplished by using optical filters to reduce or minimize unwanted light sources that project onto the active area of the PSD sensor and by biasing the PSD to increase the light level at which it becomes saturated. Second, to measure the position of a particular light source reliably, it is preferable to isolate the light source from other light sources by reducing or eliminating the effect of other light sources, which can include ambient light and light generated by other spots. One approach is to isolate one light source is to modulate the light source with a unique pattern such that it is distinguished from other light sources. If the i-th emitter on-off modulates the projected light with a frequency fi, the PSD sensor can extract the signal generated by filtering a signal using a band-pass filter with lower and upper frequencies of fi−w and fi+w, respectively, where 2w corresponds to the width of the corresponding band-pass filter. The signal processing unit of the PSD can use the filter to suppress signals with frequencies outside the frequency range defined by the band-pass filter. The filtering of the PSD signal can occur either before or after the PSD currents are converted into associated centroid positions. In one embodiment, where the first emitter is modulated at a frequency f1 and the second emitter is modulated at a frequency f2, and wherein f1≠f2, the signal processing unit filters the signal specified by f1 to measure c1, the centroid of the first spot, and filters the signal specified by f2 to measure c2, the centroid of the second spot.
Exemplary Method for Pose Estimation
In one embodiment, the apparatus includes N emitters, which project N light spots, and M cameras. The position of the i-th camera in the global reference frame is denoted herein by Si=(xi, yi, zi), and the rotational orientation of the i-th camera in the global reference frame is denoted herein by Ri(αi, βi, θi). The position of the j-th light spot is denoted herein by and the position of the projection of the j-th spot onto the i-th camera is denoted herein by ci,j. Then, the following relationship relates Si, wj, and ci,j.
ci,j=PiRi(wj−Si) Equation 1
In Equation 1, Ri represents the three-degree-of-freedom rotation transformation, which, in one embodiment, results from the composition of three mutually orthogonal one-degree-of-freedom rotation transformations, such as Ri=RαRβRθ. Also in Equation 1, Pi represents the perspective transformation associated with the i-th camera. Thus, Equation 1 defines three equations for six unknowns, in which the unknowns are xi, yi, zi, αi, βi, and θi. In a system with N spots and M cameras, N×M such matrix equations can be formulated, but not all such equations are necessarily unique, independent, and non-degenerate. Thus, with two spots and one camera, values for x, y, and θ can be determined. To determine all six degrees of freedom, it is preferable to have a configuration of spots and cameras that generates at least six independent, non-degenerate equations analogous to Equation 1.
Exemplary System with Two Emitters and One Camera
In one embodiment, the system includes two spots projected onto the ceiling and one optical position sensor with one PSD camera. The relationship between a spot wj and its projection cj in the PSD camera reference frame is given by the following equation.
cj=PRθ(wi−S) Equation 2
In Equation 2, S represents the position of the PSD camera in the global reference frame, and P represents the transformation from a point (X, Y, Z) in the global coordinate system to a point (u, v, z) in the PSD camera reference frame. Also, for the purposes of this example, the z axis of the camera coordinate system is aligned with the Z axis of the global coordinate system in the vertical direction. This implies that Rα and Rβ correspond to identity matrices; accordingly, Rα and Rβ have been omitted from Equation 2. In the case of a pinhole camera model, P corresponds to the scalar value λ/(λ−Z), where λ represents the focal length of the camera. It will be understood that multiplication by a scalar value can also be achieved by multiplication by the corresponding multiple of the appropriately-dimensioned identity matrix. Also in Equation 2, Rθ can be represented by the following unitary matrix.
Equation 2 can be re-written as follows.
wj−S=Rθ−1P−1cj Equation 4
In Equation 4, P−1 represents the inverse perspective transformation, and Rθ−1 represents the inverse rotation transformation. When the position of the i-th spot is associated with appropriate camera parameters, such as the camera focal length in a pinhole camera model, then Equation 4 defines two non-degenerate equations in three unknowns x, y, and θ for each measurement cj. Thus, the three variables, x, y, and θ, together determine the pose of the PSD camera.
Because two equations in three unknowns do not define a unique solution for the pose of the PSD camera, it is preferable to use more independent equations than unknowns. With two spots and one PSD camera, it is possible to generate four equations in three unknowns as follows.
w1−S=Rθ−1P−1c1 Equation 5
w2−S=Rθ−1P−1c2 Equation 6
Equation 5 relates the spot w1 with its associated PSD camera position c1, and Equation 6 relates the spot w2 with its associated PSD camera position c2. Subtracting Equation 5 from Equation 6 generates the following matrix equation expressed in Equation 7.
w2−w1=Rθ−1P−1(c2−c1) Equation 7
Equation 7 can be expanded as follows.
The matrix equation given in Equation 8 expresses two non-degenerate linear equations. In Equation 8, Δy=d=y2−y1, Δu=u2−u1, and Δv=v2−v1. As discussed earlier in connection with
0=P−1(Δu cos θ−Δv sin θ) Equation 9
Solving for θ in Equation 9 gives θ=tan−1(Δu/Δv), and substituting this result into Equation 5 provides the following solution for S, the position of the PSD camera in the global reference frame.
S=w1−Rθ−1P−1c1|θ=tan
Accordingly, the pose (x, y, θ) of the PSD camera as a function of the measurements c1 and c2 can be determined using Equation 9 and Equation 10.
An Example of Using One Emitter and One Camera
In one embodiment, the system includes one spot projected onto the ceiling and one optical position sensor with one PSD camera. Similar to the relationship discussed earlier in connection with Equation 2, the relationship between the spot w and its projection c in the PSD camera reference frame is given by the following equation.
c=PRθ(w−S) Equation 11
Because the origin of the global coordinate system can be chosen, without loss of generality, such that the spot w is located directly above the origin, rearranging Equation 11 provides the following solution for S.
w−S=Rθ−1P−1c Equation 12
Equation 12 can be rewritten in coordinate notation to give the following relationship.
Thus, Equation 13 specifies two non-degenerate linear equations. In the case that P−1 corresponds to a scalar or to a scalar multiple of an identity matrix, squaring and summing the two non-degenerate linear equations and simplifying the result yields the following.
x2+y2=(P−1)2[(u cos θ−v sin θ)2+(u sin θ+v cos θ)2]=(P−1)2(u2+v2) Equation 14
Because x2+y2 is equal to ∥S∥2 and u2+v2 is equal ∥c∥2, Equation 14 can be simplified to ∥S∥2=(P−1)2∥c∥2, or ∥S∥=P−1∥c∥.
Thus, the distance measurement ∥c∥, and the corresponding distance measurement ∥S∥, can define a circle in an x-y plane centered at the origin (0, 0) with radius ∥S∥. A tangent to the circle at the position of the sensor at the position of the sensor (that is, at S), is orthogonal to the vector s=(x, y)T, where the superscripted “T” denotes the vector or matrix transposition operation. The tangent ŝ can therefore be expressed as ŝ=(y−x)T. The rotational orientation, φ, of the robot of the robot with respect to ŝ can then be estimated using a measurement of c as given in the following relationship.
φ=tan−1(u/v) Equation 15
Thus, in this embodiment, ∥S∥ and φ can be determined, which can advantageously support applications for robotics, person tracking, object tracking, and the like. In one example, the spot is projected onto the ceiling directly above a docking station, and the optical position sensor with one PSD camera is attached to a robot. Using the estimation of ∥S∥ and φ, the robot can guide itself to turn toward the spot and approach the spot. In this manner, the robot can approach the docking station and recharge itself. In an alternative example, the projector can correspond to a handheld projector and can be used to point above a user-selected object or location of interest to guide to the robot to the object or location. This alternative example provides a powerful interface for robot interaction.
One embodiment of the method and apparatus includes a camera, such as a CCD camera, a CMOS camera, and the like, and a projector that generates a pattern on a projection surface, such as a ceiling. It will be understood that this embodiment can include multiple cameras and/or multiple projectors. By way of example, the projector can correspond to a slide projector, and the pattern can be encoded in a slide. In one embodiment, at least one pattern has the shape of a circle, and in another embodiment, at least one pattern has the shape of a square. Each camera generates grayscale or color images. A signal processing unit processes the camera images, extracts the unique patterns, and estimates a position of the pattern in camera sensor coordinates. The position of the pattern can be defined as the centroid of the pattern. The position of the j-th pattern in the global reference frame can be denoted herein by and the position of the j-th pattern in the reference frame of the i-th camera can be denoted herein by ci,j. Then, the relationship between the j-th pattern and its projection onto the i-th camera is defined by Equation 1. The signal processing unit captures the camera images and processes the images using one or more image analysis techniques to detect and extract the position of known patterns. The image analysis techniques can include, by way of example, line and corner detection (to detect a square pattern, for example), Hough transform (to detect a circle, for example), and the like. After the positions of the patterns in the camera reference frames are determined, the signal processing unit can estimate the positions of the cameras with respect to the global reference frame using the methods described previously. Optionally, one or more of the projectors can modulate on-off to reduce the effects of ambient light. The modulation frequencies can advantageously be used to associate a unique identifier with each pattern. In one embodiment, the identifier of a pattern is advantageously encoded within the pattern itself. As an example, the shape of the pattern can define a unique identifier, if distinct shapes are used for each pattern. For example, the system can distinguish between a square pattern and a circular pattern and associate different identifiers with each pattern.
In one embodiment, the modulation frequency of the projector can encode information, such as bit patterns to transmit a message that can be detected and extracted by the camera and the signal processing unit. The bit patterns can be modulated in the signal using any of a variety of common modulation techniques, such as pulse width modulation, space width modulation, and phase modulation.
In another embodiment, the bit patterns are modulated on top of the original “carrier” frequency of the spot. The projectors and optical position sensors can advantageously be used for optical wireless communication. In this embodiment, the projector projects the light pattern on a reflecting surface, and the optical sensor detects the signal by viewing the reflecting surface, which eliminates the need for line-of-sight between the emitter and the sensor. The signal modulated in the projected light can carry commands, such as commands for a robot, similar to the way in which light modulated by a remote control unit can carry commands to an appliance. In one example, the projection of the spot on the ceiling directly above a docking station enables the robot to find the docking station and perform self-charging. In addition, an interface with the docking station, such as a button on the docking station, can generate a command to the robot to return to the charging station.
Yet another embodiment of the method and apparatus includes a projector for one or more distinct regions of an environment, such as a projector for each distinct region. Advantageously, this embodiment expands the coverage of localization throughout relatively large areas or throughout multiple relatively confined areas, such as multiple rooms. The covered area associated with one projector can be constrained by the field of view of the camera, the distance from the projector to the reflection surface, and the presence of objects and walls that obstruct the camera's view of the spot. Increasing the number of light patterns can increase the coverage area. In one embodiment, for coverage across multiple rooms, one or more projectors are provided for each room in which coverage is desired, so that, for example, each room can have a dedicated projector. For example, each projector can project one or more spots that have an identifier that is unique within the room. It will be understood that the identifier associated with a spot can be based on the spot's modulation frequency, the spot's shape, the spot's color, or another appropriate characteristic that can be detected by the camera sensor.
In one implementation, the combination of the individual spot identifiers with a room can define a unique identifier for the room. By way of example, a first room can have two spots having associated unique identifiers “A” and “B,” and a second room can have two spots having associated unique identifiers “A” and “C.” The unique identifiers for each room can advantageously be used by a system, such as by a robot, to build a topological map of the rooms and the connectivity of the rooms. Without a unique identifier for each room or region, the system can disadvantageously generate ambiguous position information. As an illustration, without a unique identifier for each room, the position associated with an (x, y) coordinate of a first room can generally not be distinguished from the position associated with the (x, y) coordinate of a second room.
Although this invention has been described with reference to these specific embodiments, the descriptions are intended to be illustrative of the invention and are not intended to be limiting. Various modifications and applications may occur to those skilled in the art without departing from the true spirit and scope of the invention.
Number | Name | Date | Kind |
---|---|---|---|
2136324 | John | Nov 1938 | A |
2353621 | Sav et al. | Jul 1944 | A |
2770825 | Pullen | Nov 1956 | A |
3119369 | Harland et al. | Jan 1964 | A |
3166138 | Dunn | Jan 1965 | A |
3333564 | Waters | Aug 1967 | A |
3381652 | Schaefer et al. | May 1968 | A |
3457575 | Bienek | Jul 1969 | A |
3550714 | Bellinger | Dec 1970 | A |
3569727 | Aggarwal et al. | Mar 1971 | A |
3674316 | De Brey | Jul 1972 | A |
3678882 | Kinsella | Jul 1972 | A |
3744586 | Leinauer | Jul 1973 | A |
3756667 | Bombardier et al. | Sep 1973 | A |
3809004 | Leonheart | May 1974 | A |
3816004 | Bignardi | Jun 1974 | A |
3845831 | James | Nov 1974 | A |
3853086 | Asplund | Dec 1974 | A |
3863285 | Hukuba | Feb 1975 | A |
3888181 | Kups | Jun 1975 | A |
3937174 | Haaga | Feb 1976 | A |
3952361 | Wilkins | Apr 1976 | A |
3989311 | Debrey | Nov 1976 | A |
3989931 | Phillips | Nov 1976 | A |
4012681 | Finger et al. | Mar 1977 | A |
4070170 | Leinfelt | Jan 1978 | A |
4099284 | Shinozaki et al. | Jul 1978 | A |
4119900 | Kremnitz | Oct 1978 | A |
4175589 | Nakamura et al. | Nov 1979 | A |
4175892 | De brey | Nov 1979 | A |
4196727 | Verkaart et al. | Apr 1980 | A |
4198727 | Farmer | Apr 1980 | A |
4199838 | Simonsson | Apr 1980 | A |
D258901 | Keyworth | Apr 1981 | S |
4297578 | Carter | Oct 1981 | A |
4309758 | Halsall et al. | Jan 1982 | A |
4369543 | Chen et al. | Jan 1983 | A |
4401909 | Gorsek | Aug 1983 | A |
4416033 | Specht | Nov 1983 | A |
4445245 | Lu | May 1984 | A |
4465370 | Yuasa et al. | Aug 1984 | A |
4477998 | You | Oct 1984 | A |
4481692 | Kurz | Nov 1984 | A |
4482960 | Pryor | Nov 1984 | A |
4492058 | Goldfarb et al. | Jan 1985 | A |
4513469 | Godfrey et al. | Apr 1985 | A |
D278732 | Ohkado | May 1985 | S |
4518437 | Sommer | May 1985 | A |
4556313 | Miller et al. | Dec 1985 | A |
4580311 | Kurz | Apr 1986 | A |
4601082 | Kurz | Jul 1986 | A |
4618213 | Chen | Oct 1986 | A |
4620285 | Perdue | Oct 1986 | A |
4624026 | Olson et al. | Nov 1986 | A |
4626995 | Lofgren et al. | Dec 1986 | A |
4628454 | Ito | Dec 1986 | A |
4644156 | Takahashi et al. | Feb 1987 | A |
4654492 | Koerner et al. | Mar 1987 | A |
4654924 | Getz et al. | Apr 1987 | A |
4662854 | Fang | May 1987 | A |
4674048 | Okumura | Jun 1987 | A |
4679152 | Perdue | Jul 1987 | A |
4680827 | Hummel | Jul 1987 | A |
4696074 | Cavalli | Sep 1987 | A |
D292223 | Trumbull | Oct 1987 | S |
4700301 | Dyke | Oct 1987 | A |
4700427 | Knepper | Oct 1987 | A |
4703820 | Reinaud | Nov 1987 | A |
4710020 | Maddox et al. | Dec 1987 | A |
4716621 | Zoni | Jan 1988 | A |
4728801 | O'Connor | Mar 1988 | A |
4733343 | Yoneda et al. | Mar 1988 | A |
4733430 | Westergren | Mar 1988 | A |
4733431 | Martin | Mar 1988 | A |
4735136 | Lee et al. | Apr 1988 | A |
4735138 | Gawler et al. | Apr 1988 | A |
4748336 | Fujie et al. | May 1988 | A |
4748833 | Nagasawa | Jun 1988 | A |
4756049 | Uehara | Jul 1988 | A |
4767213 | Hummel | Aug 1988 | A |
4769700 | Pryor | Sep 1988 | A |
4777416 | George et al. | Oct 1988 | A |
D298766 | Tanno et al. | Nov 1988 | S |
4782550 | Jacobs | Nov 1988 | A |
4796198 | Boultinghouse et al. | Jan 1989 | A |
4806751 | Abe et al. | Feb 1989 | A |
4811228 | Hyyppa | Mar 1989 | A |
4813906 | Matsuyama et al. | Mar 1989 | A |
4815157 | Tsuchiya | Mar 1989 | A |
4818875 | Weiner | Apr 1989 | A |
4829442 | Kadonoff et al. | May 1989 | A |
4829626 | Harkonen et al. | May 1989 | A |
4832098 | Palinkas et al. | May 1989 | A |
4846297 | Field et al. | Jul 1989 | A |
4851661 | Everett | Jul 1989 | A |
4854000 | Takimoto | Aug 1989 | A |
4854006 | Nishimura et al. | Aug 1989 | A |
4855915 | Dallaire | Aug 1989 | A |
4857912 | Everett et al. | Aug 1989 | A |
4880474 | Koharagi et al. | Nov 1989 | A |
4887415 | Martin | Dec 1989 | A |
4893025 | Lee | Jan 1990 | A |
4901394 | Nakamura et al. | Feb 1990 | A |
4912643 | Beirne | Mar 1990 | A |
4918441 | Bohman | Apr 1990 | A |
4919224 | Shyu et al. | Apr 1990 | A |
4919489 | Kopsco | Apr 1990 | A |
4920060 | Parrent et al. | Apr 1990 | A |
4920605 | Takashima | May 1990 | A |
4933864 | Evans et al. | Jun 1990 | A |
4937912 | Kurz | Jul 1990 | A |
4953253 | Fukuda et al. | Sep 1990 | A |
4956891 | Wulff | Sep 1990 | A |
4961303 | McCarty et al. | Oct 1990 | A |
4961304 | Ovsborn et al. | Oct 1990 | A |
4962453 | Pong et al. | Oct 1990 | A |
4971591 | Raviv et al. | Nov 1990 | A |
4973912 | Kaminski et al. | Nov 1990 | A |
4974283 | Holsten et al. | Dec 1990 | A |
4977639 | Takahashi et al. | Dec 1990 | A |
4986663 | Cecchi et al. | Jan 1991 | A |
5001635 | Yasutomi et al. | Mar 1991 | A |
5002145 | Wakaumi et al. | Mar 1991 | A |
5002501 | Tucker | Mar 1991 | A |
5012886 | Jonas et al. | May 1991 | A |
5018240 | Holman | May 1991 | A |
5020186 | Lessig et al. | Jun 1991 | A |
5022812 | Coughlan et al. | Jun 1991 | A |
5023788 | Kitazume et al. | Jun 1991 | A |
D318500 | Malewicki et al. | Jul 1991 | S |
5032775 | Mizuno et al. | Jul 1991 | A |
5033151 | Kraft et al. | Jul 1991 | A |
5033291 | Podoloff et al. | Jul 1991 | A |
5045769 | Everett | Sep 1991 | A |
5049802 | Mintus et al. | Sep 1991 | A |
5051906 | Evans et al. | Sep 1991 | A |
5062819 | Mallory | Nov 1991 | A |
5070567 | Holland | Dec 1991 | A |
5084934 | Lessig et al. | Feb 1992 | A |
5086535 | Grossmeyer et al. | Feb 1992 | A |
5093955 | Blehert et al. | Mar 1992 | A |
5094311 | Akeel | Mar 1992 | A |
5105502 | Takashima | Apr 1992 | A |
5105550 | Shenoha | Apr 1992 | A |
5109566 | Kobayashi et al. | May 1992 | A |
5115538 | Cochran et al. | May 1992 | A |
5127128 | Lee | Jul 1992 | A |
5136675 | Hodson | Aug 1992 | A |
5136750 | Takashima et al. | Aug 1992 | A |
5142985 | Stearns et al. | Sep 1992 | A |
5144471 | Takanashi et al. | Sep 1992 | A |
5144714 | Mori et al. | Sep 1992 | A |
5144715 | Matsuyo et al. | Sep 1992 | A |
5152028 | Hirano | Oct 1992 | A |
5152202 | Strauss | Oct 1992 | A |
5155684 | Burke et al. | Oct 1992 | A |
5163202 | Kawakami et al. | Nov 1992 | A |
5163320 | Goshima et al. | Nov 1992 | A |
5170352 | McTamaney et al. | Dec 1992 | A |
5182833 | Yamaguchi et al. | Feb 1993 | A |
5202742 | Frank et al. | Apr 1993 | A |
5204814 | Noonan et al. | Apr 1993 | A |
5206500 | Decker et al. | Apr 1993 | A |
5208521 | Aoyama | May 1993 | A |
5216777 | Moro et al. | Jun 1993 | A |
5233682 | Abe et al. | Aug 1993 | A |
5239720 | Wood et al. | Aug 1993 | A |
5251358 | Moro et al. | Oct 1993 | A |
5258822 | Nakamura et al. | Nov 1993 | A |
5261139 | Lewis | Nov 1993 | A |
5276618 | Everett | Jan 1994 | A |
5276939 | Uenishi | Jan 1994 | A |
5277064 | Knigga et al. | Jan 1994 | A |
5279672 | Betker et al. | Jan 1994 | A |
5284452 | Corona | Feb 1994 | A |
5284522 | Kobayashi et al. | Feb 1994 | A |
5293955 | Lee | Mar 1994 | A |
D345707 | Alister | Apr 1994 | S |
5303448 | Hennessey et al. | Apr 1994 | A |
5307273 | Oh et al. | Apr 1994 | A |
5309592 | Hiratsuka | May 1994 | A |
5310379 | Hippely et al. | May 1994 | A |
5315227 | Pierson et al. | May 1994 | A |
5319827 | Yang | Jun 1994 | A |
5319828 | Waldhauser et al. | Jun 1994 | A |
5321614 | Ashworth | Jun 1994 | A |
5323483 | Baeg | Jun 1994 | A |
5324948 | Dudar et al. | Jun 1994 | A |
5341540 | Soupert et al. | Aug 1994 | A |
5341549 | Wirtz et al. | Aug 1994 | A |
5345649 | Whitlow | Sep 1994 | A |
5353224 | Lee et al. | Oct 1994 | A |
5363305 | Cox et al. | Nov 1994 | A |
5363935 | Schempf et al. | Nov 1994 | A |
5369347 | Yoo | Nov 1994 | A |
5369838 | Wood et al. | Dec 1994 | A |
5386862 | Glover et al. | Feb 1995 | A |
5399951 | Lavallee et al. | Mar 1995 | A |
5400244 | Watanabe et al. | Mar 1995 | A |
5404612 | Ishikawa | Apr 1995 | A |
5410479 | Coker | Apr 1995 | A |
5435405 | Schempf et al. | Jul 1995 | A |
5440216 | Kim | Aug 1995 | A |
5442358 | Keeler et al. | Aug 1995 | A |
5444965 | Colens | Aug 1995 | A |
5446356 | Kim | Aug 1995 | A |
5446445 | Bloomfield et al. | Aug 1995 | A |
5451135 | Schempf et al. | Sep 1995 | A |
5454129 | Kell | Oct 1995 | A |
5455982 | Armstrong et al. | Oct 1995 | A |
5465525 | Mifune et al. | Nov 1995 | A |
5465619 | Sotack et al. | Nov 1995 | A |
5467273 | Faibish et al. | Nov 1995 | A |
5471560 | Allard et al. | Nov 1995 | A |
5491670 | Weber | Feb 1996 | A |
5497529 | Boesi | Mar 1996 | A |
5498948 | Bruni et al. | Mar 1996 | A |
5502638 | Takenaka | Mar 1996 | A |
5505072 | Oreper | Apr 1996 | A |
5507067 | Hoekstra et al. | Apr 1996 | A |
5511147 | Abdel | Apr 1996 | A |
5515572 | Hoekstra et al. | May 1996 | A |
5534762 | Kim | Jul 1996 | A |
5537017 | Feiten et al. | Jul 1996 | A |
5537711 | Tseng | Jul 1996 | A |
5539953 | Kurz | Jul 1996 | A |
5542146 | Hoekstra et al. | Aug 1996 | A |
5542148 | Young | Aug 1996 | A |
5546631 | Chambon | Aug 1996 | A |
5551525 | Pack et al. | Sep 1996 | A |
5553349 | Kilstrom et al. | Sep 1996 | A |
5555587 | Guha | Sep 1996 | A |
5560077 | Crotchett | Oct 1996 | A |
5568589 | Hwang | Oct 1996 | A |
D375592 | Ljunggren | Nov 1996 | S |
5608306 | Rybeck et al. | Mar 1997 | A |
5608894 | Kawakami et al. | Mar 1997 | A |
5608944 | Gordon | Mar 1997 | A |
5610488 | Miyazawa | Mar 1997 | A |
5611106 | Wulff | Mar 1997 | A |
5611108 | Knowlton et al. | Mar 1997 | A |
5613261 | Kawakami et al. | Mar 1997 | A |
5613269 | Miwa | Mar 1997 | A |
5621291 | Lee | Apr 1997 | A |
5622236 | Azumi et al. | Apr 1997 | A |
5634237 | Paranjpe | Jun 1997 | A |
5634239 | Tuvin et al. | Jun 1997 | A |
5636402 | Kubo et al. | Jun 1997 | A |
5642299 | Hardin et al. | Jun 1997 | A |
5646494 | Han | Jul 1997 | A |
5647554 | Ikegami et al. | Jul 1997 | A |
5650702 | Azumi | Jul 1997 | A |
5652489 | Kawakami | Jul 1997 | A |
5682313 | Edlund et al. | Oct 1997 | A |
5709007 | Chiang | Jan 1998 | A |
5710506 | Broell et al. | Jan 1998 | A |
5714119 | Kawagoe et al. | Feb 1998 | A |
5717169 | Liang et al. | Feb 1998 | A |
5720077 | Nakamura et al. | Feb 1998 | A |
5732401 | Conway | Mar 1998 | A |
5735959 | Kubo et al. | Apr 1998 | A |
5752871 | Tsuzuki | May 1998 | A |
5756904 | Oreper et al. | May 1998 | A |
5761762 | Kubo | Jun 1998 | A |
5764888 | Bolan et al. | Jun 1998 | A |
5767437 | Rogers | Jun 1998 | A |
5777596 | Herbert | Jul 1998 | A |
5778486 | Kim | Jul 1998 | A |
5781697 | Jeong | Jul 1998 | A |
5781960 | Kilstrom et al. | Jul 1998 | A |
5787545 | Colens | Aug 1998 | A |
5793900 | Nourbakhsh et al. | Aug 1998 | A |
5794297 | Muta | Aug 1998 | A |
5815880 | Nakanishi | Oct 1998 | A |
5815884 | Imamura et al. | Oct 1998 | A |
5819008 | Asama et al. | Oct 1998 | A |
5819360 | Fujii | Oct 1998 | A |
5819936 | Saveliev et al. | Oct 1998 | A |
5820821 | Kawagoe et al. | Oct 1998 | A |
5821730 | Drapkin | Oct 1998 | A |
5825981 | Matsuda | Oct 1998 | A |
5831597 | West et al. | Nov 1998 | A |
5839156 | Park et al. | Nov 1998 | A |
5839532 | Yoshiji et al. | Nov 1998 | A |
5841259 | Kim et al. | Nov 1998 | A |
5844232 | Pezant | Dec 1998 | A |
5867800 | Leif | Feb 1999 | A |
5869910 | Colens | Feb 1999 | A |
5896611 | Haaga | Apr 1999 | A |
5903124 | Kawakami | May 1999 | A |
5905209 | Oreper | May 1999 | A |
5907886 | Buscher | Jun 1999 | A |
5910700 | Crotzer | Jun 1999 | A |
5916008 | Wong | Jun 1999 | A |
5924167 | Wright et al. | Jul 1999 | A |
5926909 | McGee | Jul 1999 | A |
5933102 | Miller et al. | Aug 1999 | A |
5933913 | Wright et al. | Aug 1999 | A |
5935179 | Kleiner et al. | Aug 1999 | A |
5940170 | Berg et al. | Aug 1999 | A |
5940346 | Sadowsky et al. | Aug 1999 | A |
5940927 | Haegermarck et al. | Aug 1999 | A |
5940930 | Oh et al. | Aug 1999 | A |
5942869 | Katou et al. | Aug 1999 | A |
5943730 | Boomgaarden | Aug 1999 | A |
5943733 | Tagliaferri | Aug 1999 | A |
5947225 | Kawakami et al. | Sep 1999 | A |
5950408 | Schaedler | Sep 1999 | A |
5959423 | Nakanishi et al. | Sep 1999 | A |
5968281 | Wright et al. | Oct 1999 | A |
5974348 | Rocks | Oct 1999 | A |
5983448 | Wright et al. | Nov 1999 | A |
5984880 | Lander et al. | Nov 1999 | A |
5987383 | Keller et al. | Nov 1999 | A |
5989700 | Krivopal | Nov 1999 | A |
5991951 | Kubo et al. | Nov 1999 | A |
5998953 | Nakamura et al. | Dec 1999 | A |
5998971 | Corbridge | Dec 1999 | A |
6000088 | Wright et al. | Dec 1999 | A |
6009358 | Angott et al. | Dec 1999 | A |
6021545 | Delgado et al. | Feb 2000 | A |
6023813 | Thatcher et al. | Feb 2000 | A |
6023814 | Imamura | Feb 2000 | A |
6025687 | Himeda et al. | Feb 2000 | A |
6026539 | Mouw et al. | Feb 2000 | A |
6030465 | Marcussen et al. | Feb 2000 | A |
6032542 | Warnick et al. | Mar 2000 | A |
6036572 | Sze | Mar 2000 | A |
6038501 | Kawakami | Mar 2000 | A |
6040669 | Hog | Mar 2000 | A |
6041471 | Charky et al. | Mar 2000 | A |
6041472 | Kasen et al. | Mar 2000 | A |
6049620 | Dickinson et al. | Apr 2000 | A |
6052821 | Chouly et al. | Apr 2000 | A |
6055042 | Sarangapani | Apr 2000 | A |
6055702 | Imamura et al. | May 2000 | A |
6061868 | Moritsch et al. | May 2000 | A |
6065182 | Wright et al. | May 2000 | A |
6073432 | Schaedler | Jun 2000 | A |
6076025 | Ueno et al. | Jun 2000 | A |
6076026 | Jambhekar et al. | Jun 2000 | A |
6076226 | Reed | Jun 2000 | A |
6076227 | Schallig et al. | Jun 2000 | A |
6088020 | Mor | Jul 2000 | A |
6094775 | Behmer | Aug 2000 | A |
6099091 | Campbell | Aug 2000 | A |
6101670 | Song | Aug 2000 | A |
6101671 | Wright et al. | Aug 2000 | A |
6108031 | King et al. | Aug 2000 | A |
6108067 | Okamoto | Aug 2000 | A |
6108076 | Hanseder | Aug 2000 | A |
6108269 | Kabel | Aug 2000 | A |
6112143 | Allen et al. | Aug 2000 | A |
6112996 | Matsuo | Sep 2000 | A |
6119057 | Kawagoe | Sep 2000 | A |
6122798 | Kobayashi et al. | Sep 2000 | A |
6124694 | Bancroft et al. | Sep 2000 | A |
6125498 | Roberts et al. | Oct 2000 | A |
6131237 | Kasper et al. | Oct 2000 | A |
6138063 | Himeda | Oct 2000 | A |
6142252 | Kinto et al. | Nov 2000 | A |
6154694 | Aoki et al. | Nov 2000 | A |
6167332 | Kurtzberg et al. | Dec 2000 | A |
6167587 | Kasper et al. | Jan 2001 | B1 |
6192548 | Huffman | Feb 2001 | B1 |
6216307 | Kaleta et al. | Apr 2001 | B1 |
6220865 | Macri et al. | Apr 2001 | B1 |
6226830 | Hendriks et al. | May 2001 | B1 |
6230362 | Kasper et al. | May 2001 | B1 |
6237741 | Guidetti | May 2001 | B1 |
6240342 | Fiegert et al. | May 2001 | B1 |
6243913 | Frank et al. | Jun 2001 | B1 |
6255793 | Peless et al. | Jul 2001 | B1 |
6259979 | Holmquist | Jul 2001 | B1 |
6261379 | Conrad et al. | Jul 2001 | B1 |
6263539 | Baig | Jul 2001 | B1 |
6263989 | Won | Jul 2001 | B1 |
6272936 | Oreper et al. | Aug 2001 | B1 |
6278917 | Bauer et al. | Aug 2001 | B1 |
6278918 | Dickson et al. | Aug 2001 | B1 |
6282526 | Ganesh | Aug 2001 | B1 |
6283034 | Miles | Sep 2001 | B1 |
6285930 | Dickson et al. | Sep 2001 | B1 |
6300737 | Bergvall et al. | Oct 2001 | B1 |
6321337 | Reshef et al. | Nov 2001 | B1 |
6321515 | Colens | Nov 2001 | B1 |
6323570 | Nishimura et al. | Nov 2001 | B1 |
6324714 | Walz et al. | Dec 2001 | B1 |
6327741 | Reed | Dec 2001 | B1 |
6332400 | Meyer | Dec 2001 | B1 |
6339735 | Peless et al. | Jan 2002 | B1 |
6370453 | Sommer | Apr 2002 | B2 |
6374155 | Wallach et al. | Apr 2002 | B1 |
6381802 | Park | May 2002 | B2 |
6385515 | Dickson et al. | May 2002 | B1 |
6388013 | Saraf et al. | May 2002 | B1 |
6389329 | Colens | May 2002 | B1 |
6400048 | Nishimura et al. | Jun 2002 | B1 |
6401294 | Kasper | Jun 2002 | B2 |
6408226 | Byrne et al. | Jun 2002 | B1 |
6412141 | Kasper et al. | Jul 2002 | B2 |
6415203 | Inoue et al. | Jul 2002 | B1 |
6421870 | Basham et al. | Jul 2002 | B1 |
6427285 | Legatt et al. | Aug 2002 | B1 |
6430471 | Kintou et al. | Aug 2002 | B1 |
6431296 | Won | Aug 2002 | B1 |
6437227 | Theimer | Aug 2002 | B1 |
6437465 | Nishimura et al. | Aug 2002 | B1 |
6438456 | Feddema et al. | Aug 2002 | B1 |
6438793 | Miner et al. | Aug 2002 | B1 |
6442476 | Poropat | Aug 2002 | B1 |
6443509 | Levin et al. | Sep 2002 | B1 |
6444003 | Sutcliffe | Sep 2002 | B1 |
6446302 | Kasper et al. | Sep 2002 | B1 |
6454036 | Airey et al. | Sep 2002 | B1 |
D464091 | Christianson | Oct 2002 | S |
6457206 | Judson | Oct 2002 | B1 |
6459955 | Bartsch et al. | Oct 2002 | B1 |
6463368 | Feiten et al. | Oct 2002 | B1 |
6465982 | Bergvall et al. | Oct 2002 | B1 |
6480762 | Uchikubo et al. | Nov 2002 | B1 |
6481515 | Kirkpatrick et al. | Nov 2002 | B1 |
6490539 | Dickson et al. | Dec 2002 | B1 |
6491127 | Holmberg et al. | Dec 2002 | B1 |
6493612 | Bisset et al. | Dec 2002 | B1 |
6493613 | Peless et al. | Dec 2002 | B2 |
6496754 | Song et al. | Dec 2002 | B2 |
6496755 | Wallach et al. | Dec 2002 | B2 |
6502657 | Kerrebrock et al. | Jan 2003 | B2 |
6504610 | Bauer et al. | Jan 2003 | B1 |
6507773 | Parker et al. | Jan 2003 | B2 |
6525509 | Petersson et al. | Feb 2003 | B1 |
D471243 | Cioffi et al. | Mar 2003 | S |
6535793 | Allard | Mar 2003 | B2 |
6548982 | Papanikolopoulos et al. | Apr 2003 | B1 |
6553612 | Dyson et al. | Apr 2003 | B1 |
6556892 | Kuroki et al. | Apr 2003 | B2 |
6557104 | Vu et al. | Apr 2003 | B2 |
D474312 | Stephens et al. | May 2003 | S |
6571415 | Gerber et al. | Jun 2003 | B2 |
6571422 | Gordon et al. | Jun 2003 | B1 |
6572711 | Sclafani et al. | Jun 2003 | B2 |
6574536 | Kawagoe et al. | Jun 2003 | B1 |
6580246 | Jacobs | Jun 2003 | B2 |
6584376 | Van Kommer | Jun 2003 | B1 |
6586908 | Petersson et al. | Jul 2003 | B2 |
D478884 | Slipy et al. | Aug 2003 | S |
6601265 | Burlington | Aug 2003 | B1 |
6604021 | Imai et al. | Aug 2003 | B2 |
6604022 | Parker et al. | Aug 2003 | B2 |
6605156 | Clark et al. | Aug 2003 | B1 |
6611120 | Song et al. | Aug 2003 | B2 |
6611734 | Parker et al. | Aug 2003 | B2 |
6611738 | Ruffner | Aug 2003 | B2 |
6615108 | Peless et al. | Sep 2003 | B1 |
6615885 | Ohm | Sep 2003 | B1 |
6622465 | Jerome et al. | Sep 2003 | B2 |
6624744 | Wilson et al. | Sep 2003 | B1 |
6625843 | Kim et al. | Sep 2003 | B2 |
6629028 | Paromtchik et al. | Sep 2003 | B2 |
6654482 | Parent et al. | Nov 2003 | B1 |
6658325 | Zweig | Dec 2003 | B2 |
6658354 | Lin | Dec 2003 | B2 |
6658692 | Lenkiewicz et al. | Dec 2003 | B2 |
6658693 | Reed | Dec 2003 | B1 |
6661239 | Ozick | Dec 2003 | B1 |
6662889 | De Fazio et al. | Dec 2003 | B2 |
6668951 | Won | Dec 2003 | B2 |
6670817 | Fournier et al. | Dec 2003 | B2 |
6671592 | Bisset et al. | Dec 2003 | B1 |
6687571 | Byrne et al. | Feb 2004 | B1 |
6690993 | Foulke et al. | Feb 2004 | B2 |
6711280 | Stafsudd et al. | Mar 2004 | B2 |
6732826 | Song et al. | May 2004 | B2 |
6741054 | Koselka et al. | May 2004 | B2 |
6748297 | Song et al. | Jun 2004 | B2 |
6756703 | Chang | Jun 2004 | B2 |
6760647 | Nourbakhsh et al. | Jul 2004 | B2 |
6764373 | Osawa et al. | Jul 2004 | B1 |
6769004 | Barrett | Jul 2004 | B2 |
6774596 | Bisset | Aug 2004 | B1 |
6779380 | Nieuwkamp | Aug 2004 | B1 |
6810305 | Kirkpatrick | Oct 2004 | B2 |
6830120 | Yashima et al. | Dec 2004 | B1 |
6832407 | Salem et al. | Dec 2004 | B2 |
6836701 | McKee | Dec 2004 | B2 |
6841963 | Song et al. | Jan 2005 | B2 |
6845297 | Allard | Jan 2005 | B2 |
6856811 | Burdue et al. | Feb 2005 | B2 |
6859010 | Jeon et al. | Feb 2005 | B2 |
6859682 | Naka et al. | Feb 2005 | B2 |
6860206 | Rudakevych et al. | Mar 2005 | B1 |
6865447 | Lau et al. | Mar 2005 | B2 |
6870792 | Chiappetta | Mar 2005 | B2 |
6871115 | Huang et al. | Mar 2005 | B2 |
6886651 | Slocum et al. | May 2005 | B1 |
6888333 | Laby | May 2005 | B2 |
6901624 | Mori et al. | Jun 2005 | B2 |
6906702 | Tanaka et al. | Jun 2005 | B1 |
6925357 | Wang et al. | Aug 2005 | B2 |
6925679 | Wallach et al. | Aug 2005 | B2 |
D510066 | Hickey et al. | Sep 2005 | S |
6938298 | Aasen | Sep 2005 | B2 |
6940291 | Ozick | Sep 2005 | B1 |
6941199 | Bottomley et al. | Sep 2005 | B1 |
6956348 | Landry et al. | Oct 2005 | B2 |
6957712 | Song et al. | Oct 2005 | B2 |
6960986 | Asama et al. | Nov 2005 | B2 |
6965211 | Tsurumi | Nov 2005 | B2 |
6968592 | Takeuchi et al. | Nov 2005 | B2 |
6971140 | Kim | Dec 2005 | B2 |
6975246 | Trudeau | Dec 2005 | B1 |
6985556 | Shanmugavel et al. | Jan 2006 | B2 |
6993954 | George et al. | Feb 2006 | B1 |
6999850 | McDonald | Feb 2006 | B2 |
7013527 | Thomas et al. | Mar 2006 | B2 |
7024280 | Parker et al. | Apr 2006 | B2 |
7027893 | Perry et al. | Apr 2006 | B2 |
7030768 | Wanie | Apr 2006 | B2 |
7031805 | Lee et al. | Apr 2006 | B2 |
7054716 | McKee et al. | May 2006 | B2 |
7057120 | Ma et al. | Jun 2006 | B2 |
7057643 | Iida et al. | Jun 2006 | B2 |
7065430 | Naka et al. | Jun 2006 | B2 |
7066291 | Martins et al. | Jun 2006 | B2 |
7069124 | Whittaker et al. | Jun 2006 | B1 |
7075661 | Petty et al. | Jul 2006 | B2 |
7085624 | Aldred et al. | Aug 2006 | B2 |
7113847 | Chmura et al. | Sep 2006 | B2 |
7155308 | Jones | Dec 2006 | B2 |
7171285 | Kim et al. | Jan 2007 | B2 |
7173391 | Jones et al. | Feb 2007 | B2 |
7188000 | Chiappetta et al. | Mar 2007 | B2 |
7193384 | Norman et al. | Mar 2007 | B1 |
7196487 | Jones et al. | Mar 2007 | B2 |
7201786 | Wegelin et al. | Apr 2007 | B2 |
7206677 | Hulden | Apr 2007 | B2 |
7211980 | Bruemmer et al. | May 2007 | B1 |
7246405 | Yan | Jul 2007 | B2 |
7248951 | Hulden | Jul 2007 | B2 |
7275280 | Haegermarck et al. | Oct 2007 | B2 |
7283892 | Boillot et al. | Oct 2007 | B1 |
7288912 | Landry et al. | Oct 2007 | B2 |
7318248 | Yan et al. | Jan 2008 | B1 |
7320149 | Huffman et al. | Jan 2008 | B1 |
7324870 | Lee | Jan 2008 | B2 |
7328196 | Peters | Feb 2008 | B2 |
7352153 | Yan | Apr 2008 | B2 |
7359766 | Jeon et al. | Apr 2008 | B2 |
7360277 | Moshenrose et al. | Apr 2008 | B2 |
7363108 | Noda et al. | Apr 2008 | B2 |
7388879 | Sabe et al. | Jun 2008 | B2 |
7389166 | Harwig et al. | Jun 2008 | B2 |
7408157 | Yan | Aug 2008 | B2 |
7418762 | Arai et al. | Sep 2008 | B2 |
7430455 | Casey et al. | Sep 2008 | B2 |
7430462 | Chiu et al. | Sep 2008 | B2 |
7441298 | Svendsen et al. | Oct 2008 | B2 |
7444206 | Abramson et al. | Oct 2008 | B2 |
7448113 | Jones et al. | Nov 2008 | B2 |
7459871 | Landry et al. | Dec 2008 | B2 |
7467026 | Sakagami et al. | Dec 2008 | B2 |
7474941 | Kim et al. | Jan 2009 | B2 |
7503096 | Lin | Mar 2009 | B2 |
7515991 | Egawa et al. | Apr 2009 | B2 |
7555363 | Augenbraun et al. | Jun 2009 | B2 |
7557703 | Yamada et al. | Jul 2009 | B2 |
7568259 | Yan | Aug 2009 | B2 |
7571511 | Jones et al. | Aug 2009 | B2 |
7578020 | Jaworski et al. | Aug 2009 | B2 |
7600521 | Woo | Oct 2009 | B2 |
7603744 | Reindle | Oct 2009 | B2 |
7617557 | Reindle | Nov 2009 | B2 |
7620476 | Morse et al. | Nov 2009 | B2 |
7636982 | Jones et al. | Dec 2009 | B2 |
7647144 | Haegermarck | Jan 2010 | B2 |
7650666 | Jang | Jan 2010 | B2 |
7660650 | Kawagoe et al. | Feb 2010 | B2 |
7693605 | Park | Apr 2010 | B2 |
7720554 | DiBernardo et al. | May 2010 | B2 |
7801645 | Taylor et al. | Sep 2010 | B2 |
7805220 | Taylor et al. | Sep 2010 | B2 |
7809944 | Kawamoto | Oct 2010 | B2 |
7849555 | Hahm et al. | Dec 2010 | B2 |
7853645 | Brown et al. | Dec 2010 | B2 |
7920941 | Park et al. | Apr 2011 | B2 |
7937800 | Yan | May 2011 | B2 |
7957836 | Myeong et al. | Jun 2011 | B2 |
7996097 | Dibernardo et al. | Aug 2011 | B2 |
8295955 | Dibernardo et al. | Oct 2012 | B2 |
20010004719 | Sommer | Jun 2001 | A1 |
20010013929 | Torsten | Aug 2001 | A1 |
20010020200 | Das et al. | Sep 2001 | A1 |
20010025183 | Shahidi | Sep 2001 | A1 |
20010037163 | Allard | Nov 2001 | A1 |
20010043509 | Green et al. | Nov 2001 | A1 |
20010045883 | Holdaway et al. | Nov 2001 | A1 |
20010047231 | Peless et al. | Nov 2001 | A1 |
20010047895 | De Fazio et al. | Dec 2001 | A1 |
20020011813 | Koselka et al. | Jan 2002 | A1 |
20020016649 | Jones | Feb 2002 | A1 |
20020021219 | Edwards | Feb 2002 | A1 |
20020095239 | Wallach et al. | Jul 2002 | A1 |
20020097400 | Jung et al. | Jul 2002 | A1 |
20020104963 | Mancevski | Aug 2002 | A1 |
20020108209 | Peterson | Aug 2002 | A1 |
20020112742 | Bredo et al. | Aug 2002 | A1 |
20020116089 | Kirkpatrick | Aug 2002 | A1 |
20020120364 | Colens | Aug 2002 | A1 |
20020124343 | Reed | Sep 2002 | A1 |
20020153185 | Song et al. | Oct 2002 | A1 |
20020156556 | Ruffner | Oct 2002 | A1 |
20020159051 | Guo | Oct 2002 | A1 |
20020166193 | Kasper | Nov 2002 | A1 |
20020169521 | Goodman et al. | Nov 2002 | A1 |
20020173877 | Zweig | Nov 2002 | A1 |
20020189871 | Won | Dec 2002 | A1 |
20030009259 | Hattori et al. | Jan 2003 | A1 |
20030019071 | Field et al. | Jan 2003 | A1 |
20030023356 | Keable | Jan 2003 | A1 |
20030025472 | Jones et al. | Feb 2003 | A1 |
20030028286 | Glenn et al. | Feb 2003 | A1 |
20030030399 | Jacobs | Feb 2003 | A1 |
20030058262 | Sato et al. | Mar 2003 | A1 |
20030060928 | Abramson et al. | Mar 2003 | A1 |
20030067451 | Tagg et al. | Apr 2003 | A1 |
20030097875 | Lentz et al. | May 2003 | A1 |
20030120389 | Abramson et al. | Jun 2003 | A1 |
20030124312 | Autumn | Jul 2003 | A1 |
20030126352 | Barrett | Jul 2003 | A1 |
20030137268 | Papanikolopoulos et al. | Jul 2003 | A1 |
20030192144 | Song et al. | Oct 2003 | A1 |
20030193657 | Uomori et al. | Oct 2003 | A1 |
20030208304 | Peless et al. | Nov 2003 | A1 |
20030216834 | Allard | Nov 2003 | A1 |
20030221114 | Hino et al. | Nov 2003 | A1 |
20030229421 | Chmura et al. | Dec 2003 | A1 |
20030229474 | Suzuki et al. | Dec 2003 | A1 |
20030233171 | Heiligensetzer | Dec 2003 | A1 |
20030233177 | Johnson et al. | Dec 2003 | A1 |
20030233870 | Mancevski | Dec 2003 | A1 |
20030233930 | Ozick | Dec 2003 | A1 |
20040016077 | Song et al. | Jan 2004 | A1 |
20040020000 | Jones | Feb 2004 | A1 |
20040030448 | Solomon | Feb 2004 | A1 |
20040030449 | Solomon | Feb 2004 | A1 |
20040030450 | Solomon | Feb 2004 | A1 |
20040030451 | Solomon | Feb 2004 | A1 |
20040030570 | Solomon | Feb 2004 | A1 |
20040030571 | Solomon | Feb 2004 | A1 |
20040031113 | Wosewick et al. | Feb 2004 | A1 |
20040049877 | Jones et al. | Mar 2004 | A1 |
20040055163 | McCambridge et al. | Mar 2004 | A1 |
20040068351 | Solomon | Apr 2004 | A1 |
20040068415 | Solomon | Apr 2004 | A1 |
20040068416 | Solomon | Apr 2004 | A1 |
20040074038 | Im et al. | Apr 2004 | A1 |
20040076324 | Burl et al. | Apr 2004 | A1 |
20040083570 | Song et al. | May 2004 | A1 |
20040088079 | Lavarec et al. | May 2004 | A1 |
20040093122 | Galibraith | May 2004 | A1 |
20040098167 | Yi et al. | May 2004 | A1 |
20040111184 | Chiappetta et al. | Jun 2004 | A1 |
20040111821 | Lenkiewicz et al. | Jun 2004 | A1 |
20040113777 | Matsuhira et al. | Jun 2004 | A1 |
20040117064 | McDonald | Jun 2004 | A1 |
20040117846 | Karaoguz et al. | Jun 2004 | A1 |
20040118998 | Wingett et al. | Jun 2004 | A1 |
20040125461 | Kawamura | Jul 2004 | A1 |
20040128028 | Miyamoto et al. | Jul 2004 | A1 |
20040133316 | Dean | Jul 2004 | A1 |
20040134336 | Solomon | Jul 2004 | A1 |
20040134337 | Solomon | Jul 2004 | A1 |
20040143919 | Wilder | Jul 2004 | A1 |
20040148419 | Chen et al. | Jul 2004 | A1 |
20040148731 | Damman et al. | Aug 2004 | A1 |
20040153212 | Profio et al. | Aug 2004 | A1 |
20040156541 | Jeon et al. | Aug 2004 | A1 |
20040158357 | Lee et al. | Aug 2004 | A1 |
20040181706 | Chen et al. | Sep 2004 | A1 |
20040187249 | Jones et al. | Sep 2004 | A1 |
20040187457 | Colens | Sep 2004 | A1 |
20040200505 | Taylor et al. | Oct 2004 | A1 |
20040204792 | Taylor et al. | Oct 2004 | A1 |
20040210345 | Noda et al. | Oct 2004 | A1 |
20040210347 | Sawada et al. | Oct 2004 | A1 |
20040211444 | Taylor et al. | Oct 2004 | A1 |
20040221790 | Sinclair et al. | Nov 2004 | A1 |
20040236468 | Taylor et al. | Nov 2004 | A1 |
20040244138 | Taylor et al. | Dec 2004 | A1 |
20040255425 | Arai et al. | Dec 2004 | A1 |
20050000543 | Taylor et al. | Jan 2005 | A1 |
20050010330 | Abramson et al. | Jan 2005 | A1 |
20050010331 | Taylor et al. | Jan 2005 | A1 |
20050021181 | Kim et al. | Jan 2005 | A1 |
20050033124 | Kelly et al. | Feb 2005 | A1 |
20050067994 | Jones et al. | Mar 2005 | A1 |
20050085947 | Aldred et al. | Apr 2005 | A1 |
20050144751 | Kegg et al. | Jul 2005 | A1 |
20050150074 | Diehl et al. | Jul 2005 | A1 |
20050154795 | Kuz et al. | Jul 2005 | A1 |
20050156562 | Cohen et al. | Jul 2005 | A1 |
20050165508 | Kanda et al. | Jul 2005 | A1 |
20050166354 | Uehigashi | Aug 2005 | A1 |
20050166355 | Tani | Aug 2005 | A1 |
20050172445 | Diehl et al. | Aug 2005 | A1 |
20050183229 | Uehigashi | Aug 2005 | A1 |
20050183230 | Uehigashi | Aug 2005 | A1 |
20050187678 | Myeong et al. | Aug 2005 | A1 |
20050192707 | Park et al. | Sep 2005 | A1 |
20050204717 | Colens | Sep 2005 | A1 |
20050209736 | Kawagoe | Sep 2005 | A1 |
20050213109 | Schell et al. | Sep 2005 | A1 |
20050217042 | Reindle | Oct 2005 | A1 |
20050218852 | Landry et al. | Oct 2005 | A1 |
20050222933 | Wesby | Oct 2005 | A1 |
20050229340 | Sawalski et al. | Oct 2005 | A1 |
20050229355 | Crouch et al. | Oct 2005 | A1 |
20050235451 | Yan | Oct 2005 | A1 |
20050251292 | Casey et al. | Nov 2005 | A1 |
20050255425 | Pierson | Nov 2005 | A1 |
20050258154 | Blankenship et al. | Nov 2005 | A1 |
20050273967 | Taylor et al. | Dec 2005 | A1 |
20050288819 | De Guzman | Dec 2005 | A1 |
20060000050 | Cipolla et al. | Jan 2006 | A1 |
20060010638 | Shimizu et al. | Jan 2006 | A1 |
20060020369 | Taylor et al. | Jan 2006 | A1 |
20060020370 | Abramson | Jan 2006 | A1 |
20060021168 | Nishikawa | Feb 2006 | A1 |
20060025134 | Cho et al. | Feb 2006 | A1 |
20060037170 | Shimizu | Feb 2006 | A1 |
20060060216 | Woo | Mar 2006 | A1 |
20060061657 | Rew et al. | Mar 2006 | A1 |
20060064828 | Stein et al. | Mar 2006 | A1 |
20060087273 | Ko et al. | Apr 2006 | A1 |
20060089765 | Pack et al. | Apr 2006 | A1 |
20060100741 | Jung | May 2006 | A1 |
20060143295 | Costa-Requena et al. | Jun 2006 | A1 |
20060146776 | Kim | Jul 2006 | A1 |
20060184293 | Konandreas et al. | Aug 2006 | A1 |
20060190146 | Morse et al. | Aug 2006 | A1 |
20060196003 | Song et al. | Sep 2006 | A1 |
20060220900 | Ceskutti et al. | Oct 2006 | A1 |
20060259494 | Watson et al. | Nov 2006 | A1 |
20060288519 | Jaworski et al. | Dec 2006 | A1 |
20060293787 | Kanda et al. | Dec 2006 | A1 |
20070006404 | Cheng et al. | Jan 2007 | A1 |
20070017061 | Yan | Jan 2007 | A1 |
20070028574 | Yan | Feb 2007 | A1 |
20070032904 | Kawagoe et al. | Feb 2007 | A1 |
20070043459 | Abbott et al. | Feb 2007 | A1 |
20070045018 | Carter et al. | Mar 2007 | A1 |
20070150096 | Yeh et al. | Jun 2007 | A1 |
20070157415 | Lee et al. | Jul 2007 | A1 |
20070157420 | Lee et al. | Jul 2007 | A1 |
20070179670 | Chiappetta et al. | Aug 2007 | A1 |
20070226949 | Hahm et al. | Oct 2007 | A1 |
20070234492 | Svendsen et al. | Oct 2007 | A1 |
20070244610 | Ozick et al. | Oct 2007 | A1 |
20070250212 | Halloran et al. | Oct 2007 | A1 |
20070266508 | Jones et al. | Nov 2007 | A1 |
20080007203 | Cohen et al. | Jan 2008 | A1 |
20080039974 | Sandin et al. | Feb 2008 | A1 |
20080052846 | Kapoor et al. | Mar 2008 | A1 |
20080184518 | Taylor et al. | Aug 2008 | A1 |
20080276407 | Schnittman et al. | Nov 2008 | A1 |
20080282494 | Won et al. | Nov 2008 | A1 |
20080307590 | Jones et al. | Dec 2008 | A1 |
20090007366 | Svendsen et al. | Jan 2009 | A1 |
20090038089 | Landry et al. | Feb 2009 | A1 |
20090049640 | Lee et al. | Feb 2009 | A1 |
20090055022 | Casey et al. | Feb 2009 | A1 |
20090292393 | Casey et al. | Nov 2009 | A1 |
20100011529 | Won et al. | Jan 2010 | A1 |
20100049365 | Jones et al. | Feb 2010 | A1 |
20100063628 | Landry et al. | Mar 2010 | A1 |
20100107355 | Won et al. | May 2010 | A1 |
20100257690 | Jones et al. | Oct 2010 | A1 |
20100257691 | Jones et al. | Oct 2010 | A1 |
20100263158 | Jones et al. | Oct 2010 | A1 |
20110125323 | Gutmann et al. | May 2011 | A1 |
Number | Date | Country |
---|---|---|
2128842 | Dec 1980 | DE |
3317376 | Dec 1987 | DE |
3536907 | Feb 1989 | DE |
199311014 | Oct 1993 | DE |
4338841 | May 1995 | DE |
4414683 | Oct 1995 | DE |
19849978 | Feb 2001 | DE |
10242257 | Apr 2003 | DE |
10357636 | Jul 2005 | DE |
102004041021 | Aug 2005 | DE |
102005046813 | Apr 2007 | DE |
338988 | Dec 1988 | DK |
0265542 | May 1988 | EP |
0281085 | Sep 1988 | EP |
0294101 | Dec 1988 | EP |
0433697 | Jun 1991 | EP |
0437024 | Jul 1991 | EP |
0479273 | Apr 1992 | EP |
0554978 | Aug 1993 | EP |
0615719 | Sep 1994 | EP |
0792726 | Sep 1997 | EP |
0 798 567 | Oct 1997 | EP |
0294101 | Dec 1998 | EP |
0845237 | Apr 2000 | EP |
0861629 | Sep 2001 | EP |
1228734 | Aug 2002 | EP |
1331537 | Jul 2003 | EP |
1380246 | Jan 2004 | EP |
1018315 | Nov 2004 | EP |
1553472 | Jul 2005 | EP |
1642522 | Apr 2006 | EP |
2238196 | Aug 2005 | ES |
2601443 | Jan 1988 | FR |
2601443 | Jan 1998 | FR |
2828589 | Feb 2003 | FR |
702426 | Jan 1954 | GB |
2128842 | May 1984 | GB |
2225221 | May 1990 | GB |
2267360 | Dec 1993 | GB |
2283838 | May 1995 | GB |
2284957 | Jun 1995 | GB |
2404330 | Feb 2005 | GB |
2417354 | Feb 2006 | GB |
59033511 | Mar 1984 | JP |
59-112311 | Jun 1984 | JP |
59099308 | Jun 1984 | JP |
59112311 | Jun 1984 | JP |
59112311 | Jun 1984 | JP |
59120124 | Jul 1984 | JP |
59131668 | Sep 1984 | JP |
59164973 | Sep 1984 | JP |
59184917 | Oct 1984 | JP |
2283343 | Nov 1984 | JP |
59212924 | Dec 1984 | JP |
59226909 | Dec 1984 | JP |
60089213 | May 1985 | JP |
60211510 | Oct 1985 | JP |
60259895 | Dec 1985 | JP |
61023221 | Jan 1986 | JP |
61097712 | May 1986 | JP |
62070709 | Apr 1987 | JP |
62074018 | Apr 1987 | JP |
62164431 | Jul 1987 | JP |
62263507 | Nov 1987 | JP |
62263508 | Nov 1987 | JP |
62189057 | Dec 1987 | JP |
63079623 | Apr 1988 | JP |
63158032 | Jul 1988 | JP |
4019586 | Jan 1992 | JP |
4074285 | Mar 1992 | JP |
4084921 | Mar 1992 | JP |
5023269 | Feb 1993 | JP |
5042076 | Feb 1993 | JP |
5046246 | Feb 1993 | JP |
5150827 | Jun 1993 | JP |
5150829 | Jun 1993 | JP |
5-257527 | Oct 1993 | JP |
5040519 | Oct 1993 | JP |
05257527 | Oct 1993 | JP |
5257533 | Oct 1993 | JP |
05285861 | Nov 1993 | JP |
05285861 | Nov 1993 | JP |
6003251 | Jan 1994 | JP |
6137828 | May 1994 | JP |
6293095 | Oct 1994 | JP |
06327598 | Nov 1994 | JP |
6105781 | Dec 1994 | JP |
07129239 | May 1995 | JP |
7059702 | Jun 1995 | JP |
7270518 | Oct 1995 | JP |
7295636 | Nov 1995 | JP |
7313417 | Dec 1995 | JP |
8016776 | Jan 1996 | JP |
8089449 | Apr 1996 | JP |
08089451 | Apr 1996 | JP |
8123548 | May 1996 | JP |
8152916 | Jun 1996 | JP |
8263137 | Oct 1996 | JP |
8335112 | Dec 1996 | JP |
9044240 | Feb 1997 | JP |
9066855 | Mar 1997 | JP |
9145309 | Jun 1997 | JP |
09179625 | Jul 1997 | JP |
09185410 | Jul 1997 | JP |
2555263 | Aug 1997 | JP |
09206258 | Aug 1997 | JP |
9265319 | Oct 1997 | JP |
9269807 | Oct 1997 | JP |
9269810 | Oct 1997 | JP |
9319432 | Dec 1997 | JP |
9319434 | Dec 1997 | JP |
9325812 | Dec 1997 | JP |
10-27018 | Jan 1998 | JP |
10055215 | Feb 1998 | JP |
10117973 | May 1998 | JP |
10118963 | May 1998 | JP |
10177414 | Jun 1998 | JP |
10214114 | Aug 1998 | JP |
10295595 | Nov 1998 | JP |
11015941 | Jan 1999 | JP |
11102220 | Apr 1999 | JP |
11162454 | Jun 1999 | JP |
11174145 | Jul 1999 | JP |
11175149 | Jul 1999 | JP |
11213157 | Aug 1999 | JP |
11508810 | Aug 1999 | JP |
11510935 | Sep 1999 | JP |
11295412 | Oct 1999 | JP |
2000047728 | Feb 2000 | JP |
2000056006 | Feb 2000 | JP |
2000056831 | Feb 2000 | JP |
2000066722 | Mar 2000 | JP |
2000075925 | Mar 2000 | JP |
2000275321 | Oct 2000 | JP |
2000353014 | Dec 2000 | JP |
2001022443 | Jan 2001 | JP |
2001067588 | Mar 2001 | JP |
2001087182 | Apr 2001 | JP |
2001508572 | Jun 2001 | JP |
3197758 | Aug 2001 | JP |
3201903 | Aug 2001 | JP |
2001216482 | Aug 2001 | JP |
2001258807 | Sep 2001 | JP |
2001265437 | Sep 2001 | JP |
2001-522079 | Nov 2001 | JP |
2002-82720 | Mar 2002 | JP |
2002204769 | Jul 2002 | JP |
2002333920 | Nov 2002 | JP |
2002360479 | Dec 2002 | JP |
2002366227 | Dec 2002 | JP |
2002369778 | Dec 2002 | JP |
2003010076 | Jan 2003 | JP |
2003010088 | Jan 2003 | JP |
2003015740 | Jan 2003 | JP |
2003084994 | Mar 2003 | JP |
2003-515210 | Apr 2003 | JP |
2003167628 | Jun 2003 | JP |
2003167628 | Jun 2003 | JP |
2003180587 | Jul 2003 | JP |
2003190064 | Jul 2003 | JP |
2003262520 | Sep 2003 | JP |
2003304992 | Oct 2003 | JP |
2003310509 | Nov 2003 | JP |
2004123040 | Apr 2004 | JP |
2004148021 | May 2004 | JP |
2004160102 | Jun 2004 | JP |
2004166968 | Jun 2004 | JP |
2004219185 | Aug 2004 | JP |
2005118354 | May 2005 | JP |
2005224265 | Aug 2005 | JP |
2005230032 | Sep 2005 | JP |
2005245916 | Sep 2005 | JP |
2005352707 | Dec 2005 | JP |
2006043071 | Feb 2006 | JP |
2006155274 | Jun 2006 | JP |
2006164223 | Jun 2006 | JP |
2006247467 | Sep 2006 | JP |
2006260161 | Sep 2006 | JP |
2006293662 | Oct 2006 | JP |
2006296697 | Nov 2006 | JP |
2007034866 | Feb 2007 | JP |
2007213180 | Aug 2007 | JP |
2009015611 | Jan 2009 | JP |
2010198552 | Sep 2010 | JP |
9526512 | Oct 1995 | WO |
9530887 | Nov 1995 | WO |
9617258 | Jun 1996 | WO |
9715224 | Nov 1997 | WO |
9740734 | Nov 1997 | WO |
9741451 | Nov 1997 | WO |
9853456 | Nov 1998 | WO |
9905580 | Feb 1999 | WO |
9916078 | Apr 1999 | WO |
WO 9923543 | May 1999 | WO |
9928800 | Jun 1999 | WO |
9938056 | Jul 1999 | WO |
9938237 | Jul 1999 | WO |
9943250 | Sep 1999 | WO |
99059042 | Nov 1999 | WO |
0038026 | Jun 2000 | WO |
0038028 | Jun 2000 | WO |
0038029 | Jun 2000 | WO |
0004430 | Oct 2000 | WO |
0078410 | Dec 2000 | WO |
0106904 | Feb 2001 | WO |
0106905 | Feb 2001 | WO |
WO 0137060 | May 2001 | WO |
0239864 | May 2002 | WO |
0239868 | May 2002 | WO |
0275350 | Sep 2002 | WO |
02067744 | Sep 2002 | WO |
02067745 | Sep 2002 | WO |
02067752 | Sep 2002 | WO |
02069774 | Sep 2002 | WO |
02069775 | Sep 2002 | WO |
02081074 | Oct 2002 | WO |
03015220 | Feb 2003 | WO |
03024292 | Mar 2003 | WO |
03026474 | May 2003 | WO |
03040546 | May 2003 | WO |
03040845 | May 2003 | WO |
03040846 | May 2003 | WO |
03062850 | Jul 2003 | WO |
03062852 | Jul 2003 | WO |
2004004533 | Jan 2004 | WO |
2004004534 | Jan 2004 | WO |
2004006034 | Jan 2004 | WO |
2005006935 | Jan 2005 | WO |
2005055795 | Jun 2005 | WO |
2005055796 | Jun 2005 | WO |
2005076545 | Aug 2005 | WO |
2005077243 | Aug 2005 | WO |
2005077244 | Aug 2005 | WO |
2005081074 | Sep 2005 | WO |
2005082223 | Sep 2005 | WO |
2005083541 | Sep 2005 | WO |
2005098475 | Oct 2005 | WO |
2005098476 | Oct 2005 | WO |
2006046400 | May 2006 | WO |
2006068403 | Jun 2006 | WO |
2006073248 | Jul 2006 | WO |
2007137234 | Nov 2007 | WO |
Entry |
---|
Becker, C.; Salas, J.; Tokusei, K.; Latombe, J.-C.; Robotics and Automation, 1995. Proceedings, 1995 IEEE International Conference on Robotics and Automation vol. 1, May 21-27, 1995 pp. 401-406 vol. 1. |
Becker C. et al.: “Reliable navigation using landmarks” Proceedings of the Int'l Conf. on Robotics and Automation, May 21-27, 1995, New York, IEEE, US vol. 1, May 21, 1995. |
International Search Report for PCT/US05/010200, dated Aug. 2, 2005. |
International Search Report for PCT/US05/010244, dated Aug. 2, 2005. |
Japanese Office Action, JP Patent Application No. 2007-506413, dated May 26, 2010, English Translation and Japanese Office Action. |
Roboking—not just a vacuum cleaner, a robot!, Jan. 21, 2004, infocom.uz/2004/01/21/robokingne-prosto-pyilesos-a-robot/, accessed Oct. 10, 2011, 7 pages. |
SVET Computers—New Technologies—Robot Vacuum Cleaner, Oct. 1999, available at http://www.sk.rs/1999/10/sknt01.html, 1 page, accessed Nov. 1, 2011. |
Matsumura Camera Online Shop: Retrieved from the Internet: URL<http://www.rakuten.co.jp/matsucame/587179/711512/>. Accessed Nov. 2011, 7 pages. |
Dyson's Robot Vacuum Cleaner—the DC06, May 2004, Retrieved from the Internet: URL<http://www.gizmag.com/go/1282/>. Accessed Nov. 2011, 3 pages. |
Electrolux Trilobite, “Time to enjoy life,” Retrieved from the Internet: URL<http://www.robocon.co.kr/trilobite/Presentation—Trilobite—Kor—030104.ppt, 26 pages, accessed Dec. 2011. |
Electrolux Trilobite, Jan. 12, 2001, http://www.electroluxui.com:8080/2002%5C822%5C833102EN.pdf, accessed Jul. 2, 2012, 10 pages. |
Euroflex, Jan. 2006, Retrieved from the Internet: URL<http://www.euroflex.tv/novita—dett.php?id=15, 1 page, accessed Nov. 2011. |
Facts on the Trilobite, Retrieved from the Internet: URL<http://www.frc.ri.cmu.edu/˜hpm/talks/Extras/trilobite.desc.html>. 2 pages, accessed Nov. 2011. |
Friendly Robotics, Retrieved from the Internet: URL<http://www.robotsandrelax.com/PDFs/RV400Manual.pdf>. 18 pages, accessed Dec. 2011. |
Robot Buying Guide, “LG announces the first robotic vacuum cleaner for Korea,” Retrieved from the Internet: URL<http://robotbg.com/news/2003/04/22/lg—announces—the—first—robotic—vacu>. 1 page, Apr. 2003. |
UBOT, cleaning robot capable of wiping with a wet duster, Retrieved from the Internet: URL<http://us.aving.net/news/view.php?articleId=23031>. 4 pages, accessed Nov. 2011. |
Taipei Times, “Robotic vacuum by Matsuhita about to undergo testing,” Retrieved from the Internet: URL<http://www.taipeitimes.com/News/worldbiz/archives/2002/03/26/0000129338>. accessed Mar. 2002, 2 pages. |
Ascii, Mar. 25, 2002, http://ascii.jp/elem/000/000/330/330024/ accessed Nov. 1, 2011. 7 pages. |
Yujin Robotics,“An intelligent cleaning robot,” Retrieved from the Internet: URL<http://us.aving.net/news/view.php?articleId=7257>. 8 pages, accessed Nov. 2011. |
Everyday Robots, “Everyday Robots: Reviews, Discussion and News for Consumers,” Retrieved from the Internet: URL<www.everydayrobots.com/index.php?option=content&task=view&id=9>. 7 pages, Apr. 2005. |
Gat, “Robust Low-Computation Sensor-driven Control for Task-Directed Navigation,” Proc of IEEE International Conference on Robotics and Automation , Sacramento, CA pp. 2484-2489, Apr. 1991. |
Hitachi: News release: “The home cleaning robot of the autonomous movement type (experimental machine),” Retrieved from the Internet: URL<www.i4u.com./japanreleases/hitachirobot.htm>. 5 pages, Mar. 2005. |
Jeong et al., “An intelligent map-building system for indoor mobile robot using low cost photo sensors,”SPIE, vol. 6042, 6 pages, 2005. |
Kahney, “Robot Vacs are in the House,” Retrieved from the Internet: URL<www.wired.com/news/technology/o,1282,59237,00.html>. 6 pages, Jun. 2003. |
Karcher, “Product Manual Download Karch”, available at www.karcher.com, 17 pages, 2004. |
Karcher “Karcher RoboCleaner RC 3000,” Retrieved from the Internet: URL<www.robocleaner.de/english/screen3.html>. 4 pages, Dec. 2003. |
Karcher USA “RC 3000 Robotics cleaner,” : Retrieved from the Internet: URL<www.karcher-usa.com, 3 pages, Mar. 2005. |
Leonard et al., “Mobile Robot Localization by tracking Geometric Beacons,” IEEE Transaction on Robotics and Automation, 7(3):376-382, Jun. 1991. |
Linde, Dissertation-“On Aspects of Indoor Localization,” Available at: https://eldorado.tu-dortmund.de/handle/2003/22854, University of Dortmund, 138 pages, Aug. 2006. |
Morland,“Autonomous Lawnmower Control”, Downloaded from the internet at: http://cns.bu.edu/˜cjmorlan/robotics/lawnmower/report.pdf, 10 pages, Jul. 2002. |
Nam et al., “Real-Time Dynamic Visual Tracking Using PSD Sensors and extended Trapezoidal Motion Planning”, Applied Intelligence 10, pp. 53-70, 1999. |
Kurth, “Range-Only Robot Localization and SLAM with Radio”, http://www.ri.cmu.edu/pub—files/pub4/kurth—derek—2004—1/kurth—derek—2004—1.pdf. 60 pages, May 2004, accessed Jul. 27, 2012. |
Li et al. “Robust Statistical Methods for Securing Wireless Localization in Sensor Networks,” Information Processing in Sensor Networks, 2005, Fourth International Symposium on, pp. 91-98, Apr. 2005. |
Champy, “Physical management of IT assets in Data Centers using RFID technologies,” RFID 2005 University, Oct. 12-14, 2005 , 19 pages. |
On Robo, “Robot Reviews Samsung Robot Vacuum (VC-RP30W),” Retrieved from the Internet: URL<www.onrobo.com/reviews/AT—Home/vacuum—cleaners/on00vcrb30rosam/index.htm>. 2 pages, 2005. |
OnRobo “Samsung Unveils Its Multifunction Robot Vacuum,” Retrieved from the Internet: URL<www.onrobo.com/enews/0210/samsung—vacuum.shtml>. 3 pages, Mar. 2005. |
Pages et al., “A camera-projector system for robot positioning by visual serving,” Proceedings of the 2006 Conference on Computer Vision and Pattern Recognition Workshop (CVPRW06), 8 pages, Jun. 2006. |
Pages et al., “Robust decoupled visual servoing based on structured light,” 2005 IEEE/RSJ, Int. Conf. on Intelligent Robots and Systems, pp. 2676-2681, 2005. |
Park et al., “A Neural Network Based Real-Time Robot Tracking Controller Using Position Sensitive Detectors,” IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on Neutral Networks, Orlando, Florida pp. 2754-2758, Jun./Jul. 1994. |
Pirjanian, “Challenges for Standards for consumer Robotics,” IEEE Workshop on Advanced Robotics and its Social impacts, pp. 260-264, Jun. 2005. |
Shimoga et al., “Touch and Force Reflection for Telepresence Surgery,” Engineering in Medicine and Biology Society, 1994. Engineering Advances: New Opportunities for Biomedical Engineers. Proceedings of the 16th Annual International Conference of the IEEE, Baltimore, MD, pp. 1049-1050, 1994. |
Cozman et al., “Robot Localization using a Computer Vision Sextant,” IEEE International Midwest Conference on Robotics and Automation, pp. 106-111, 1995. |
Dorfmüller-Ulhaas, “Optical Tracking From User Motion to 3D Interaction,” http://www.cg.tuwien.ac.at/research/publications/2002/Dorfmueller-Ulhaas-thesis, 182 pages, 2002. |
Gregg et al., “Autonomous Lawn Care Applications,” 2006 Florida Conference on Recent Advances in Robotics, Miami, Florida, May 25-26, 2006, Florida International University, 5 pages. |
Eren et al., “Accuracy in position estimation of mobile robots based on coded infrared signal transmission,” Proceedings: Integrating Intelligent Instrumentation and Control, Instrumentation and Measurement Technology Conference, 1995, IMTC/95. pp. 548-551, 1995. |
Friendly Robotics, “Friendly Robotics—Friendly Vac, Robotic Vacuum Cleaner,” Retrieved from the Internet: URL<www.friendlyrobotics.com/vac.htm> 5 pages, Apr. 2005. |
Fukuda et al., “Navigation System based on Ceiling Landmark Recognition for Autonomous mobile robot,” 1995 IEEE/RSJ International Conference on Intelligent Robots and Systems 95. ‘Human Robot Interaction and Cooperative Robots’, Pittsburgh, PA, pp. 1466-1471, Aug. 1995. |
Hoag et al., “Navigation and Guidance in interstellar space,” ACTA Astronautica, vol. 2, pp. 513-533 , Feb. 1975. |
Huntsberger et al., “CAMPOUT: A Control Architecture for Tightly Coupled Coordination of Multirobot Systems for Planetary Surface Exploration,” IEEE Transactions on Systems, Man, and Cybernetics—Part A: Systems and Humans, 33(5):550-559, Sep. 2003. |
Iirobotics.com, “Samsung Unveils Its Multifunction Robot Vacuum,” Retrieved from the Internet: URL<.www.iirobotics.com/webpages/hotstuff.php?ubre=111>. 3 pages, Mar. 2005. |
Borges et al., “Optimal Mobile Robot Pose Estimation Using Geometrical Maps,” IEEE Transactions on Robotics and Automation, 18(1): 87-94, Feb. 2002. |
Braunstingl et al., “Fuzzy Logic Wall Following of a Mobile Robot Based on the Concept of General Perception,” ICAR '95, 7th International Conference on Advanced Robotics, Sant Feliu De Guixols, Spain, pp. 367-376, Sep. 1995. |
Bulusu et al., “Self Configuring Localization systems: Design and Experimental Evaluation,”ACM Transactions on Embedded Computing Systems, 3(1):24-60, 2003. |
Caccia et al., “Bottom-Following for Remotely Operated Vehicles,”5th IFAC Conference, Alaborg, Denmark, pp. 245-250, Aug. 2000. |
Chae et al., “StarLITE: A new artificial landmark for the navigation of mobile robots,” http://www.irc.atr.jp/jk-nrs2005/pdf/Starlite.pdf, 4 pages, 2005. |
Chiri, “Joystick Control for Tiny OS Robot,” http://www.eecs.berkeley.edu/Programs/ugrad/superb/papers2002/chiri.pdf. 12 pages, Aug. 2002. |
Christensen et al. “Theoretical Methods for Planning and Control in Mobile Robotics,” 1997 First International Conference on Knowledge-Based Intelligent Electronic Systems, Adelaide, Australia, pp. 81-86, May 1997. |
Clerentin et al., “A localization method based on two omnidirectional perception systems cooperation,” Proc of IEEE International Conference on Robotics & Automation, San Francisco, CA vol. 2, pp. 1219-1224, Apr. 2000. |
Corke, “High Performance Visual serving for robots end-point control,” SPIE vol. 2056, Intelligent Robots and Computer Vision, 1993, 10 pages. |
Andersen et al., “Landmark based navigation strategies,” SPIE Conference on Mobile Robots XIII, SPIE vol. 3525, pp. 170-181, Jan. 8, 1999. |
D'Orazio et al., “Model based Vision System for mobile robot position estimation”, SPIE, vol. 2058 Mobile Robots VIII, pp. 38-49, 1992. |
De Bakker et al., “Smart PSD—array for sheet of light range imaging”, Proc. Of SPIE, vol. 3965, pp. 1-12, May 2000. |
Desaulniers et al., “An Efficient Algorithm to find a shortest path for a car-like Robot,” IEEE Transactions on robotics and Automation, 11(6):819-828, Dec. 1995. |
Dorsch et al., “Laser Triangulation: Fundamental uncertainty in distance measurement,” Applied Optics, 33(7):1306-1314, Mar. 1994. |
Dulimarta et al., “Mobile Robot Localization in Indoor Environment”, Pattern Recognition, 30(1):99-111, 1997. |
Eren et al., “Operation of Mobile Robots in a Structured Infrared Environment,” Proceedings ‘Sensing, Processing, Networking’, IEEE Instrumentation and Measurement Technology Conference, 1997 (IMTC/97), Ottawa, Canada vol. 1, pp. 20-25, May 1997. |
Becker et al., “Reliable Navigation Using Landmarks,” IEEE International Conference on Robotics and Automation, 0-7803-1965-6, pp. 401-406, 1995. |
Benayad-Cherif et al., “Mobile Robot Navigation Sensors,” SPIE vol. 1831 Mobile Robots, VII, pp. 378-387, 1992. |
Facchinetti Claudio et al., “Using and Learning Vision-Based Self-Positioning for Autonomous Robot Navigation,” ICARCV '94, vol. 3, pp. 1694-1698, 1994. |
Facchinetti Claudio et al., “Self-Positioning Robot Navigation Using Ceiling Images Sequences,” ACCV '95, 5 pages, Dec. 1995. |
Fairfield et al., “Mobile Robot Localization with Sparse Landmarks,” SPIE vol. 4573, pp. 148-155, 2002. |
Favre-Bulle, “Efficient tracking of 3D—Robot Position by Dynamic Triangulation,” IEEE Instrumentation and Measurement Technology Conference IMTC 98 Session on Instrumentation and Measurement in Robotics, vol. 1, pp. 446-449, May 1998. |
Fayman, “Exploiting Process Integration and Composition in the context of Active Vision,” IEEE Transactions on Systems, Man, and Cybernetics—Part C: Application and reviews, vol. 29, No. 1, pp. 73-86, Feb. 1999. |
Florbot GE Plastics, 1989-1990, 2 pages, available at http://www.fuseid.com/, accessed Sep. 27, 2012. |
Franz et al., “Biomimetric robot navigation”, Robotics and Autonomous Systems, vol. 30 pp. 133-153, 2000. |
Fuentes et al., “Mobile Robotics 1994,” University of Rochester. Computer Science Department, TR 588, 44 pages, Dec. 1994. |
Gionis, “A hand-held optical surface scanner for environmental Modeling and Virtual Reality,” Virtual Reality World, 16 pages, 1996. |
Goncalves et al., “A Visual Front-End for Simultaneous Localization and Mapping”, Proceedings of the 2005 IEEE International Conference on Robotics and Automation, Barcelona, Spain, pp. 44-49, Apr. 2005. |
Hamamatsu “SI PIN Diode S5980, S5981 S5870—Multi-element photodiodes for surface mounting,” Hamatsu Photonics, 2 pages, Apr. 2004. |
Hammacher Schlemmer , “Electrolux Trilobite Robotic Vacuum,” Retrieved from the Internet: URL<www.hammacher.com/publish/71579.asp?promo=xsells>. 3 pages, Mar. 2005. |
Haralick et al. “Pose Estimation from Corresponding Point Data”, IEEE Transactions on Systems, Man, and Cybernetics, 19(6):1426-1446, Nov. 1989. |
Hausler, “About the Scaling Behaviour of Optical Range Sensors,” Fringe '97, Proceedings of the 3rd International Workshop on Automatic Processing of Fringe Patterns, Bremen, Germany, pp. 147-155, Sep. 1997. |
Blaasvaer et al., “AMOR—An Autonomous Mobile Robot Navigation System,” Proceedings of the IEEE International Conference on Systems, Man, and Cybernetics, pp. 2266-2271, 1994. |
Jensfelt et al., “Active Global Localization for a mobile robot using multiple hypothesis tracking,” IEEE Transactions on Robots and Automation, 17(5): 748-760, Oct. 2001. |
Karlsson et al, “Core Technologies for service Robotics,” IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2004), vol. 3, pp. 2979-2984, Sep. 2004. |
Karlsson et al., The vSLAM Algorithm for Robust Localization and Mapping, Proceedings of the 2005 IEEE International Conference on Robotics and Automation, Barcelona, Spain, pp. 24-29, Apr. 2005. |
King and Weiman, “HelpmateTM Autonomous Mobile Robots Navigation Systems,” SPIE vol. 1388 Mobile Robots, pp. 190-198, 1990. |
Kleinberg, The Localization Problem for Mobile Robots, Laboratory for Computer Science, Massachusetts Institute of Technology, 1994 IEEE, pp. 521-531, 1994. |
Knights, et al., “Localization and Identification of Visual Landmarks,” Journal of Computing Sciences in Colleges, 16(4):312-313, May 2001. |
Kolodko et al., “Experimental System for Real-Time Motion Estimation,” Proceedings of the 2003 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM 2003), pp. 981-986, 2003. |
Komoriya et al., “Planning of Landmark Measurement for the Navigation of a Mobile Robot,” Proceedings of the 1992 IEEE/RSJ International Cofnerence on Intelligent Robots and Systems, Raleigh, NC pp. 1476-1481, Jul. 1992. |
Krotov et al., “Digital Sextant,” Downloaded from the internet at: http://www.cs.cmu.edu/˜epk/ , 1 page, 1995. |
Krupa et al., “Autonomous 3-D Positioning of Surgical Instruments in Robotized Laparoscopic Surgery Using Visual Servoin,” IEEE Transactions on Robotics and Automation, 19(5):842-853, Oct. 2003. |
Kuhl et al., “Self Localization in Environments using Visual Angles,” VRCAI '04 Proceedings of the 2004 ACM SIGGRAPH international conference on Virtual Reality continuum and its applications in industry, pp. 472-475, 2004. |
Lambrinos et al., “A mobile robot employing insect strategies for navigation,” Retrieved from the Internat: URL<http://www8.cs.umu.se/kurser/TDBD17/VT04/d1/Assignment%20Papers/lambrinos-RAS-2000.pdf>. 38 pages, Feb. 1999. |
Lang et al., “Visual Measurement of Orientation Using Ceiling Features”, 1994 IEEE, pp. 552-555, 1994. |
Lapin, “Adaptive position estimation for an automated guided vehicle,” SPIE, vol. 1831 Mobile Robots VII, pp. 82-94, 1992. |
LaValle et al., “Robot Motion Planning in a Changing, Partially Predictable Environment,” 1994 IEEE International Symposium on Intelligent Control, Columbus, OH, pp. 261-266, Aug. 1994. |
Lee et al., “Development of Indoor Navigation system for Humanoid Robot Using Multi-sensors Integration”, ION NTM, San Diego, CA pp. 798-805, Jan. 2007. |
Lee et al., “Localization of a Mobile Robot Using the Image of a Moving Object,” IEEE Transaction on Industrial Electronics, 50(3):612-619, Jun. 2003. |
Li et al., “Making a Local Map of Indoor Environments by Swiveling a Camera and a Sonar,” Proceedings of the 1999 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 954-959, 1999. |
Lin et al., “Mobile Robot Navigation Using Artificial Landmarks,” Journal of robotics System, 14(2): 93-106, 1997. |
Lumelsky et al., “An Algorithm for Maze Searching with Azimuth Input”, 1994 IEEE International Conference on Robotics and Automation, San Diego, CA vol. 1, pp. 111-116, 1994. |
Luo et al., “Real-time Area-Covering Operations with Obstacle Avoidance for Cleaning Robots,” IEEE, pp. 2359-2364, 2002. |
Madsen et al., “Optimal landmark selection for triangulation of robot position,” Journal of Robotics and Autonomous Systems, vol. 13 pp. 277-292, 1998. |
Martishevcky, “The Accuracy of point light target coordinate determination by dissectoral tracking system”, SPIE vol. 2591, pp. 25-30, Oct. 23, 2005. |
Matsutek Enterprises Co. Ltd, “Automatic Rechargeable Vacuum Cleaner,” http://matsutek.manufacturer.globalsources.com/si/6008801427181/pdtl/Home-vacuum/10 . . . , Apr. 2007, 3 pages. |
McGillem et al., “Infra-red Lacation System for Navigation and Autonomous Vehicles,” 1988 IEEE International Conference on Robotics and Automation, vol. 2, pp. 1236-1238, Apr. 1988. |
McGillem,et al. “A Beacon Navigation Method for Autonomous Vehicles,” IEEE Transactions on Vehicular Technology, 38(3):132-139, Aug. 1989. |
Miro et al., “Towards Vision Based Navigation in Large Indoor Environments,” Proceedings of the IEEE/RSJ International. |
MobileMag, “Samsung Unveils High-tech Robot Vacuum Cleaner,” Retrieved from the Internet: URL<http://www.mobilemag.com/content/100/102/C2261/>. 4 pages, Mar. 2005. |
Monteiro et al., “Visual Servoing for Fast Mobile Robot: Adaptive Estimation of Kinematic Parameters,” Proceedings of the IECON '93., International Conference on Industrial Electronics, Maui, HI, pp. 1588-1593, Nov. 1993. |
Moore et al., “A simple Map-bases Localization strategy using range measurements,” SPIE, vol. 5804 pp. 612-620, 2005. |
Munich et al., “ERSP: A Software Platform and Architecture for the Service Robotics Industry,” Intelligent Robots and Systems, 2005. (IROS 2005), pp. 460-467, Aug. 2005. |
Munich et al., “SIFT-ing Through Features with ViPR”, IEEE Robotics & Automation Magazine, pp. 72-77, Sep. 2006. |
Nitu et al., “Optomechatronic System for Position Detection of a Mobile Mini-Robot,” IEEE Ttransactions on Industrial Electronics, 52(4):969-973, Aug. 2005. |
On Robo, “Robot Reviews Samsung Robot Vacuum (VC-RP30W),” Retrieved from the Internet: URL <www.onrobo.com/reviews/AT—Home/vacuum—cleaners/on00vcrb30rosam/index.htm>. 2 pages, 2005. |
Pirjanian et al. “Representation and Execution of Plan Sequences for Multi-Agent Systems,” Proceedings of the 2001 IEEE/RSJ International Conference on Intelligent Robots and Systems, Maui, Hawaii, pp. 2117-2123, Oct. 2001. |
Pirjanian et al., “A decision-theoretic approach to fuzzy behavior coordination”, 1999 IEEE International Symposium on Computational Intelligence in Robotics and Automation, 1999. CIRA '99., Monterey, CA, pp. 101-106, Nov. 1999. |
Pirjanian et al., “Distributed Control for a Modular, Reconfigurable Cliff Robot,” Proceedings of the 2002 IEEE International Conference on Robotics & Automation, Washington, D.C. pp. 4083-4088, May 2002. |
Pirjanian et al., “Improving Task Reliability by Fusion of Redundant Homogeneous Modules Using Voting Schemes,” Proceedings of the 1997 IEEE International Conference on Robotics and Automation, Albuquerque, NM, pp. 425-430, Apr. 1997. |
Pirjanian et al., “Multi-Robot Target Acquisition using Multiple Objective Behavior Coordination,” Proceedings of the 2000 IEEE International Conference on Robotics & Automation, San Francisco, CA, pp. 2696-2702, Apr. 2000. |
Prassler et al., “A Short History of Cleaning Robots,” Autonomous Robots 9, 211-226, 2000, 16 pages. |
Remazeilles et al., “Image based robot navigation in 3D environments,” Proc. of SPIE, vol. 6052, pp. 1-14, Dec. 2005. |
Rives et al., “Visual servoing based on ellipse features,” SPIE, vol. 2056 Intelligent Robots and Computer Vision pp. 356-367, 1993. |
Ronnback, “On Methods for Assistive Mobile Robots,” Retrieved from the Internet: URL<http://www.openthesis.org/documents/methods-assistive-mobile-robots-595019.html>. 218 pages, Jan. 2006. |
Roth-Tabak et al., “Environment Model for mobile Robots Indoor Navigation,” SPIE, vol. 1388 Mobile Robots, pp. 453-463, 1990. |
Sahin et al., “Development of a Visual Object Localization Module for Mobile Robots,” 1999 Third European Workshop on Advanced Mobile Robots, (Eurobot '99), pp. 65-72, 1999. |
Salomon et al., “Low-Cost Optical Indoor Localization system for Mobile Objects without Image Processing,” IEEE Conference on Emerging Technologies and Factory Automation, 2006. (ETFA '06), pp. 629-632, Sep. 2006. |
Sato, “Range Imaging Based on Moving Pattern Light and Spatio-Temporal Matched Filter,” Proceedings International Conference on Image Processing, vol. 1., Lausanne, Switzerland, pp. 33-36, Sep. 1996. |
Schenker et al., “Lightweight rovers for Mars science exploration and sample return,” Intelligent Robots and Computer Vision XVI, SPIE Proc. 3208, pp. 24-36, 1997. |
Sim et al, “Learning Visual Landmarks for Pose Estimation,” IEEE International Conference on Robotics and Automation, vol. 3, Detroit, MI, pp. 1972-1978, May 1999. |
Sobh et al., “Case Studies in Web-Controlled Devices and Remote Manipulation,” Automation Congress, 2002 Proceedings of the 5th Biannual World, pp. 435-440, Dec. 2002. |
Stella et al., “Self-Location for Indoor Navigation of Autonomous Vehicles,” Part of the SPIE conference on Enhanced and Synthetic Vision SPIE vol. 3364, pp. 298-302, 1998. |
Summet, “Tracking Locations of Moving Hand-held Displays Using Projected Light,” Pervasive 2005, LNCS 3468, pp. 37-46, 2005. |
Svedman et al., “Structure from Stereo Vision using Unsynchronized Cameras for Simultaneous Localization and Mapping,” 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 2993-2998, 2005. |
Teller, “Pervasive pose awareness for people, Objects and Robots,” http://www.ai.mit.edu/lab/dangerous-ideas/Spring2003/teller-pose.pdf, 6 pages, Apr. 2003. |
Terada et al., “An Acquisition of the Relation between Vision and Action using Self-Organizing Map and Reinforcement Learning,” 1998 Second International Conference on Knowledge-Based Intelligent Electronic Systems, Adelaide, Australia, pp. 429-434, Apr. 1998. |
The Sharper Image, eVac Robotic Vacuum—Product Details, www.sharperiamge.com/us/en/templates/products/pipmoreworklprintable.jhtml, 1 page, Mar. 2005. |
TheRobotStore.com, “Friendly Robotics Robotic Vacuum RV400—The Robot Store,” www.therobotstore.com/s.nl/sc.9/category.-109/it.A/id.43/.f, 1 page, Apr. 2005. |
Thrun, Sebastian, “Learning Occupancy Grid Maps With Forward Sensor Models,” Autonomous Robots 15, 28 pages, Sep. 1, 2003. |
Trebi-Ollennu et al., “Mars Rover Pair Cooperatively Transporting a Long Payload,” Proceedings of the 2002 IEEE International Conference on Robotics & Automation, Washington, D.C. pp. 3136-3141, May 2002. |
Tribelhorn et al., “Evaluating the Roomba: A low-cost, ubiquitous platform for robotics research and education,” IEEE, pp. 1393-1399, 2007. |
Tse et al., “Design of a Navigation System for a Household Mobile Robot Using Neural Networks,” Department of Manufacturing Engg. & Engg. Management, City University of Hong Kong, pp. 2151-2156, 1998. |
Watanabe et al., “Position Estimation of Mobile Robots With Internal and External Sensors Using Uncertainty Evolution Technique,” 1990 IEEE International Conference on Robotics and Automation, Cincinnati, OH, pp. 2011-2016, May 1990. |
Watts, “Robot, boldly goes where no man can,” The Times—pp. 20, Jan. 1985. |
Wijk et al., “Triangulation-Based Fusion of Sonar Data with Application in Robot Pose Tracking,” IEEE Transactions on Robotics and Automation, 16(6):740-752, Dec. 2000. |
Wolf et al., “Robust Vision-Based Localization by Combining an Image-Retrieval System with Monte Carol Localization,”, IEEE Transactions on Robotics, 21(2):208-216, Apr. 2005. |
Wolf et al., “Robust Vision-based Localization for Mobile Robots Using an Image Retrieval System Based on Invariant Features,” Proceedings of the 2002 IEEE International Conference on Robotics & Automation, Washington, D.C., pp. 359-365, May 2002. |
Wong, “EIED Online>> Robot Business”, ED Online ID# 13114, 17 pages, Jul. 2006. |
Yamamoto et al., “Optical Sensing for Robot Perception and Localization,” 2005 IEEE Workshop on Advanced Robotics and its Social Impacts, pp. 14-17, 2005. |
Yata et al., “Wall Following Using Angle Information Measured by a Single Ultrasonic Transducer,” Proceedings of the 1998 IEEE, International Conference on Robotics & Automation, Leuven, Belgium, pp. 1590-1596, May 1998. |
Yun et al., “Image-Based Absolute Positioning System for Mobile Robot Navigation,” IAPR International Workshops SSPR, Hong Kong, pp. 261-269, Aug. 2006. |
Yun et al., “Robust Positioning a Mobile Robot with Active Beacon Sensors,” Lecture Notes in Computer Science, 2006, vol. 4251, pp. 890-897, 2006. |
Yuta et al., “Implementation of an Active Optical Range sensor Using Laser Slit for In-Door Intelligent Mobile Robot,” IEE/RSJ International Workshop on Intelligent Robots and Systems (IROS 91) vol. 1, Osaka, Japan, pp. 415-420, Nov. 3-5, 1991. |
Zha et al., “Mobile Robot Localization Using Incomplete Maps for Change Detection in a Dynamic Environment,” Advanced Intelligent Mechatronics '97. Final Program and Abstracts., IEEE/ASME International Conference, pp. 110, Jun. 1997. |
Zhang et al., “A Novel Mobile Robot Localization Based on Vision,” SPIE vol. 6279, 6 pages, Jan. 2007. |
McLurkin, “The Ants: A community of Microrobots,” Paper submitted for requirements of BSEE at MIT, May 1995, 60 pages. |
McLurkin “Stupid Robot Tricks: A Behavior-based Distributed Algorithm Library for Programming Swarms of Robots,” Paper submitted for requirements of BSEE at MIT, May 2004, 127 pages. |
Grumet, “Robots Clean House,” Popular Mechanics, Nov. 2003, 3 pages. |
Kurs et al, Wireless Power transfer via Strongly Coupled Magnetic Resonances, Downloaded from www.sciencemag.org, Aug. 2007, 5 pages. |
Jarosiewicz et al., “Final Report—Lucid,” University of Florida, Departmetn of Electrical and Computer Engineering, EEL 5666—Intelligent Machine Design Laboratory, 50 pages, Aug. 1999. |
Karcher Product Manual Download webpage: Retrieved from the Internet: URL<http://www.karcher.com/bta/download.en.shtml?ACTION=SELECTTEILENR&Id=rc3000&submitButtonName=Select+Product+Manual″and associated .pdf file “5959-915en.pdf (4.7 MB) English/English,” 16 pages, accessed Jan. 2004. |
Karcher RC 3000 Cleaning Robot-user manual Manufacturer: Alfred-Karcher GmbH & Co, Cleaning Systems, Alfred Karcher-Str 28-40, PO Box 160, D-71349 Winnenden, Germany, Dec. 2002. |
Karcher USA, RC3000 Robotic Cleaner, website: http://www.karcher-usa.com/showproducts.php?op=view prod¶m1=143¶m2=¶m3=, 6 pages, accessed Mar. 2005. |
Chamberlin et al., “Team 1: Robot Locator Beacon System,” NASA Goddard SFC, Design Proposal, 15 pages, Feb. 2006. |
EBay, “Roomba Timer -> Timed Cleaning- Floorvac Robotic Vacuum,” Retrieved from the Internet: URL Cgi.ebay.com/ws/eBay|SAP|.dll?viewitem&category=43526&item=4375198387&rd=1, 5 pages, Apr. 2005. |
Put Your Roomba . . . On, Automatic webpages: http://www.acomputeredge.com/roomba, 5 pages, accessed Apr. 2005. |
RoboMaid Sweeps Your Floors So You Won't Have to, the Official Site, website: Retrieved from the Internet: URL<http://therobomaid.com/>. 2 pages, accessed Mar. 2005. |
The Sharper Image, eVac Robotic Vacuum—Product Details, www.sharperiamge.com/us/en/templates/products/pipmorework1printable.jhtml, 1 page, Mar. 2005. |
Schofield, “Neither Master nor slave-A Practical Study in the Development and Employment of Cleaning Robots, Emerging Technologies and Factory Automation,” 1999 Proceedings ETFA '99 1999 7th IEEE International Conference on Barcelona, Spain, pp. 1427-1434, Oct. 1999. |
Doty et al., “Sweep Strategies for a Sensory-Driven, Behavior-Based Vacuum Cleaning Agent,” AAAI 1993 Fall Symposium Series, Instantiating Real-World Agents, pp. 1-6, Oct. 22-24, 1993. |
Number | Date | Country | |
---|---|---|---|
20130245937 A1 | Sep 2013 | US |
Number | Date | Country | |
---|---|---|---|
60557252 | Mar 2004 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13204075 | Aug 2011 | US |
Child | 13651080 | US | |
Parent | 12780746 | May 2010 | US |
Child | 13204075 | US | |
Parent | 11090621 | Mar 2005 | US |
Child | 12780746 | US |