1. Field of Endeavor
What is disclosed herein relates to determining the position and orientation of an object.
2. Description of the Related Art
It is often necessary or useful to be able to determine the pose (location and orientation) of an object, person, pet, asset or device. Certain systems for determining the pose of an object equipped with a signal sensor are known. However, many mobile devices lack the resources necessary to run known systems themselves or are unable to implement them while performing additional tasks, and many such systems fail to effectively take into account practical issues such as manufacturing and production variances, variability of the surfaces of the area in which a mobile device operates (such as uneven terrain or floors), and complications resulting from signal interaction with environmental features such as walls or trees.
Certain embodiments discussed in this application may be utilized in conjunction with systems and methods disclosed in U.S. Pat. No. 7,720,554, filed on Mar. 25, 2005, the content of which is hereby incorporated herein in its entirety by reference.
A method for accurately estimating the pose of a mobile object in an environment, without an a priori map of that environment, is disclosed. The method generates the estimates in near real-time, compensates for the rotational variability of a signal sensor, compensates for signal multipath effects, and is statistically more accurate than relying on dead reckoning or signal detection alone. The method comprises decomposing an environment to be navigated by the mobile object into two or more cells, each of which is defined by three or more nodes. An expected measure of a background signal is determined for each of the nodes, and an expected measure of the signal at positions interior to the cell is estimated based on the expected measure at each of two or more of the cell's nodes. The actual or expected measures at the nodes need not be known a priori, because as the mobile object navigates the environment, the mobile object maps the signal measure at substantially the same time as it localizes by using, for example, an appropriate implementation of an appropriate SLAM algorithm. During an initialization process, initial values for some or all of the calibration parameters including but not limited to the rotational variability, sensor error, and the like, are optionally determined. Also obtained is a scale parameter that correlates a position or location to an expected signal measure. The initialization process makes use of data from the signal sensor as well as a motion sensor and allows for initial determination of an expected signal measure at each of the nodes of a cell. During the SLAM phase, the pose of the mobile object is estimated based on some or all of the following: data from the motion sensor, data from the signal sensor, a map of expected signal measures, the calibration parameters, and previous values for these items. If the mobile object leaves a cell defined by initialized nodes, then the initialization process may be rerun to initialize any previously uninitialized nodes of the cell the mobile object enters. Optionally, some or all of the uninitialized nodes of the cell the mobile object enters are initialized by extrapolating from nodes of cells neighboring the entered cell.
Also disclosed is a method for accurately estimating the pose of a mobile object in an environment, without an a priori map of that environment, which estimates the pose in near real-time, compensates for signal multipath effects, and is statistically more accurate than relying on dead reckoning or signal detection alone. The method comprises decomposing an environment to be navigated by the mobile object into two or more cells, each of which is defined by three or more nodes. An expected measure of a background signal is determined for each of the nodes, and the expected measure of the signal at positions proximate to those nodes is estimated based on the expected measure at each of two or more of the cell's nodes. The actual or expected measures at the nodes need not be known a priori, because as the mobile object navigates the environment, the mobile object maps the signal measure at substantially the same time as it localizes by using, for example, an appropriate implementation of an appropriate SLAM algorithm.
During an initialization process, initial values for some or all of the calibration parameters, including but not limited to rotational variability, sensor error, and the like, are optionally determined. Also obtained is a scale parameter that correlates a position to an expected signal measure. The scale parameter may become less accurate for positions (locations) not proximate to the cell or the nodes of the cell. The initialization process makes use of data from the signal sensor as well as a motion sensor and allows for initial determination of the expected signal measure at each of the nodes of a cell.
During the SLAM phase, the pose of the mobile object is estimated based on some or all of the following: data from the motion sensor, data from the signal sensor, the map of expected signal measures, the calibration parameters, and previous values for these items. If the mobile object moves or is moved so that is not proximate to the nodes or to an earlier position, then the initialization process may be rerun to initialize any previously uninitialized nodes of the cell the mobile object enters. Optionally, some or all of the uninitialized nodes proximate to the mobile object's new position are initialized by extrapolating from previously estimated values associated with positions proximate to the uninitialized nodes.
“Proximate” is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art (i.e., it is not to be limited to a special or customized meaning) and includes, without limitation, being less than 0.25 meters, 0.5 meters, 1 meter, 5 mobile device lengths, or less than 10 mobile device lengths apart. In some embodiments, proximate may be defined relative to the size of an environment if a measure of that size is obtained (e.g., 10% or 5% of environment width). In some embodiments, proximate may be defined relative to the mobile device (e.g., the distance traveled by the mobile device in 0.5 seconds or 1 second). Poses may be proximate if their locations are proximate. Orientations may also be proximate. For example, two poses may be proximate if they differ by less than a particular amount, such as but not limited to 1, 2, 5, or 10 degrees. In some embodiments, two poses are proximate if both their locations and orientations are proximate. In other embodiments, only the locations of the poses are considered. A pose may be proximate to a location or position.
The disclosed aspects will hereinafter be described in conjunction with the appended drawings, which are provided to illustrate and not to limit the disclosed aspects. Like designations denote like elements.
Described herein are methods and systems for the localization of an object, such as a mobile object (e.g., a robotic floor cleaner). Certain embodiments may utilize such mobile object localization to navigate the mobile object. By way of illustration and not limitation, the mobile object may optionally be an autonomous, semiautonomous, or remotely directed floor cleaner (e.g., a sweeper, a vacuum, and/or a mopper), delivery vehicle (e.g., that delivers mail in a building, food in a hospital or dormitory, etc.), or monitoring vehicle (e.g., pollution or contaminant detector, security monitor), equipped with one or more drive motors which drive one or more wheels, tracks, or other such device, where the drive motors may be under control of a computing device executing a program stored in non-transitory memory (e.g., it persists when the object is powered down or when some other data is overwritten or erased).
Example embodiments will now be described with reference to certain figures. Through the description herein, “localization” may include determining both the position of an object in an environment and the orientation of that object. The combination of position and orientation is referred to as the “pose”. Either or both of the position (or location) and orientation may be absolute (in terms of a logical reference angle and origin) or relative (to another object).
Many objects, including mobile objects, are not functionally or physically symmetrical. Knowing the orientation of such objects may be useful in determining how to navigate such objects in an environment. For example, some mobile objects can only move forward and some mobile objects may have functional components, such as vacuum ports or sweepers, at specific locations on their surface. Also, the current orientation of a mobile object may affect its future position as much as its current position does if it moves in the direction of its orientation. Thus, determining the pose of a mobile object may be of great assistance in determining how to navigate the mobile object to perform a task, such as a floor cleaning task, in an efficient manner.
For convenience, much of this disclosure is expressed in terms of localizing a “mobile device”. However, the disclosed aspects may generally be used to localize types of objects, and one of skill in the art will understand how the disclosure can be applied to objects that are not independently mobile (such as those that are transported or carried by something else) and to objects that are not devices (e.g., pets equipped with collars or humans carrying appropriately configured tags or computing devices).
Typically, when performing tasks such as vacuum cleaning, lawn mowing, delivery, elderly care, etc., an autonomous or mobile device needs to know its pose with respect to its environment in order to reach its goal or accomplish its task in an effective way. For example, toys and other devices might be intended and configured to behave in a particular manner when they are in a particular location. Even if the device itself has no additional task or goal that benefits from localization, if its pose can be determined then the location of a person or other entity carrying or otherwise attached to the device can be determined. If the relative orientations of the carrier and the device are known, then the pose of the carrier can be determined.
The methods and systems disclosed herein advance the state of the art in how the pose of an autonomous device is computed from a combination of observations of a vector field that varies over space and measurements from motion sensors such as odometers, gyroscopes, accelerometers, internal measurement units (IMU) or other dead-reckoning devices (generically referred to as “dead-reckoning sensors” and the output of which is generically referred to as “odometry” or “motion measurements”). Measurements (e.g., measurements of change in position or orientation) from a motion sensor may be relative to another position or may be absolute. Such measurements may include measures of location or distance (e.g., distance or direction of travel) as well as measures of object orientation (e.g., amount of rotation from a previous orientation or amount of rotation from an absolute reference). Wave or other signals emitted into an environment by an external source can create an appropriate vector field. Example methods and systems disclosed herein use a localization and mapping technique, such as a simultaneous (which may be substantially simultaneous) localization and mapping (SLAM) framework, for estimating object pose, parameters modeling rotational variability, and parameters describing the signal distribution or vector field in the environment.
Example embodiments incorporating certain disclosed aspects can localize and track a mobile device with higher accuracy than conventional methods that ignore complications such as rotational variability or multi-path effects. Some embodiments do so in a way that requires no a priori map of the environment or of the signal strength in that environment. Some disclosed embodiments can optionally do so while using relatively inexpensive amounts of computational resources such as processing power, storage, and time, such that the functionality disclosed herein can be made available in a relatively compact mobile device and/or it can be distributed in affordable mass market consumer goods, including products which perform additional functionality beyond localizing, mapping, or navigating. Pose estimates can be obtained in near real time in some such embodiments and some embodiments run in constant or substantially constant time, with storage requirements linear or near linear based on the size of the environment for a given node size (i.e., for a given node size, it is linear in the number of nodes).
U.S. Pat. No. 7,720,554 discloses, among other things, a low-cost optical sensing system for indoor localization. A beacon 160 projects a pair of unique infrared patterns or spots 180 on the ceiling 140. The beacon 160 can be placed relatively freely in the environment 110 and adjusted such that it points towards the ceiling 140. An optical signal sensor 170 measures the direction to both spots 180 on the ceiling 140. The signal sensor 170 then reports the coordinates of both direction vectors projected onto the sensor plane. These beacon spots 180 are the signal sources in an example embodiment that is used throughout this disclosure. Other embodiments may use more or fewer spots 180. Other wave signals such as those used in Wi-Fi, GPS, cellular networks, magnetic fields, sound waves, radio-frequency identification (RFID), or light can also be used. Corresponding sources include wireless routers, satellites, cell towers, coils, speakers, RFID transmitters, and projectors. For example, appropriately configured ceiling lights or speakers may be used in certain embodiments. Although the illustrated embodiment uses a dedicated projector 160 to generate the signal sources 180, in other embodiments pre-existing or off-the-shelf generators can be used. For example, in an apartment complex or a yard, a detector 170 may be configured to take advantage of the distinct Wi-Fi signals available from the various Wi-Fi routers that may be within range. Similarly, existing lights, including fixed ceiling lights, may be used with photo-sensitive sensors. Other signal sources may generate soundwaves (audible, subsonic, or ultrasonic) and the detector 170 may be configured to detect the generated waves. Thus, no or minimal modification to the environment is necessary for such embodiments to be effective. Digital signals, including those transmitted by radio and/or as used in wireless communications may also be used.
Because an indoor embodiment is used to illustrate many of the disclosed aspects, those aspects are disclosed in the context of an indoor environment. However, the disclosed aspects are not limited in this way and can operate outdoors as well as indoors.
A system that tracks the pose of a mobile device 100 equipped with a signal sensor 170 by relying, even in part, on the values reported by that sensor 170 faces a number of challenges. Typically, the signals sensed by the sensor 170 will have a different strength or value at different locations in the environment. In the illustrated scenario, the mobile device 100 moves along the ground 150 (although one of skill could readily apply what is disclosed to a mobile device that travels along a wall or ceiling, or that moves (and rotates) in three dimensions). One challenge is relating a change in the detected (sensed) signal to a change in ground position. The relationship between sensed signal and ground position is the “scale” parameter.
Another challenge stems from the construction, manufacture, or assembly of the sensor 170, performance properties of the second 170, and/or its association with or coupling to the mobile device 100. In some embodiments the orientation of the sensor 170 is fixed relative to the environment 110 and is independent of the rotation of the mobile device 100. For example, a gyroscopic or inertial system may be used to rotatably attach the sensor 170 to the mobile device 100 such that when the mobile device turns or rotates, the sensor rotates in a counter direction. In other embodiments the sensor 170 is rigidly affixed to or integrated with the mobile device 100 such that its orientation is substantially fixed relative to the orientation of the mobile device 100. Indeed, in this disclosure the position and orientation of the sensor 170 are presumed to be identical to that of the mobile device 100 so that, for example, “sensor 170” is used interchangeably with “device 100” when discussing pose or motion. As discussed below, this assumption simplifies the disclosure. One of reasonable skill can readily account for any fixed or calculable offset between the orientation of the sensor 170 and the device 100.
Ideally, rotation of the sensor 170 relative to the environment 110 should not affect the detected signal or should affect it in a way that depends only on the degree of rotation. For example, the direction to signal sources 180 changes when rotating the sensor 170, but the magnitude of the signal at that position is not changed. However, some sensors have directional sensitivities. For example, a Wi-Fi receiver can show changes in signal strength when the antenna is rotating as a result of the device on which it is mounted (e.g., the mobile device) rotating. Even in such a situation, the variation might be predictable and calculable. However, errors in manufacturing, misalignments in attaching the sensor on the object, uneven flooring, and the like may introduce an additional, difficult to predict, variation in the orientation of the signal sensor 170 relative to the orientation of the device 100. This may lead to seemingly unpredictable variation in the signal strength detected by the sensor 170. Thus, for example, a sensor 170 measuring bearing and elevation relative to sources 180 can show variations due to calibration errors of the sensor's vertical axis. This parameter is referred to herein as “rotational variability”.
A third challenge in determining the pose of a mobile device arises from the multiple paths from the signal sources 180 to the sensor 170. In general, a sensor 170 may receive a wave signal not only directly from a source 180 but also through reflections on walls 120, 130, 135 and other stationary and non-stationary objects in the environment (e.g., furniture, trees, and humans). The direct path as well as each reflection may contribute to the signal measured on the sensor 170. This can create non-linear and seemingly arbitrary distributions of the signal throughout the environment 110. This effect is referred to herein “multi-path”.
Some embodiments of the methods and systems disclosed are configured to operate when some or all of the following conditions are met:
First, a given signal can be uniquely identified relative to other signals so that when a signal is detected at different times in an environment 110 with multiple signals, a correspondence between the signals can be maintained. For example, signals in Wi-Fi, GPS and other networks contain a unique ID as part of their data packet protocol. Active beacons, such as those disclosed in U.S. Pat. No. 7,720,554, may encode a signature (e.g., by modulating the signal, such as by modulating a light that forms light spots on a ceiling).
Second, signals are substantially continuous and change over space but optionally not in time. It should be understood that continuity does not mean that there is necessarily a one-to-one correspondence of vector of signal values to ground positions. The same measurement vector might be observed at several different locations in the environment 110 because, for example, of multi-path. Some embodiments may operate with signals that change in time, where the change over time is known or can be predicted.
Third, a dependency on orientation can by described by signal sensor orientation and rotational variability. In other words, knowing the signal values at one pose (position and orientation) enables expected signal values for other orientations at the same position to be calculated if the change in sensor orientation and any rotational variability are known.
The dead reckoning (motion) sensor 190 may include multiple instances of multiple types of dead reckoning sensors such as those mentioned above. A signal sensor 170 provides measurement vectors of the signals in the environment. The signal sensor 170 may include multiple instances of one or more types of sensing components. In some embodiments the signal sensor 170 may include one or more sensors which detect more than one different types of signals (e.g., the signal sensor 170 may include both Wi-Fi sensors and light sensors). Some such embodiments may use only one signal type at a time; some such embodiments may normalize the output of the signal sensor and proceed as if there were only one type of (composite) signal being sensed; and some embodiments may extend what is disclosed below in obvious ways by using the availability of more signal sensor data to improve the filtering results.
The output of sensors 170, 190 are provided to a Vector Field SLAM module 220. The illustrated SLAM module 220 reads and stores information 230 about a grid of nodes. The SLAM module 220 also provides pose estimates of the mobile device 100 and map information about the signal distribution in the environment 110. These may be provided to other components for use and/or display. For example, pose estimates may be provided to a navigational component 240, which directs the mobile device 100 to move to a new location based at least in part on its current pose. They may also be provided to an alerting or action system 250 which uses the current pose as at least a partial basis for subsequent action such as cleaning. The map may be stored for future use and/or displayed for diagnostic purposes, for example.
Even though many appropriate signal sources may be present or could be made available, and although appropriate signal sensors may be configured on an embodiment, some embodiments will optionally not use GPS, not use WiFi, not use direct light signals (e.g., non-reflected light from lamps or infrared sources), and/or not make use of ceiling lighting fixtures for some or all aspects of the localization process.
The processor 310 may be operatively connected to various output mechanisms such as screens or displays, light and sound systems, and data output devices (e.g., busses, ports, and wireless or wired network connections). The processor may be configured to perform navigational routines which take into account the results of the SLAM process. Executing a navigational process may result in signals being sent to various controllers such as motors (including drive motors or servomotors), brakes, actuators, etc, which may cause the mobile device 100 to move to a new pose (or to perform another activity, such as a cleaning function). The move to this new pose may, in turn, trigger additional output from the sensors to the processor, causing the cycle to continue. An example embodiment is configured with an ARM7 processor, 256K of flash ROM for software, and 64K of RAM for data. These are not minimum requirements—some or all of what is disclosed herein can be accomplished with less processing and storage capacity. Other embodiments may be different processors and different memory configurations, with larger or smaller amounts of memory.
Turning back to
The geometry of the illustrated localization system results in a linear model for position estimation in an ideal environment without multi-path signals. That is, if the sensor 170 moves one meter in one direction, the sensor coordinates change by a certain amount (depending on the scale parameter, which is proportional to the height of the ceiling 140). If the sensor 170 then moves another meter into the same direction, the sensed signals change by the same amount.
In situations such as that shown in
νinit=(s1,s2,m0) (1)
From these parameters, an expected signal value h=(hx1, hy1, hx2, hy2)T at a sensor position (x y)T can be calculated as:
It is straightforward to extend this model for an arbitrary number of spots 180.
For general wave signals, a similar linear model can be chosen. In general, the following model in Equation (3) applies, where h is the vector of estimated signal values for position (x y)T, h0 is the absolute offset in the sensor space, and A0 is a general scale matrix.
A flow chart for computing the parameters of this linear model (either Equation 2 or Equation 3) is shown in
RANSAC (Random Sample Consensus) is an iterative method to estimate the parameters of a mathematical function from sensor data that include outliers (see, e.g., A. Fischler, R. C. Bolles. Random Sample Consensus: A Paradigm for Model Fitting with Applications to Image Analysis and Automated Cartography. Comm. of the ACM, Vol 24, pp 381-395, 1981). The RANSAC algorithm runs several iterations. In a given iteration a number of measurements are chosen at random (the term “random” as used herein, encompasses pseudo random). In an embodiment using two spots 180, two signal sensor 170 readings each containing measurements to both spots 180 are sufficient. In an example implementation, it was determined that additional sample readings per iteration did not produce a significant improvement on the results and increased the resources consumed by the RANSAC process. From the chosen measurements the parameter values are determined by solving the set of equations arising from placing the chosen measurements into the mathematical model, Equation (2). More generally, Equation (3) may be used. The computed parameters are then evaluated using some or all available sensor data, optionally including dead reckoning data. This usually computes a score such as the number of inliers or the overall residual error. After completing the desired number of iterations, the parameter values with a score meeting certain criteria (e.g., the best score) are chosen as the final parameters.
Embodiments may use variations of RANSAC or alternatives to it.
Illustrative examples of the parameters used during initialization are presented below, in the discussion of GraphSLAM.
Once the initialization process is complete or the parameters for the relevant equation are otherwise determined, one or more algorithms for accounting for noisy sensors and dead-reckoning drift can be used to implement a system to effectively track the pose of the mobile device 100 with more accuracy, in less time, and/or with lower computational resource requirements than many conventional methods. Examples of such algorithms include the Kalman Filter, the Extended Kalman Filter (EKE), the Invariant Extended Kalman Filter (IEKF), and the Unscented Kalman Filter (UKF). However, the ability of these filters to effectively track pose after the initialization process of
As discussed above, multi-path occurs when the wave signal not only reaches the signal sensor 170 directly but also in other ways, such as by reflecting from nearby objects or walls (e.g. the right wall 130 in
However, if the expected signal strength as a particular location is known, then signal strength measurements can still be used for localization in a multi-path environment via, for example, a Bayesian localization framework such as an EKE. In an example embodiment, by way of illustration, a piece-wise linear approximation (pieces are illustrated in
The second challenge mentioned was rotational variability. When turning a sensor 170 in place, the measurements of the observed vector signal can change. This is the rotational variability of the sensor 170. For example, a sensor 170 in an embodiment using spots 180 outputs (x y) coordinates of the center of a spot 180 on the sensor plane. The (x y) coordinates essentially are a vector representing bearing and elevation to the spot 180. Ideally, as the sensor 170 rotates in place, only the bearing should change—the elevation should stay constant. In practice, however, elevation changes (usually, but not always, by a relatively small amount) due to variations in manufacturing, calibration errors, or misalignments in mounting the sensor 170 on the mobile device 100.
For example,
The coordinate hx1 of spot 181 is equal to the tangent of β and is measured by:
The rotational variability is modeled by an offset in β that changes with the orientation of the sensor 170 such that Equation (5) holds, where β′ is the angle to the ideal axis of rotation perpendicular to the ground plane and βε is the angular error that changes with rotation.
β=β′+βε (5)
Inserting (5) in (4) and applying the rule of the tangent of the sum of angles yields:
Since βε is small, tan βε is approximated by:
Substituting (7) into (6) yields:
For elevation angles β′ that are much less then 90°, 1−βε tan β′ is approximated as 1, yielding Equation (9), where cx is the rotational variance on the x axis depending on the orientation of the signal sensor 170.
For the y axis of the sensor 170 another bias term cy is derived in an analogous way. Together both parameters form the vector c of rotational variability.
Since the direction β to the spots 180 can be arbitrary, the parameters for rotational variability are substantially independent of where the spots 180 are located. All spots 180 may therefore share substantially the same parameters.
Similar and analogous results can be obtained for other signal sources and sensor types. Rotational variability is not limited to the illustrated embodiment. Other sensor(s) 170 that measures bearing-to-signal sources 180 can show similar effects when the vertical axis of the sensor 170 is slightly misaligned or the sensor 170 otherwise rotates around an axis different from the ideal one. For example, antennas for radio or other wireless communication can show slight changes in the received signal when they rotate. Thus, an optional useful model of the way the vector of signal values changes on rotation of the sensor 170 is a function that only depends on the orientation of signal sensor 170 and parameters describing the rotational variability of the signal sensor 170.
The measurements were then rotated back and drawn in a common reference frame.
Changes in the pitch or angle of the mobile device relative to the surface it is traversing can also cause or contribute to rotational variability. For example, uneven floors or ground such as might result from rolling terrain, general bumpiness, twigs or branches, brickwork, and the like can cause the pitch of the mobile device to change. In some embodiments, rotational variability due to change in pitch is monotonic, although it complements rotational variability due to manufacturing and other sources At least some rotational variability due to changes in pitch may be accounted for using the methods described herein. For example, changes in pitch of less than 3, 5, or 7 degrees (or other pitches) may be accommodated by some embodiments without modification to what is disclosed herein.
Although there is significant signal distortion, it has been determined that the error is systematic and continuous. This allows modeling the nature of the signal using non-linear systems. An example embodiment approximates the non-linearity caused by multi-path through the use of piece-wise linear functions. This example technique is described below in greater detail. Other approximations, e.g., using Splines (piecewise polynomial (parametric) curves which may be used to approximate complex shapes using curve fitting) or Nurbs (non-uniform rational basis splines, which are mathematical models which may be used to generate and represent surfaces and curves) may also be used and may provide more accurate representations of the non-linear signal distortion. However, experimentation with certain embodiments has indicated that the use of bi-linear interpolation results in faster processes and produces sufficiently good results in embodiments that have limited computational resources. Embodiments with more computational resources or those with relaxed time constraints may beneficially use other representations, including Splines or Nurbs.
In some embodiments, localization of a mobile device 100 equipped with a signal sensor 170 is performed by learning the signal distribution in the environment 110 while at the same time (or at substantially the same time) localizing the mobile device 100. This is known as simultaneous localization and mapping (SLAM). As discussed above, in the following it is assumed that the pose of the mobile device 100 and the signal sensor 170 are substantially identical. In some embodiments they are not, and it is straightforward to add, for example, a fixed coordinate transformation between the two poses. However, assuming pose identity facilitates understanding of the various disclosed aspects.
In SLAM, a device moves through a time series of poses x0 . . . xT, xt=(x, y, θ)ε SE(2), in an environment (e.g. room 110) containing N map features m1 . . . mN, miεM. Here SE(2) is the space of poses in the 2 dimensional plane and M the space of the map features. Without loss of generality, x0=(0, 0, 0)T. At each time step t=1 . . . T the system receives a motion input ut (e.g., odometry from dead reckoning sensors 190) with covariance Rt and a measurement zt (e.g., of signal strength from signal sensors 170) with covariance Qt.
The motion input ut is measured, for example, by motion sensors 190 on the mobile device 100 and describes the change in pose of the sensor 170 from time step t−1 to t. As mentioned above, in certain embodiments the motion input may be provided by external sensors or a combination of internal and external sensors. The input vector ut is associated with a covariance Rt that models the accuracy of the pose change. Typical motion sensors 190 include wheel encoders, gyroscopes, accelerometers, IMUs and other dead-reckoning systems. A motion model defined by a function g describes the motion of the device 100 since the previous time step where eu is a zero mean error with covariance Rt:
xt=g(xt−1,ut)+eu (11)
An example of input ut is a forward translation d followed by a rotation α: ut=(dα)T. Equation (11) then resolves into the following form:
For those skilled in the art it is straightforward to substitute different motion models g and input vectors ut depending on the geometry of the mobile device 100 and available motion sensors 190. The systems and methods disclosed herein apply regardless of the motion model.
When the signal sensor 170 on the mobile device 100 obtains a new reading zt of the wave signals, the SLAM system uses a sensor model to predict the observation. As in the case of motion, the sensor reading zt is associated with a covariance Qt modeling the accuracy of the measurement. The sensor model is defined by a function h that predicts an observation given the sensor 170 pose at time step t and map features as in Equation (13), where ez is a zero mean error with covariance Qt. The sensor model h depends on the map features and the available signal sensor 170 in the mobile device 100. In early SLAM applications such as those described in Thrun et al. [2005, Chapter 10], map features are landmarks and the sensor model h computes bearing and distance to them. The systems and methods disclosed herein optionally use a very different approach: some or all of the features are signal values at predetermined or fixed locations and, few or none of the features are landmarks in the environment. The expected values of wave signals at a given device 100 pose are computed by h as follows.
zt=h(xt,m1 . . . mN)+ez (13)
In SLAM it is possible to include in the sensor model calibration parameters like those describing rotational variability of the sensor 170. The SLAM algorithm then not only estimates device pose and map features, but also estimates the calibration parameters. All calibration parameters are summarized in a vector c. The size of this vector depends on the sensor 170. For example, in an embodiment using the reflection from spots of modulated light created by a project 160 as the signal sources 180, the calibration parameters include the two bias constants (cx, cy) in Equation (10). The observation model in Equation (13) then includes this parameter:
zt=h(xt,c,mN)+ez (14)
Embodiments also learn the vector field generated by M signals over the environment. This vector field can mathematically be described as a function that maps a ground pose to a vector of M signal values.
VF:SE(2)→M (15)
Since signals are independent of sensor 170 orientation (per the preferences set forth above), the space of poses SE(2) can be decomposed into position and orientation. The vector field over position is then modeled as a piece-wise linear function by laying a regular grid of node positions bi=(bi;x; bi;y)T, i=1 . . . N onto the ground 150 (or onto whatever surface the mobile device 100 is traversing). This creates rectangular cells with one node at each of the cell's four corners. Each node i holds a vector miεM describing the expected signal values when placing the sensor at bi and pointing at a pre-defined direction θ0. Returning to the running example of signal sources 180 being spots of modulated light, the vector mi holds four values—the coordinates of both spots 180: mi=(mi,x1, mi,y1, mi,x2, mi,y2)T.
The spacing of cells in the regular grid defines the granularity and precision with which the wave-signal distribution in the environment 110 is modeled. A finer spacing leads to more cells, yielding better precision but requiring more memory. A coarser spacing results in fewer cells, requiring less memory but at the possible cost of precision. The exact parameter for the cell size depends on the environment, mobile device, and the application. For the purpose of covering an environment 110 with reasonable precision (e.g., for systematic floor cleaning), the cell size could be 0.5 m to 2 meters for a system using spots of frequency modulated light as signal sources 180 in an environment with a ceiling height of 2.5 to 5 meters.
For an arbitrary sensor position with orientation θ0, the expected signal values are computed by bilinear interpolation from the nodes of a cell (e.g., the four nodes) containing the sensor position. Such a cell is illustrated in
The expected signal values at (x, y) with orientation θ0 are then computed as Equation (16), where mi0, mi1, mi2 and mi3 are the signal values at the four cell nodes and w0, w1, w2 and w3 are the weights of the bilinear interpolation computed as Equation (17).
The final expected signal values are computed by taking into account sensor orientation θ and the parameters c describing the rotational variability of the sensor 170:
h(xt,c,m1, . . . mN)=hR(h0(x,y,m1 . . . mN),θ,c). (18)
Here hR is a continuous function that transforms the interpolated signal values obtained through Eq. (16) by the sensor orientation and rotational variability. This is usually a rotation by orientation θ followed by a correction with the rotational variability c. In the running example, turning the sensor 170 in place causes the spot 181 coordinates to change according to the rotation angle θ but in the opposite direction. The rotational component hR therefore becomes Equation (19), where (hx1, hy1, hx2, hy2) is the output vector of Equation (16). It is also possible to formulate the equations for a variable number of spots 180 since the components in Equations (16)-(19) are not correlated between spots 180. Similar equations can be readily obtained for other signal sources.
It is possible to apply more complex schemes for predicting the sensor signal that use more than only the four nodes of the current cell. A cell with fewer nodes could also be used. In another embodiment, the function in Equation (16) is evaluated for the current and several neighboring cells and then a weighted mean of them is computed as the final result. The weights are taken as the mass of probability of the current position estimate that falls into each cell. The weight of a given cell is a function of the probability that the sensor or mobile device is within this cell. This probability can be derived from the current mobile device pose and associated uncertainty as it is computed by the localization filter.
The above understandings and equations enable the application of a SLAM algorithm for estimating device path, rotational variability, and/or the signal values at the node positions. Optionally, full SLAM and/or on-line SLAM may be used.
In full SLAM, the complete trajectory of the device 100, rotational variability of the sensor 170, and/or some or all map features are computed. For example, the state that is estimated is:
One algorithm that computes an estimate of Y is GraphSLAM, which is used in some embodiments and is described in more detail below.
In contrast, on-line SLAM estimates the current pose and some or all map features at each time step t=1 . . . T. The state estimated at each time step t is:
There are several algorithms that estimate yt over time. Examples using EKF-SLAM, EIF-SLAM and ESEIF-SLAM are described below. Embodiments may use any of the described full SLAM or on-line SLAM algorithms, as well as other algorithms. Some embodiments can be configured to use a particular SLAM algorithm depending on, for example, a user's preference, the computational resources available, and other operational constraints.
GraphSLAM is a non-linear optimization method for estimating the state vector in Equation 20 by finding the values in Y that best explain the sensor and motion data from sensors 170 and 190. GraphSLAM estimates Y as the solution to a non-linear least squares problem in finding the minimum of the following objective function where the quantities are defined as described before:
An example implementation of GraphSLAM is illustrated in
The linear equation system may optionally be solved during optimization state 1230 using Conjugate Gradient, since the system is usually sparse and positive definite.
For providing an initial estimate of the state vector in state 1210, the following method can be used. First, the initial device poses x1 . . . xT are computed from x0=(0, 0, 0)T by iteratively applying the motion model in (11) for each t=1 . . . T. Second, the initial rotational variability is c=ĉ where ĉ is a rough guess about the values of rotational variability that depend on the sensor 170. In the running example, some embodiments use ĉ=(0, 0)T because the rotational variability is usually small. The initial node values mi are computed from Equations (1) and (2). For example, the parameters in Equation (1) are computed by applying RANSAC over a short initial sequence, as discussed above. The node values mi are then obtained from the node position bi through Equation (2).
The short initial sequence typically contains a minimum or relatively low number of sensor samples (e.g., 2 to 50) while the mobile device 100 moves a certain distance. This distance is usually proportional to the chosen cell size such that enough samples are available that cover a reasonable fraction of the cell. For example, for a cell size of 1 meter, the distance threshold may be selected within the range of 0.5 m to 1 meter. More generally, some embodiments may be configured to travel a distance of ⅓ to ⅔ of the cell size. This distance may also depend on the size of the mobile device 100: typically, larger mobile devices should travel further during the initialization phase. Optionally, a given sample is spaced a minimum distance from an adjacent sample. This distance may be determined based on a dynamically configured initialization travel distance and sample count, for example. It may also be fixed a priori so that samples are taken after every half second of travel or after every 10 centimeters of travel, for example, although other time periods and distances may be used.
GraphSLAM may be implemented as a batch method since the motion and sensor data needs to be available when computing the non-linear optimization. Furthermore, the amount of computation is significant. These constraints may make it difficult to use GraphSLAM in certain embedded systems with limited computational resources, such as if the mobile device 100 is a conventional vacuum cleaner or other consumer product. GraphSLAM is nevertheless useful as a baseline algorithm for computing the best possible result given the sensor data and a chosen model. For example, it can be used during the development of products or selectively run when computational resources are available to check the performance of other methods. Further, there are certain embodiments of product mobile devices where there are sufficient computational and memory resources to utilize GraphSLAM.
One such method for state estimation used by some embodiments is an Extended Kalman Filter (EKF). The EKF is a non-linear variant of the Kalman Filter (KF). EKF-SLAM is an on-line SLAM method. The state vector contains the current pose of the device 100 but not older or future poses (or estimates thereof). Furthermore, the size of the state grows as the mobile device 100 moves in the environment 110. Initially the state contains only device pose, rotational variability and the node estimates of the 4 nodes of the initial cell.
As the mobile device 100 moves around and visits further cells, the system grows by augmenting the state vector with further nodes. After t time steps and visiting cells with a total of n nodes the state becomes:
The EKF computes an estimate of this state by maintaining mean and covariance modeling a Gaussian distribution over the state.
y˜N(μ,Σ) (25)
The initial mean is set to equation (26), where ĉ is a rough guess/estimate of the rotational variability of the sensor 170 and {circumflex over (m)}1 . . . {circumflex over (m)}4 are initial values of the four nodes obtained from sensor data of a short initial sequence as described before using Equations (1) and (2). Again, in a sample embodiment using spots 180, the initial rotational variability can be set to ĉ=(0, 0)T.
The initial covariance is a diagonal matrix where the vehicle uncertainty is set to 0 and the uncertainties of rotational variability and the four initial nodes are infinite. For implementation on a computer, ∞ can be replaced by a large number.
On object motion u, with covariance Rt, EKF-SLAM updates the state as Equations (28) and (29), where f extends the motion model g over all state variables and Fy is its Jacobian with respect to state per Equations (30)-(31).
When a new sensor observation zt with covariance Qt is taken, the system determines the current cell, i.e. the cell in which the mean estimate of current device pose {circumflex over (x)}t falls, and then updates the mean and covariance of the state.
In general the current cell at time t can be:
In the first case no changes are required to the state vector and the system can continue updating mean and covariance as described further below.
In the second and third cases, nodes not yet present in the state vector need to be added by augmenting the state with the new nodes. In general, adding a node to the state vector containing n nodes is achieved by Equations (32) and (33), where {circumflex over (m)}n+1 and Mn+1 are the mean and covariance of the new node. This mean and covariance can be computed from nodes already contained in the state vector by linear extrapolation per Equations (34) and (35), where Ai, i=1 . . . n are matrices weighting the contribution of each node in the extrapolation, M is the covariance over all nodes, and S is additional noise for inflating the new covariance to allow the new node to vary for accommodating the non-linear structure of the wave signal. In some embodiments and in certain scenarios, the vector field changes slowly over space (i.e., the signal is relatively constant). Thus, in such embodiments, change between adjacent nodes is limited and extrapolation might degenerate into a linear model. Some embodiments use a smaller S in introduced in such circumstances, and some embodiments use introduced a larger S if the vector field is known or predicted to change more rapidly over space.
The initialization of a new node is graphically illustrated in
The extrapolation is such that the mid point between the spots 180 is used for extrapolation. The orientation of the line between the two new spot estimates is taken over from the closer one. This has the effect that changes in orientation are not propagated when initializing new nodes.
Some embodiments optionally only consider cases where a new node can be initialized from a pair of the 8 directions. In case there are several possible candidates, an embodiment may chose the one with the smallest resulting covariance Mn. For comparing covariances, the matrix determinant, the trace of the matrix, its Frobenius norm, or other norms can be used.
If there are no neighbors for initialization, some embodiments discard the sensor observation. Such a situation may occur, for example, when the mobile device 100 travels over a full cell without any sensor 170 observations and then arrives in a cell where all four cells are not yet part of the state vector (scenario 3, above). In this scenario, the utility of the new observation for localization may be minimal. Nonetheless, some embodiments may still initialize a new node by linear combinations of other nodes in the state vector using Equations (34) and (35). Some embodiments may optionally only use the motion updates (e.g., the odometry from the dead reckoning sensors 190) of the mobile device 100 and wait until the device 100 returns to an existing cell or to a cell that can be initialized. Another approach is to start over and re-initialize the system from the current pose.
Once the state vector contains elements for all nodes of the current cell, the mean and covariance are updated with the measurement zt and its covariance Qt by application of the EKF equations per Equations (37)-(40) where h(yt) is the sensor model defined in Eq. (18), Hy the Jacobian of the sensor model and K the Kalman gain.
A flow chart of the EKF-SLAM method for object localization is shown in
In general, EKF-SLAM has the advantage that it is an on-line method, integrating motion/odometry and signal sensor measurements as they appear. The most computationally expensive operation is the update of the covariance matrix on sensor update in Eq. (38), state 1550. This involves the update of large numbers (e.g., all) of the matrix elements, an operation that takes time quadratic in the number of nodes in the state.
In general, the covariance Σt is fully correlated. That is, there are few, if any, elements that are zero. This typically requires holding the full matrix in a data memory, which may limit the applicability of the method for embedded systems or other environments if there are overly limited memory resources.
An additional step in the EKF as well as in other filters is outlier rejection. In the case where measurements are received that seem implausible, the filter rejects these measurements. This may be accomplished by not updating the filter on such measurements, which may be the result of hardware errors, signal interference, or irregular timing problems, for example.
There are several options for detecting such outliers. For example, the sensor measurement itself can be examined for valid data. By way of illustration, a threshold on the absolute magnitude of the signal strength reported by a sensor if the range of allowable magnitudes for the signal being detected is known. If the measurement falls below or above this threshold it is rejected.
Another way to detect outliers is by comparing the received measurement zt with the expected one h(
Another approach used by some embodiments for state estimation is an Extended Information Filter (EIF). The EIF is similar to the Extended Kalman Filter in that it models a Gaussian distribution over the state space and processes motion and signal sensor data on-line. Its parameterization, often called a dual representation, differs from that used by EKF. The parameterization consists of an information vector ηt and an information matrix Λt that are related to the mean μt and covariance Σt of the EKF in the following way:
ηt=Σt−1μt
Λt=Σt−1 (41)
The EIF-SLAM algorithm processes data from the motion sensors 190 and signal sensors 170 in the same way as EKF-SLAM described above. The computation of information vector and information matrix on object motion and sensor measurement can be derived from Eqs. (26) to (40) by inserting Eq. (41) and simplifying the resulting equations.
In general a direct application of the EIF-SLAM algorithm does not provide a greater advantage than EKF-SLAM. Under some approximations, however, it is possible to keep the information matrix sparse, i.e. many elements are zero, allowing for a more compact storage and more efficient updates in terms of time and computational resources.
EIF-SLAM has the property that when inserting a signal sensor 170 measurement, only those elements in the state the measurement depends on need to be updated in the information matrix. For Vector Field SLAM this means that only elements related with the device 100's object pose and rotational variability and with the four nodes of the current cell are updated. All other elements in the information matrix stay unchanged. Therefore, the update on signal sensor 170 information turns only few elements from zero into non-zero and generally preserves the sparsity of the information matrix.
However, the update on device motion (e.g., when new data from the motion sensors 190 is received) causes a full update of the whole information matrix in the general case. This causes the information matrix to become non-zero in most if not all elements, which may destroy any sparsity that was present before the motion update.
Some embodiments may use strategies for approximating the update of the information matrix on device motion that preserve the sparsity of the information matrix. Two such methods are the Sparse Extended Information Filter (SEIF) and the Exactly Sparse Extended Information Filter (ESEIF).
Yet another approach available to some embodiments for state estimation is ESEIF. The principle of the ESEIF algorithm is maintaining a set of “active features”. In the original context, “features” refer to landmarks. In the case of Vector Field SLAM, the features are the nodes. The active features are a subset of all features. Typically those features that are currently observed by the mobile device 100 are the active ones. Other features are called “passive”.
Only the active features contain cross-information between the pose of the device 100 and the feature (where the cross-information between device pose and feature is non-zero for active features, whereas for passive features this cross-information is zero). A feature can change its state from passive to active at any time without the need of special operations. The cross-information between device pose and feature starts as zero and becomes non-zero when updating the system on device motion.
Changing an active feature to a passive one requires computationally non-trivial operations that approximate the actual information matrix by a sparsification. ESEIF-SLAM conceptually integrates out the device pose and then re-localizes the device 100 using observations from only those features (nodes) that should stay or become active. By integrating out the device pose, the state becomes free of the pose. Any uncertainty in the device pose is moved into the feature estimates through the cross-information between device pose and feature. When re-localizing the device 100, only the features used in the signal sensor 170 observation then establish non-zero cross information. This way the sparseness of the information matrix is preserved.
The following describes an implementation of the ESEIF algorithm in the context of Vector Field SLAM.
In an example embodiment, as long as the object stays within this initial cell, the system updates the complete information matrix using all 4 nodes as active features. Eventually the matrix becomes fully dense (most if not all elements become non-zero), as illustrated in
When the mobile device 100 moves out of the current cell and enters a different cell, the procedure of integrating out the device pose, initializing new nodes, and re-localizing the device takes place. First, the uncertainty of the device pose is integrated out. This moves information from the object pose into the rotational variability and the 4 nodes through their cross information. The result is an information matrix as shown in
Next, new nodes are initialized and added to the state. For example, two new nodes m5 and m6 may be added as shown in
The initial values for the information vector and matrix are obtained similarly to Equations (32)-(36), but in the information form as set out in Equation (41). The new information matrix then becomes the one as shown in
The pose of the device 100 is then reintroduced. In the original ESEIF algorithm, an object is localized through observations of active features. In this application of Vector Field SLAM algorithm this is performed in two steps. First, the state is augmented with the new device pose as shown in
The entries for the new device pose in information vector and matrix are computed using Equation (41) and the following mean and covariance per. Equations (42) and (43), where R0 is a parameter that increases the uncertainty of the new device pose. Thus, the new device pose stays unchanged but becomes less certain. At this time there are no active nodes since all cross information between device pose and nodes are zero. Any four nodes can be chosen as the next active set of features. Since the device 100 is in the cell defined by nodes m3 . . . m6, those nodes are chosen as the next set of active features.
μt=μt−1 (42)
Σt=Σt−1+R0 (43)
On signal sensor measurement zt, the uncertainty of the device pose is reduced and elements related to rotational variability and the four active nodes m3 . . . m6 are updated. This creates new cross-information between device pose, rotational variability, and active nodes as shown in
As the device 100 moves within the current cell, in this example embodiment optionally only the device pose, rotational variability, and active cells m3 . . . m6 are updated, as was noted during the discussion of the initial situation. When the device 100 moves into another cell, the state is extended and the information vector and matrix are augmented with new nodes as described above. If the new cell has been visited before, no new nodes need to be added to the state. In either case, the same procedure of integrating out device pose followed by re-localization takes place.
The mathematical equations for motion update (e.g., from the dead reckoning motion sensors 190), signal sensor update (e.g., from the sensors 170), and sparsification can be formulated directly in the information space, i.e. only using μ and Λ for storing the state between motion and sensor updates. In addition an estimate of the mean μ is needed for computing the Jacobians of motion and sensor model.
A flow chart of an example implementation of the ESEIF-SLAM algorithm for object localization is shown in
The state vector as defined in (20) and (21) only contains one field for rotational variability. This is under the assumption that rotational variability does not change with location and thus can be shared among all nodes. There are, however, situations where this is not the case, e.g. when the error βε in Equation (5) is significant and the approximations in Equations (7)-(9) introduce a larger error, or when the sensor 170 is tilted due to uneven floor. There are different ways to deal with changing rotational variability.
In one embodiment each node contains its own estimate of rotational variability. The state vector of full SLAM in Equation (20) containing the full object path changes into Equation (44), with similar changes for the state of on-line SLAM in Equation 21.
The rotational variability is computed similar to the expected node values by using bilinear interpolation per Equation (45), where ci0, ci1, ci2 and ci3 are the rotational variability estimates at the four cell nodes according to
c=w0ci0+w1ci1w2ci2+w3ci3 (45)
Initial estimates of rotational variability are 0 with a co-variance of total uncertainty. When initializing new nodes, the same techniques as described for initial mean and covariance of the node signal values apply for rotational variability.
The cost of storing rotational variability with each node is an increase in the number of state variables and therefore higher memory and run-time consumption. This can limit the application of this solution when computational resources are constrained.
In another embodiment, only one instance of rotational variability is kept, as originally defined in Equations (20) and (21), but it is allowed to change when the mobile device 100 moves. For EKF-SLAM this means that in the motion model in Equations (28)-(30), a component Vt is added to the sub-matrix of the rotational variability in the state covariance. Vt is an additive co-variance matrix modeling how much rotational variability is allowed to change when moving. It is usually a diagonal matrix of constant values.
In another embodiment, Vt=0 as long as the device 100 stays within a cell and Vt is set to a diagonal matrix with constant non-zero values on the diagonal only when the device 100 changes between cells. This has the advantage that while the device 100 stays within a cell, rotational variability is assumed to be constant and is only allowed to change when moving into another cell. In some situations this may offer a better approximation at the cost of additional computation time, but requires no significant additional computational space.
In another embodiment, Vt is used to allow a change in rotational variability when moving between cells in the ESEIF-SLAM system. In the sparsification state, the rotational variability is integrated out and re-localized as the device pose is. This is done because adding Vt in the information space would otherwise fully populate the information matrix, destroying or reducing its sparseness. The states for sparsification with rotational variability included are analogous to the previously described method. An additional advantage of this approach is the removal of cross-information between rotational variability and passive nodes. This further reduces memory requirements and saves computations, at least partially counteracting the additional computation necessary to perform the calculations.
These methods and systems may also be used for detecting and estimating “drift” on, for example, carpet. When a mobile device 100 moves on a carpeted surface, the carpet exhibits a force onto the mobile device 100 tending to slide or shift the mobile device 100 in a certain direction. This effect is caused by the directional grain, material, or other properties of the carpet. Other surfaces, such as lawns or artificial turf, may also exhibit similar properties.
The amount of this drift can be estimated by the localization filter in different ways. In one embodiment, the filter state in Equation (24) is augmented by two additional variables driftx and drifty that represent the amount of carpet drift in the x and y direction of the global coordinate frame. The motion model in Equation (11) then takes into account these new parameters and the filter estimates their values at the same time it estimates the other state variables.
In another embodiment, the mobile device 100 may be configured to move a certain distance forward followed by the same distance backward. From the difference in the position output of the localization system at the beginning and end of this sequence, the amount of carpet drift can be estimated because the carpet drift may be proportional to this position difference. Typically, such a distance would be small enough that it can be traversed rapidly but large enough that an appreciable difference can be detected and the results not obfuscated by noise. Some embodiments may use distances in the range of 10 cm to 2 meters. Some embodiments may use smaller distances. Some embodiments may use larger distances.
The systems and methods described above were evaluated by moving an indoor localization sensor 170, configured to detect infrared patterns 180 projected from a beacon 160, along a rail. Ground truth information—the actual pose of the sensor 170—was directly available from position and orientation sensors on the rail motor. Every 50 cm, sensed signal strength and other measurements were recorded with the sensor 170 in 8 different directions (every 45°), and approximately 50 readings were taken for each of those directions. Once the sensor 170 reached the end of the rail, it was moved 50 cm parallel to the previous rail line and another round of measurements was taken. This was repeated until a total of eight parallel tracks were completed. The previously discussed
The previously discussed
Using the recorded data, a path for a virtual mobile device 100 through the grid was generated. Starting in the lower left corner the object moves along the rows and changes between rows on the left and right side. This results in a theoretically straightforward motion: along a row, a 90° turn at the end of the row, a brief movement to reach the next row, and then another 90° turn before traversing that next row. In practice, when zero-mean Gaussian noise is added to the motion information (simulating real-world error after extended use of dead-reckoning sensors), the odometry path is obtained as shown in
The simulated relative pose data and the resulting odometry path are plausible examples of internal motion estimates. Mobile devices such as autonomous vacuum cleaners or other consumer products can show a similar degradation of pose estimation when using the integration of wheel encoder counts as the only method for pose estimation for example.
For testing the Vector Field SLAM system, one of the approximately 50 sensor measurements from the ground truth pose was randomly chosen when reaching a grid position. This measurement was then provided to the SLAM method for object localization. The cell size for Vector Field SLAM was set to 1×1 meters.
In another series of experiments, the accuracy of the individual Vector Field SLAM implementations was compared to ground truth. In general, all three methods provide higher accuracy than other methods that only use linear sensor models. The GraphSLAM method usually provided slightly better accuracy than EKF-SLAM and ESEIF-SLAM. The latter two usually provided similar accuracy. The absolute position error was determined to depend on several factors such as ceiling height and the size of environments. In the test environment, the overall mean position error was about 6 cm. In general, the sources of error may vary depending on the signal sources 180 used. For example, ceiling height may not be a significant contributor to error if the background signal used is generated by magnetic coils suspended over the operating environment.
Vector Field SLAM also provides information about the learned sensor model or map—the signal strength through the environment.
A typical embodiment will run asynchronously in that a new time step is considered to occur whenever new data is available from signal sensor 170. This may be as often as six or seven times a second. In some embodiments, new sensor data may be ignored if the embodiment is still integrating previously available data and generating new pose information. In some embodiments the localization processor may request data from the signal sensor 170 or otherwise indicate that it is available to process that data. Some embodiments may run synchronously, with new data provided at fixed and regular time intervals.
The systems and methods disclosed herein can be implemented in hardware, software, firmware, or a combination thereof. Software can include compute readable instructions stored in memory (e.g., non-transitory memory, such as solid state memory (e.g., ROM, EEPROM, FLASH, RAM), optical memory (e.g., a CD, DVD, Bluray disc, etc.), magnetic memory (e.g., a hard disc drive), etc., configured to implement the algorithms on a general purpose computer, special purpose processors, or combinations thereof.
While certain embodiments may be illustrated or discussed as having certain example components, additional, fewer, or different components may be used. Further, with respect to the processes discussed herein, various states may be performed in a different order, not all states are required to be reached, and fewer, additional, or different states may be utilized.
Various aspects and advantages of the embodiments have been described where appropriate. It is to be understood that not necessarily all such aspects or advantages may be achieved in accordance with any particular embodiment. Thus, for example, it should be recognized that the various embodiments may be carried out in a manner that achieves or optimizes one advantage or group of advantages as taught herein without necessarily achieving other aspects or advantages as may be taught or suggested herein. Further, embodiments may include several novel features, no single one of which is solely responsible for the embodiment's desirable attributes or which is essential to practicing the systems, devices, methods, and techniques described herein.
This application claims the benefit under 35 U.S.C. §119(e) of U.S. Provisional Application No. 61/280,677, filed Nov. 6, 2009, the entirety of which is hereby incorporated by reference.
Number | Name | Date | Kind |
---|---|---|---|
1755054 | Darst | Apr 1930 | A |
1780221 | Buchmann | Nov 1930 | A |
1970302 | Gerhardt | Aug 1934 | A |
2136324 | John | Nov 1938 | A |
2302111 | Dow et al. | Nov 1942 | A |
2353621 | Sav et al. | Jul 1944 | A |
2770825 | Pullen | Nov 1956 | A |
2930055 | Fallen et al. | Mar 1960 | A |
3119369 | Harland et al. | Jan 1964 | A |
3166138 | Dunn | Jan 1965 | A |
3333564 | Waters | Aug 1967 | A |
3375375 | Robert et al. | Mar 1968 | A |
3381652 | Schaefer et al. | May 1968 | A |
3457575 | Bienek | Jul 1969 | A |
3550714 | Bellinger | Dec 1970 | A |
3569727 | Aggarwal et al. | Mar 1971 | A |
3649981 | Woodworth | Mar 1972 | A |
3674316 | De Bray | Jul 1972 | A |
3678882 | Kinsella | Jul 1972 | A |
3690559 | Rudloff | Sep 1972 | A |
3744586 | Leinauer | Jul 1973 | A |
3756667 | Bombardier et al. | Sep 1973 | A |
3809004 | Leonheart | May 1974 | A |
3816004 | Bignardi | Jun 1974 | A |
3845831 | James | Nov 1974 | A |
RE28268 | Autrand | Dec 1974 | E |
3851349 | Lowder | Dec 1974 | A |
3853086 | Asplund | Dec 1974 | A |
3863285 | Hukuba | Feb 1975 | A |
3888181 | Kups | Jun 1975 | A |
3937174 | Haaga | Feb 1976 | A |
3952361 | Wilkins | Apr 1976 | A |
3989311 | Debrey | Nov 1976 | A |
3989931 | Phillips | Nov 1976 | A |
4004313 | Capra | Jan 1977 | A |
4012681 | Finger et al. | Mar 1977 | A |
4070170 | Leinfelt | Jan 1978 | A |
4099284 | Shinozaki et al. | Jul 1978 | A |
4119900 | Kremnitz | Oct 1978 | A |
4175589 | Nakamura et al. | Nov 1979 | A |
4175892 | De bray | Nov 1979 | A |
4196727 | Verkaart et al. | Apr 1980 | A |
4198727 | Farmer | Apr 1980 | A |
4199838 | Simonsson | Apr 1980 | A |
4209254 | Reymond et al. | Jun 1980 | A |
D258901 | Keyworth | Apr 1981 | S |
4297578 | Carter | Oct 1981 | A |
4305234 | Pichelman | Dec 1981 | A |
4306329 | Yokoi | Dec 1981 | A |
4309758 | Halsall et al. | Jan 1982 | A |
4328545 | Halsall et al. | May 1982 | A |
4367403 | Miller | Jan 1983 | A |
4369543 | Chen et al. | Jan 1983 | A |
4401909 | Gorsek | Aug 1983 | A |
4416033 | Specht | Nov 1983 | A |
4445245 | Lu | May 1984 | A |
4465370 | Yuasa et al. | Aug 1984 | A |
4477998 | You | Oct 1984 | A |
4481692 | Kurz | Nov 1984 | A |
4482960 | Pryor | Nov 1984 | A |
4492058 | Goldfarb et al. | Jan 1985 | A |
4513469 | Godfrey et al. | Apr 1985 | A |
D278732 | Ohkado | May 1985 | S |
4518437 | Sommer | May 1985 | A |
4534637 | Suzuki et al. | Aug 1985 | A |
4556313 | Miller et al. | Dec 1985 | A |
4575211 | Matsumura et al. | Mar 1986 | A |
4580311 | Kurz | Apr 1986 | A |
4601082 | Kurz | Jul 1986 | A |
4618213 | Chen | Oct 1986 | A |
4620285 | Perdue | Oct 1986 | A |
4624026 | Olson et al. | Nov 1986 | A |
4626995 | Lofgren et al. | Dec 1986 | A |
4628454 | Ito | Dec 1986 | A |
4638445 | Mattaboni | Jan 1987 | A |
4644156 | Takahashi et al. | Feb 1987 | A |
4649504 | Krouglicof et al. | Mar 1987 | A |
4652917 | Miller | Mar 1987 | A |
4654492 | Koerner et al. | Mar 1987 | A |
4654924 | Getz et al. | Apr 1987 | A |
4660969 | Sorimachi et al. | Apr 1987 | A |
4662854 | Fang | May 1987 | A |
4674048 | Okumura | Jun 1987 | A |
4679152 | Perdue | Jul 1987 | A |
4680827 | Hummel | Jul 1987 | A |
4696074 | Cavalli | Sep 1987 | A |
D292223 | Trumbull | Oct 1987 | S |
4700301 | Dyke | Oct 1987 | A |
4700427 | Knepper | Oct 1987 | A |
4703820 | Reinaud | Nov 1987 | A |
4709773 | Clement et al. | Dec 1987 | A |
4710020 | Maddox et al. | Dec 1987 | A |
4712740 | Duncan et al. | Dec 1987 | A |
4716621 | Zoni | Jan 1988 | A |
4728801 | O'Connor | Mar 1988 | A |
4733343 | Yoneda et al. | Mar 1988 | A |
4733431 | Martin | Mar 1988 | A |
4735136 | Lee et al. | Apr 1988 | A |
4735138 | Gawler et al. | Apr 1988 | A |
4748336 | Fujie et al. | May 1988 | A |
4748833 | Nagasawa | Jun 1988 | A |
4756049 | Uehara | Jul 1988 | A |
4767213 | Hummel | Aug 1988 | A |
4769700 | Pryor | Sep 1988 | A |
4777416 | George et al. | Oct 1988 | A |
D298766 | Tanno et al. | Nov 1988 | S |
4782550 | Jacobs | Nov 1988 | A |
4796198 | Boultinghouse et al. | Jan 1989 | A |
4806751 | Abe et al. | Feb 1989 | A |
4811228 | Hyyppa | Mar 1989 | A |
4813906 | Matsuyama et al. | Mar 1989 | A |
4815157 | Tsuchiya | Mar 1989 | A |
4817000 | Eberhardt | Mar 1989 | A |
4818875 | Weiner | Apr 1989 | A |
4829442 | Kadonoff et al. | May 1989 | A |
4829626 | Harkonen et al. | May 1989 | A |
4832098 | Palinkas et al. | May 1989 | A |
4846297 | Field et al. | Jul 1989 | A |
4851661 | Everett | Jul 1989 | A |
4854000 | Takimoto | Aug 1989 | A |
4854006 | Nishimura et al. | Aug 1989 | A |
4855915 | Dallaire | Aug 1989 | A |
4857912 | Everett et al. | Aug 1989 | A |
4858132 | Holmquist | Aug 1989 | A |
4867570 | Sorimachi et al. | Sep 1989 | A |
4880474 | Koharagi et al. | Nov 1989 | A |
4887415 | Martin | Dec 1989 | A |
4891762 | Chotiros | Jan 1990 | A |
4893025 | Lee | Jan 1990 | A |
4901394 | Nakamura et al. | Feb 1990 | A |
4905151 | Weiman et al. | Feb 1990 | A |
4909972 | Britz | Mar 1990 | A |
4912643 | Beirne | Mar 1990 | A |
4918441 | Bohman | Apr 1990 | A |
4919224 | Shyu et al. | Apr 1990 | A |
4919489 | Kopsco | Apr 1990 | A |
4920060 | Parrent et al. | Apr 1990 | A |
4920605 | Takashima | May 1990 | A |
4933864 | Evans et al. | Jun 1990 | A |
4937912 | Kurz | Jul 1990 | A |
4953253 | Fukuda et al. | Sep 1990 | A |
4954962 | Evans et al. | Sep 1990 | A |
4955714 | Stotler et al. | Sep 1990 | A |
4956891 | Wulff | Sep 1990 | A |
4961303 | McCarty et al. | Oct 1990 | A |
4961304 | Ovsborn et al. | Oct 1990 | A |
4962453 | Pong et al. | Oct 1990 | A |
4967862 | Pong et al. | Nov 1990 | A |
4971591 | Raviv et al. | Nov 1990 | A |
4973912 | Kaminski et al. | Nov 1990 | A |
4974283 | Holsten et al. | Dec 1990 | A |
4977618 | Allen | Dec 1990 | A |
4977639 | Takahashi et al. | Dec 1990 | A |
4986663 | Cecchi et al. | Jan 1991 | A |
5001635 | Yasutomi et al. | Mar 1991 | A |
5002145 | Wakaumi et al. | Mar 1991 | A |
5002501 | Tucker | Mar 1991 | A |
5012886 | Jonas et al. | May 1991 | A |
5018240 | Holman | May 1991 | A |
5020186 | Lessig et al. | Jun 1991 | A |
5022812 | Coughlan et al. | Jun 1991 | A |
5023788 | Kitazume et al. | Jun 1991 | A |
5024529 | Svetkoff et al. | Jun 1991 | A |
D318500 | Malewicki et al. | Jul 1991 | S |
5032775 | Mizuno et al. | Jul 1991 | A |
5033151 | Kraft et al. | Jul 1991 | A |
5033291 | Podoloff et al. | Jul 1991 | A |
5040116 | Evans et al. | Aug 1991 | A |
5045769 | Everett | Sep 1991 | A |
5049802 | Mintus et al. | Sep 1991 | A |
5051906 | Evans et al. | Sep 1991 | A |
5062819 | Mallory | Nov 1991 | A |
5070567 | Holland | Dec 1991 | A |
5084934 | Lessig et al. | Feb 1992 | A |
5086535 | Grossmeyer et al. | Feb 1992 | A |
5090321 | Abouav | Feb 1992 | A |
5093955 | Blehert et al. | Mar 1992 | A |
5094311 | Akeel | Mar 1992 | A |
5098262 | Wecker et al. | Mar 1992 | A |
5105502 | Takashima | Apr 1992 | A |
5105550 | Shenoha | Apr 1992 | A |
5109566 | Kobayashi et al. | May 1992 | A |
5111401 | Everett, Jr. et al. | May 1992 | A |
5115538 | Cochran et al. | May 1992 | A |
5127128 | Lee | Jul 1992 | A |
5136675 | Hodson | Aug 1992 | A |
5136750 | Takashima et al. | Aug 1992 | A |
5142985 | Stearns et al. | Sep 1992 | A |
5144471 | Takanashi et al. | Sep 1992 | A |
5144714 | Mori et al. | Sep 1992 | A |
5144715 | Matsuyo et al. | Sep 1992 | A |
5152028 | Hirano | Oct 1992 | A |
5152202 | Strauss | Oct 1992 | A |
5154617 | Suman et al. | Oct 1992 | A |
5155684 | Burke et al. | Oct 1992 | A |
5163202 | Kawakami et al. | Nov 1992 | A |
5163320 | Goshima et al. | Nov 1992 | A |
5164579 | Pryor et al. | Nov 1992 | A |
5165064 | Mattaboni | Nov 1992 | A |
5170352 | McTamaney et al. | Dec 1992 | A |
5173881 | Sindle | Dec 1992 | A |
5182833 | Yamaguchi et al. | Feb 1993 | A |
5187662 | Kamimura et al. | Feb 1993 | A |
5202742 | Frank et al. | Apr 1993 | A |
5204814 | Noonan et al. | Apr 1993 | A |
5206500 | Decker et al. | Apr 1993 | A |
5208521 | Aoyama | May 1993 | A |
5216777 | Moro et al. | Jun 1993 | A |
5222786 | Sovis et al. | Jun 1993 | A |
5227985 | DeMenthon | Jul 1993 | A |
5233682 | Abe et al. | Aug 1993 | A |
5239720 | Wood et al. | Aug 1993 | A |
5251358 | Moro et al. | Oct 1993 | A |
5258822 | Nakamura et al. | Nov 1993 | A |
5261139 | Lewis | Nov 1993 | A |
5276618 | Everett | Jan 1994 | A |
5276939 | Uenishi | Jan 1994 | A |
5277064 | Knigga et al. | Jan 1994 | A |
5279672 | Betker et al. | Jan 1994 | A |
5284452 | Corona | Feb 1994 | A |
5284522 | Kobayashi et al. | Feb 1994 | A |
5293955 | Lee | Mar 1994 | A |
D345707 | Alister | Apr 1994 | S |
5303448 | Hennessey et al. | Apr 1994 | A |
5307273 | Oh et al. | Apr 1994 | A |
5309592 | Hiratsuka | May 1994 | A |
5310379 | Hippely et al. | May 1994 | A |
5315227 | Pierson et al. | May 1994 | A |
5319827 | Yang | Jun 1994 | A |
5319828 | Waldhauser et al. | Jun 1994 | A |
5321614 | Ashworth | Jun 1994 | A |
5323483 | Baeg | Jun 1994 | A |
5324948 | Dudar et al. | Jun 1994 | A |
5331713 | Tipton | Jul 1994 | A |
5341186 | Kato | Aug 1994 | A |
5341540 | Soupert et al. | Aug 1994 | A |
5341549 | Wirtz et al. | Aug 1994 | A |
5345649 | Whitlow | Sep 1994 | A |
5352901 | Poorman | Oct 1994 | A |
5353224 | Lee et al. | Oct 1994 | A |
5363305 | Cox et al. | Nov 1994 | A |
5363935 | Schempf et al. | Nov 1994 | A |
5369347 | Yoo | Nov 1994 | A |
5369838 | Wood et al. | Dec 1994 | A |
5386862 | Glover et al. | Feb 1995 | A |
5399951 | Lavallee et al. | Mar 1995 | A |
5400244 | Watanabe et al. | Mar 1995 | A |
5404612 | Ishikawa | Apr 1995 | A |
5410479 | Coker | Apr 1995 | A |
5435405 | Schempf et al. | Jul 1995 | A |
5440216 | Kim | Aug 1995 | A |
5442358 | Keeler et al. | Aug 1995 | A |
5444965 | Colens | Aug 1995 | A |
5446356 | Kim | Aug 1995 | A |
5446445 | Bloomfield et al. | Aug 1995 | A |
5451135 | Schempf et al. | Sep 1995 | A |
5454129 | Kell | Oct 1995 | A |
5455982 | Armstrong et al. | Oct 1995 | A |
5465525 | Mifune et al. | Nov 1995 | A |
5465619 | Sotack et al. | Nov 1995 | A |
5467273 | Faibish et al. | Nov 1995 | A |
5471560 | Allard et al. | Nov 1995 | A |
5491670 | Weber | Feb 1996 | A |
5497529 | Boesi | Mar 1996 | A |
5498948 | Bruni et al. | Mar 1996 | A |
5502638 | Takenaka | Mar 1996 | A |
5505072 | Oreper | Apr 1996 | A |
5507067 | Hoekstra et al. | Apr 1996 | A |
5510893 | Suzuki | Apr 1996 | A |
5511147 | Abdel | Apr 1996 | A |
5515572 | Hoekstra et al. | May 1996 | A |
5534762 | Kim | Jul 1996 | A |
5535476 | Kresse et al. | Jul 1996 | A |
5537017 | Feiten et al. | Jul 1996 | A |
5537711 | Tseng | Jul 1996 | A |
5539953 | Kurz | Jul 1996 | A |
5542146 | Hoekstra et al. | Aug 1996 | A |
5542148 | Young | Aug 1996 | A |
5546631 | Chambon | Aug 1996 | A |
5548511 | Bancroft | Aug 1996 | A |
5551119 | Wörwag | Sep 1996 | A |
5551525 | Pack et al. | Sep 1996 | A |
5553349 | Kilstrom et al. | Sep 1996 | A |
5555587 | Guha | Sep 1996 | A |
5560077 | Crotchett | Oct 1996 | A |
5568589 | Hwang | Oct 1996 | A |
D375592 | Ljunggren | Nov 1996 | S |
5608306 | Rybeck et al. | Mar 1997 | A |
5608894 | Kawakami et al. | Mar 1997 | A |
5608944 | Gordon | Mar 1997 | A |
5610488 | Miyazawa | Mar 1997 | A |
5611106 | Wulff | Mar 1997 | A |
5611108 | Knowlton et al. | Mar 1997 | A |
5613261 | Kawakami et al. | Mar 1997 | A |
5613269 | Miwa | Mar 1997 | A |
5621291 | Lee | Apr 1997 | A |
5622236 | Azumi et al. | Apr 1997 | A |
5634237 | Paranjpe | Jun 1997 | A |
5634239 | Tuvin et al. | Jun 1997 | A |
5636402 | Kubo et al. | Jun 1997 | A |
5642299 | Hardin et al. | Jun 1997 | A |
5646494 | Han | Jul 1997 | A |
5647554 | Ikegami et al. | Jul 1997 | A |
5650702 | Azumi | Jul 1997 | A |
5652489 | Kawakami | Jul 1997 | A |
5682313 | Edlund et al. | Oct 1997 | A |
5682839 | Grimsley et al. | Nov 1997 | A |
5696675 | Nakamura et al. | Dec 1997 | A |
5698861 | Oh | Dec 1997 | A |
5709007 | Chiang | Jan 1998 | A |
5710506 | Broell et al. | Jan 1998 | A |
5714119 | Kawagoe et al. | Feb 1998 | A |
5717169 | Liang et al. | Feb 1998 | A |
5717484 | Hamaguchi et al. | Feb 1998 | A |
5720077 | Nakamura et al. | Feb 1998 | A |
5722109 | Delmas et al. | Mar 1998 | A |
5732401 | Conway | Mar 1998 | A |
5735017 | Barnes et al. | Apr 1998 | A |
5735959 | Kubo et al. | Apr 1998 | A |
5742975 | Knowlton et al. | Apr 1998 | A |
5745235 | Vercammen et al. | Apr 1998 | A |
5752871 | Tsuzuki | May 1998 | A |
5756904 | Oreper et al. | May 1998 | A |
5761762 | Kubo | Jun 1998 | A |
5764888 | Bolan et al. | Jun 1998 | A |
5767437 | Rogers | Jun 1998 | A |
5767960 | Orman | Jun 1998 | A |
5770936 | Hirai et al. | Jun 1998 | A |
5777596 | Herbert | Jul 1998 | A |
5778486 | Kim | Jul 1998 | A |
5781697 | Jeong | Jul 1998 | A |
5781960 | Kilstrom et al. | Jul 1998 | A |
5784755 | Karr et al. | Jul 1998 | A |
5786602 | Pryor et al. | Jul 1998 | A |
5787545 | Colens | Aug 1998 | A |
5793900 | Nourbakhsh et al. | Aug 1998 | A |
5794297 | Muta | Aug 1998 | A |
5802665 | Knowlton et al. | Sep 1998 | A |
5812267 | Everett et al. | Sep 1998 | A |
5814808 | Takada et al. | Sep 1998 | A |
5815880 | Nakanishi | Oct 1998 | A |
5815884 | Imamura et al. | Oct 1998 | A |
5819008 | Asama et al. | Oct 1998 | A |
5819360 | Fujii | Oct 1998 | A |
5819936 | Saveliev et al. | Oct 1998 | A |
5820821 | Kawagoe et al. | Oct 1998 | A |
5821730 | Drapkin | Oct 1998 | A |
5825981 | Matsuda | Oct 1998 | A |
5828770 | Leis et al. | Oct 1998 | A |
5831597 | West et al. | Nov 1998 | A |
5836045 | Anthony et al. | Nov 1998 | A |
5839156 | Park et al. | Nov 1998 | A |
5839532 | Yoshiji et al. | Nov 1998 | A |
5841259 | Kim et al. | Nov 1998 | A |
5844232 | Pezant | Dec 1998 | A |
5867800 | Leif | Feb 1999 | A |
5867861 | Kasen et al. | Feb 1999 | A |
5869910 | Colens | Feb 1999 | A |
5894621 | Kubo | Apr 1999 | A |
5896611 | Haaga | Apr 1999 | A |
5903124 | Kawakami | May 1999 | A |
5905209 | Oreper | May 1999 | A |
5907886 | Buscher | Jun 1999 | A |
5910700 | Crotzer | Jun 1999 | A |
5911260 | Suzuki | Jun 1999 | A |
5916008 | Wong | Jun 1999 | A |
5924167 | Wright et al. | Jul 1999 | A |
5926909 | McGee | Jul 1999 | A |
5933102 | Miller et al. | Aug 1999 | A |
5933913 | Wright et al. | Aug 1999 | A |
5935179 | Kleiner et al. | Aug 1999 | A |
5935333 | Davis | Aug 1999 | A |
5940170 | Berg et al. | Aug 1999 | A |
5940346 | Sadowsky et al. | Aug 1999 | A |
5940927 | Haegermarck et al. | Aug 1999 | A |
5940930 | Oh et al. | Aug 1999 | A |
5942869 | Katou et al. | Aug 1999 | A |
5943730 | Boomgaarden | Aug 1999 | A |
5943733 | Tagliaferri | Aug 1999 | A |
5943933 | Evans et al. | Aug 1999 | A |
5947225 | Kawakami et al. | Sep 1999 | A |
5950408 | Schaedler | Sep 1999 | A |
5959423 | Nakanishi et al. | Sep 1999 | A |
5968281 | Wright et al. | Oct 1999 | A |
5974348 | Rocks | Oct 1999 | A |
5974365 | Mitchell | Oct 1999 | A |
5983448 | Wright et al. | Nov 1999 | A |
5984880 | Lander et al. | Nov 1999 | A |
5987383 | Keller et al. | Nov 1999 | A |
5989700 | Krivopal | Nov 1999 | A |
5991951 | Kubo et al. | Nov 1999 | A |
5995883 | Nishikado | Nov 1999 | A |
5995884 | Allen et al. | Nov 1999 | A |
5996167 | Close | Dec 1999 | A |
5998953 | Nakamura et al. | Dec 1999 | A |
5998971 | Corbridge | Dec 1999 | A |
6000088 | Wright et al. | Dec 1999 | A |
6009358 | Angott et al. | Dec 1999 | A |
6012618 | Matsuo | Jan 2000 | A |
6021545 | Delgado et al. | Feb 2000 | A |
6023813 | Thatcher et al. | Feb 2000 | A |
6023814 | Imamura | Feb 2000 | A |
6025687 | Himeda et al. | Feb 2000 | A |
6026539 | Mouw et al. | Feb 2000 | A |
6030464 | Azevedo | Feb 2000 | A |
6030465 | Marcussen et al. | Feb 2000 | A |
6032327 | Oka et al. | Mar 2000 | A |
6032542 | Warnick et al. | Mar 2000 | A |
6036572 | Sze | Mar 2000 | A |
6038501 | Kawakami | Mar 2000 | A |
6040669 | Hog | Mar 2000 | A |
6041471 | Charky et al. | Mar 2000 | A |
6041472 | Kasen et al. | Mar 2000 | A |
6046800 | Ohtomo et al. | Apr 2000 | A |
6049620 | Dickinson et al. | Apr 2000 | A |
6050648 | Keleny | Apr 2000 | A |
6052821 | Chouly et al. | Apr 2000 | A |
6055042 | Sarangapani | Apr 2000 | A |
6055702 | Imamura et al. | May 2000 | A |
6061868 | Moritsch et al. | May 2000 | A |
6065182 | Wright et al. | May 2000 | A |
6070290 | Schwarze et al. | Jun 2000 | A |
6073432 | Schaedler | Jun 2000 | A |
6076025 | Ueno et al. | Jun 2000 | A |
6076026 | Jambhekar et al. | Jun 2000 | A |
6076226 | Reed | Jun 2000 | A |
6076227 | Schallig et al. | Jun 2000 | A |
6081257 | Zeller | Jun 2000 | A |
6088020 | Mor | Jul 2000 | A |
6094775 | Behmer | Aug 2000 | A |
6099091 | Campbell | Aug 2000 | A |
6101670 | Song | Aug 2000 | A |
6101671 | Wright et al. | Aug 2000 | A |
6108031 | King et al. | Aug 2000 | A |
6108067 | Okamoto | Aug 2000 | A |
6108269 | Kabel | Aug 2000 | A |
6108597 | Kirchner et al. | Aug 2000 | A |
6108859 | Burgoon | Aug 2000 | A |
6112143 | Allen et al. | Aug 2000 | A |
6112996 | Matsuo | Sep 2000 | A |
6119057 | Kawagoe | Sep 2000 | A |
6122798 | Kobayashi et al. | Sep 2000 | A |
6124694 | Bancroft et al. | Sep 2000 | A |
6125498 | Roberts et al. | Oct 2000 | A |
6131237 | Kasper et al. | Oct 2000 | A |
6138063 | Himeda | Oct 2000 | A |
6142252 | Kinto et al. | Nov 2000 | A |
6146041 | Chen et al. | Nov 2000 | A |
6146278 | Kobayashi | Nov 2000 | A |
6154279 | Thayer | Nov 2000 | A |
6154694 | Aoki et al. | Nov 2000 | A |
6160479 | Ahlen et al. | Dec 2000 | A |
6167332 | Kurtzberg et al. | Dec 2000 | A |
6167587 | Kasper et al. | Jan 2001 | B1 |
6192548 | Huffman | Feb 2001 | B1 |
6192549 | Kasen et al. | Feb 2001 | B1 |
6202243 | Beaufoy et al. | Mar 2001 | B1 |
6205380 | Bauer et al. | Mar 2001 | B1 |
6216307 | Kaleta et al. | Apr 2001 | B1 |
6220865 | Macri et al. | Apr 2001 | B1 |
6226830 | Hendriks et al. | May 2001 | B1 |
6230362 | Kasper et al. | May 2001 | B1 |
6237741 | Guidetti | May 2001 | B1 |
6240342 | Fiegert et al. | May 2001 | B1 |
6243913 | Frank et al. | Jun 2001 | B1 |
6255793 | Peless et al. | Jul 2001 | B1 |
6259979 | Holmquist | Jul 2001 | B1 |
6261379 | Conrad et al. | Jul 2001 | B1 |
6263539 | Baig | Jul 2001 | B1 |
6263989 | Won | Jul 2001 | B1 |
6272936 | Oreper et al. | Aug 2001 | B1 |
6276478 | Hopkins et al. | Aug 2001 | B1 |
6278917 | Bauer et al. | Aug 2001 | B1 |
6278918 | Dickson et al. | Aug 2001 | B1 |
6279196 | Kasen et al. | Aug 2001 | B2 |
6282526 | Ganesh | Aug 2001 | B1 |
6283034 | Miles | Sep 2001 | B1 |
6285778 | Nakajima et al. | Sep 2001 | B1 |
6285930 | Dickson et al. | Sep 2001 | B1 |
6286181 | Kasper et al. | Sep 2001 | B1 |
6300737 | Bergvall et al. | Oct 2001 | B1 |
6321337 | Reshef et al. | Nov 2001 | B1 |
6321515 | Colens | Nov 2001 | B1 |
6323570 | Nishimura et al. | Nov 2001 | B1 |
6324714 | Walz et al. | Dec 2001 | B1 |
6327741 | Reed | Dec 2001 | B1 |
6332400 | Meyer | Dec 2001 | B1 |
6339735 | Peless et al. | Jan 2002 | B1 |
6362875 | Burkley | Mar 2002 | B1 |
6370453 | Sommer | Apr 2002 | B2 |
6374155 | Wallach et al. | Apr 2002 | B1 |
6374157 | Takamura | Apr 2002 | B1 |
6381802 | Park | May 2002 | B2 |
6385515 | Dickson et al. | May 2002 | B1 |
6388013 | Saraf et al. | May 2002 | B1 |
6389329 | Colens | May 2002 | B1 |
6397429 | Legatt et al. | Jun 2002 | B1 |
6400048 | Nishimura et al. | Jun 2002 | B1 |
6401294 | Kasper | Jun 2002 | B2 |
6408226 | Byrne et al. | Jun 2002 | B1 |
6412141 | Kasper et al. | Jul 2002 | B2 |
6415203 | Inoue et al. | Jul 2002 | B1 |
6418586 | Fulghum | Jul 2002 | B2 |
6421870 | Basham et al. | Jul 2002 | B1 |
6427285 | Legatt et al. | Aug 2002 | B1 |
6430471 | Kintou et al. | Aug 2002 | B1 |
6431296 | Won | Aug 2002 | B1 |
6437227 | Theimer | Aug 2002 | B1 |
6437465 | Nishimura et al. | Aug 2002 | B1 |
6438456 | Feddema et al. | Aug 2002 | B1 |
6438793 | Miner et al. | Aug 2002 | B1 |
6442476 | Poropat | Aug 2002 | B1 |
6442789 | Legatt et al. | Sep 2002 | B1 |
6443509 | Levin et al. | Sep 2002 | B1 |
6444003 | Sutcliffe | Sep 2002 | B1 |
6446302 | Kasper et al. | Sep 2002 | B1 |
6454036 | Airey et al. | Sep 2002 | B1 |
D464091 | Christianson | Oct 2002 | S |
6457206 | Judson | Oct 2002 | B1 |
6459955 | Bartsch et al. | Oct 2002 | B1 |
6463368 | Feiten et al. | Oct 2002 | B1 |
6465982 | Bergvall et al. | Oct 2002 | B1 |
6473167 | Odell | Oct 2002 | B1 |
6480762 | Uchikubo et al. | Nov 2002 | B1 |
6481515 | Kirkpatrick et al. | Nov 2002 | B1 |
6482252 | Conrad et al. | Nov 2002 | B1 |
6490539 | Dickson et al. | Dec 2002 | B1 |
6491127 | Holmberg et al. | Dec 2002 | B1 |
6493612 | Bisset et al. | Dec 2002 | B1 |
6493613 | Peless et al. | Dec 2002 | B2 |
6496754 | Song et al. | Dec 2002 | B2 |
6496755 | Wallach et al. | Dec 2002 | B2 |
6502657 | Kerrebrock et al. | Jan 2003 | B2 |
6504610 | Bauer et al. | Jan 2003 | B1 |
6507773 | Parker et al. | Jan 2003 | B2 |
6519808 | Legatt et al. | Feb 2003 | B2 |
6525509 | Petersson et al. | Feb 2003 | B1 |
D471243 | Cioffi et al. | Mar 2003 | S |
6530102 | Pierce et al. | Mar 2003 | B1 |
6530117 | Peterson | Mar 2003 | B2 |
6532404 | Colens | Mar 2003 | B2 |
6535793 | Allard | Mar 2003 | B2 |
6540424 | Hall et al. | Apr 2003 | B1 |
6540607 | Mokris et al. | Apr 2003 | B2 |
6548982 | Papanikolopoulos et al. | Apr 2003 | B1 |
6553612 | Dyson et al. | Apr 2003 | B1 |
6556722 | Russell et al. | Apr 2003 | B1 |
6556892 | Kuroki et al. | Apr 2003 | B2 |
6557104 | Vu et al. | Apr 2003 | B2 |
D474312 | Stephens et al. | May 2003 | S |
6563130 | Dworkowski et al. | May 2003 | B2 |
6571415 | Gerber et al. | Jun 2003 | B2 |
6571422 | Gordon et al. | Jun 2003 | B1 |
6572711 | Sclafani et al. | Jun 2003 | B2 |
6574536 | Kawagoe et al. | Jun 2003 | B1 |
6580246 | Jacobs | Jun 2003 | B2 |
6584376 | Van Kommer | Jun 2003 | B1 |
6586908 | Petersson et al. | Jul 2003 | B2 |
6587573 | Stam et al. | Jul 2003 | B1 |
6590222 | Bisset et al. | Jul 2003 | B1 |
6594551 | McKinney et al. | Jul 2003 | B2 |
6594844 | Jones | Jul 2003 | B2 |
6597076 | Scheible et al. | Jul 2003 | B2 |
D478884 | Slipy et al. | Aug 2003 | S |
6601265 | Burlington | Aug 2003 | B1 |
6604021 | Imai et al. | Aug 2003 | B2 |
6604022 | Parker et al. | Aug 2003 | B2 |
6605156 | Clark et al. | Aug 2003 | B1 |
6609269 | Kasper | Aug 2003 | B2 |
6611120 | Song et al. | Aug 2003 | B2 |
6611734 | Parker et al. | Aug 2003 | B2 |
6611738 | Ruffner | Aug 2003 | B2 |
6615108 | Peless et al. | Sep 2003 | B1 |
6615434 | Davis et al. | Sep 2003 | B1 |
6615885 | Ohm | Sep 2003 | B1 |
6622465 | Jerome et al. | Sep 2003 | B2 |
6624744 | Wilson et al. | Sep 2003 | B1 |
6625843 | Kim et al. | Sep 2003 | B2 |
6629028 | Paromtchik et al. | Sep 2003 | B2 |
6633150 | Wallach et al. | Oct 2003 | B1 |
6637546 | Wang | Oct 2003 | B1 |
6639659 | Granger | Oct 2003 | B2 |
6654482 | Parent et al. | Nov 2003 | B1 |
6658325 | Zweig | Dec 2003 | B2 |
6658354 | Lin | Dec 2003 | B2 |
6658692 | Lenkiewicz et al. | Dec 2003 | B2 |
6658693 | Reed | Dec 2003 | B1 |
6661239 | Ozick | Dec 2003 | B1 |
6662889 | De Fazio et al. | Dec 2003 | B2 |
6670817 | Fournier et al. | Dec 2003 | B2 |
6671592 | Bisset et al. | Dec 2003 | B1 |
6671925 | Field et al. | Jan 2004 | B2 |
6677938 | Maynard | Jan 2004 | B1 |
6687571 | Byrne et al. | Feb 2004 | B1 |
6688951 | Kashiwaya et al. | Feb 2004 | B2 |
6690134 | Jones et al. | Feb 2004 | B1 |
6690993 | Foulke et al. | Feb 2004 | B2 |
6697147 | Ko et al. | Feb 2004 | B2 |
6705332 | Field et al. | Mar 2004 | B2 |
6711280 | Stafsudd et al. | Mar 2004 | B2 |
6732826 | Song et al. | May 2004 | B2 |
6735811 | Field et al. | May 2004 | B2 |
6735812 | Hekman et al. | May 2004 | B2 |
6737591 | Lapstun et al. | May 2004 | B1 |
6741054 | Koselka et al. | May 2004 | B2 |
6741364 | Lange et al. | May 2004 | B2 |
6748297 | Song et al. | Jun 2004 | B2 |
6756703 | Chang | Jun 2004 | B2 |
6760647 | Nourbakhsh et al. | Jul 2004 | B2 |
6764373 | Osawa et al. | Jul 2004 | B1 |
6769004 | Barrett | Jul 2004 | B2 |
6774596 | Bisset | Aug 2004 | B1 |
6779380 | Nieuwkamp | Aug 2004 | B1 |
6781338 | Jones et al. | Aug 2004 | B2 |
6809490 | Jones et al. | Oct 2004 | B2 |
6810305 | Kirkpatrick | Oct 2004 | B2 |
6810350 | Blakley | Oct 2004 | B2 |
6830120 | Yashima et al. | Dec 2004 | B1 |
6832407 | Salem et al. | Dec 2004 | B2 |
6836701 | McKee | Dec 2004 | B2 |
6841963 | Song et al. | Jan 2005 | B2 |
6845297 | Allard | Jan 2005 | B2 |
6848146 | Wright et al. | Feb 2005 | B2 |
6854148 | Rief et al. | Feb 2005 | B1 |
6856811 | Burdue et al. | Feb 2005 | B2 |
6859010 | Jeon et al. | Feb 2005 | B2 |
6859682 | Naka et al. | Feb 2005 | B2 |
6860206 | Rudakevych et al. | Mar 2005 | B1 |
6865447 | Lau et al. | Mar 2005 | B2 |
6870792 | Chiappetta | Mar 2005 | B2 |
6871115 | Huang et al. | Mar 2005 | B2 |
6883201 | Jones et al. | Apr 2005 | B2 |
6886651 | Slocum et al. | May 2005 | B1 |
6888333 | Laby | May 2005 | B2 |
6901624 | Mori et al. | Jun 2005 | B2 |
6906702 | Tanaka et al. | Jun 2005 | B1 |
6914403 | Tsurumi | Jul 2005 | B2 |
6917854 | Bayer | Jul 2005 | B2 |
6925357 | Wang et al. | Aug 2005 | B2 |
6925679 | Wallach et al. | Aug 2005 | B2 |
6929548 | Wang | Aug 2005 | B2 |
D510066 | Hickey et al. | Sep 2005 | S |
6938298 | Aasen | Sep 2005 | B2 |
6940291 | Ozick | Sep 2005 | B1 |
6941199 | Bottomley et al. | Sep 2005 | B1 |
6956348 | Landry et al. | Oct 2005 | B2 |
6957712 | Song et al. | Oct 2005 | B2 |
6960986 | Asama et al. | Nov 2005 | B2 |
6965209 | Jones et al. | Nov 2005 | B2 |
6965211 | Tsurumi | Nov 2005 | B2 |
6968592 | Takeuchi et al. | Nov 2005 | B2 |
6971140 | Kim | Dec 2005 | B2 |
6975246 | Trudeau | Dec 2005 | B1 |
6980229 | Ebersole | Dec 2005 | B1 |
6985556 | Shanmugavel et al. | Jan 2006 | B2 |
6993954 | George et al. | Feb 2006 | B1 |
6999850 | McDonald | Feb 2006 | B2 |
7013527 | Thomas et al. | Mar 2006 | B2 |
7024278 | Chiappetta et al. | Apr 2006 | B2 |
7024280 | Parker et al. | Apr 2006 | B2 |
7027893 | Perry et al. | Apr 2006 | B2 |
7030768 | Wanie | Apr 2006 | B2 |
7031805 | Lee et al. | Apr 2006 | B2 |
7032469 | Bailey | Apr 2006 | B2 |
7040869 | Beenker | May 2006 | B2 |
7041029 | Fulghum et al. | May 2006 | B2 |
7051399 | Field et al. | May 2006 | B2 |
7053578 | Diehl et al. | May 2006 | B2 |
7054716 | McKee et al. | May 2006 | B2 |
7055210 | Keppler et al. | Jun 2006 | B2 |
7057120 | Ma et al. | Jun 2006 | B2 |
7057643 | Iida et al. | Jun 2006 | B2 |
7059012 | Song et al. | Jun 2006 | B2 |
7065430 | Naka et al. | Jun 2006 | B2 |
7066291 | Martins et al. | Jun 2006 | B2 |
7069124 | Whittaker et al. | Jun 2006 | B1 |
7075661 | Petty et al. | Jul 2006 | B2 |
7079923 | Abramson et al. | Jul 2006 | B2 |
7085623 | Siegers | Aug 2006 | B2 |
7085624 | Aldred et al. | Aug 2006 | B2 |
7113847 | Chmura et al. | Sep 2006 | B2 |
7117067 | McLurkin et al. | Oct 2006 | B2 |
7133746 | Abramson et al. | Nov 2006 | B2 |
7142198 | Lee | Nov 2006 | B2 |
7148458 | Schell et al. | Dec 2006 | B2 |
7155308 | Jones | Dec 2006 | B2 |
7167775 | Abramson et al. | Jan 2007 | B2 |
7171285 | Kim et al. | Jan 2007 | B2 |
7173391 | Jones et al. | Feb 2007 | B2 |
7174238 | Zweig | Feb 2007 | B1 |
7188000 | Chiappetta et al. | Mar 2007 | B2 |
7193384 | Norman et al. | Mar 2007 | B1 |
7196487 | Jones et al. | Mar 2007 | B2 |
7201786 | Wegelin et al. | Apr 2007 | B2 |
7206677 | Hulden | Apr 2007 | B2 |
7211980 | Bruemmer et al. | May 2007 | B1 |
7225500 | Diehl et al. | Jun 2007 | B2 |
7246405 | Yan | Jul 2007 | B2 |
7248951 | Hulden | Jul 2007 | B2 |
7254464 | McLurkin et al. | Aug 2007 | B1 |
7275280 | Haegermarck et al. | Oct 2007 | B2 |
7283892 | Boillot et al. | Oct 2007 | B1 |
7288912 | Landry et al. | Oct 2007 | B2 |
7318248 | Yan et al. | Jan 2008 | B1 |
7320149 | Huffman et al. | Jan 2008 | B1 |
7321807 | Laski | Jan 2008 | B2 |
7324870 | Lee | Jan 2008 | B2 |
7328196 | Peters | Feb 2008 | B2 |
7332890 | Cohen et al. | Feb 2008 | B2 |
7346428 | Huffman et al. | Mar 2008 | B1 |
7352153 | Yan | Apr 2008 | B2 |
7359766 | Jeon et al. | Apr 2008 | B2 |
7360277 | Moshenrose et al. | Apr 2008 | B2 |
7363108 | Noda et al. | Apr 2008 | B2 |
7388879 | Sabe et al. | Jun 2008 | B2 |
7389156 | Ziegler et al. | Jun 2008 | B2 |
7389166 | Harwig et al. | Jun 2008 | B2 |
7408157 | Yan | Aug 2008 | B2 |
7418762 | Arai et al. | Sep 2008 | B2 |
7430455 | Casey et al. | Sep 2008 | B2 |
7430462 | Chiu et al. | Sep 2008 | B2 |
7441298 | Svendsen et al. | Oct 2008 | B2 |
7444206 | Abramson et al. | Oct 2008 | B2 |
7448113 | Jones et al. | Nov 2008 | B2 |
7459871 | Landry et al. | Dec 2008 | B2 |
7467026 | Sakagami et al. | Dec 2008 | B2 |
7474941 | Kim et al. | Jan 2009 | B2 |
7503096 | Lin | Mar 2009 | B2 |
7515991 | Egawa et al. | Apr 2009 | B2 |
7539557 | Yamauchi | May 2009 | B2 |
7546891 | Won | Jun 2009 | B2 |
7555363 | Augenbraun et al. | Jun 2009 | B2 |
7556108 | Won | Jul 2009 | B2 |
7557703 | Yamada et al. | Jul 2009 | B2 |
7568259 | Yan | Aug 2009 | B2 |
7571511 | Jones et al. | Aug 2009 | B2 |
7578020 | Jaworski et al. | Aug 2009 | B2 |
7597162 | Won | Oct 2009 | B2 |
7600521 | Woo | Oct 2009 | B2 |
7603744 | Reindle | Oct 2009 | B2 |
7611583 | Buckley et al. | Nov 2009 | B2 |
7617557 | Reindle | Nov 2009 | B2 |
7620476 | Morse et al. | Nov 2009 | B2 |
7636928 | Uno | Dec 2009 | B2 |
7636982 | Jones et al. | Dec 2009 | B2 |
7647144 | Haegermarck | Jan 2010 | B2 |
7650666 | Jang | Jan 2010 | B2 |
7660650 | Kawagoe et al. | Feb 2010 | B2 |
7663333 | Jones et al. | Feb 2010 | B2 |
7693605 | Park | Apr 2010 | B2 |
7706917 | Chiappetta et al. | Apr 2010 | B1 |
7720554 | DiBernardo et al. | May 2010 | B2 |
7720572 | Ziegler et al. | May 2010 | B2 |
7761954 | Ziegler et al. | Jul 2010 | B2 |
7765635 | Park | Aug 2010 | B2 |
7784147 | Burkholder et al. | Aug 2010 | B2 |
7801645 | Taylor et al. | Sep 2010 | B2 |
7805220 | Taylor et al. | Sep 2010 | B2 |
7809944 | Kawamoto | Oct 2010 | B2 |
7832048 | Harwig et al. | Nov 2010 | B2 |
7849555 | Hahm et al. | Dec 2010 | B2 |
7853645 | Brown et al. | Dec 2010 | B2 |
7860680 | Arms et al. | Dec 2010 | B2 |
7920941 | Park et al. | Apr 2011 | B2 |
7937800 | Yan | May 2011 | B2 |
7957836 | Myeong et al. | Jun 2011 | B2 |
7996097 | DiBernardo et al. | Aug 2011 | B2 |
8035255 | Kurs et al. | Oct 2011 | B2 |
8087117 | Kapoor et al. | Jan 2012 | B2 |
8106539 | Schatz et al. | Jan 2012 | B2 |
8295955 | DiBernardo et al. | Oct 2012 | B2 |
8304935 | Karalis et al. | Nov 2012 | B2 |
8324759 | Karalis et al. | Dec 2012 | B2 |
8380350 | Ozick et al. | Feb 2013 | B2 |
8396592 | Jones et al. | Mar 2013 | B2 |
8400017 | Kurs et al. | Mar 2013 | B2 |
8410636 | Kurs et al. | Apr 2013 | B2 |
8412377 | Casey et al. | Apr 2013 | B2 |
8428778 | Landry et al. | Apr 2013 | B2 |
8441154 | Karalis et al. | May 2013 | B2 |
8461719 | Kesler et al. | Jun 2013 | B2 |
8461720 | Kurs et al. | Jun 2013 | B2 |
8461721 | Karalis et al. | Jun 2013 | B2 |
8461722 | Kurs et al. | Jun 2013 | B2 |
8466583 | Karalis et al. | Jun 2013 | B2 |
8471410 | Karalis et al. | Jun 2013 | B2 |
8476788 | Karalis et al. | Jul 2013 | B2 |
8482158 | Kurs et al. | Jul 2013 | B2 |
8487480 | Kesler et al. | Jul 2013 | B1 |
8497601 | Hall et al. | Jul 2013 | B2 |
8552592 | Schatz et al. | Oct 2013 | B2 |
8569914 | Karalis et al. | Oct 2013 | B2 |
8587153 | Schatz et al. | Nov 2013 | B2 |
8587155 | Giler et al. | Nov 2013 | B2 |
8594840 | Chiappetta et al. | Nov 2013 | B1 |
8598743 | Hall et al. | Dec 2013 | B2 |
8618696 | Kurs et al. | Dec 2013 | B2 |
20010004719 | Sommer | Jun 2001 | A1 |
20010013929 | Torsten | Aug 2001 | A1 |
20010020200 | Das et al. | Sep 2001 | A1 |
20010025183 | Shahidi | Sep 2001 | A1 |
20010037163 | Allard | Nov 2001 | A1 |
20010043509 | Green et al. | Nov 2001 | A1 |
20010045883 | Holdaway et al. | Nov 2001 | A1 |
20010047231 | Peless et al. | Nov 2001 | A1 |
20010047895 | De Fazio et al. | Dec 2001 | A1 |
20020011367 | Kolesnik | Jan 2002 | A1 |
20020011813 | Koselka et al. | Jan 2002 | A1 |
20020016649 | Jones | Feb 2002 | A1 |
20020021219 | Edwards | Feb 2002 | A1 |
20020027652 | Paromtchik et al. | Mar 2002 | A1 |
20020036779 | Kiyoi et al. | Mar 2002 | A1 |
20020081937 | Yamada et al. | Jun 2002 | A1 |
20020095239 | Wallach et al. | Jul 2002 | A1 |
20020097400 | Jung et al. | Jul 2002 | A1 |
20020104963 | Mancevski | Aug 2002 | A1 |
20020108209 | Peterson | Aug 2002 | A1 |
20020112742 | Bredo et al. | Aug 2002 | A1 |
20020113973 | Ge | Aug 2002 | A1 |
20020116089 | Kirkpatrick | Aug 2002 | A1 |
20020120364 | Colens | Aug 2002 | A1 |
20020124343 | Reed | Sep 2002 | A1 |
20020153185 | Song et al. | Oct 2002 | A1 |
20020156556 | Ruffner | Oct 2002 | A1 |
20020159051 | Guo | Oct 2002 | A1 |
20020166193 | Kasper | Nov 2002 | A1 |
20020169521 | Goodman et al. | Nov 2002 | A1 |
20020173877 | Zweig | Nov 2002 | A1 |
20020189871 | Won | Dec 2002 | A1 |
20030009259 | Hattori et al. | Jan 2003 | A1 |
20030015232 | Nguyen | Jan 2003 | A1 |
20030019071 | Field et al. | Jan 2003 | A1 |
20030023356 | Keable | Jan 2003 | A1 |
20030024986 | Mazz et al. | Feb 2003 | A1 |
20030025472 | Jones et al. | Feb 2003 | A1 |
20030028286 | Glenn et al. | Feb 2003 | A1 |
20030030399 | Jacobs | Feb 2003 | A1 |
20030058262 | Sato et al. | Mar 2003 | A1 |
20030060928 | Abramson et al. | Mar 2003 | A1 |
20030067451 | Tagg et al. | Apr 2003 | A1 |
20030097875 | Lentz et al. | May 2003 | A1 |
20030120389 | Abramson et al. | Jun 2003 | A1 |
20030124312 | Autumn | Jul 2003 | A1 |
20030126352 | Barrett | Jul 2003 | A1 |
20030137268 | Papanikolopoulos et al. | Jul 2003 | A1 |
20030146384 | Logsdon et al. | Aug 2003 | A1 |
20030159232 | Hekman et al. | Aug 2003 | A1 |
20030168081 | Lee et al. | Sep 2003 | A1 |
20030175138 | Beenker | Sep 2003 | A1 |
20030192144 | Song et al. | Oct 2003 | A1 |
20030193657 | Uomori et al. | Oct 2003 | A1 |
20030208304 | Peless et al. | Nov 2003 | A1 |
20030216834 | Allard | Nov 2003 | A1 |
20030221114 | Hino et al. | Nov 2003 | A1 |
20030229421 | Chmura et al. | Dec 2003 | A1 |
20030229474 | Suzuki et al. | Dec 2003 | A1 |
20030233171 | Heiligensetzer | Dec 2003 | A1 |
20030233177 | Johnson et al. | Dec 2003 | A1 |
20030233870 | Mancevski | Dec 2003 | A1 |
20030233930 | Ozick | Dec 2003 | A1 |
20040016077 | Song et al. | Jan 2004 | A1 |
20040020000 | Jones | Feb 2004 | A1 |
20040030448 | Solomon | Feb 2004 | A1 |
20040030449 | Solomon | Feb 2004 | A1 |
20040030450 | Solomon | Feb 2004 | A1 |
20040030451 | Solomon | Feb 2004 | A1 |
20040030570 | Solomon | Feb 2004 | A1 |
20040030571 | Solomon | Feb 2004 | A1 |
20040031113 | Wosewick et al. | Feb 2004 | A1 |
20040049877 | Jones et al. | Mar 2004 | A1 |
20040055163 | McCambridge et al. | Mar 2004 | A1 |
20040068351 | Solomon | Apr 2004 | A1 |
20040068415 | Solomon | Apr 2004 | A1 |
20040068416 | Solomon | Apr 2004 | A1 |
20040074038 | Im et al. | Apr 2004 | A1 |
20040074044 | Diehl et al. | Apr 2004 | A1 |
20040076324 | Burl et al. | Apr 2004 | A1 |
20040083570 | Song et al. | May 2004 | A1 |
20040085037 | Jones et al. | May 2004 | A1 |
20040088079 | Lavarec et al. | May 2004 | A1 |
20040093122 | Galibraith | May 2004 | A1 |
20040098167 | Yi et al. | May 2004 | A1 |
20040111184 | Chiappetta et al. | Jun 2004 | A1 |
20040111821 | Lenkiewicz et al. | Jun 2004 | A1 |
20040113777 | Matsuhira et al. | Jun 2004 | A1 |
20040117064 | McDonald | Jun 2004 | A1 |
20040117846 | Karaoguz et al. | Jun 2004 | A1 |
20040118998 | Wingett et al. | Jun 2004 | A1 |
20040125461 | Kawamura | Jul 2004 | A1 |
20040128028 | Miyamoto et al. | Jul 2004 | A1 |
20040133316 | Dean | Jul 2004 | A1 |
20040134336 | Solomon | Jul 2004 | A1 |
20040134337 | Solomon | Jul 2004 | A1 |
20040143919 | Wilder | Jul 2004 | A1 |
20040148419 | Chen et al. | Jul 2004 | A1 |
20040148731 | Damman et al. | Aug 2004 | A1 |
20040153212 | Profio et al. | Aug 2004 | A1 |
20040156541 | Jeon et al. | Aug 2004 | A1 |
20040158357 | Lee et al. | Aug 2004 | A1 |
20040168148 | Goncalves et al. | Aug 2004 | A1 |
20040181706 | Chen et al. | Sep 2004 | A1 |
20040187249 | Jones et al. | Sep 2004 | A1 |
20040187457 | Colens | Sep 2004 | A1 |
20040196451 | Aoyama | Oct 2004 | A1 |
20040200505 | Taylor et al. | Oct 2004 | A1 |
20040201361 | Koh et al. | Oct 2004 | A1 |
20040204792 | Taylor et al. | Oct 2004 | A1 |
20040204804 | Lee et al. | Oct 2004 | A1 |
20040210345 | Noda et al. | Oct 2004 | A1 |
20040210347 | Sawada et al. | Oct 2004 | A1 |
20040211444 | Taylor et al. | Oct 2004 | A1 |
20040221790 | Sinclair et al. | Nov 2004 | A1 |
20040236468 | Taylor et al. | Nov 2004 | A1 |
20040244138 | Taylor et al. | Dec 2004 | A1 |
20040255425 | Arai et al. | Dec 2004 | A1 |
20050000543 | Taylor et al. | Jan 2005 | A1 |
20050010330 | Abramson et al. | Jan 2005 | A1 |
20050010331 | Taylor et al. | Jan 2005 | A1 |
20050015920 | Kim et al. | Jan 2005 | A1 |
20050021181 | Kim et al. | Jan 2005 | A1 |
20050028316 | Thomas et al. | Feb 2005 | A1 |
20050033124 | Kelly et al. | Feb 2005 | A1 |
20050053912 | Roth et al. | Mar 2005 | A1 |
20050055796 | Wright et al. | Mar 2005 | A1 |
20050067994 | Jones et al. | Mar 2005 | A1 |
20050081782 | Buckley et al. | Apr 2005 | A1 |
20050085947 | Aldred et al. | Apr 2005 | A1 |
20050091782 | Gordon et al. | May 2005 | A1 |
20050091786 | Wright et al. | May 2005 | A1 |
20050137749 | Jeon et al. | Jun 2005 | A1 |
20050144751 | Kegg et al. | Jul 2005 | A1 |
20050150074 | Diehl et al. | Jul 2005 | A1 |
20050150519 | Keppler et al. | Jul 2005 | A1 |
20050154795 | Kuz et al. | Jul 2005 | A1 |
20050156562 | Cohen et al. | Jul 2005 | A1 |
20050162119 | Landry et al. | Jul 2005 | A1 |
20050163119 | Ito et al. | Jul 2005 | A1 |
20050165508 | Kanda et al. | Jul 2005 | A1 |
20050166354 | Uehigashi | Aug 2005 | A1 |
20050166355 | Tani | Aug 2005 | A1 |
20050172445 | Diehl et al. | Aug 2005 | A1 |
20050183229 | Uehigashi | Aug 2005 | A1 |
20050183230 | Uehigashi | Aug 2005 | A1 |
20050187678 | Myeong et al. | Aug 2005 | A1 |
20050192707 | Park et al. | Sep 2005 | A1 |
20050194973 | Kwon et al. | Sep 2005 | A1 |
20050204717 | Colens | Sep 2005 | A1 |
20050209736 | Kawagoe | Sep 2005 | A1 |
20050211880 | Schell et al. | Sep 2005 | A1 |
20050212929 | Schell et al. | Sep 2005 | A1 |
20050213082 | DiBernardo et al. | Sep 2005 | A1 |
20050213109 | Schell et al. | Sep 2005 | A1 |
20050217042 | Reindle | Oct 2005 | A1 |
20050218852 | Landry et al. | Oct 2005 | A1 |
20050222933 | Wesby | Oct 2005 | A1 |
20050229340 | Sawalski et al. | Oct 2005 | A1 |
20050229355 | Crouch et al. | Oct 2005 | A1 |
20050235451 | Yan | Oct 2005 | A1 |
20050251292 | Casey et al. | Nov 2005 | A1 |
20050255425 | Pierson | Nov 2005 | A1 |
20050258154 | Blankenship et al. | Nov 2005 | A1 |
20050273967 | Taylor et al. | Dec 2005 | A1 |
20050288819 | De Guzman | Dec 2005 | A1 |
20050289527 | Illowsky et al. | Dec 2005 | A1 |
20060000050 | Cipolla et al. | Jan 2006 | A1 |
20060009879 | Lynch et al. | Jan 2006 | A1 |
20060010638 | Shimizu et al. | Jan 2006 | A1 |
20060020369 | Taylor et al. | Jan 2006 | A1 |
20060020370 | Abramson | Jan 2006 | A1 |
20060021168 | Nishikawa | Feb 2006 | A1 |
20060025134 | Cho et al. | Feb 2006 | A1 |
20060037170 | Shimizu | Feb 2006 | A1 |
20060042042 | Mertes et al. | Mar 2006 | A1 |
20060044546 | Lewin et al. | Mar 2006 | A1 |
20060060216 | Woo | Mar 2006 | A1 |
20060061657 | Rew et al. | Mar 2006 | A1 |
20060064828 | Stein et al. | Mar 2006 | A1 |
20060087273 | Ko et al. | Apr 2006 | A1 |
20060089765 | Pack et al. | Apr 2006 | A1 |
20060095169 | Minor et al. | May 2006 | A1 |
20060100741 | Jung | May 2006 | A1 |
20060107894 | Buckley et al. | May 2006 | A1 |
20060119839 | Bertin et al. | Jun 2006 | A1 |
20060143295 | Costa-Requena et al. | Jun 2006 | A1 |
20060146776 | Kim | Jul 2006 | A1 |
20060150361 | Aldred et al. | Jul 2006 | A1 |
20060184293 | Konandreas et al. | Aug 2006 | A1 |
20060185690 | Song et al. | Aug 2006 | A1 |
20060190133 | Konandreas et al. | Aug 2006 | A1 |
20060190134 | Ziegler et al. | Aug 2006 | A1 |
20060190146 | Morse et al. | Aug 2006 | A1 |
20060196003 | Song et al. | Sep 2006 | A1 |
20060200281 | Ziegler et al. | Sep 2006 | A1 |
20060220900 | Ceskutti et al. | Oct 2006 | A1 |
20060229774 | Park et al. | Oct 2006 | A1 |
20060259194 | Chiu | Nov 2006 | A1 |
20060259494 | Watson et al. | Nov 2006 | A1 |
20060278161 | Burkholder et al. | Dec 2006 | A1 |
20060288519 | Jaworski et al. | Dec 2006 | A1 |
20060293787 | Kanda et al. | Dec 2006 | A1 |
20060293808 | Qian | Dec 2006 | A1 |
20070006404 | Cheng et al. | Jan 2007 | A1 |
20070016328 | Ziegler et al. | Jan 2007 | A1 |
20070017061 | Yan | Jan 2007 | A1 |
20070028574 | Yan | Feb 2007 | A1 |
20070032904 | Kawagoe et al. | Feb 2007 | A1 |
20070042716 | Goodall et al. | Feb 2007 | A1 |
20070043459 | Abbott et al. | Feb 2007 | A1 |
20070045018 | Carter et al. | Mar 2007 | A1 |
20070061041 | Zweig | Mar 2007 | A1 |
20070061043 | Ermakov et al. | Mar 2007 | A1 |
20070097832 | Koivisto et al. | May 2007 | A1 |
20070114975 | Cohen et al. | May 2007 | A1 |
20070142964 | Abramson | Jun 2007 | A1 |
20070150096 | Yeh et al. | Jun 2007 | A1 |
20070156286 | Yamauchi | Jul 2007 | A1 |
20070157415 | Lee et al. | Jul 2007 | A1 |
20070157420 | Lee et al. | Jul 2007 | A1 |
20070179670 | Chiappetta et al. | Aug 2007 | A1 |
20070226949 | Hahm et al. | Oct 2007 | A1 |
20070234492 | Svendsen et al. | Oct 2007 | A1 |
20070244610 | Ozick et al. | Oct 2007 | A1 |
20070245511 | Hahm et al. | Oct 2007 | A1 |
20070250212 | Halloran et al. | Oct 2007 | A1 |
20070261193 | Gordon et al. | Nov 2007 | A1 |
20070266508 | Jones et al. | Nov 2007 | A1 |
20070267230 | Won | Nov 2007 | A1 |
20070267998 | Cohen et al. | Nov 2007 | A1 |
20080007203 | Cohen et al. | Jan 2008 | A1 |
20080009965 | Bruemmer et al. | Jan 2008 | A1 |
20080039974 | Sandin et al. | Feb 2008 | A1 |
20080047092 | Schnittman et al. | Feb 2008 | A1 |
20080052846 | Kapoor et al. | Mar 2008 | A1 |
20080058987 | Ozick et al. | Mar 2008 | A1 |
20080063400 | Hudson et al. | Mar 2008 | A1 |
20080086241 | Phillips et al. | Apr 2008 | A1 |
20080091304 | Ozick et al. | Apr 2008 | A1 |
20080109126 | Sandin et al. | May 2008 | A1 |
20080121097 | Rudakevych et al. | May 2008 | A1 |
20080133052 | Jones et al. | Jun 2008 | A1 |
20080134458 | Ziegler et al. | Jun 2008 | A1 |
20080140255 | Ziegler et al. | Jun 2008 | A1 |
20080143063 | Won | Jun 2008 | A1 |
20080143064 | Won | Jun 2008 | A1 |
20080155768 | Ziegler et al. | Jul 2008 | A1 |
20080184518 | Taylor et al. | Aug 2008 | A1 |
20080236907 | Won | Oct 2008 | A1 |
20080266254 | Robbins et al. | Oct 2008 | A1 |
20080266748 | Lee | Oct 2008 | A1 |
20080276407 | Schnittman et al. | Nov 2008 | A1 |
20080276408 | Gilbert, Jr. et al. | Nov 2008 | A1 |
20080281470 | Gilbert, Jr. et al. | Nov 2008 | A1 |
20080282494 | Won et al. | Nov 2008 | A1 |
20080294288 | Yamauchi | Nov 2008 | A1 |
20080302586 | Yan | Dec 2008 | A1 |
20080307590 | Jones et al. | Dec 2008 | A1 |
20090007366 | Svendsen et al. | Jan 2009 | A1 |
20090038089 | Landry et al. | Feb 2009 | A1 |
20090048727 | Hong et al. | Feb 2009 | A1 |
20090049640 | Lee et al. | Feb 2009 | A1 |
20090055022 | Casey et al. | Feb 2009 | A1 |
20090065271 | Won | Mar 2009 | A1 |
20090102296 | Greene et al. | Apr 2009 | A1 |
20090107738 | Won | Apr 2009 | A1 |
20090173553 | Won | Jul 2009 | A1 |
20090232506 | Hudson et al. | Sep 2009 | A1 |
20090292393 | Casey et al. | Nov 2009 | A1 |
20100001991 | Jeong et al. | Jan 2010 | A1 |
20100006028 | Buckley et al. | Jan 2010 | A1 |
20100011529 | Won et al. | Jan 2010 | A1 |
20100049365 | Jones et al. | Feb 2010 | A1 |
20100063628 | Landry et al. | Mar 2010 | A1 |
20100076600 | Cross et al. | Mar 2010 | A1 |
20100082193 | Chiappetta | Apr 2010 | A1 |
20100107355 | Won et al. | May 2010 | A1 |
20100139995 | Rudakevych | Jun 2010 | A1 |
20100257690 | Jones et al. | Oct 2010 | A1 |
20100257691 | Jones et al. | Oct 2010 | A1 |
20100263158 | Jones et al. | Oct 2010 | A1 |
20100268384 | Jones et al. | Oct 2010 | A1 |
20100274387 | Pitzer | Oct 2010 | A1 |
20100293742 | Chung et al. | Nov 2010 | A1 |
20100312429 | Jones et al. | Dec 2010 | A1 |
20130245937 | DiBernardo et al. | Sep 2013 | A1 |
Number | Date | Country |
---|---|---|
2128842 | Dec 1980 | DE |
3317376 | Dec 1987 | DE |
3536907 | Feb 1989 | DE |
3404202 | Dec 1992 | DE |
199311014 | Oct 1993 | DE |
4338841 | May 1995 | DE |
4414683 | Oct 1995 | DE |
19849978 | Feb 2001 | DE |
102004038074 | Jun 2005 | DE |
10357636 | Jul 2005 | DE |
102004041021 | Aug 2005 | DE |
102005046813 | Apr 2007 | DE |
338988 | Dec 1988 | DK |
0265542 | May 1988 | EP |
0281085 | Sep 1988 | EP |
0286328 | Oct 1988 | EP |
0294101 | Dec 1988 | EP |
0358628 | Sep 1989 | EP |
0352045 | Jan 1990 | EP |
0433697 | Jun 1991 | EP |
0437024 | Jul 1991 | EP |
0479273 | Apr 1992 | EP |
0554978 | Aug 1993 | EP |
0615719 | Sep 1994 | EP |
0792726 | Sep 1997 | EP |
0 798 567 | Oct 1997 | EP |
0930040 | Jul 1999 | EP |
0845237 | Apr 2000 | EP |
0861629 | Sep 2001 | EP |
1228734 | Aug 2002 | EP |
1380245 | Jan 2004 | EP |
1380246 | Jan 2004 | EP |
1018315 | Nov 2004 | EP |
1553472 | Jul 2005 | EP |
1557730 | Jul 2005 | EP |
1642522 | Apr 2006 | EP |
1836941 | Sep 2007 | EP |
2238196 | Aug 2005 | ES |
722755 | Mar 1932 | FR |
2601443 | Jan 1998 | FR |
2828589 | Feb 2003 | FR |
702426 | Jan 1954 | GB |
2128842 | May 1984 | GB |
2213047 | Aug 1989 | GB |
2225221 | May 1990 | GB |
2267360 | Dec 1993 | GB |
2283838 | May 1995 | GB |
2284957 | Jun 1995 | GB |
2300082 | Oct 1996 | GB |
2344747 | Jun 2000 | GB |
2404330 | Feb 2005 | GB |
2417354 | Feb 2006 | GB |
53021869 | Feb 1978 | JP |
53110257 | Sep 1978 | JP |
57064217 | Apr 1982 | JP |
59005315 | Jan 1984 | JP |
59033511 | Mar 1984 | JP |
59094005 | May 1984 | JP |
59099308 | Jun 1984 | JP |
59112311 | Jun 1984 | JP |
59120124 | Jul 1984 | JP |
59131668 | Sep 1984 | JP |
59164973 | Sep 1984 | JP |
59184917 | Oct 1984 | JP |
2283343 | Nov 1984 | JP |
59212924 | Dec 1984 | JP |
59226909 | Dec 1984 | JP |
60089213 | May 1985 | JP |
60211510 | Oct 1985 | JP |
60259895 | Dec 1985 | JP |
61023221 | Jan 1986 | JP |
61097712 | May 1986 | JP |
61160366 | Jul 1986 | JP |
62070709 | Apr 1987 | JP |
62074018 | Apr 1987 | JP |
62120510 | Jun 1987 | JP |
62154008 | Jul 1987 | JP |
62164431 | Jul 1987 | JP |
62263507 | Nov 1987 | JP |
62263508 | Nov 1987 | JP |
62189057 | Dec 1987 | JP |
63079623 | Apr 1988 | JP |
63158032 | Jul 1988 | JP |
63203483 | Aug 1988 | JP |
63241610 | Oct 1988 | JP |
1118752 | Aug 1989 | JP |
2-6312 | Jan 1990 | JP |
3051023 | Mar 1991 | JP |
4019586 | Jan 1992 | JP |
4074285 | Mar 1992 | JP |
4084921 | Mar 1992 | JP |
5023269 | Feb 1993 | JP |
5042076 | Feb 1993 | JP |
05046239 | Feb 1993 | JP |
5046246 | Feb 1993 | JP |
5060049 | Mar 1993 | JP |
5091604 | Apr 1993 | JP |
5095879 | Apr 1993 | JP |
5150827 | Jun 1993 | JP |
5150829 | Jun 1993 | JP |
5054620 | Jul 1993 | JP |
5040519 | Oct 1993 | JP |
05257527 | Oct 1993 | JP |
5257533 | Oct 1993 | JP |
05285861 | Nov 1993 | JP |
5302836 | Nov 1993 | JP |
5312514 | Nov 1993 | JP |
5341904 | Dec 1993 | JP |
6003251 | Jan 1994 | JP |
6038912 | Feb 1994 | JP |
6105781 | Apr 1994 | JP |
6137828 | May 1994 | JP |
6154143 | Jun 1994 | JP |
6293095 | Oct 1994 | JP |
06327 598 | Nov 1994 | JP |
7047046 | Feb 1995 | JP |
07129239 | May 1995 | JP |
7059702 | Jun 1995 | JP |
7222705 | Aug 1995 | JP |
7270518 | Oct 1995 | JP |
7281752 | Oct 1995 | JP |
7311041 | Nov 1995 | JP |
7313417 | Dec 1995 | JP |
7319542 | Dec 1995 | JP |
8000393 | Jan 1996 | JP |
8016241 | Jan 1996 | JP |
8016776 | Jan 1996 | JP |
8063229 | Mar 1996 | JP |
8084696 | Apr 1996 | JP |
8089449 | Apr 1996 | JP |
08089451 | Apr 1996 | JP |
8123548 | May 1996 | JP |
8152916 | Jun 1996 | JP |
8256960 | Oct 1996 | JP |
8263137 | Oct 1996 | JP |
8286741 | Nov 1996 | JP |
8286744 | Nov 1996 | JP |
8286745 | Nov 1996 | JP |
8286747 | Nov 1996 | JP |
8322774 | Dec 1996 | JP |
8335112 | Dec 1996 | JP |
8339297 | Dec 1996 | JP |
9044240 | Feb 1997 | JP |
9047413 | Feb 1997 | JP |
9066855 | Mar 1997 | JP |
9145309 | Jun 1997 | JP |
09160644 | Jun 1997 | JP |
9179625 | Jul 1997 | JP |
09179625 | Jul 1997 | JP |
09185410 | Jul 1997 | JP |
9192069 | Jul 1997 | JP |
2555263 | Aug 1997 | JP |
9204223 | Aug 1997 | JP |
9204224 | Aug 1997 | JP |
09206258 | Aug 1997 | JP |
9233712 | Sep 1997 | JP |
9265319 | Oct 1997 | JP |
9269807 | Oct 1997 | JP |
9269810 | Oct 1997 | JP |
9319431 | Dec 1997 | JP |
9319432 | Dec 1997 | JP |
9319434 | Dec 1997 | JP |
9325812 | Dec 1997 | JP |
10-27018 | Jan 1998 | JP |
10055215 | Feb 1998 | JP |
10117973 | May 1998 | JP |
10118963 | May 1998 | JP |
10165738 | Jun 1998 | JP |
10177414 | Jun 1998 | JP |
10214114 | Aug 1998 | JP |
10240342 | Sep 1998 | JP |
10240343 | Sep 1998 | JP |
10260727 | Sep 1998 | JP |
10295595 | Nov 1998 | JP |
10314088 | Dec 1998 | JP |
11015941 | Jan 1999 | JP |
11065655 | Mar 1999 | JP |
11102219 | Apr 1999 | JP |
11102220 | Apr 1999 | JP |
11162454 | Jun 1999 | JP |
11174145 | Jul 1999 | JP |
11175149 | Jul 1999 | JP |
11178764 | Jul 1999 | JP |
11178765 | Jul 1999 | JP |
11212642 | Aug 1999 | JP |
11213157 | Aug 1999 | JP |
11508810 | Aug 1999 | JP |
11510935 | Sep 1999 | JP |
11282532 | Oct 1999 | JP |
11282533 | Oct 1999 | JP |
11295412 | Oct 1999 | JP |
11346964 | Dec 1999 | JP |
2000047728 | Feb 2000 | JP |
2000056006 | Feb 2000 | JP |
2000056831 | Feb 2000 | JP |
2000060782 | Feb 2000 | JP |
2000066722 | Mar 2000 | JP |
2000075925 | Mar 2000 | JP |
2000102499 | Apr 2000 | JP |
2000275321 | Oct 2000 | JP |
2000279353 | Oct 2000 | JP |
2000353014 | Dec 2000 | JP |
2001022443 | Jan 2001 | JP |
2001067588 | Mar 2001 | JP |
2001087182 | Apr 2001 | JP |
2001121455 | May 2001 | JP |
2001125641 | May 2001 | JP |
2001508572 | Jun 2001 | JP |
2001197008 | Jul 2001 | JP |
3197758 | Aug 2001 | JP |
3201903 | Aug 2001 | JP |
2001216482 | Aug 2001 | JP |
2001258807 | Sep 2001 | JP |
2001265437 | Sep 2001 | JP |
2001275908 | Oct 2001 | JP |
2001289939 | Oct 2001 | JP |
2001-522079 | Nov 2001 | JP |
2001306170 | Nov 2001 | JP |
2001525567 | Dec 2001 | JP |
2002-82720 | Mar 2002 | JP |
2002073170 | Mar 2002 | JP |
2002078650 | Mar 2002 | JP |
2002204768 | Jul 2002 | JP |
2002204769 | Jul 2002 | JP |
2002247510 | Aug 2002 | JP |
2002532178 | Oct 2002 | JP |
2002532180 | Oct 2002 | JP |
2002323925 | Nov 2002 | JP |
2002333920 | Nov 2002 | JP |
03356170 | Dec 2002 | JP |
2002355206 | Dec 2002 | JP |
2002360471 | Dec 2002 | JP |
2002360482 | Dec 2002 | JP |
2002366227 | Dec 2002 | JP |
2002369778 | Dec 2002 | JP |
2003005296 | Jan 2003 | JP |
2003010076 | Jan 2003 | JP |
2003010088 | Jan 2003 | JP |
2003028528 | Jan 2003 | JP |
03375843 | Feb 2003 | JP |
2003036116 | Feb 2003 | JP |
2003038401 | Feb 2003 | JP |
2003038402 | Feb 2003 | JP |
2003047579 | Feb 2003 | JP |
2003505127 | Feb 2003 | JP |
2003061882 | Mar 2003 | JP |
2003084994 | Mar 2003 | JP |
2003-515210 | Apr 2003 | JP |
2003167628 | Jun 2003 | JP |
2003180586 | Jul 2003 | JP |
2003180587 | Jul 2003 | JP |
2003186539 | Jul 2003 | JP |
2003190064 | Jul 2003 | JP |
2003241836 | Aug 2003 | JP |
2003262520 | Sep 2003 | JP |
2003304992 | Oct 2003 | JP |
2003310509 | Nov 2003 | JP |
2003330543 | Nov 2003 | JP |
2004123040 | Apr 2004 | JP |
2004148021 | May 2004 | JP |
2004160102 | Jun 2004 | JP |
2004166968 | Jun 2004 | JP |
2004198330 | Jul 2004 | JP |
2004219185 | Aug 2004 | JP |
2004351234 | Dec 2004 | JP |
2005118354 | May 2005 | JP |
2005211360 | Aug 2005 | JP |
2005224265 | Aug 2005 | JP |
2005230032 | Sep 2005 | JP |
2005245916 | Sep 2005 | JP |
2005346700 | Dec 2005 | JP |
2005352707 | Dec 2005 | JP |
2006043071 | Feb 2006 | JP |
2006155274 | Jun 2006 | JP |
2006164223 | Jun 2006 | JP |
2006227673 | Aug 2006 | JP |
2006247467 | Sep 2006 | JP |
2006260161 | Sep 2006 | JP |
2006293662 | Oct 2006 | JP |
2006296697 | Nov 2006 | JP |
2007034866 | Feb 2007 | JP |
2007213180 | Aug 2007 | JP |
2009015611 | Jan 2009 | JP |
2010198552 | Sep 2010 | JP |
9526512 | Oct 1995 | WO |
9530887 | Nov 1995 | WO |
9617258 | Jun 1996 | WO |
9715224 | May 1997 | WO |
9740734 | Nov 1997 | WO |
9741451 | Nov 1997 | WO |
9853456 | Nov 1998 | WO |
9905580 | Feb 1999 | WO |
9916078 | Apr 1999 | WO |
WO 9923543 | May 1999 | WO |
9938056 | Jul 1999 | WO |
9938237 | Jul 1999 | WO |
9943250 | Sep 1999 | WO |
0038026 | Jun 2000 | WO |
0038028 | Jun 2000 | WO |
0038029 | Jun 2000 | WO |
0004430 | Oct 2000 | WO |
0078410 | Dec 2000 | WO |
0106904 | Feb 2001 | WO |
0106905 | Feb 2001 | WO |
WO 0137060 | May 2001 | WO |
01080703 | Nov 2001 | WO |
0191623 | Dec 2001 | WO |
0224292 | Mar 2002 | WO |
0239864 | May 2002 | WO |
0239868 | May 2002 | WO |
02062194 | Aug 2002 | WO |
02067744 | Sep 2002 | WO |
02067745 | Sep 2002 | WO |
02067752 | Sep 2002 | WO |
02069774 | Sep 2002 | WO |
02069775 | Sep 2002 | WO |
02071175 | Sep 2002 | WO |
02074150 | Sep 2002 | WO |
02075356 | Sep 2002 | WO |
02075469 | Sep 2002 | WO |
02075470 | Sep 2002 | WO |
2002075350 | Sep 2002 | WO |
02081074 | Oct 2002 | WO |
02101477 | Dec 2002 | WO |
03015220 | Feb 2003 | WO |
03024292 | Mar 2003 | WO |
03040546 | May 2003 | WO |
03040845 | May 2003 | WO |
03040846 | May 2003 | WO |
03062850 | Jul 2003 | WO |
03062852 | Jul 2003 | WO |
2004004533 | Jan 2004 | WO |
2004004534 | Jan 2004 | WO |
2004006034 | Jan 2004 | WO |
2004025947 | Mar 2004 | WO |
2004058028 | Jul 2004 | WO |
2004059409 | Jul 2004 | WO |
2005006935 | Jan 2005 | WO |
2005037496 | Apr 2005 | WO |
2005055795 | Jun 2005 | WO |
2005055796 | Jun 2005 | WO |
2005076545 | Aug 2005 | WO |
2005077243 | Aug 2005 | WO |
2005077244 | Aug 2005 | WO |
2005081074 | Sep 2005 | WO |
2005082223 | Sep 2005 | WO |
2005083541 | Sep 2005 | WO |
2005098475 | Oct 2005 | WO |
2005098476 | Oct 2005 | WO |
2006046400 | May 2006 | WO |
2006061133 | Jun 2006 | WO |
2006068403 | Jun 2006 | WO |
2006073248 | Jul 2006 | WO |
2006089307 | Aug 2006 | WO |
2007028049 | Mar 2007 | WO |
2007036490 | Apr 2007 | WO |
2007065033 | Jun 2007 | WO |
2007137234 | Nov 2007 | WO |
Entry |
---|
Gutmann et al., A Constant-Time Algorithm for Vector Field SLAM using an Exactly Sparse Extended Information Filter, Evolution Robotics, 8 pages. |
Gutmann et al., Vector Field SLAM, Evolution Robotics, 7 pages. |
Becker, C.; Salas, J.; Tokusei, K.; Latombe, J.-C.; .: “Reliable navigation using landmarks,” Robotics and Automation, 1995. Proceedings, 1995 IEEE International Conference on Robotics and Automation, vol. 1, 21-27, May 21-27, 1995 pp. 401-406, vol. 1. |
International Search Report for PCT/US05/010200, dated Aug. 2, 2005. |
International Search Report for PCT/US05/010244, dated Aug. 2, 2005. |
Japanese Office Action, JP Patent Application No. 2007-506413, dated May 26, 2010, English Translation and Japanese Office Action. |
Andersen et al., “Landmark based navigation strategies,” SPIE Conference on Mobile Robots XIII, SPIE vol. 3525, pp. 170-181, Jan. 8, 1999. |
Ascii, Mar. 25, 2002, http://ascii.jp/elem/000/000/330/330024/, accessed Nov. 2011, 15 pages (with English translation). |
Barker, “Navigation by the Stars—Ben Barker 4th Year Project,” Nov. 2004, 20 pages. |
Benayad-Cherif et al., “Mobile Robot Navigation Sensors,” SPIE vol. 1831 Mobile Robots, VII, pp. 378-387, 1992. |
Betke et al. “Mobile robot localization using landmarks,” Proceedings of the IEEE/RSJ/GI International Conference on Intelligent Robots and Systems '94 Advanced Robotic Systems and the Real World' (IROS '94), Accessed via IEEE Xplore, 1994, 8 pages. |
Bison et al., “Using a structured beacon for cooperative position estimation,” Robotics and Autonomous Systems, 29(1):33-40, Oct. 1999. |
Blaasvaer et al., “AMOR—An Autonomous Mobile Robot Navigation System,” Proceedings of the IEEE International Conference on Systems, Man, and Cybernetics, pp. 2266-2271, 1994. |
Borges et al., “Optimal Mobile Robot Pose Estimation Using Geometrical Maps,” IEEE Transactions on Robotics and Automation, 18(1): 87-94, Feb. 2002. |
Braunstingl et al., “Fuzzy Logic Wall Following of a Mobile Robot Based on the Concept of General Perception,” ICAR '95, 7th International Conference on Advanced Robotics, Sant Feliu De Guixols, Spain, pp. 367-376, Sep. 1995. |
Bulusu et al., “Self Configuring Localization systems: Design and Experimental Evaluation,”ACM Transactions on Embedded Computing Systems, 3(1):24-60, 2003. |
Caccia et al., “Bottom-Following for Remotely Operated Vehicles,” 5th IFAC Conference, Alaborg, Denmark, pp. 245-250, Aug. 2000. |
Chae et al., “StarLITE: A new artificial landmark for the navigation of mobile robots,” http://www.irc.atr.jp/jk-nrs2005/pdf/Starlite.pdf, 4 pages, 2005. |
Chamberlin et al., “Team 1: Robot Locator Beacon System,” NASA Goddard SFC, Design Proposal, 15 pages, Feb. 2006. |
Champy, “Physical management of IT assets in Data Centers using RFID technologies,” RFID 2005 University, Oct. 12-14, 2005 , 19 pages. |
Chiri, “Joystick Control for Tiny OS Robot,” http://www.eecs.berkeley.edu/Programs/ugrad/superb/papers2002/chiri.pdf. 12 pages, Aug. 2002. |
Christensen et al. “Theoretical Methods for Planning and Control in Mobile Robotics,” 1997 First International Conference on Knowledge-Based Intelligent Electronic Systems, Adelaide, Australia, pp. 81-86, May 1997. |
CleanMate 365, Intelligent Automatic Vacuum Cleaner, Model No. QQ-1, User Manual www.metapo.com/support/user—manual.pdf, Dec. 2005, 11 pages. |
Clerentin et al., “A localization method based on two omnidirectional perception systems cooperation,” Proc of IEEE International Conference on Robotics & Automation, San Francisco, CA vol. 2, pp. 1219-1224, Apr. 2000. |
Corke, “High Performance Visual serving for robots end-point control,” SPIE vol. 2056, Intelligent Robots and Computer Vision, 1993, 10 pages. |
Cozman et al., “Robot Localization using a Computer Vision Sextant,” IEEE International Midwest Conference on Robotics and Automation, pp. 106-111, 1995. |
D'Orazio et al., “Model based Vision System for mobile robot position estimation”, SPIE, vol. 2058 Mobile Robots VIII, pp. 38-49, 1992. |
De Bakker et al., “Smart PSD- array for sheet of light range imaging”, Proc. Of SPIE, vol. 3965, pp. 1-12, May 2000. |
Denning Roboscrub image (1989), 1 page. |
Desaulniers et al., “An Efficient Algorithm to find a shortest path for a car-like Robot,” IEEE Transactions on robotics and Automation, 11(6):819-828, Dec. 1995. |
Dorfmüller-Ulhaas, “Optical Tracking From User Motion to 3D Interaction,” http://www.cg.tuwien.ac.at/research/publications/2002/Dorfmueller-Ulhaas-thesis, 182 pages, 2002. |
Dorsch et al., “Laser Triangulation: Fundamental uncertainty in distance measurement,” Applied Optics, 33(7):1306-1314, Mar. 1994. |
Doty et al., “Sweep Strategies for a Sensory-Driven, Behavior-Based Vacuum Cleaning Agent,” AAAI 1993 Fall Symposium Series, Instantiating Real-World Agents, pp. 1-6, Oct. 22-24, 1993. |
Dudek et al., “Localizing a Robot with Minimum Travel” Proceedings of the sixth annual ACM-SIAM symposium on Discrete Algorithms, 27(2):583-604, Apr. 1998. |
Dulimarta et al., “Mobile Robot Localization in Indoor Environment”, Pattern Recognition, 30(1):99-111, 1997. |
Dyson's Robot Vacuum Cleaner—the DC06, May 2004, Retrieved from the Internet: URL< http://www.gizmag.com/go/1282/>. Accessed Nov. 2011, 3 pages. |
EBay, “Roomba Timer -> Timed Cleaning-Floorvac Robotic Vacuum,” Retrieved from the Internet: URL Cgi.ebay.com/ws/eBay|SAP|.dll?viewitem&category=43526&item=4375198387&rd=1, 5 pages, Apr. 2005. |
Electrolux Trilobite, “Time to enjoy life,” Retrieved from the Internet: URL<http://www.robocon.co.kr/trilobite/Presentation—Trilobite—Kor—030104.ppt, 26 pages, accessed Dec. 2011. |
Electrolux Trilobite, Jan. 12, 2001, http://www.electroluxui.com:8080/2002%5C822%5C833102EN.pdf, accessed Jul. 2, 2012, 10 pages. |
Electrolux, “Designed for the well-lived home,” Retrieved from the Internet: URL<http://www.electroluxusa.com/node57.as[?currentURL=node142.asp%3F >. Accessed Mar. 2005, 2 pages. |
Eren et al., “Accuracy in position estimation of mobile robots based on coded infrared signal transmission,” Proceedings: Integrating Intelligent Instrumentation and Control, Instrumentation and Measurement Technology Conference, 1995, IMTC/95. pp. 548-551, 1995. |
Eren et al., “Operation of Mobile Robots in a Structured Infrared Environment,” Proceedings ‘Sensing, Processing, Networking’, IEEE Instrumentation and Measurement Technology Conference, 1997 (IMTC/97), Ottawa, Canada vol. 1, pp. 20-25, May 1997. |
Euroflex Intelligente Monstre, (English excerpt only), 2006, 15 pages. |
Euroflex, Jan. 2006, Retrieved from the Internet: URL<http://www.euroflex.tv/novita—dett.php?id=15, accessed Nov. 2011, 1 page. |
eVac Robotic Vacuum S1727 Instruction Manual, Sharper Image Corp, Copyright 2004, 16 pages. |
Everyday Robots, “Everyday Robots: Reviews, Discussion and News for Consumers,” Aug. 2004, Retrieved from the Internet: URL<www.everydayrobots.com/index.php?option=content&task=view&id=9> (Sep. 2012), 4 pages. |
Evolution Robotics, “NorthStar- Low-cost Indoor Localiztion—How it Works,” E Evolution Robotics , 2 pages, 2005. |
Facchinetti Claudio et al., “Self-Positioning Robot Navigation Using Ceiling Images Sequences,” ACCV '95, 5 pages, Dec. 1995. |
Facchinetti Claudio et al., “Using and Learning Vision-Based Self-Positioning for Autonomous Robot Navigation,” ICARCV '94, vol. 3, pp. 1694-1698, 1994. |
Facts on Trilobite, webpage, Retrieved from the Internet: URL<http://trilobiteelectroluxse/presskit—en/model11335asp?print=yes&pressID=>. 2 pages, accessed Dec. 2003. |
Fairfield et al., “Mobile Robot Localization with Sparse Landmarks,” SPIE vol. 4573, pp. 148-155, 2002. |
Favre-Bulle, “Efficient tracking of 3D—Robot Position by Dynamic Triangulation,” IEEE Instrumentation and Measurement Technology Conference IMTC 98 Session on Instrumentation and Measurement in Robotics, vol. 1, pp. 446-449, May 1998. |
Fayman, “Exploiting Process Integration and Composition in the context of Active Vision,” IEEE Transactions on Systems, Man, and Cybernetics—Part C: Application and reviews, vol. 29, No. 1, pp. 73-86, Feb. 1999. |
Floorbot GE Plastics—IMAGE, available at http://www.fuseid.com/, 1989-1990, Accessed Sep. 2012, 1 page. |
Floorbotics, VR8 Floor Cleaning Robot, Product Description for Manufacturing, URL: <http://www.consensus.sem.au/SoftwareAwards/CSAarchive/CSA2004/CSAart04/FloorBot/F>. Mar. 2004, 11 pages. |
Franz et al., “Biomimetric robot navigation”, Robotics and Autonomous Systems, vol. 30 pp. 133-153, 2000. |
Friendly Robotics, “Friendly Robotics—Friendly Vac, Robotic Vacuum Cleaner,” Retrieved from the Internet: URL< www.friendlyrobotics.com/vac.htm > 5 pages, Apr. 2005. |
Friendly Robotics, Retrieved from the Internet: URL<http://www.robotsandrelax.com/PDFs/RV400Manual.pdf>. 18 pages, accessed Dec. 2011. |
Fuentes et al., “Mobile Robotics 1994,” University of Rochester. Computer Science Department, TR 588, 44 pages, Dec. 1994. |
Fukuda et al., “Navigation System based on Ceiling Landmark Recognition for Autonomous mobile robot,” 1995 IEEE/RSJ International Conference on Intelligent Robots and Systems 95. ‘Human Robot Interaction and Cooperative Robots’, Pittsburgh, PA, pp. 1466/1471, Aug. 1995. |
Gat, “Robust Low-Computation Sensor-driven Control for Task-Directed Navigation,” Proc of IEEE International Conference on Robotics and Automation , Sacramento, CA pp. 2484-2489, Apr. 1991. |
Gionis, “A hand-held optical surface scanner for environmental Modeling and Virtual Reality,” Virtual Reality World, 16 pages, 1996. |
Goncalves et al., “A Visual Front-End for Simultaneous Localization and Mapping”, Proceedings of the 2005 IEEE International Conference on Robotics and Automation, Barcelona, Spain, pp. 44-49, Apr. 2005. |
Gregg et al., “Autonomous Lawn Care Applications,” 2006 Florida Conference on Recent Advances in Robotics, Miami, Florida, May 25-26, 2006, Florida International University, 5 pages. |
Grumet, “Robots Clean House,” Popular Mechanics, Nov. 2003, 3 pages. |
Hamamatsu “Si PIN Diode S5980, S5981 S5870- Multi-element photodiodes for surface mounting,” Hamatsu Photonics, 2 pages, Apr. 2004. |
Hammacher Schlemmer , “Electrolux Trilobite Robotic Vacuum,” Retrieved from the Internet: URL< www.hammacher.com/publish/71579.asp?promo=xsells>. 3 pages, Mar. 2005. |
Haralick et al. “Pose Estimation from Corresponding Point Data”, IEEE Transactions on Systems, Man, and Cybernetics, 19(6):1426-1446, Nov. 1989. |
Hausler, “About the Scaling Behaviour of Optical Range Sensors,” Fringe '97, Proceedings of the 3rd International Workshop on Automatic Processing of Fringe Patterns, Bremen, Germany, pp. 147-155, Sep. 1997. |
Hitachi, http://www.hitachi.co.jp/New/cnews/hi—030529—hi—030529.pdf , 15 pages, May 29, 2003 (with English translation). |
Hitachi: News release: “The home cleaning robot of the autonomous movement type (experimental machine),” Retrieved from the Internet: URL< www.i4u.com./japanreleases/hitachirobot.htm>. 5 pages, Mar. 2005. |
Hoag et al., “Navigation and Guidance in interstellar space,” ACTA Astronautica, vol. 2, pp. 513-533 , Feb. 1975. |
Home Robot—UBOT; Microbotusa.com, retrieved from the WWW at www.microrobotusa.com, accessed Dec. 2, 2008, 2 pages. |
Huntsberger et al., “CAMPOUT: A Control Architecture for Tightly Coupled Coordination of Multirobot Systems for Planetary Surface Exploration,” IEEE Transactions on Systems, Man, and Cybernetics—Part A: Systems and Humans, 33(5):550-559, Sep. 2003. |
Iirobotics.com, “Samsung Unveils Its Multifunction Robot Vacuum,” Retrieved from the Internet: URL<.www.iirobotics.com/webpages/hotstuff.php?ubre=111>. 3 pages, Mar. 2005. |
InMach “Intelligent Machines,” Retrieved from the Internet: URL<www.inmach.de/inside.html>. 1 page , Nov. 2008. |
Innovation First, “2004 EDU Robot Controller Reference Guide,” Retrieved from the Internet: URL<http://www.ifirobotics.com>. 13 pages, Mar. 2004. |
IT media, Retrieved from the Internet: URL<http://www.itmedia.co.jp/news/0111/16/robofesta—m.html>. Accessed Nov. 1, 2011, 8 pages (with English translation). |
It's eye, Retrieved from the Internet: URL< www.hitachi.co.jp/rd/pdf/topics/hitac2003—10.pdf>. 11 pages, 2003. |
Jarosiewicz et al., “Final Report—Lucid,” University of Florida, Departmetn of Electrical and Computer Engineering, EEL 5666—Intelligent Machine Design Laboratory, 50 pages, Aug. 1999. |
Jensfelt et al., “Active Global Localization for a mobile robot using multiple hypothesis tracking,” IEEE Transactions on Robots and Automation, 17(5): 748-760, Oct. 2001. |
Jeong et al., “An intelligent map-building system for indoor mobile robot using low cost photo sensors,” SPIE, vol. 6042, 6 pages, 2005. |
Kahney, “Robot Vacs are in the House,” Retrieved from the Internet: URL<www.wired.com/news/technology/o,1282,59237,00.html>. 6 pages, Jun. 2003. |
Karcher “Karcher RoboCleaner RC 3000,” Retrieved from the Internet: URL<www.robocleaner.de/english/screen3.html>. 4 pages, Dec. 2003. |
Karcher RC 3000 Cleaning Robot-user manual Manufacturer: Alfred-Karcher GmbH & Co, Cleaning Systems, Alfred Karcher-Str 28-40, PO Box 160, D-71349 Winnenden, Germany, Dec. 2002, 8 pages. |
Karcher RC3000 RoboCleaner,- IMAGE, Accessed at <http://www.karcher.de/versions/int/assets/video/2—4—robo—en.swf>. Accessed Sep. 2009, 1 page. |
Karcher USA, RC3000 Robotic Cleaner, website: http://www.karcher-usa.com/showproducts.php?op=view prod¶m1=143¶m2=¶m3=, 3 pages, accessed Mar. 2005. |
Karcher, “Product Manual Download Karch”, available at www.karcher.com, 16 pages, 2004. |
Karlsson et al, “Core Technologies for service Robotics,” IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2004), vol. 3, pp. 2979-2984, Sep. 2004. |
Karlsson et al., The vSLAM Algorithm for Robust Localization and Mapping, Proceedings of the 2005 IEEE International Conference on Robotics and Automation, Barcelona, Spain, pp. 24-29, Apr. 2005. |
King and Weiman, “HelpmateTM Autonomous Mobile Robots Navigation Systems,” SPIE vol. 1388 Mobile Robots, pp. 190-198, 1990. |
Kleinberg, The Localization Problem for Mobile Robots, Laboratory for Computer Science, Massachusetts Institute of Technology, 1994 IEEE, pp. 521-531, 1994. |
Knights, et al., “Localization and Identification of Visual Landmarks,” Journal of Computing Sciences in Colleges, 16(4):312-313, May 2001. |
Kolodko et al., “Experimental System for Real-Time Motion Estimation,” Proceedings of the 2003 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM 2003), pp. 981-986, 2003. |
Komoriya et al., “Planning of Landmark Measurement for the Navigation of a Mobile Robot,” Proceedings of the 1992 IEEE/RSJ International Cofnerence on Intelligent Robots and Systems, Raleigh, NC pp. 1476-1481, Jul. 1992. |
Koolvac Robotic Vacuum Cleaner Owner's Manual, Koolatron, 2004, 13 pages. |
Krotkov et al., “Digital Sextant,” Downloaded from the internet at: http://www.cs.cmu.edu/˜epk/ , 1 page, 1995. |
Krupa et al., “Autonomous 3-D Positioning of Surgical Instruments in Robotized Laparoscopic Surgery Using Visual Servoin,” IEEE Transactions on Robotics and Automation, 19(5):842-853, Oct. 2003. |
Kuhl et al., “Self Localization in Environments using Visual Angles,” VRCAI '04 Proceedings of the 2004 ACM SIGGRAPH international conference on Virtual Reality continuum and its applications in industry, pp. 472-475, 2004. |
Kurs et al, Wireless Power transfer via Strongly Coupled Magnetic Resonances, Downloaded from www.sciencemag.org, Aug. 2007, 5 pages. |
Kurth, “Range-Only Robot Localization and SLAM with Radio”, http://www.ri.cmu.edu/pub—files/pub4/kurth—derek—2004—1/kurth—derek—2004—1.pdf. 60 pages, May 2004, accessed Jul. 27, 2012. |
Kwon et al., “Table Recognition through Range-based Candidate Generation and Vision based Candidate Evaluation,” ICAR 2007, The 13th International Conference on Advanced Robotics Aug. 21-24, 2007, Jeju, Korea, pp. 918-923, 2007. |
Lambrinos et al., “A mobile robot employing insect strategies for navigation,” Retrieved from the Internal: URL<http://www8.cs.umu.se/kurser/TDBD17/VT04/dl/Assignment%20Papers/lambrinos-RAS-2000.pdf>. 38 pages, Feb. 1999. |
Lang et al., “Visual Measurement of Orientation Using Ceiling Features”, 1994 IEEE, pp. 552-555, 1994. |
Lapin, “Adaptive position estimation for an automated guided vehicle,” SPIE, vol. 1831 Mobile Robots VII, pp. 82-94, 1992. |
LaValle et al., “Robot Motion Planning in a Changing, Partially Predictable Environment,” 1994 IEEE International Symposium on Intelligent Control, Columbus, OH, pp. 261-266, Aug. 1994. |
Lee et al., “Development of Indoor Navigation system for Humanoid Robot Using Multi-sensors Integration”, ION NTM, San Diego, CA pp. 798-805, Jan. 2007. |
Lee et al., “Localization of a Mobile Robot Using the Image of a Moving Object,” IEEE Transaction on Industrial Electronics, 50(3):612-619, Jun. 2003. |
Leonard et al., “Mobile Robot Localization by tracking Geometric Beacons,” IEEE Transaction on Robotics and Automation, 7(3):376-382, Jun. 1991. |
Li et al. “Robust Statistical Methods for Securing Wireless Localization in Sensor Networks,” Information Processing in Sensor Networks, 2005, Fourth International Symposium on, pp. 91-98, Apr. 2005. |
Li et al., “Making a Local Map of Indoor Environments by Swiveling a Camera and a Sonar,” Proceedings of the 1999 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 954-959, 1999. |
Lin et al., “Mobile Robot Navigation Using Artificial Landmarks,” Journal of robotics System, 14(2): 93-106, 1997. |
Linde, Dissertation-“On Aspects of Indoor Localization,” Available at: https://eldorado.tu-dortmund.de/handle/2003/22854, University of Dortmund, 138 pages, Aug. 2006. |
Lumelsky et al., “An Algorithm for Maze Searching with Azimuth Input”, 1994 IEEE International Conference on Robotics and Automation, San Diego, CA vol. 1, pp. 111-116, 1994. |
Luo et al., “Real-time Area-Covering Operations with Obstacle Avoidance for Cleaning Robots,” IEEE, pp. 2359-2364, 2002. |
Ma, Thesis—“Documentation on Northstar,” California Institute of Technology, 14 pages, May 2006. |
Madsen et al., “Optimal landmark selection for triangulation of robot position,” Journal of Robotics and Autonomous Systems, vol. 13 pp. 277-292, 1998. |
Malik et al., “Virtual Prototyping for Conceptual Design of a Tracked Mobile Robot,” Electrical and Computer Engineering, Canadian Conference on, IEEE, PI. pp. 2349-2352, May 2006. |
Martishevcky, “The Accuracy of point light target coordinate determination by dissectoral tracking system”, SPIE vol. 2591, pp. 25-30, Oct. 23, 2005. |
Maschinemarkt Würzburg 105, No. 27, pp. 3, 30, Jul. 5, 1999 (with English translation). |
Matsumura Camera Online Shop: Retrieved from the Internet: URL<http://www.rakuten.co.jp/matsucame/587179/711512/>. Accessed Nov. 2011, 15 pages (with English translation). |
Matsutek Enterprises Co. Ltd, “Automatic Rechargeable Vacuum Cleaner,” http://matsutek.manufacturer.globalsources.com/si/6008801427181/pdtl/Home-vacuum/10 . . . , Apr. 2007, 3 pages. |
McGillem et al., “Infra-red Lacation System for Navigation and Autonomous Vehicles,” 1988 IEEE International Conference on Robotics and Automation, vol. 2, pp. 1236-1238, Apr. 1988. |
McGillem,et al. “A Beacon Navigation Method for Autonomous Vehicles,” IEEE Transactions on Vehicular Technology, 38(3):132-139, Aug. 1989. |
McLurkin “Stupid Robot Tricks: A Behavior-based Distributed Algorithm Library for Programming Swarms of Robots,” Paper submitted for requirements of BSEE at MIT, May 2004, 127 pages. |
McLurkin, “The Ants: A community of Microrobots,” Paper submitted for requirements of BSEE at MIT, May 1995, 60 pages. |
Michelson, “Autonomous navigation,” McGraw-Hill—Access Science, Encyclopedia of Science and Technology Online, 2007, 4 pages. |
Miro et al., “Towards Vision Based Navigation in Large Indoor Environments,” Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Beijing, China, pp. 2096-2102, Oct. 2006. |
MobileMag, “Samsung Unveils High-tech Robot Vacuum Cleaner,” Retrieved from the Internet: URL<http://www.mobilemag.com/content/100/102/C2261/>. 4 pages, Mar. 2005. |
Monteiro et al., “Visual Servoing for Fast Mobile Robot: Adaptive Estimation of Kinematic Parameters,” Proceedings of the IECON '93., International Conference on Industrial Electronics, Maui, HI, pp. 1588-1593, Nov. 1993. |
Moore et al., “A simple Map-bases Localization strategy using range measurements,” SPIE, vol. 5804 pp. 612-620, 2005. |
Morland,“Autonomous Lawnmower Control”, Downloaded from the internet at: http://cns.bu.edu/˜cjmorlan/robotics/lawnmower/report.pdf, 10 pages, Jul. 2002. |
Munich et al., “ERSP: A Software Platform and Architecture for the Service Robotics Industry,” Intelligent Robots and Systems, 2005. (IROS 2005), pp. 460-467, Aug. 2005. |
Munich et al., “SIFT-ing Through Features with ViPR”, IEEE Robotics & Automation Magazine, pp. 72-77, Sep. 2006. |
Nam et al., “Real-Time Dynamic Visual Tracking Using PSD Sensors and extended Trapezoidal Motion Planning”, Applied Intelligence 10, pp. 53-70, 1999. |
Nitu et al., “Optomechatronic System for Position Detection of a Mobile Mini-Robot,” IEEE Ttransactions on Industrial Electronics, 52(4):969-973, Aug. 2005. |
On Robo, “Robot Reviews Samsung Robot Vacuum (VC-RP30W),” Retrieved from the Internet: URL <www.onrobo.com/reviews/AT—Home/vacuum—cleaners/on00vcrb30rosam/index.htm>. 2 pages, 2005. |
OnRobo “Samsung Unveils Its Multifunction Robot Vacuum,” Retrieved from the Internet: URL <www.onrobo.com/enews/0210/samsung—vacuum.shtml>. 3 pages, Mar. 2005. |
Pages et al., “A camera-projector system for robot positioning by visual serving,” Proceedings of the 2006 Conference on Computer Vision and Pattern Recognition Workshop (CVPRW06), 8 pages, Jun. 2006. |
Pages et al., “Optimizing Plane-to-Plane Positioning Tasks by Image-Based Visual Servoing and Structured Light,” IEEE Transactions on Robotics, 22(5):1000-1010, Oct. 2006. |
Pages et al., “Robust decoupled visual servoing based on structured light,” 2005 IEEE/RSJ, Int. Conf. on Intelligent Robots and Systems, pp. 2676-2681, 2005. |
Park et al., “A Neural Network Based Real-Time Robot Tracking Controller Using Position Sensitive Detectors,” IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on Neutral Networks, Orlando, Florida pp. 2754-2758, Jun./Jul. 1994. |
Park et al., “Dynamic Visual Servo Control of Robot Manipulators using Neutral Networks,” The Korean Institute Telematics and Electronics, 29-B(10):771-779, Oct. 1992. |
Paromtchik “Toward Optical Guidance of Mobile Robots,” Proceedings of the Fourth World Multiconference on Systemics, Cybermetics and Informatics, Orlando, FL, USA, Jul. 23, 2000, vol. IX, pp. 44-49, available at http://emotion.inrialpes.fr/˜paromt/infos/papers/paromtchik:asama:sci:2000.ps.gz, accessed Jul. 3, 2012, 6 pages. |
Paromtchik et al., “Optical Guidance System for Multiple mobile Robots,” Proceedings 2001 ICRA. IEEE International Conference on Robotics and Automation, vol. 3, pp. 2935-2940, May 2001. |
Penna et al., “Models for Map Building and Navigation, IEEE Transactions on Systems. Man. And Cybernetics.,” 23(5):1276-1301, Sep./Oct. 1993. |
Pirjanian et al. “Representation and Execution of Plan Sequences for Multi-Agent Systems,” Proceedings of the 2001 IEEE/RSJ International Conference on Intelligent Robots and Systems, Maui, Hawaii, pp. 2117-2123, Oct. 2001. |
Pirjanian et al., “A decision-theoretic approach to fuzzy behavior coordination”, 1999 IEEE International Symposium on Computational Intelligence in Robotics and Automation, 1999. CIRA '99., Monterey, CA, pp. 101-106, Nov. 1999. |
Pirjanian et al., “Distributed Control for a Modular, Reconfigurable Cliff Robot,” Proceedings of the 2002 IEEE International Conference on Robotics & Automation, Washington, D.C. pp. 4083-4088, May 2002. |
Pirjanian et al., “Improving Task Reliability by Fusion of Redundant Homogeneous Modules Using Voting Schemes,” Proceedings of the 1997 IEEE International Conference on Robotics and Automation, Albuquerque, NM, pp. 425-430, Apr. 1997. |
Pirjanian et al., “Multi-Robot Target Acquisition using Multiple Objective Behavior Coordination,” Proceedings of the 2000 IEEE International Conference on Robotics & Automation, San Francisco, CA, pp. 2696-2702, Apr. 2000. |
Pirjanian, “Challenges for Standards for consumer Robotics,” IEEE Workshop on Advanced Robotics and its Social impacts, pp. 260-264, Jun. 2005. |
Pirjanian, “Reliable Reaction,” Proceedings of the 1996 IEEE/SICE/RSJ International Conference on Multisensor Fusion and Integration for Intelligent Systems, pp. 158-165, 1996. |
Prassler et al., “A Short History of Cleaning Robots,” Autonomous Robots 9, 211-226, 2000, 16 pages. |
Put Your Roomba . . . On, Automatic webpages: http://www.acomputeredge.com/roomba, 5 pages, accessed Apr. 2005. |
Remazeilles et al., “Image based robot navigation in 3D environments,” Proc. of SPIE, vol. 6052, pp. 1-14, Dec. 2005. |
Rives et al., “Visual servoing based on ellipse features,” SPIE, vol. 2056 Intelligent Robots and Computer Vision pp. 356-367, 1993. |
Roboking—not just a vacuum cleaner, a robot!, Jan. 21, 2004, infocom.uz/2004/01/21/robokingne-prosto-pyilesos-a-robot/, accessed Oct. 10, 2011, 5 pages. |
RoboMaid Sweeps Your Floors So You Won't Have to, the Official Site, website: Retrieved from the Internet: URL<http://therobomaid.com/>. 2 pages, accessed Mar. 2005. |
Robot Buying Guide, “LG announces the first robotic vacuum cleaner for Korea,” Retrieved from the Internet: URL<http://robotbg.com/news/2003/04/22/lg—announces—the—first—robotic—vacu>. 1 page, Apr. 2003. |
Robotics World, “A Clean Sweep,” 5 pages, Jan. 2001. |
Ronnback, “On Methods for Assistive Mobile Robots,” Retrieved from the Internet: URL<http://www.openthesis.org/documents/methods-assistive-mobile-robots-595019.html>. 218 pages, Jan. 2006. |
Roth-Tabak et al., “Environment Model for mobile Robots Indoor Navigation,” SPIE, vol. 1388 Mobile Robots, pp. 453-463, 1990. |
Sahin et al., “Development of a Visual Object Localization Module for Mobile Robots,” 1999 Third European Workshop on Advanced Mobile Robots, (Eurobot '99), pp. 65-72, 1999. |
Salomon et al., “Low-Cost Optical Indoor Localization system for Mobile Objects without Image Processing,” IEEE Conference on Emerging Technologies and Factory Automation, 2006. (ETFA '06), pp. 629-632, Sep. 2006. |
Sato, “Range Imaging Based on Moving Pattern Light and Spatio-Temporal Matched Filter,” Proceedings International Conference on Image Processing, vol. 1., Lausanne, Switzerland, pp. 33-36, Sep. 1996. |
Schenker et al., “Lightweight rovers for Mars science exploration and sample return,” Intelligent Robots and Computer Vision XVI, SPIE Proc. 3208, pp. 24-36, 1997. |
Schofield, “Neither Master nor slave—A Practical Study in the Development and Employment of Cleaning Robots, Emerging Technologies and Factory Automation,” 1999 Proceedings ETFA '99 1999 7th IEEE International Conference on Barcelona, Spain, pp. 1427-1434, Oct. 1999. |
Shimoga et al., “Touch and Force Reflection for Telepresence Surgery,” Engineering in Medicine and Biology Society, 1994. Engineering Advances: New Opportunities for Biomedical Engineers. Proceedings of the 16th Annual International Conference of the IEEE, Baltimore, MD, pp. 1049-1050, 1994. |
Sim et al, “Learning Visual Landmarks for Pose Estimation,” IEEE International Conference on Robotics and Automation, vol. 3, Detroit, MI, pp. 1972-1978, May 1999. |
Sobh et al., “Case Studies in Web-Controlled Devices and Remote Manipulation,” Automation Congress, 2002 Proceedings of the 5th Biannual World, pp. 435-440, Dec. 2002. |
Special Reports, “Vacuum Cleaner Robot Operated in Conjunction with 3G Celluar Phone,” 59(9): 3 pages, Retrieved from the Internet: URL<http://www.toshiba.co.jp/tech/review/2004/09/59—0>. 2004. |
Stella et al., “Self-Location for Indoor Navigation of Autonomous Vehicles,” Part of the SPIE conference on Enhanced and Synthetic Vision SPIE vol. 3364, pp. 298-302, 1998. |
Summet, “Tracking Locations of Moving Hand-held Displays Using Projected Light,” Pervasive 2005, LNCS 3468, pp. 37-46, 2005. |
Svedman et al., “Structure from Stereo Vision using Unsynchronized Cameras for Simultaneous Localization and Mapping,” 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 2993-2998, 2005. |
SVET Computers—New Technologies—Robot Vacuum Cleaner, Oct. 1999, available at http://www.sk.rs/1999/10/sknt01.html, 1 page, accessed Nov. 1, 2011. |
Taipei Times, “Robotic vacuum by Matsuhita about to undergo testing,” Retrieved from the Internet: URL<http://www.taipeitimes.com/News/worldbiz/archives/2002/03/26/0000129338>. accessed Mar. 2002, 2 pages. |
Takio et al., “Real-Time Position and Pose Tracking Method of Moving Object Using Visual Servo System,” 47th IEEE International Symposium on Circuits and Systems, pp. 167-170, 2004. |
Tech-on!, Retrieved from the Internet: URL<http://techon.nikkeibp.co.jp/members/01db/200203/1006501/>. 7 pages, accessed Nov. 2011 (with English translation). |
Teller, “Pervasive pose awareness for people, Objects and Robots,” http://www.ai.mit.edu/lab/dangerous-ideas/Spring2003/teller-pose.pdf, 6 pages, Apr. 2003. |
Terada et al., “An Acquisition of the Relation between Vision and Action using Self-Organizing Map and Reinforcement Learning,” 1998 Second International Conference on Knowledge-Based Intelligent Electronic Systems, Adelaide, Australia, pp. 429-434, Apr. 1998. |
The Sharper Image, eVac Robotic Vacuum—Product Details, www.sharperiamge.com/us/en/templates/products/pipmoreworklprintable.jhtml, 1 page, Accessed Mar. 2005. |
TheRobotStore.com, “Friendly Robotics Robotic Vacuum RV400—The Robot Store,” www.therobotstore.com/s.nl/sc.9/category.-109/it.A/id.43/.f, 1 page, Apr. 2005. |
Thrun, Sebastian, “Learning Occupancy Grid Maps With Forward Sensor Models,” Autonomous Robots 15, 28 pages, Sep. 1, 2003. |
TotalVac.com, RC3000 RoboCleaner website, 2004, Accessed at http://ww.totalvac.com/robot—vacuum.htm (Mar. 2005), 3 pages. |
Trebi-Ollennu et al., “Mars Rover Pair Cooperatively Transporting a Long Payload,” Proceedings of the 2002 IEEE International Conference on Robotics & Automation, Washington, D.C. pp. 3136-3141, May 2002. |
Tribelhorn et al., “Evaluating the Roomba: A low-cost, ubiquitous platform for robotics research and education,” IEEE, pp. 1393-1399, 2007. |
Tse et al., “Design of a Navigation System for a Household Mobile Robot Using Neural Networks,” Department of Manufacturing Engg. & Engg. Management, City University of Hong Kong, pp. 2151-2156, 1998. |
UAMA (Asia) Industrial Co., Ltd., “RobotFamily,” 2005, 1 page. |
UBOT, cleaning robot capable of wiping with a wet duster, Retrieved from the Internet: URL<http://us.aving.net/news/view.php?articleId=23031>. 4 pages, accessed Nov. 2011. |
Watanabe et al., “Position Estimation of Mobile Robots With Internal and External Sensors Using Uncertainty Evolution Technique,” 1990 IEEE International Conference on Robotics and Automation, Cincinnati, OH, pp. 2011-2016, May 1990. |
Watts, “Robot, boldly goes where no man can,” The Times—pp. 20, Jan. 1985. |
Wijk et al., “Triangulation-Based Fusion of Sonar Data with Application in Robot Pose Tracking,” IEEE Transactions on Robotics and Automation, 16(6):740-752, Dec. 2000. |
Wolf et al., “Robust Vision-Based Localization by Combining an Image-Retrieval System with Monte Carol Localization,”, IEEE Transactions on Robotics, 21(2):208-216, Apr. 2005. |
Wolf et al., “Robust Vision-based Localization for Mobile Robots Using an Image Retrieval System Based on Invariant Features,” Proceedings of the 2002 IEEE International Conference on Robotics & Automation, Washington, D.C., pp. 359-365, May 2002. |
Wong, “EIED Online>> Robot Business”, ED Online ID# 13114, 17 pages, Jul. 2006. |
Yamamoto et al., “Optical Sensing for Robot Perception and Localization,” 2005 IEEE Workshop on Advanced Robotics and its Social Impacts, pp. 14-17, 2005. |
Yata et al., “Wall Following Using Angle Information Measured by a Single Ultrasonic Transducer,” Proceedings of the 1998 IEEE, International Conference on Robotics & Automation, Leuven, Belgium, pp. 1590-1596, May 1998. |
Yujin Robotics,“An intelligent cleaning robot,” Retrieved from the Internet: URL<http://us.aving.net/news/view.php?articleId=7257>. 8 pages, accessed Nov. 2011. |
Yun et al., “Image-Based Absolute Positioning System for Mobile Robot Navigation,” IAPR International Workshops SSPR, Hong Kong, pp. 261-269, Aug. 2006. |
Yun et al., “Robust Positioning a Mobile Robot with Active Beacon Sensors,” Lecture Notes in Computer Science, 2006, vol. 4251, pp. 890-897, 2006. |
Yuta et al., “Implementation of an Active Optical Range sensor Using Laser Slit for In-Door Intelligent Mobile Robot,” IEE/RSJ International Workshop on Intelligent Robots and Systems (IROS 91) vol. 1, Osaka, Japan, pp. 415-420, Nov. 3-5, 1991. |
Zha et al., “Mobile Robot Localization Using Incomplete Maps for Change Detection in a Dynamic Environment,” Advanced Intelligent Mechatronics '97. Final Program and Abstracts., IEEE/ASME International Conference, pp. 110, Jun. 1997. |
Zhang et al., “A Novel Mobile Robot Localization Based on Vision,” SPIE vol. 6279, 6 pages, Jan. 2007. |
Zoombot Remote Controlled Vaccuum-RV-500 New Roomba 2, website: http://cgi.ebay.com/ws/eBayISAPI.d11?ViewItem&category=43526&item=4373497618&rd=1, accessed Apr. 20, 2005, 7 pages. |
Written Opinion of the Searching Authority, PCT/US2004/001504, Aug. 20, 2004, 9 pages. |
United States Office Action issued in U.S. Appl. No. 11/633,869, mailed Sep. 16, 2010, 43 pages. |
Number | Date | Country | |
---|---|---|---|
20110125323 A1 | May 2011 | US |
Number | Date | Country | |
---|---|---|---|
61280677 | Nov 2009 | US |