1. Field of the Invention
The present invention relates generally to a vehicle control system and method using optical tracking, and in particular to such a system and method for an agricultural vehicle including a towed component comprising an implement.
2. Description of the Related Art
Automatic control of steering (“autosteering”) of vehicles is becoming more widespread, especially in agricultural and mining applications. Most commercially available automatic steering systems include a controller that has means for determining, among other things, the position and heading of a vehicle, a computer-based system for comparing the position and heading of the vehicle with a desired position and heading, and a steering control responsive to a control signal issued by the controller when the position and/or heading of the vehicle deviates from the desired position and/or heading.
As used herein, “attitude” generally refers to the heading or orientation (pitch with respect to the Y axis, roll with respect to the X axis and yaw with respect to the Z axis) of the vehicle, or of an implement associated with the vehicle. Other vehicle/implement-related parameters of interest include groundspeed or velocity and position. Position can be defined absolutely in relation to a geo-reference system, or relatively in relation to a fixed position at a known location, such as a base station. A change in one or both of the position and orientation of the vehicle (which can include a towed component, such as an implement or a trailer) can be considered a change in the vehicle's “pose.” This includes changes (e.g. different order time derivatives) in attitude and/or position. Attitude and position are generally measured relatively with respect to a particular reference flame that is fixed relative to the area that the vehicle is operating in, or globally with respect to a geo-reference system.
Automatic control systems for controlling steering of a vehicle may include a global navigation satellite system (GNSS, including the global positioning system (GPS)) based system. GNSS-based systems typically include a GNSS receiver mounted on the vehicle that receives signals from constellations of GNSS satellites that orbit the earth. The GNSS receiver can then determine or estimate a location of the vehicle. A number of early automatic steering control systems included GNSS-only systems. These systems suffered from limitations in that signals from the constellation of GNSS satellites are received at a relatively low rate, meaning that the location of the vehicle was also determined from the GNSS system at a relatively low rate. As the vehicle is continually moving, there were significant periods during which the location of the vehicle was not being determined. Accordingly, the vehicles would often deviate from the desired path of travel.
Significant work has also been conducted in respect of using inertial sensors to attempt to control the steering of the vehicle. Inertial sensors include accelerometers and/or gyroscopes that can be used to provide indications as to the attitude and speed (or changes thereto) of the vehicle. Unfortunately, inertial sensors such as accelerometers and gyroscopes suffer from time-varying errors. This is particularly marked in the less expensive inertial sensors used in commercially available vehicle steering control systems. Less expensive inertial sensors are used in commercially available systems principally to reduce the cost of the systems to make them affordable.
U.S. Pat. No. 6,876,920, which is assigned to a common assignee herewith and incorporated herein by reference, describes a vehicle guidance apparatus for guiding a vehicle over a paddock or field along a number of paths, the paths being offset from each other by a predetermined distance. The vehicle guidance apparatus includes a GNSS receiver for periodically receiving data regarding the vehicle's location, and an inertial relative location determining means for generating relative location data along a current path during time periods between receipt of vehicle position data from the GNSS receiver. The apparatus also includes data entry means to enable the entry by an operator of an initial path and a desired offset distance between the paths. Processing means are arranged to generate a continuous guidance signal indicative of errors in the attitude and position of the vehicle relative to one of the paths, the attitude and position being determined by combining corrected GNSS vehicle location data with the relative location data from the inertial relative location determining means.
In the system described in U.S. Pat. No. 6,876,920, the inertial sensor is used to provide a higher data rate than that obtainable from GNSS alone. Although the inertial navigation system (INS) part of the steering control system suffers from errors, in particular a yaw bias, the signals received from the GNSS system are used to correct these errors. Thus, the combination of a GNSS based system and a relatively inexpensive INS navigation system allow for quite accurate control of the position of the vehicle. Although this system allows for accurate vehicle positioning and sound control of the vehicle's steering, difficulties may be experienced if there are prolonged periods of GNSS outage. GNSS outages may occur due to unsuitable weather conditions, the vehicle operating in an area where GNSS signals cannot be accessed, or due to problems with the GNSS receiver. If a period of prolonged GNSS outage occurs, the steering system relies solely upon the INS. Unfortunately, a yaw bias in a relatively inexpensive inertial sensor used in the commercial embodiment of that steering control system can result in errors being introduced into the steering of the vehicle.
Optical computer mice are widely used to control the position of a cursor on a computer screen. Optical computer mice incorporate an optoelectronic sensor that takes successive pictures of the surface on which the mouse operates. Most optical computer mice use a light source to illuminate the surface that is being tracked (i.e. the surface over which the mouse is moving). Changes between one frame and the next are processed using the image processing ability of the chip that is embedded in the mouse. A digital correlation algorithm is used so that the movement of the mouse is translated into corresponding movement of the mouse cursor on the computer screen.
The optical movement sensors used in optical computer mice have high processing capabilities. A number of commercially available optical computer mice include optical mouse sensors that can process successive images of the surface over which the mouse is moving at speeds in excess of 1500 frames per second. The mouse has a small light emitting source that bounces light off the surface and onto a complimentary metal oxide semiconductor (CMOS) sensor. The CMOS sensor sends each image to a digital signal processor (DSP) for analysis. The DSP is able to detect patterns in images and see how those patterns have moved since the previous image. Based on the change in patterns over a sequence of images, the digital signal processor determines how far the mouse has moved in X and Y directions, and sends these corresponding distances to the computer. The computer moves the cursor on the screen based upon the coordinates received from the mouse. This happens hundreds to thousands of times each second, making the cursor appear to move very smoothly.
The chips incorporated into optical computer mice often include photodetectors and an embedded integrated circuit that is used to analyse the digital signals received from the photodetectors. The photodetectors may include an array of photosensors, such as an array of charge coupled devices (CCDs).
U.S. Pat. No. 5,786,804 (incorporated herein by reference), which is assigned to Hewlett-Packard Company, describes a method and system for tracking attitude of a device. The system includes fixing a two-dimensional (2D) array of photosensors to the device and using the array to form a reference frame and a sample frame of images. The fields of view of the sample and reference frames largely overlap, so that there are common image features from frame to flame. Several frames are correlated with the reference frame to detect differences in location of the common features. Based upon detection of correlations of features, an attitudinal signal indicative of pitch, yaw and/or roll is generated. The attitudinal signal is used to manipulate a screen cursor of a display system, such as a remote interactive video system.
In a first aspect, the present invention provides a control system for controlling movement of a vehicle characterized in that the control system includes an optical movement sensor which scans a surface over which the vehicle is moving and generates a signal indicative of relative movement along an axis of the vehicle and relative movement across an axis of the vehicle, said signal being provided to a controller.
In a second aspect, the present invention provides a control system for controlling movement of a vehicle comprising a controller having a computer memory for storing or generating a desired path of travel, the controller being adapted to receive position and/or heading signals from one or more sensors, the position and/or heading signals enabling the controller to determine a position and/or heading of the vehicle relative to a desired path of travel, the controller signals to a steering control mechanism in response to the determined position and/or heading of the vehicle, wherein the position and/or heading signals from the one or more sensors include a signal generated by an optical movement sensor configured to scan a surface during travel of the vehicle, the optical movement sensor generating a signal indicative of relative movement along an axis of the vehicle and relative movement across an axis of the vehicle. The surface that is scanned by the optical movement sensor is suitably a surface over which the vehicle is travelling. Suitably, the optical movement sensor scans a surface that is close to or under the vehicle, during travel of the vehicle over the surface.
The optical movement sensor may comprise the operative part of an optical computer mouse. Therefore, in saying that the optical movement sensor “scans” the surface over which the vehicle moves, where the optical movement sensor comprises the operative part of an optical computer mouse it will be understood that the optical movement sensor receives successive images of the surface over which the vehicle is moving. One part or other of the control system will then detect patterns in the images, and uses the change in the patterns between successive images to obtain information regarding the movement of the vehicle.
The optical movement sensor may comprise an illumination source and an illumination detector. The optical movement sensor may comprise an optical movement sensor integrated circuit.
As noted above, the optical movement sensor may comprise the operative part from an optical computer mouse. Alternatively, the optical movement sensor may be adapted from or derived from the operative part of an optical computer mouse. The optical movement sensor may use a light source to illuminate the surface that is being tracked (i.e. the surface over which the vehicle is moving).
Changes between one frame and the next may be processed by an image processing part of a chip embedded in the optical movement sensor and this may translate the movement across the surface of the optical movement sensor (which will generally be mounted to the vehicle) into movement along two axes. Alternatively the image processing may be performed by processing means separate from the optical movement sensor. For example, the signals received by the optical movement sensor may be conveyed to a separate microprocessor with graphics processing capabilities for processing.
The optical movement sensor may include an optical movement sensing circuit that tracks movement in a fashion similar to the optical movement sensing circuits used to track movement in computer mice. The person skilled in the art will readily appreciate how such optical movement sensing circuits analyse data and provide signals indicative of movement of the sensor across the surface. For this reason, further discussion as to the actual algorithms used in the optical movement sensing circuits need not be provided. Suitably, the optical movement sensing circuit may comprise an optical movement sensing integrated circuit. Such optical movement sensing integrated circuits are readily available from a number of suppliers.
In some embodiments, the control system of the present invention may further comprise one or more inertial sensors for providing further signals regarding the vehicle's attitude and position (or changes thereto) to the controller. Accelerometers and rate gyroscopes are examples of inertial sensors that may be used. The inertial sensors may form part of or comprise an inertial navigation system (INS), a dynamic measurement unit (DMU), an inertial sensor assembly (ISA), or an attitude heading reference system (AHRS). These are well-known to persons skilled in the art and need not be described further. The inertial sensors may be used in conjunction with other navigation sensors, such as magnetometers; or vehicle based sensors such as steering angle sensors, or wheel speed encoders.
Inertial sensors, such as rate gyroscopes and accelerometers, can suffer from time varying errors that can propagate through to create errors in the vehicle's calculated attitude and/or position. These errors can be sufficiently acute that to prevent providing the controller with significantly inaccurate measures of the vehicle's attitude and/or position, it is preferable (and often necessary) for the control system to also receive signals regarding the vehicle's attitude and/or position (or changes thereto) from a source that is independent of the inertial sensors. These separate signals can be used to compensate for the errors in the inertial sensor signals using known signal processing techniques.
It is common to use GNSS signals (which provide information regarding the vehicle's location) to compensate for the errors in the inertial sensor signals. However, the present invention opens up the possibility of providing a control system that includes the optical movement sensor and one or an assembly of inertial sensors (and possibly including one or more other vehicle sensors as well). In other words, in some embodiments of the present invention, the signals provided by the optical movement sensor may be used to compensate for the errors in the inertial sensor signals instead of or in addition to the GNSS signals.
In embodiments such as those described in the previous paragraph, a single optical movement sensor may generally be sufficient to compensate for the errors in inertial sensors such as accelerometers which measure rates of change in linear displacement. However, a single optical movement sensor may not be sufficient to compensate for errors in inertial sensors such as gyroscopes which measure rates of change in angular displacement because the optical movement sensor will often be fixedly mounted to the vehicle such that the orientation of the optical movement sensor is fixed to, and changes with, the orientation of the vehicle.
The single optical movement sensor of the kind used in optical computer mice is able to detect and measure movement of the optical movement sensor along the X (roll) and Y (pitch) axes (in the present context this means the X (roll) and Y (pitch) axes of the vehicle because the optical movement sensor is fixed to the vehicle). However, this kind of optical movement sensor is not generally able to detect and measure rotation about the Z (yaw) axis. Consequently, if it is desired to compensate for the XYZ errors in inertial sensors such as gyroscopes using optical movement sensors that are fixedly mounted to the vehicle, two or more optical movement sensors will generally need to be provided and mounted at different locations on the vehicle.
Alternatively, a single optical movement sensor can be used to compensate for the errors in gyroscopes and the like which measure rates of change in rotational displacement if the optical movement sensor is not fixed with respect to the vehicle. Rather, the optical movement sensor could be mounted so that when the vehicle turned (i.e. rotated about its Z (yaw) axis), the orientation of the optical movement sensor would remain unchanged. In effect, even if the vehicle turns, the orientation of the optical movement sensor would remain unchanged meaning that the optical movement sensor would effectively translate but not rotate with respect to the surface over which the vehicle is moving. A single optical movement sensor might thus be used to compensate for the errors in both accelerometers and gyroscopes, but some system or mechanism (e.g., gimbal-mounting) would need to be provided to maintain the constant orientation of the optical movement sensor.
The embodiments of the invention described above where the control system incorporates one or more inertial sensors, one or more optical movement sensors, and where the optical movement sensor(s) are used (instead of GNSS signals) to compensate for the errors in the inertial sensor(s) can generally be described as relative measurement control systems. This is because the optical movement sensor(s) and the inertial sensor(s) can only measure changes in vehicle attitude and/or position. They are unable to fix the geographic position and attitude of the vehicle in absolute “global” coordinates. References in this document to relative movement of the vehicle, or of an implement associated with the vehicle, or relative attitude/position/heading/pose information should be understood in this context.
However, the relative coordinate system established by relative measurement control systems such as those described above can relate to absolute geographic space if the vehicle can be moved sequentially to at least two, and preferably three or more, locations whose absolute geographic locations are known. This leads to the possibility of calibrating a control system having only optical, inertial, and possibly other vehicle sensors, in the total absence of GNSS. For example, during power up (initialization), the inertial navigation system positions of the vehicle could be arbitrarily set on a map whose origin and orientation is known. To relate this map to absolute geographic space, the vehicle could be located at the first known location, the internal coordinates noted, then moved to a second location and the new internal coordinates likewise noted. The line between the two points could be fitted from the internal map onto the real world map to arrive at the XY offset between the two map origins, the orientation difference between the two map origins, and the linear scaling difference between the two maps.
Thus, in one embodiment, the present invention may comprise a control system including one or more optical movement sensors and one or more inertial sensors. Suitably, the control system may include one or more optical movement sensors and an assembly of inertial sensors. In one embodiment, the control system of the present invention may further comprise an assembly of sensors including accelerometers and rate gyroscopes for providing further position and/or attitude signals to the controller. The assembly may comprise between one and three sensor sets orthogonally mounted, with each sensor set comprising not necessarily one of each, but no more than one of each of the above-mentioned sensors. Such inertial sensors are well known to persons skilled in the art and need not be described further.
In another embodiment, the present invention may comprise a control system including one or more optical movement sensors and one or more other sensors. The other sensors may comprise navigation sensors such as magnetometers, or vehicle sensors such as wheel speed encoders, and steering angle encoders. Control systems in accordance with this embodiment of the invention would also be described as relative measurement control systems, and the relative coordinate system established by such a system can relate to absolute geographic space in generally the same way as described above.
In yet another embodiment, the control system of the present invention, which incorporates one or more optical movement sensors, may be integrated with a GNSS system. In this system, the GNSS system provides absolute measurement in geographic space and the optical movement sensor provides relative movement data that can be used to control the vehicle during periods of outage of GNSS signals or during periods of normal operation when no GNSS signals are being received. Thus, in a further embodiment, the present invention provides a control system including one or more optical movement sensors and a GNSS system.
In a further still embodiment, the control system of the present invention may incorporate one or more optical movement sensors, a GNSS system and one or more inertial sensors, suitably an assembly of inertial sensors. In this embodiment, the optical movement sensor is configured to look at the ground near or under the vehicle. The output signal generated by the optical movement sensor comprises the relative movement along the axis of the vehicle and the relative movement across the axis of the vehicle. This information can be used as an additional source for compensating for the errors in the inertial sensors, giving a combined GNSS/INS/optical movement sensor system with the capability of operating over sustained periods of GNSS outage. Thus, in another embodiment, the present invention that may provide a control system including one or more optical movement sensors, a GNSS system and one or more inertial sensors, such as an assembly of inertial sensors.
GPS (global positioning system) is the name of the satellite-based navigation system originally developed by the United States Department of Defense. GNSS (including GPS and other satellite-based navigation systems) is now used in a wide range of applications. A number of systems also exist for increasing the accuracy of the location readings obtained using GNSS receivers. Some of these systems operate by taking supplementary readings from additional satellites and using these supplementary readings to “correct” the original GNSS location readings. These systems are commonly referred to as “Satellite Based Augmentation Systems” (SBAS) and some examples of SBASs are:
A number of “Ground Based Augmentation Systems” (GBASs) also exist which help to increase the accuracy of GNSS location readings by taking additional readings from beacons located at known locations on the ground. It will be understood that throughout this specification, all references to GNSS include GNSS when augmented by supplementary systems such as SBASs, GBASs and the like.
In embodiments of the present invention where the optical movement sensor is used in combination with one or more other sensors, the datastream from the optical movement sensor may be combined with a datastream from another sensor. This may be done using known signal processing techniques to obtain a stream of statistically optimal estimates of the vehicle's current position and/or attitude. Suitably, the signal processing techniques may utilize a statistically optimised filter or estimator. The optimal filter or estimator could usefully, but not necessarily, comprise a Kalman filter.
The optical sensor used in the control system in accordance with the present invention may comprise an optical movement sensing integrated circuit that receives raw data from a lens assembly mounted on a vehicle or on an implement towed by a vehicle. The lens assembly may be configured such that an image of the ground immediately below the lens assembly is formed on a photosensor plane of the optical movement sensing integrated chip by the lens assembly. Usefully, the lens may be a telecentric lens. Furthermore, the lens may be an object space telecentric lens. An object space telecentric lens is one that achieves dimensional and geometric invariance of images within a range of different distances from the lens and across the whole field of view. Telecentric lenses will be known to those skilled in the art and therefore need not be described any further.
The lens assembly may be chosen so that the extent of the image on the optical movement sensing integrated chip represents a physical extent in the object plane which is commensurate with both the anticipated maximum speed of the vehicle and the processing rate of the optical movement sensing integrated circuit. For example, if the maximum speed of the vehicle is 5 m per second and the desired overlap of successive images is 99%, an image representing 0.5 m in extent will require a processing speed of 1000 frames per second.
The optical movement sensor may include an illumination source of sufficient power such that the image of the ground beneath the vehicle is rendered with optimum contrast. This can be usefully, but not necessarily implemented as an array of high intensity light emitting diodes chosen to emit light at the wavelength of optimal intensity of the optical movement sensor.
Desirably, the optical movement sensor may be provided with a mechanism to keep the entrance pupil of the optical assembly free of dust. This could be usefully implemented by means of a high velocity air curtain passing the entrance pupil. Other mechanisms may be used, such as those that spray a cleaning fluid over the pupil. The cleaning fluid in those embodiments may comprise a cleaning liquid, such as water. Other means or mechanisms suitable for keeping the lens, or at least the entrance pupil of the optical assembly, free of dust will be known to those skilled in the art and may also be used with the present invention.
In another embodiment, the present invention provides a control system for controlling a position of an implement associated with a vehicle, characterised in that the control system includes an optical movement sensor which scans a surface over which the implement is moving and generates a signal indicative of relative movement along an axis of the implement and relative movement across an axis of the implement, said signal being provided to a controller.
In another aspect, the present invention provides a control system for maintaining a position and/or heading (attitude) of an implement close to a desired path of travel, the control system comprising a controller having a computer memory for storing or generating the desired path of travel, the controller being adapted to receive position and/or heading signals relating to a position and/or heading of the implement from one or more sensors, the position and/or heading signals enabling the controller to determine the position and/or heading of the implement relative to the desired path of travel, the controller sending control signals to a position and/or heading control mechanism in response to the determined position and/or heading, wherein the position and/or heading signals from the one or more sensors include a signal generated by an optical movement sensor configured to scan a surface over which the implement is travelling, the optical movement sensor generating a signal indicative of relative movement along an axis of the vehicle and relative movement across an axis of the vehicle. Suitably, in this aspect, the optical movement sensor is mounted to the implement. The optical movement sensor may scan the surface close to the implement or underneath the implement as the implement traverses the surface.
In this aspect, the control algorithms and the position control mechanisms may be as described in U.S. Pat. No. 7,460,942, which is assigned to a common assignee herewith and incorporated herein by reference. In embodiments of this aspect of the invention, the position of the implement may be controlled by controlling the steering of the vehicle associated with the implement (this is especially useful if the implement is rigidly and fixedly connected to the vehicle), or by moving the position of the implement (or at least a working part of the implement) relative to the vehicle, which may be achieved by adjusting the lateral offset between the working part of the implement and the vehicle, or by using the working part of the implement to “steer” the implement.
In this aspect, the control system may further include one more of a GNSS system and inertial sensors and navigation sensors and vehicle based sensors. These various systems and sensors are described above with reference to other aspects of the invention.
Certain embodiments, aspects and features of the invention will now be described and explained by way of example and with reference to the drawings. However, it will be clearly appreciated that these descriptions and examples are provided to assist in understanding the invention only, and the invention is not necessarily limited to or by any of the embodiments, aspects or features described or exemplified.
As required, detailed embodiments of the present invention are disclosed herein; however, it is to be understood that the disclosed embodiments are merely exemplary of the invention, which may be embodied in various forms. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as a basis for the claims and as a representative basis for teaching one skilled in the art to variously employ the present invention in virtually any appropriately detailed structure.
Certain terminology will be used in the following description for convenience in reference only and will not be limiting. For example, up, down, front, back, right and left refer to the invention as oriented in the view being referred to. The words “inwardly” and “outwardly” refer to directions toward and away from, respectively, the geometric center of the embodiment being described and designated parts thereof. Global navigation satellite systems (GNSS) are broadly defined to include GNSS (U.S.), Galileo (proposed), GLONASS (Russia), Beidou (China), Compass (proposed), IRNSS (India, proposed), QZSS (Japan, proposed) and other current and future positioning technology using signals from satellites, using single or multiple antennae, with or without augmentation from terrestrial sources. Inertial navigation systems (INS) include gyroscopic (gyro) sensors, accelerometers and similar technologies for providing output corresponding to the inertia of moving components in all axes, i.e. through six degrees of freedom (positive and negative directions along longitudinal X, transverse Y and vertical Z axes). Yaw, pitch and roll refer to moving component rotation about the Z, Y and X axes respectively. Said terminology will include the words specifically mentioned, derivatives thereof and words of similar meaning.
The control system 2 includes a controller 14. The controller 14 suitably includes a computer memory that is capable of having an initial path of travel entered therein. The computer memory is also adapted to store or generate a desired path of travel. The controller 14 receives position and attitude signals from one or more sensors (to be described later) and the data received from the sensors are used by the controller to determine or calculate the position and attitude of the tractor. The controller 14 then compares the position and attitude of the tractor 10 with the desired position and attitude. If the determined or calculated position and attitude of the tractor deviates from the desired position and attitude, the controller 14 issues a steering correction signal that interacts with a steering control mechanism, e.g., a steering valve block 15. In response to the steering correction signal, the steering control mechanism makes adjustments to the angle of steering of the tractor 10, to thereby assist in moving the tractor back towards the desired path of travel. The steering control mechanism 15 may comprise one or more mechanical or electrical controllers or devices that can automatically adjust the steering angle of the vehicle. These devices may act upon the steering pump, the steering column and/or steering linkages. U.S. Pat. No. 6,711,501, which is incorporated herein by reference, shows a GNSS-based navigation system including a graphical user interface (GUI) for use by an operator in guiding a vehicle in swathing operations.
In one embodiment of the present invention, the steering control algorithm may be similar to that described U.S. Pat. No. 6,876,920, which is incorporated herein by reference and discloses a steering control algorithm, which involves entering an initial path of travel (often referred to as a wayline). GNSS ranging signals can be received at antennas 20 mounted on the tractor 10 and the implement 12 and connected to a GNSS receiver 13, which in turn provides positioning information, typically in an absolute or relative geo-reference frame, to the controller 14. The computer in the controller 14 then determines or calculates the desired path of travel, for example, by determining the offset of the implement 12 being towed by the tractor 10 and generating a series of parallel paths spaced apart from each other by the offset of the implement 12. This ensures that an optimal working of the field is obtained. The vehicle then commences moving along the desired path of travel. One or more sensors provide position and attitude signals to the controller 14, which uses these position and attitude signals to determine or calculate the position and attitude of the vehicle 11. This position and attitude is then compared with the desired position and attitude. If the vehicle 11 is spaced away from the desired path of travel, or is pointing away from the desired path, the controller 14 generates a steering correction signal. The steering correction signal may be generated, for example, by using the difference between the determined position and attitude of the vehicle 11 and the desired position and attitude to generate an error signal, with the magnitude of the error signal being dependent upon the difference between the determined position and attitude and the desired position and attitude of the vehicle 11. The error signal may take the form of a curvature demand signal that acts to steer the vehicle 11 back onto the desired path of travel. Steering angle sensors in the steering control mechanism, which includes a steering valve block 15, can monitor the angle of the steering wheels of the tractor 10 and send the data back to the controller 14 to thereby allow for understeering or oversteering.
In an alternative embodiment, the error signal may result in generation of a steering guidance arrow on a visual display unit to thereby enable the driver of the vehicle to properly steer the vehicle back onto the desired path of travel. For example, see U.S. Pat. No. 6,711,501. This manual control indicator may also be provided in conjunction with the steering controls 15 as described above.
It will be appreciated that the invention is by no means limited to the particular algorithm described, and that a wide variety of other steering control algorithms may also be used.
In general terms, most, if not all, steering control algorithms operate by comparing a determined or calculated position and attitude of the vehicle with a desired position and attitude of the vehicle. The desired position and attitude of the vehicle is typically determined from the path of travel that is entered into, or stored in, or generated by, the controller. The determined or calculated position and attitude of the vehicle is, in most, if not all, cases determined by having input data from one or more sensors being used to determine or calculate the position and attitude of the vehicle. In U.S. Pat. No. 6,876,920, which is incorporated herein by reference, GNSS sensors, accelerometers, wheel angle sensors and gyroscopes are used as the sensors in preferred embodiments of that patent.
Returning now to
The tractor 10 shown in
In the embodiment shown in
The optical tracking movement sensor 16 may comprise the operative part of an optical computer mouse. Optical computer mice incorporate an optoelectronics sensor that takes successive pictures of the surface on which the mouse operates. Most optical computer mice use a light source to illuminate the surface that is being tracked. Changes between one frame and the next are processed by an image processing part of a chip embedded in the mouse and this translates the movement of the mouse into movement on two axes using a digital correlation algorithm. The optical movement sensor 16 may include an illumination source for emitting light therefrom. The illumination source may comprise one or more LEDs. The optical movement sensor may also include an illumination detector for detecting light reflected from the ground or the surface over which the vehicle is travelling. Appropriate optical components, such as a lens (preferably a telecentric lens), may be utilized to properly focus the emitted or detected light. A cleaning system, such as a stream of air or other cleaning fluid, may be used to keep the optical path clean. The optical movement sensor may comprise a charge coupled device (CCD) or a complimentary metal oxide semiconductor (CMOS) sensor. The optical movement sensor 16 may also include an integrated chip that can rapidly determine the relative movement along an axis of the vehicle and the relative movement across an axis of the vehicle by analysing successive frames captured by the illumination detector. The optical movement sensor can complete hundreds to thousands of calculations per second.
The optical movement sensor 16 generates signals that are indicative of the relative movement of the vehicle along the vehicle's axis and the relative movement of the vehicle across the vehicle's axis. The signals are sent to the controller 14. The signals received by the controller 14 are used to progressively calculate or determine changes in the position and attitude of the vehicle. In the embodiment shown in
Only one optical movement sensor 16 is illustrated in
The embodiment shown in
The GNSS receiver(s) 113 on the tractor 110 receives GNSS signals from the constellation of GNSS satellites via GNSS antenna 120 mounted on the tractor 110. The signals are sent to the controller 114. The signals received from GNS S receiver(s) 113 on tractor 110 are corrected by the error correction signal sent from the transmitter 136. Thus, an accurate determination of position of the tractor 110 can be obtained from the differential GNSS system. The differential GNSS positioning system thus described can comprise a real-time kinematic (RTK) system using the carrier phase of the satellite ranging signals with the rover receiver(s) 113 in motion on the vehicle 111. RTK systems tend to be relatively accurate, and are capable of achieving sub-centimeter precision guidance.
The controller 114 also receives position signals from the optical movement sensor 116. As described above with reference to the embodiment in
The embodiment shown in
The inertial sensors 221 provide relative position and attitude information to the controller 214. Similarly, the optical movement sensor 216 also provides relative position and attitude information to controller 214. The controller 214 uses both sets of information to obtain a more accurate determination of the position and attitude of the vehicle. This will be described in greater detail hereunder. Also, as described above with reference to the embodiments in
The embodiment shown in
In
The optical movement sensor (OMS) 16 of
In state space representations, the variables or parameters used to mathematically model the motion of the vehicle, or aspects of its operation, are referred to as “states” xi. In the present case, the states may include the vehicle's position(x,y), velocity
heading h, radius of curvature r, etc. Hence the states may include x1=x, x2=y, x3=h, x4=h,
etc. However, it will be appreciated that the choice of states is never unique, and the meaning and implications of this will be well understood by those skilled in the art.
The values for the individual states at a given time are represented as the individual entries in an n×1 “state vector”:
X(t)=[x1(t)x2(t)x3(t)x4(t) . . . xn(t)]T
where n is the number of states.
In general, the mathematical model used to model the vehicle's motion and aspects of its operation will comprise a series of differential equations. The number of equations will be the same as the number of states. In some cases, the differential equations will be linear in terms of the states, whereas in other situations the equations may be nonlinear in which case they must generally be “linearized” about a point in the “state space”. Linearization techniques that may be used to do this will be well known to those skilled in this area.
Next, by noting that any jth order linear differential equations can be re-written equivalently as a set j first order linear differential equations, the linear (or linearized) equations that represent the model can be expressed using the following “state” equation:
where:
The quantities that are desired to be known about the vehicle (the real values for which are generally also measured from the vehicle itself, if possible) are the outputs y, from the model. Each of the outputs generated by the linear (or linearized) model comprises a linear combination of the states xi and inputs ui, and so the outputs can be defined by the “output” or “measurement” equation:
Y(t)=CX(t)+DU(t)+Mv(t)
where
Next, it will be noted that both the state equation and the measurement equation defined above are continuous functions of time. However, continuous time functions do not often lend themselves to easy digital implementation (such as will generally be required in implementing the present invention) because digital control systems generally operate as recursively repeating algorithms. Therefore, for the purpose of implementing the equations digitally, the continuous time equations may be converted into the following recursive discrete time equations by making the substitutions set out below and noting that (according to the principle of superposition) the overall response of a linear system is the sum of the free (unforced) response of that system and the responses of that system due to forcing/driving inputs. The recursive discrete time equations are:
Xk+1=FXk+GUk+1+Lwk+1
Yk+1=ZXk+JUk+1+Nvk+1
where
However, as noted above, the quantity Ew(t) is not deterministic and so the integral defining Lwk+1 cannot be performed (even numerically). It is for this reason that it is preferable to use statistical filtering techniques. The optimal estimators shown in
In general, a Kalman filter operates as a “predictor-corrector” algorithm. Hence, the algorithm operates by first using the mathematical model to “predict” the value of each of the states at time step k+1 based on the known inputs at time step k+1 and the known value of the states from the previous time step k. It then “corrects” the predicted value using actual measurements taken from the vehicle at time step k+1 and the optimised statistical properties of the model. In summary, the Kalman filter comprises the following equations each of which is computed in the following order for each time step:
where
The operation of the discrete time Kalman filter which may be used in the optimal estimator 60 of the present invention is schematically illustrated in
Returning now to
The error calculation module 62 uses the statistically optimal estimate of the position and attitude of the tractor obtained from the estimator 60 and the desired position and attitude of the tractor 10 determined from the required control path to calculate the error in position and attitude. This may be calculated as an error in the x-coordinate, an error in the y-coordinate and/or an error in the heading of the position and attitude of the tractor 10. These error values are represented as “Ex”, “Ey” and “Eh” in
In cases where a GNSS outage occurs, the optical movement sensor 116 continues to provide position and attitude data to the optimal estimator 160. In such circumstances, control of the vehicle 111 can be affected by the information received from the optical movement sensor alone.
As a further benefit arising from the system shown in
The embodiments shown in
The present invention provides control systems that can be used to control the movement of the vehicle and/or an implement associated with the vehicle. The control system includes an optical movement sensor that may be the operative part of an optical computer mouse. These optical movement sensors are relatively inexpensive, provide a high processing rate and utilise proven technology. Due to the high processing rate of such optical movement sensors, the control system has a high clock speed and therefore a high frequency of updating of the determined or calculated position of the vehicle or implement. The optical movement sensor may be used by itself or it may be used in conjunction with a GNSS system, one or more inertial sensors, or one or more vehicle based sensors. The optical movement sensor can be used to augment the accuracy of inertial and/or other sensors. In particular, the optical movement sensor can be used to debias yaw drift that is often inherent in inertial sensors.
Those skilled in the art will appreciate that the present invention may be susceptible to variations and modifications other than those specifically described. It is to be understood that the present invention encompasses all such variations and modifications that fall within its spirit and scope.
Number | Name | Date | Kind |
---|---|---|---|
3585537 | Rennick et al. | Jun 1971 | A |
3596228 | Reed, Jr. et al. | Jul 1971 | A |
3727710 | Sanders et al. | Apr 1973 | A |
3815272 | Marleau | Jun 1974 | A |
3899028 | Morris et al. | Aug 1975 | A |
3987456 | Gelin | Oct 1976 | A |
4132272 | Holloway et al. | Jan 1979 | A |
4170776 | MacDoran et al. | Oct 1979 | A |
4180133 | Collogan et al. | Dec 1979 | A |
4398162 | Nagai | Aug 1983 | A |
4453614 | Allen et al. | Jun 1984 | A |
4529990 | Brunner | Jul 1985 | A |
4637474 | Leonard | Jan 1987 | A |
4667203 | Counselman, III | May 1987 | A |
4689556 | Cedrone | Aug 1987 | A |
4694264 | Owens et al. | Sep 1987 | A |
4710775 | Coe | Dec 1987 | A |
4714435 | Stipanuk et al. | Dec 1987 | A |
4739448 | Rowe et al. | Apr 1988 | A |
4751512 | Longaker | Jun 1988 | A |
4769700 | Pryor | Sep 1988 | A |
4785463 | Janc et al. | Nov 1988 | A |
4802545 | Nystuen et al. | Feb 1989 | A |
4812991 | Hatch | Mar 1989 | A |
4858132 | Holmquist | Aug 1989 | A |
4864320 | Munson et al. | Sep 1989 | A |
4894662 | Counselman | Jan 1990 | A |
4916577 | Dawkins | Apr 1990 | A |
4918607 | Wible | Apr 1990 | A |
4963889 | Hatch | Oct 1990 | A |
5031704 | Fleischer et al. | Jul 1991 | A |
5100229 | Lundberg et al. | Mar 1992 | A |
5134407 | Lorenz et al. | Jul 1992 | A |
5148179 | Allison | Sep 1992 | A |
5152347 | Miller | Oct 1992 | A |
5155490 | Spradley et al. | Oct 1992 | A |
5155493 | Thursby et al. | Oct 1992 | A |
5156219 | Schmidt et al. | Oct 1992 | A |
5165109 | Han et al. | Nov 1992 | A |
5173715 | Rodal et al. | Dec 1992 | A |
5177489 | Hatch | Jan 1993 | A |
5185610 | Ward et al. | Feb 1993 | A |
5191351 | Hofer et al. | Mar 1993 | A |
5202829 | Geier | Apr 1993 | A |
5207239 | Schwitalia | May 1993 | A |
5239669 | Mason et al. | Aug 1993 | A |
5255756 | Follmer et al. | Oct 1993 | A |
5268695 | Dentinger et al. | Dec 1993 | A |
5293170 | Lorenz et al. | Mar 1994 | A |
5294970 | Dornbusch et al. | Mar 1994 | A |
5296861 | Knight | Mar 1994 | A |
5311149 | Wagner et al. | May 1994 | A |
5323322 | Mueller et al. | Jun 1994 | A |
5334987 | Teach | Aug 1994 | A |
5343209 | Sennott et al. | Aug 1994 | A |
5345245 | Ishikawa et al. | Sep 1994 | A |
5359332 | Allison et al. | Oct 1994 | A |
5361212 | Class et al. | Nov 1994 | A |
5365447 | Dennis | Nov 1994 | A |
5369589 | Steiner | Nov 1994 | A |
5375059 | Kyrtsos et al. | Dec 1994 | A |
5390124 | Kyrtsos | Feb 1995 | A |
5390125 | Sennott et al. | Feb 1995 | A |
5390207 | Fenton et al. | Feb 1995 | A |
5416712 | Geier et al. | May 1995 | A |
5442363 | Remondi | Aug 1995 | A |
5444453 | Lalezari | Aug 1995 | A |
5451964 | Babu | Sep 1995 | A |
5467282 | Dennis | Nov 1995 | A |
5471217 | Hatch et al. | Nov 1995 | A |
5476147 | Fixemer | Dec 1995 | A |
5477228 | Tiwari et al. | Dec 1995 | A |
5477458 | Loomis | Dec 1995 | A |
5490073 | Kyrtsos | Feb 1996 | A |
5491636 | Robertson | Feb 1996 | A |
5495257 | Loomis | Feb 1996 | A |
5504482 | Schreder | Apr 1996 | A |
5511623 | Frasier | Apr 1996 | A |
5519620 | Talbot et al. | May 1996 | A |
5521610 | Rodal | May 1996 | A |
5523761 | Gildea | Jun 1996 | A |
5534875 | Diefes et al. | Jul 1996 | A |
5543804 | Buchler et al. | Aug 1996 | A |
5546093 | Gudat et al. | Aug 1996 | A |
5548293 | Cohen et al. | Aug 1996 | A |
5561432 | Knight | Oct 1996 | A |
5563786 | Torii | Oct 1996 | A |
5568152 | Janky et al. | Oct 1996 | A |
5568162 | Samsel et al. | Oct 1996 | A |
5583513 | Cohen | Dec 1996 | A |
5589835 | Gildea et al. | Dec 1996 | A |
5592382 | Colley | Jan 1997 | A |
5596328 | Stangeland et al. | Jan 1997 | A |
5600670 | Turney | Feb 1997 | A |
5604506 | Rodal | Feb 1997 | A |
5608393 | Hartman | Mar 1997 | A |
5610522 | Locatelli et al. | Mar 1997 | A |
5610616 | Vallot et al. | Mar 1997 | A |
5610845 | Slabinski | Mar 1997 | A |
5612883 | Shaffer et al. | Mar 1997 | A |
5615116 | Gudat et al. | Mar 1997 | A |
5617100 | Akiyoshi et al. | Apr 1997 | A |
5617317 | Ignagni | Apr 1997 | A |
5621646 | Enge et al. | Apr 1997 | A |
5638077 | Martin | Jun 1997 | A |
5644139 | Allen et al. | Jul 1997 | A |
5664632 | Frasier | Sep 1997 | A |
5673491 | Brenna et al. | Oct 1997 | A |
5680140 | Loomis | Oct 1997 | A |
5684696 | Rao et al. | Nov 1997 | A |
5706015 | Chen et al. | Jan 1998 | A |
5717593 | Gvili | Feb 1998 | A |
5725230 | Walkup | Mar 1998 | A |
5731786 | Abraham et al. | Mar 1998 | A |
5739785 | Allison et al. | Apr 1998 | A |
5757316 | Buchler | May 1998 | A |
5765123 | Nimura et al. | Jun 1998 | A |
5777578 | Chang et al. | Jul 1998 | A |
5810095 | Orbach et al. | Sep 1998 | A |
5828336 | Yunck et al. | Oct 1998 | A |
5838562 | Gudat et al. | Nov 1998 | A |
5854987 | Sekine et al. | Dec 1998 | A |
5862501 | Talbot et al. | Jan 1999 | A |
5864315 | Welles et al. | Jan 1999 | A |
5864318 | Cozenza et al. | Jan 1999 | A |
5875408 | Bendett et al. | Feb 1999 | A |
5877725 | Kalafus | Mar 1999 | A |
5890091 | Talbot et al. | Mar 1999 | A |
5899957 | Loomis | May 1999 | A |
5906645 | Kagawa et al. | May 1999 | A |
5912798 | Chu | Jun 1999 | A |
5914685 | Kozlov et al. | Jun 1999 | A |
5917448 | Mickelson | Jun 1999 | A |
5918558 | Susag | Jul 1999 | A |
5919242 | Greatline et al. | Jul 1999 | A |
5923270 | Sampo et al. | Jul 1999 | A |
5926079 | Heine et al. | Jul 1999 | A |
5927603 | McNabb | Jul 1999 | A |
5928309 | Korver et al. | Jul 1999 | A |
5929721 | Munn et al. | Jul 1999 | A |
5933110 | Tang | Aug 1999 | A |
5935183 | Sahm et al. | Aug 1999 | A |
5936573 | Smith | Aug 1999 | A |
5940026 | Popeck | Aug 1999 | A |
5941317 | Mansur | Aug 1999 | A |
5943008 | Van Dusseldorp | Aug 1999 | A |
5944770 | Enge et al. | Aug 1999 | A |
5945917 | Harry | Aug 1999 | A |
5949371 | Nichols | Sep 1999 | A |
5955973 | Anderson | Sep 1999 | A |
5956250 | Gudat et al. | Sep 1999 | A |
5969670 | Kalafus et al. | Oct 1999 | A |
5987383 | Keller et al. | Nov 1999 | A |
6014101 | Loomis | Jan 2000 | A |
6014608 | Seo | Jan 2000 | A |
6018313 | Englemayer et al. | Jan 2000 | A |
6023239 | Kovach | Feb 2000 | A |
6052647 | Parkinson et al. | Apr 2000 | A |
6055477 | McBurney et al. | Apr 2000 | A |
6057800 | Yang et al. | May 2000 | A |
6061390 | Meehan et al. | May 2000 | A |
6061632 | Dreier | May 2000 | A |
6062317 | Gharsalli | May 2000 | A |
6069583 | Silvestrin et al. | May 2000 | A |
6076612 | Carr et al. | Jun 2000 | A |
6081171 | Ella | Jun 2000 | A |
6100842 | Dreier et al. | Aug 2000 | A |
6104978 | Harrison et al. | Aug 2000 | A |
6122595 | Varley et al. | Sep 2000 | A |
6128574 | Diekhans | Oct 2000 | A |
6144335 | Rogers | Nov 2000 | A |
6191730 | Nelson, Jr. | Feb 2001 | B1 |
6191733 | Dizchavez | Feb 2001 | B1 |
6198430 | Hwang et al. | Mar 2001 | B1 |
6198992 | Winslow | Mar 2001 | B1 |
6199000 | Keller et al. | Mar 2001 | B1 |
6205401 | Pickhard et al. | Mar 2001 | B1 |
6215828 | Signell et al. | Apr 2001 | B1 |
6229479 | Kozlov et al. | May 2001 | B1 |
6230097 | Dance et al. | May 2001 | B1 |
6233511 | Berger et al. | May 2001 | B1 |
6236916 | Staub et al. | May 2001 | B1 |
6236924 | Motz | May 2001 | B1 |
6253160 | Hanseder | Jun 2001 | B1 |
6256583 | Sutton | Jul 2001 | B1 |
6259398 | Riley | Jul 2001 | B1 |
6266595 | Greatline et al. | Jul 2001 | B1 |
6275705 | Drane et al. | Aug 2001 | B1 |
6285320 | Olster et al. | Sep 2001 | B1 |
6292132 | Wilson | Sep 2001 | B1 |
6307505 | Green | Oct 2001 | B1 |
6313788 | Wilson | Nov 2001 | B1 |
6314348 | Winslow | Nov 2001 | B1 |
6325684 | Knight | Dec 2001 | B1 |
6336066 | Pellenc et al. | Jan 2002 | B1 |
6345231 | Quincke | Feb 2002 | B2 |
6356602 | Rodal et al. | Mar 2002 | B1 |
6377889 | Soest | Apr 2002 | B1 |
6380888 | Kucik | Apr 2002 | B1 |
6389345 | Phelps | May 2002 | B2 |
6392589 | Rogers et al. | May 2002 | B1 |
6397147 | Whitehead | May 2002 | B1 |
6415229 | Diekhans | Jul 2002 | B1 |
6418031 | Archambeault | Jul 2002 | B1 |
6421003 | Riley et al. | Jul 2002 | B1 |
6424915 | Fukuda et al. | Jul 2002 | B1 |
6431576 | Viaud et al. | Aug 2002 | B1 |
6434462 | Bevly et al. | Aug 2002 | B1 |
6445983 | Dickson et al. | Sep 2002 | B1 |
6445990 | Manring | Sep 2002 | B1 |
6449558 | Small | Sep 2002 | B1 |
6463091 | Zhodzicshsky et al. | Oct 2002 | B1 |
6463374 | Keller et al. | Oct 2002 | B1 |
6466871 | Reisman et al. | Oct 2002 | B1 |
6469663 | Whitehead et al. | Oct 2002 | B1 |
6484097 | Fuchs et al. | Nov 2002 | B2 |
6501422 | Nichols | Dec 2002 | B1 |
6515619 | McKay, Jr. | Feb 2003 | B1 |
6516271 | Upadhyaya et al. | Feb 2003 | B2 |
6539303 | McClure et al. | Mar 2003 | B2 |
6542077 | Joao | Apr 2003 | B2 |
6549835 | Deguchi | Apr 2003 | B2 |
6553299 | Keller et al. | Apr 2003 | B1 |
6553300 | Ma et al. | Apr 2003 | B2 |
6553311 | Aheam et al. | Apr 2003 | B2 |
6570534 | Cohen et al. | May 2003 | B2 |
6577952 | Geier et al. | Jun 2003 | B2 |
6587761 | Kumar | Jul 2003 | B2 |
6606542 | Hauwiller et al. | Aug 2003 | B2 |
6611228 | Toda et al. | Aug 2003 | B2 |
6611754 | Klein | Aug 2003 | B2 |
6611755 | Coffee et al. | Aug 2003 | B1 |
6622091 | Perlmutter et al. | Sep 2003 | B2 |
6631394 | Ronkka et al. | Oct 2003 | B1 |
6631916 | Miller | Oct 2003 | B1 |
6643576 | O'Connor et al. | Nov 2003 | B1 |
6646603 | Dooley et al. | Nov 2003 | B2 |
6657875 | Zeng et al. | Dec 2003 | B1 |
6671587 | Hrovat et al. | Dec 2003 | B2 |
6686878 | Lange | Feb 2004 | B1 |
6688403 | Bernhardt et al. | Feb 2004 | B2 |
6703973 | Nichols | Mar 2004 | B1 |
6711501 | McClure et al. | Mar 2004 | B2 |
6721638 | Zeitler | Apr 2004 | B2 |
6732024 | Rekow et al. | May 2004 | B2 |
6744404 | Whitehead et al. | Jun 2004 | B1 |
6754584 | Pinto et al. | Jun 2004 | B2 |
6774843 | Takahashi | Aug 2004 | B2 |
6792380 | Toda | Sep 2004 | B2 |
6819269 | Flick | Nov 2004 | B2 |
6822314 | Beasom | Nov 2004 | B2 |
6865465 | McClure | Mar 2005 | B2 |
6865484 | Miyasaka et al. | Mar 2005 | B2 |
6879283 | Bird et al. | Apr 2005 | B1 |
6900992 | Kelly et al. | May 2005 | B2 |
6922635 | Rorabaugh | Jul 2005 | B2 |
6931233 | Tso et al. | Aug 2005 | B1 |
6961018 | Heppe et al. | Nov 2005 | B2 |
6967538 | Woo | Nov 2005 | B2 |
6990399 | Hrazdera et al. | Jan 2006 | B2 |
7006032 | King et al. | Feb 2006 | B2 |
7026982 | Toda et al. | Apr 2006 | B2 |
7027918 | Zimmerman et al. | Apr 2006 | B2 |
7031725 | Rorabaugh | Apr 2006 | B2 |
7089099 | Shostak et al. | Aug 2006 | B2 |
7142956 | Heiniger et al. | Nov 2006 | B2 |
7155335 | Rennels | Dec 2006 | B2 |
7162348 | McClure et al. | Jan 2007 | B2 |
7191061 | McKay et al. | Mar 2007 | B2 |
7221314 | Brabec et al. | May 2007 | B2 |
7231290 | Steichen et al. | Jun 2007 | B2 |
7248211 | Hatch et al. | Jul 2007 | B2 |
7271766 | Zimmerman et al. | Sep 2007 | B2 |
7277784 | Weiss | Oct 2007 | B2 |
7292186 | Miller et al. | Nov 2007 | B2 |
7324915 | Altmann | Jan 2008 | B2 |
7358896 | Gradincic et al. | Apr 2008 | B2 |
7373231 | McClure et al. | May 2008 | B2 |
7388539 | Whitehead et al. | Jun 2008 | B2 |
7395769 | Jensen | Jul 2008 | B2 |
7400956 | Feller et al. | Jul 2008 | B1 |
7428259 | Wang et al. | Sep 2008 | B2 |
7437230 | McClure et al. | Oct 2008 | B2 |
7451030 | Eglington et al. | Nov 2008 | B2 |
7479900 | Horstemeyer | Jan 2009 | B2 |
7505848 | Flann et al. | Mar 2009 | B2 |
7522099 | Zhodzishsky et al. | Apr 2009 | B2 |
7522100 | Yang et al. | Apr 2009 | B2 |
7571029 | Dai et al. | Aug 2009 | B2 |
7689354 | Heiniger et al. | Mar 2010 | B2 |
20010004601 | Drane et al. | Jun 2001 | A1 |
20030014171 | Ma et al. | Jan 2003 | A1 |
20030093210 | Kondo et al. | May 2003 | A1 |
20030187560 | Keller et al. | Oct 2003 | A1 |
20030208319 | Ell et al. | Nov 2003 | A1 |
20040039514 | Steichen et al. | Feb 2004 | A1 |
20040186644 | McClure et al. | Sep 2004 | A1 |
20040212533 | Whitehead et al. | Oct 2004 | A1 |
20050080559 | Ishibashi et al. | Apr 2005 | A1 |
20050225955 | Grebenkemper et al. | Oct 2005 | A1 |
20050265494 | Goodlings | Dec 2005 | A1 |
20060031664 | Wilson et al. | Feb 2006 | A1 |
20060095172 | Abramovitch et al. | May 2006 | A1 |
20060167600 | Nelson et al. | Jul 2006 | A1 |
20060206246 | Walker | Sep 2006 | A1 |
20060215739 | Williamson et al. | Sep 2006 | A1 |
20070069924 | Goren | Mar 2007 | A1 |
20070078570 | Dai et al. | Apr 2007 | A1 |
20070088447 | Stothert et al. | Apr 2007 | A1 |
20070112700 | Den Haan et al. | May 2007 | A1 |
20070121708 | Simpson | May 2007 | A1 |
20070205940 | Yang et al. | Sep 2007 | A1 |
20070285308 | Bauregger et al. | Dec 2007 | A1 |
20080129586 | Martin | Jun 2008 | A1 |
20080204312 | Euler | Aug 2008 | A1 |
20080269988 | Feller et al. | Oct 2008 | A1 |
20080284643 | Scherzinger et al. | Nov 2008 | A1 |
20090093959 | Scherzinger et al. | Apr 2009 | A1 |
20090160951 | Anderson et al. | Jun 2009 | A1 |
20090164067 | Whitehead et al. | Jun 2009 | A1 |
20090171583 | DiEsposti | Jul 2009 | A1 |
20090174597 | DiLellio et al. | Jul 2009 | A1 |
20090174622 | Kanou | Jul 2009 | A1 |
20090177395 | Stelpstra | Jul 2009 | A1 |
20090177399 | Park et al. | Jul 2009 | A1 |
20090259397 | Stanton | Oct 2009 | A1 |
20090259707 | Martin et al. | Oct 2009 | A1 |
20090262014 | DiEsposti | Oct 2009 | A1 |
20090262018 | Vasilyev et al. | Oct 2009 | A1 |
20090262974 | Lithopoulos | Oct 2009 | A1 |
20090265054 | Basnayake | Oct 2009 | A1 |
20090265101 | Jow | Oct 2009 | A1 |
20090265104 | Shroff | Oct 2009 | A1 |
20090273372 | Brenner | Nov 2009 | A1 |
20090273513 | Huang | Nov 2009 | A1 |
20090274079 | Bhatia et al. | Nov 2009 | A1 |
20090274113 | Katz | Nov 2009 | A1 |
20090276155 | Jeerage et al. | Nov 2009 | A1 |
20090295633 | Pinto et al. | Dec 2009 | A1 |
20090295634 | Yu et al. | Dec 2009 | A1 |
20090299550 | Baker | Dec 2009 | A1 |
20090322597 | Medina Herrero et al. | Dec 2009 | A1 |
20090322598 | Fly et al. | Dec 2009 | A1 |
20090322600 | Whitehead et al. | Dec 2009 | A1 |
20090322601 | Ladd et al. | Dec 2009 | A1 |
20090322606 | Gronemeyer | Dec 2009 | A1 |
20090326809 | Colley et al. | Dec 2009 | A1 |
20100013703 | Tekawy et al. | Jan 2010 | A1 |
20100026569 | Amidi | Feb 2010 | A1 |
20100030470 | Wang et al. | Feb 2010 | A1 |
20100039316 | Gronemeyer et al. | Feb 2010 | A1 |
20100039318 | Kmiecik | Feb 2010 | A1 |
20100039320 | Boyer et al. | Feb 2010 | A1 |
20100039321 | Abraham | Feb 2010 | A1 |
20100060518 | Bar-Sever et al. | Mar 2010 | A1 |
20100063649 | Wu et al. | Mar 2010 | A1 |
20100084147 | Aral | Apr 2010 | A1 |
20100085249 | Ferguson et al. | Apr 2010 | A1 |
20100085253 | Ferguson et al. | Apr 2010 | A1 |
20100103033 | Roh | Apr 2010 | A1 |
20100103034 | Tobe et al. | Apr 2010 | A1 |
20100103038 | Yeh et al. | Apr 2010 | A1 |
20100103040 | Broadbent | Apr 2010 | A1 |
20100106414 | Whitehead | Apr 2010 | A1 |
20100106445 | Kondoh | Apr 2010 | A1 |
20100109944 | Whitehead et al. | May 2010 | A1 |
20100109945 | Roh | May 2010 | A1 |
20100109947 | Rintanen | May 2010 | A1 |
20100109948 | Razoumov et al. | May 2010 | A1 |
20100109950 | Roh | May 2010 | A1 |
20100111372 | Zheng et al. | May 2010 | A1 |
20100114483 | Heo et al. | May 2010 | A1 |
20100117894 | Velde et al. | May 2010 | A1 |
20100117899 | Papadimitratos et al. | May 2010 | A1 |
20100117900 | van Diggelen et al. | May 2010 | A1 |
20100121577 | Zhang et al. | May 2010 | A1 |
20100124210 | Lo | May 2010 | A1 |
20100124212 | Lo | May 2010 | A1 |
20100134354 | Lennen | Jun 2010 | A1 |
20100149025 | Meyers et al. | Jun 2010 | A1 |
20100149030 | Verma et al. | Jun 2010 | A1 |
20100149033 | Abraham | Jun 2010 | A1 |
20100149034 | Chen | Jun 2010 | A1 |
20100149037 | Cho | Jun 2010 | A1 |
20100150284 | Fielder et al. | Jun 2010 | A1 |
20100152949 | Nunan et al. | Jun 2010 | A1 |
20100156709 | Zhang et al. | Jun 2010 | A1 |
20100156712 | Pisz et al. | Jun 2010 | A1 |
20100156718 | Chen | Jun 2010 | A1 |
20100159943 | Salmon | Jun 2010 | A1 |
20100161179 | McClure et al. | Jun 2010 | A1 |
20100161211 | Chang | Jun 2010 | A1 |
20100161568 | Xiao | Jun 2010 | A1 |
20100171660 | Shyr et al. | Jul 2010 | A1 |
20100171757 | Melamed | Jul 2010 | A1 |
20100185364 | McClure | Jul 2010 | A1 |
20100185366 | Heiniger et al. | Jul 2010 | A1 |
20100185389 | Woodard | Jul 2010 | A1 |
20100188285 | Collins | Jul 2010 | A1 |
20100188286 | Bickerstaff et al. | Jul 2010 | A1 |
20100189163 | Burgi et al. | Jul 2010 | A1 |
20100201829 | Skoskiewicz et al. | Aug 2010 | A1 |
20100207811 | Lackey | Aug 2010 | A1 |
20100210206 | Young | Aug 2010 | A1 |
20100211248 | Craig et al. | Aug 2010 | A1 |
20100211315 | Toda | Aug 2010 | A1 |
20100211316 | DaSilva | Aug 2010 | A1 |
20100220004 | Malkos et al. | Sep 2010 | A1 |
20100220008 | Conover et al. | Sep 2010 | A1 |
20100222076 | Poon et al. | Sep 2010 | A1 |
20100225537 | Abraham | Sep 2010 | A1 |
20100228408 | Ford | Sep 2010 | A1 |
20100228480 | Lithgow et al. | Sep 2010 | A1 |
20100231443 | Whitehead | Sep 2010 | A1 |
20100231446 | Marshall et al. | Sep 2010 | A1 |
20100232351 | Chansarkar et al. | Sep 2010 | A1 |
20100235093 | Chang | Sep 2010 | A1 |
20100238976 | Young | Sep 2010 | A1 |
20100241347 | King et al. | Sep 2010 | A1 |
20100241353 | Park | Sep 2010 | A1 |
20100241441 | Page et al. | Sep 2010 | A1 |
20100241864 | Kelley et al. | Sep 2010 | A1 |
Number | Date | Country |
---|---|---|
07244150 | Sep 1995 | JP |
WO9836288 | Aug 1998 | WO |
WO0024239 | May 2000 | WO |
WO03019430 | Mar 2003 | WO |
WO2005119386 | Dec 2005 | WO |
WO2009066183 | May 2009 | WO |
WO-2009082745 | Jul 2009 | WO |
WO2009126587 | Oct 2009 | WO |
WO2009148638 | Dec 2009 | WO |
WO-2010005945 | Jan 2010 | WO |
WO-2010104782 | Sep 2010 | WO |
WO-2011014431 | Feb 2011 | WO |
Number | Date | Country | |
---|---|---|---|
20110015817 A1 | Jan 2011 | US |