Integrated multi-sensor control system and method

Information

  • Patent Grant
  • RE47648
  • Patent Number
    RE47,648
  • Date Filed
    Thursday, February 11, 2016
    8 years ago
  • Date Issued
    Tuesday, October 15, 2019
    4 years ago
Abstract
A GNSS integrated multi-sensor guidance system for a vehicle assembly includes a suite of sensor units, including a global navigation satellite system (GNSS) sensor unit comprising a receiver and an antenna. An inertial measurement unit (IMU) outputs vehicle dynamic information for combining with the output of the GNSS unit. A controller with a processor receives the outputs of the sensor suite and computes steering solutions, which are utilized by vehicle actuators, including an automatic steering control unit connected to the vehicle steering for guiding the vehicle. The processor is programmed to define multiple behavior-based automatons comprising self-operating entities in the guidance system, which perform respective behaviors using data output from one or more sensor units for achieving the behaviors. A GNSS integrated multi-sensor vehicle guidance method is also disclosed.
Description
BACKGROUND OF THE INVENTION

1. Field of the Invention


The present invention relates generally to a versatile integrated multi-sensor apparatus which combines positional data from a variety of sensor types including a GNSS system. The various sensor data is ranked according to its confidence level, and using that data as a means to automatically create a planned path and steer a vehicle along that planned path. Elements of the present invention allow the system to be easily interchangeable among a multitude of vehicles and to communicate with other vehicles to allow for autonomous cooperative vehicle behavior building and task delegation.


2. Description of the Related Art


Global navigation satellite system (GNSS) guidance and control are widely used for vehicle and personal navigation and a variety of other uses involving precision location in geodesic reference systems. GNSS, which includes the Global Positioning System (GPS) and other satellite-based positioning systems, has progressed to sub-centimeter accuracy with known correction techniques, including a number of commercial satellite based augmentation systems (SBASs).


For even more accurate information, higher frequency signals with shorter wavelengths are required. It is known in the art that by using GNSS satellites' carrier phase transmissions, and possibly carrier phase signal components from base reference stations or satellite based augmentation systems (SBAS), including the Wide Area Augmentation System (WAAS) (U.S.), and similar systems such as EGNOS (European Union) and MSAS (Japan), a position may readily be determined to within millimeters. When accomplished with two antennas at a fixed spacing, an angular rotation may be computed using the position differences. In an exemplary embodiment, two antennas placed in the horizontal plane may be employed to compute a heading (rotation about a vertical axis) from a position displacement. Heading information, combined with position, either differentially corrected (DGPS) or carrier phase corrected real-time kinematic (RTK), provides the feedback information desired for a proper control of the vehicle direction.


Another benefit achieved by incorporating a GNSS-based heading sensor is the elimination or reduction of drift and biases resultant from a gyro-only or other inertial sensor approach. Yet another advantage is that heading may be computed while movable equipment is stopped or moving slowly, which is not possible in a single-antenna, GNSS-based approach that requires a velocity vector to derive a heading. Yet another advantage of incorporating a GNSS-based heading sensor is independence from a host vehicle's sensors or additional external sensors. Thus, such a system is readily maintained as equipment-independent and may be moved from one vehicle to another with minimal effort. Yet another exemplary embodiment of the sensor employs global navigation satellite system (GNSS) sensors and measurements to provide accurate, reliable positioning information. GNSS sensors include, but are not limited to, GPS, Global Navigation System (GLONAS), Wide Area Augmentation System (WAAS) and the like, as well as combinations including at least one of the foregoing.


An example of a GNSS is the Global Positioning System (GPS) established by the United States government, which employs a constellation of 24 or more satellites in well-defined orbits at an altitude of approximately 26,500 km. These satellites continually transmit microwave L-band radio signals in two frequency bands, centered at 1575.42 MHz and 1227.6 MHz, denoted as L1 and L2 respectively. These signals include timing patterns relative to the satellite's onboard precision clock (which is kept synchronized by a ground station) as well as a navigation message giving the precise orbital positions of the satellites, an ionosphere model and other useful information. GPS receivers process the radio signals, computing ranges to the GPS satellites, and by triangulating these ranges, the GPS receiver determines its position and its internal clock error.


In standalone GPS systems that determine a receiver's antenna position coordinates without reference to a nearby reference receiver, the process of position determination is subject to errors from a number of sources. These include errors in the GPS satellite's clock reference, the location of the orbiting satellite, ionosphere induced propagation delay errors, and troposphere refraction errors. The overall positional signal is weakened with each satellite target lost. These targets may be lost due to obstructions such as trees, hills, or merely because the satellite has orbited out of view.


To overcome these positioning errors of standalone GPS systems, many positioning applications have made use of data from multiple GPS receivers. Typically, in such applications, a reference or base receiver, located at a reference site having known coordinates, receives the GPS satellite signals simultaneously with the receipt of signals by a remote or rover receiver. Depending on the separation distance between the two GPS receivers, many of the errors mentioned above will affect the satellite signals equally for the two receivers. By taking the difference between signals received both at the reference site and the remote location, these errors are effectively eliminated. This facilitates an accurate determination of the remote receiver's coordinates relative to the reference receiver's coordinates. Additional sensors may also be used to support weak GNSS positional data, such as an inertial measurement unit which may include a gyroscope. Such additional sensors are, however, prone to lose calibration and then need to be corrected.


Differential global navigation satellite system (DGNSS) guidance utilizes a localized base receiver of known location in combination with a rover receiver on a moving vehicle for obtaining accurate vehicle positions from GNSS data. Differential positioning, using base and rover receivers, provides more accurate positioning information than standalone systems because the satellite ranging signal transmission errors tend to effect the base and rover receivers equally and therefore can be cancelled out in computing position solutions. In other words, the base-rover position signal “differential” accurately places the rover receiver “relative” to the base receiver. Because the “absolute” geo-reference location of the fixed-position base receiver is precisely known, the absolute position of the rover receiver can be computed using the base receiver known, absolute position and the position of the rover receiver relative thereto.


Differential GPS is well known and exhibits many forms. GPS applications have been improved and enhanced by employing a broader array of satellites such as GNSS and WAAS. For example, see commonly assigned U.S. Pat. No. 6,469,663 to Whitehead et al. titled Method and System for GPS and WAAS Carrier Phase Measurements for Relative Positioning, dated Oct. 22, 2002, the disclosures of which are incorporated by reference herein in their entirely. Additionally, multiple receiver DGPS has been enhanced by utilizing a single receiver to perform differential corrections. For example, see commonly assigned U.S. Pat. No. 6,397,147 to Whitehead titled Relative GPS Positioning Using A Single GPS Receiver With Internally Generated Differential Correction Terms, dated May 28, 2002 the disclosures of which are incorporated by reference herein in their entireties.


It is not uncommon to utilize a GNSS system in combination with an automatic-steering module linked to a vehicle's steering manifold through a steering controller unit. The guidance unit receives positional information from the GNSS unit and compares it with a pre-planned path or map. Because the GNSS positional information allows the guidance unit to know exactly where the vehicle is located along a path, it can use this information to automatically guide and steer the vehicle along this path.


A steering controller is required to accept instructions from the guidance unit and actually perform the steering controls on the vehicle. This device connects to the vehicle steering manifold and/or hydraulic steering valves. Signals from the guidance unit are delivered to the steering controller, which then commands hydraulic valves to open or close depending on the desired results.


Automatic steering systems using GNSS data tend to lose accuracy. If the system calibration is off the steering controller may tend to over-correct, resulting in erratic turns. Additionally, loss of the GNSS signal could affect the automatic steering function.


SUMMARY OF THE INVENTION

Disclosed herein is a method for providing accurate and precise vehicle positioning guidance and control with automatic steering capabilities. The present invention utilizes a series of separate sensors which may serve as temporary reliable guidance devices when GNSS signals are weak, and are recalibrated when GNSS signals are strong. This reliable positioning information gathering allows multiple vehicles to operate in cooperation with each other using autonomous task delegation and control. A versatile system is described that facilitates a number of precise steering tasks for a variety of functions using proportional hydraulic control and state-of-the-art GNSS positional systems.





BRIEF DESCRIPTION OF THE DRAWINGS

In the accompanying drawings, which illustrate the principles of the present invention and an exemplary embodiment thereof.



FIG. 1 is an isometric view of a tractor demonstrating the preferred embodiment.



FIG. 1A is an isometric view of a tractor demonstrating the three axes of orientation (X, Y and Z) and three possible directions of rotation (pitch, roll, and yaw).



FIG. 2 is an alternative line diagram demonstrating the relationship between devices in an embodiment of the invention.



FIG. 3 is an alternative line diagram demonstrating the relationship between devices in an embodiment of the invention.



FIG. 4 is an alternative line diagram demonstrating the relationship between devices in an embodiment of the invention.



FIG. 5 is an alternative line diagram demonstrating the relationship between devices in an embodiment of the invention.



FIG. 6 is an alternative line diagram demonstrating the relationship between devices in an embodiment of the invention.



FIG. 7 is an alternative line diagram demonstrating the relationship between devices in an embodiment of the invention.



FIG. 8 is an alternative line diagram demonstrating the relationship between devices in an embodiment of the invention.



FIG. 9 is an alternative line diagram demonstrating the relationship between devices in an embodiment of the invention.



FIG. 10 is a line diagram demonstrating the step-by-step method by which the sensor suite determines confidence levels of various sensors.



FIGS. 11A-G demonstrate various path-finding, path-creating, and object avoidance possibilities available when a tractor is equipped with the present invention.





DETAILED DESCRIPTION OF THE PREFERRED ASPECTS

I. Introduction, Environment, and Preferred Embodiment


Generally, a preferred embodiment of the present invention consists of components which allow a farming vehicle, with or without an attached farming implement, to automatically guide itself around a field and perform a plurality of functions, leading to precision farming. Said vehicle may be in constant communication with other vehicles similarly equipped for the same or different tasks. The vehicles within such a network are capable of making decisions amongst themselves about where to go and what to do to best perform assigned tasks based on the global position of each vehicle relative to each other and the location of said tasks.


The preferred embodiment or the integrated multi-sensor guidance system (guidance system) 2, as shown in FIG. 1, includes a vehicle 4, which may be equipped with a farming implement 6, a sensor suite 7, a guidance unit 10 capable of versatile path guidance, a steering controller 12 providing proportional hydraulic control, a hydraulic steering manifold 14, a wheel angle/speed sensor (WAS) 16, and an implement steering manifold 50. Additionally, the guidance system 2 includes a base station antenna 18 to communicate with a differential receiver 20, a GNSS receiver 22 connected to a plurality of antennas 24 located on the vehicle 4, and an inertial measurement unit (IMU) 26 for providing additional information to said guidance unit 10. Also included is a user interface 28 within the cab of the vehicle 4 allowing the driver of that vehicle to manually input commands into the guidance system 2 and override automatic control.


The preferred embodiment of the present invention has at least four particular applications. First, there is a command center approach that can be applied, where the guidance system is a one-time capital investment that can be moved and used with each piece of farming equipment, regardless of the season or the task being performed. Second, a highly accurate yet economical automatic steering application is available. Such an application can allow for high accuracy work to be performed 24 hours a day, 7 days a week with limited stress on human drivers. The third particular application of the present invention deals with sectional control of implements; that is the guidance unit can selectively shut off portions of the working implement where overlap would otherwise occur. Finally, site-specific farming using variable rate control can be applied. Depending on the site and the crop being grown, the system can fluctuate how much work the implement does, whether that be spraying, seeding, or tilling.



FIGS. 11A-G demonstrate the versatility of the automatic steering capabilities of the present invention on straight, contour, and circle pivot driving paths. These figures demonstrate a vehicle being self-driven around a series of different path-types 64 which are automatically generated by the guidance system 10 of the preferred embodiment due to information both manually input and gathered by the sensor suite 7. These figures demonstrate how the guidance system 2 will recognize field borders 62 and obstacles 66. When the planned path 64 encounters the obstacles 66, the system will either automatically create an alternative path 72 or return manual control to the vehicle driver. When the planned path 64 encounters the field border 62, the system will automatically shut off all implement controls and either perform an automatic turn in the headlands 68 or return manual vehicle control to the vehicle operator.


II. Sensor Suite


The sensor suite 7 is comprised of a plurality of sensors, including at least a GNSS system 8, a wheel angle sensor (WAS) 16 and an inertial measurement unit (IMU) 26. Additional sensors may include a video camera unit oriented in the vehicle towards the direction of travel. For example, the video camera unit can be oriented towards a landmark on the horizon, which can provide an aiming point or point of reference corresponding to a predetermined geo-reference location. Other sensors in the sensor suite 7 can include a radar unit for ranging and direction finding, e.g., to a particular radar target. A laser unit, radio input, telemetry, and other sensor units capable of aiding in precision position and trajectory mapping can also be utilized. This suite of sensors gathers position and heading data and relay this information to the guidance unit 10 discussed in detail in section III.


In the preferred embodiment of this invention, the GNSS system 8 will be assigned the highest confidence level as a default, and is thus a primary and important element to this guidance system 2. Global navigation satellite systems (GNSS) are broadly defined to include GPS (U.S.), Galileo (proposed), GLONASS (Russia), Beidou/Compass (China, proposed), IRNSS (India, proposed), QZSS (Japan, proposed) and other current and future positioning technology using signals from satellites, with or without augmentation from terrestrial sources. Inertial navigation systems (INS) include gyroscopic (gyro) sensors, accelerometers and similar technologies for providing output corresponding to the inertia of moving components in all axes, i.e. through six degrees of freedom (positive and negative directions along transverse X, longitudinal Y and vertical Z axes). Yaw, pitch and roll refer to moving component rotation about the Z, X and Y axes respectively. Said terminology will include the words specifically mentioned, derivatives thereof and words of similar meaning.


Disclosed herein in an exemplary embodiment is a sensor system for vehicle guidance. The sensor system can utilize a plurality of GNSS code or carrier phase differenced antennas to derive attitude information, herein referred to as a GNSS attitude system. Moreover, the GNSS attitude system may optionally be combined with one or more rate gyro(s) used to measure turn, roll or pitch rates and to further calibrate bias and scale factor errors within these gyros. In an exemplary embodiment, the rate gyros and GNSS receiver/antenna are integrated together within the same unit, to provide multiple mechanisms to characterize a vehicle's motion and position to make a robust vehicle steering control mechanism.


The preferred embodiment of the present invention includes a vehicle 4, an implement 6, and a sensor suite 7. The sensor suite is comprised of a plurality of sensors, containing at least a GNSS system 8, a WAS 16, and an IMU 26. Said GNSS system 8 is further comprised of a receiver 22, a differential receiver 20, a base station antenna 18, and a plurality of antennas 24 located on said vehicle 4 and implement 6. The GNSS system provides position information to the guidance unit 10. This information can be used or creating a path 64 around a field 60, establishing alternatives 72 to said path when obstacles 66 are encountered.


The sensor suite 7 will integrate all connected sensors with the ultimate result being robust tight wheel control; that is, wheel and vehicle control at a very precise level. This sensor integration implements a confidence level or reliance level checklist by which certain sensors are given higher-priority when position information is used unless those sensors are reporting weak or no signal. Higher priority sensor systems are used to recalibrate lower priority systems while said higher priority systems remain at their default signal levels. This ensures that when the higher priority systems lose signal, the lower priority systems are timely calibrated to compensate for the higher priority system for the short time period of reduced signal.


III. Guidance Unit 10


A guidance unit 10, otherwise known as an electronic control unit (ECU), can be put to several different uses on an agricultural vehicle. One common use is to provide heading data based on a pre-planned or calculated path 64. The guidance unit might have the path manually input into the unit, or it might be capable of receiving GNSS positional data and information regarding a particular piece of land and calculate a path based off of this information. The guidance unit 10 can display information to the vehicle's driver through a user interface (UI) 28 and allow the driver to manually steer the vehicle along the displayed path. A more precise application of such a guidance unit 10 is to introduce automatic steering to a farming vehicle 4. The vehicle 4 will then guide itself along said calculated or pre-planned path 64 with greater precision than manual steering could provide.


The guidance unit 10 can be put to additional uses as well, including automated implement control and advanced mapping and data management. The automated implement control comprises sectional implement control, including application rate control and variable rate control. The advanced mapping and data management, as mentioned above, includes the system's ability to take known landscape information from the GNSS system and store that information for processing during jobs. This leads to real-time map creation as the vehicle self-guides the piece of land to be worked.


The preferred embodiment of the present invention includes the sensor suite 7 mentioned above which is connected to the guidance unit 10. The guidance unit 10 interprets positional data received from the sensor suite 7 and puts it to use in several ways. The guidance unit 10 is further divided into at least a logic portion 30 and a guidance portion 32. The guidance unit receives data from the sensor suite 7, determines what to do with the data in the logic portion 30, including computing a path 64 or selectively controlling the implement, and then transmits that data through the guidance portion 32 to the steering controller 12 and the implement steering controller 50.


As demonstrated in FIG. 10, a confidence loop 100 is employed by the guidance unit 10 against multiple sensors in the sensor suite 7 to determine which sensor systems should be relied on when determining position and heading information. The confidence loop 100 is comprised of several steps. The start step 102 is initiated when the guidance system 2 is booted up. This can either be directly connected to the start-up of the vehicle 4 to which the system is attached, or completely independent of that vehicle. The system is then initialized at 104 and a default process 106 is begun. In this default process, the sensor systems are placed in a default reliance list whereby a particular sensor is given a higher confidence than other sensors and that high-confidence sensor is used to calibrate all other sensors. This highest confidence sensor is also used for initial position and heading information gathering and to instruct the guidance unit 10. In a preferred embodiment of the invention, the GNSS system 8 could delimit as the initial highest confidence system, followed by the IMU 26 and the WAS 16.


Once the default process 106 is begun, the loop 100 begins a sensor signal check 108. During this step, each sensor's signal is checked internally to determine whether it is communicating properly with the rest of the guidance system 2 and whether incoming signals are present. For example, the GNSS system 8 will be checked several times per second to determine the strength of the satellite signal being received by the antennas 24 and receiver 22. These sensor signal levels are then compared 110 with the default signal levels that are expected. If these detected signals are equal to or exceed the strength of the expected signal, a “yes” command is entered and the sensor signal check begins again.


If, however, the detected signal is lower than the expected default signal, a “no” command is reported and the loop 100 enters a confidence level reduction step 112 whereby the particular sensor's confidence level is reduced according to the strength of the detected signal. A confidence level comparison step 114 is then performed, comparing the updated confidence levels of all sensors in the sensor suite 7. If the result of the sensor-reliance reordering step 116 is a change in reliance levels, a “yes” command is returned and the reliance priority list is reordered at 118. This occurs when the confidence level of a particular sensor drops so low due to a weak or loss of signal that its information is no longer reliable. That sensor drops down in the reliance list and the new most reliable sensor is used to produce position and heading information until a sensor signal check 108 results in the original sensor regaining its signal and thus priority level. If the result of the sensor-reliance reordering step 116 is “no,” then the reliance list is not reordered and the confidence loop 100 returns to the sensor signal checking step 104.


This process of steps ensures that only the most reliable sensors are used to determine current vehicle position and heading and to recalibrate less reliable sensors. The listed steps are an example of such a confidence loop 100 and are not intended to be the only means to achieve the desired results. Additional or fewer steps may be used to return an appropriate confidence or reliance level list.


As an example of this process, the guidance unit 10 is connected to the steering controller 12 and the WAS 16. The guidance unit can relay correction information from the GNSS positioning system 8 to the WAS for calibration purposes. The WAS 16 is initially calibrated with a zero-heading and receives information from the steering controller 12 regarding turn data, and in turn relays actual data back to the steering controller and the guidance unit. The guidance unit knows exact position and heading information because of data received from the GNSS system 8 and other sensors high on the reliability list. By comparing the highly reliable GNSS information with the less reliable WAS information, the guidance unit can tell whether the WAS is correct or not. If it is determined that the WAS information is incorrect, the guidance unit can recalibrate the WAS and create a new zero-heading. In the alternative, if the confidence loop 100 were to determine that the GNSS system 8 had a weak signal at a particular point, the guidance unit 10 could rely on data from the IMU 26 and/or WAS 16 until the GNSS signal returns. These additional sensors are better suited for short-term accurate guidance, but quickly degrade and must be recalibrated.


IV. Steering Controller 12


The steering controller 12 is the third major component of the guidance system 2. The steering controller is designed to accept guidance inputs and transform those inputs into outputs that result in actual motion and steering of the vehicle 4.


The steering controller 12 portion of the guidance system 2 is designed to transmit and receive steering information from all associated parts and to provide the means for actually controlling the direction of the vehicle 4 based upon position and guidance data gathered by the sensor suite 7 and interpreted by the guidance unit 10. The steering controller is directly connected to the guidance unit 10, the WAS 16, the hydraulic steering manifold 14, and the implement controller 50. The steering controller 12 is the primary step for transforming data from the guidance system into actual movement of the vehicle itself.


Although the WAS 16 is part of the sensor suite 7 as discussed above, there is a direct connection between the WAS 16 and the steering controller 12. This results in a “wheel loop” whereby the steering controller 12 transmits steering commands to the hydraulic steering manifold 14 which proceeds to turn the wheels of the vehicle 4 in a direction. The angle of the turn is reported back to the steering controller, which may order further steering corrections depending on the pre-planned path 64. This angle can also reported to the guidance unit 10 where it is compared with other sensors in the confidence loop 100. Assuming another sensor, such as the GNSS system 8, is currently at the top of the reliance list, the WAS may be recalibrated if it turns out that the applied turning angle was incorrect when applied to the calculated path 64.


V. Automaton Control


The process of controlling several machines as automatons in a smart and accurate system, such as the one presented herein, is accomplished with the combination of the above-described units into a single, autonomous system allowing one system to control the positioning, guidance, and workload of a fleet of agricultural vehicles.


VI. Alternative Examples of a Guidance System 2


The above sections discuss the preferred embodiment of the invention, comprising generally a sensor suite 7, a guidance unit 10 and a steering controller 12. Several alternative methods of forming the guidance system 2 exist. A primary example is using the GNSS system 8 to completely replace the sensor suite 7, and moving the IMU 26 to the guidance unit 10. Other examples of said guidance system 2 follow.


As shown in FIG. 3, the IMU 26 and an optional “smart” antenna 34 may be directly connected to the guidance unit 10 providing direct information used to compare position and heading information with data received from the sensor suite 7. The smart antenna 34 is a combination receiver 22 and antenna 24. The user interface (UI) 28 is connected directly to the GNSS system 8. Additionally, the WAS 16 is connected separately and entirely to the steering controller 12.



FIG. 4 provides another detailed breakdown of the guidance unit 10 and its relationship with a GNSS system 8 and the steering controller 12. The IMU 26 is composed of a plurality of Kalman filters 36 which relay information regarding the various degrees of pitch, roll, and heading of the vehicle. The guidance portion 32 is further composed of a cross-track PID 38, a state dependent Ricatti equation (SDRE) 40 and a heading proportional integral derivative (PID) component 42. The steering controller 12 is again in direct communication with the WAS 16.



FIG. 5 demonstrates another alternative example of the relationship between the guidance unit 10 and the other elements of the guidance system 2. In this example the guidance unit 10 is independently connected to several unique elements, including a GNSS system 8, an implement control system 50, a variable rate transfer controller 52, personal computer office software 44, and a steering controller 12. The steering controller is separately connected to steering sensors 46 and the steering interface 48. The steering sensors may in turn contain WAS 16 or other sensor types. An important aspect demonstrated in this figure is the relationship between the guidance unit 10 and cooperative PC office software 44. This relationship is a key element because it allows the guidance unit 10 to be updated, controlled, and calibrated through a connection with a standard office PC. This allows the end-user to create paths, identify field boundaries, and update equipment software while using familiar PC technology instead of new, single-application user interfaces associated solely with the guidance unit.



FIG. 6 demonstrates another alternative example of the present invention. In this example, the guidance unit 10 contains a majority of the standard system elements, including the GNSS system 8, a UI 28, a variable rate controller 52, guidance 32 and steering 46 sensors, an accessory input 54, a mapping device 56, and a section controller 58 containing input/output connections. The accessory input device 54 allows the guidance controller to connect to external devices such as a weather radio, wireless service, video sensors, and monitoring devices. A wireless receiver 22 is connected to the GNSS 8 portion of the guidance unit 10 externally. A steering controller 12 is also connected to the guidance 32 and steering 46 sensors externally. The steering controller has additional connections to control the vehicle steering manifold 14.



FIG. 8 demonstrates another alternative example of the present invention. As is typical, the guidance unit 10 is the central element. Two steering controllers 12 are connected to the guidance unit, providing options to the guidance unit. A smart antenna 34, UI 28, and GNSS 8 system are connected to the guidance unit 10 along a first connection, and a second GNSS 8 system, along with a virtual terminal (VT) and/or an original equipment manufacturer (OEM) terminal. These two connections again provide options upon which the guidance unit 10 can make decisions to base path-making and steering choices on.



FIG. 9 demonstrates another example of the present invention generally comprising a kit for installation in an existing vehicle, such as a tractor with a hydraulic steering system. This example is again divided into three main components; the GNSS system 8, the guidance unit 10, and the steering controller 12. The GNSS system 8 includes an internal receiver 22 and at least one external antenna 24, along with various input/output connections. A CAN connection links the GNSS system 8 to the guidance unit 10. The guidance unit includes an internal IMU and an optional connection to an external smart antenna 34. The guidance unit 10 connects to the steering controller 12 through another CAN connection. The steering controller 12 is connected to and controls the vehicle steering valve/manifold 14. An analog connection links the WAS to the steering controller 12. Additionally, several switches are connected to the steering controller that will cancel auto-steer and return the vehicle to manual steering, stopping the vehicle immediately unless the driver is ready to continue. These switches include, but are not limited to, a steering wheel switch that detects when the operator's hand touches the steering wheel, a magnetic shutoff switch that is attached to the operator's seat and can determine if and when the operator stands to leave the seat, and a manual shut-off switch.



An integrated multi-sensor guidance system for a vehicle assembly including a steering subsystem, which guidance system includes: said vehicle assembly having a dynamic attitude comprising a geo-reference location, vehicle assembly orientation and vehicle assembly speed; a processor with multiple sensor inputs and actuator outputs; a suite of sensor units each connected to a respective sensor input.



Said sensor unit suite includes a GNSS unit with an antenna and a receiver connected to said antenna, said GNSS unit providing output signals corresponding to the GNSS-defined locations of said vehicle assembly dynamic attitude to a respective processor input.



A guidance controller is adapted for receiving signal input and generating control output based on said signal input; a data storage device including memory storage; and a suite of actuator units each connected to a respective actuator output.



Said sensor unit suite includes an inertial measurement unit (IMU) sensor providing output signals corresponding to an inertial aspect of a dynamic attitude of said vehicle assembly to a respective processor input.



Said guidance controller is adapted for receiving inertial measurement signals and integrating said inertial measurement signals with said GNSS-based positioning signals. Said processor is programmed to determine variable confidence levels in real time for each said sensor unit based on its current relative performance; and said processor is programmed to utilize said sensor unit outputs proportionally based on their respective confidence levels in generating said control output signals.



Said processor is programmed to define multiple behavior-based automatons comprising self-operating entities in said guidance system, said automatons performing respective behaviors using data output from one or more sensor units for achieving said behaviors and wherein one or more sensor units provide the same or similar data.



Each said automaton has an accepting interface for accepting requests from other automatons; a requesting interface for making requests to another automaton; a knowledge input for receiving a behavioral definition for affecting the behavior of the automatons; and a data input for receiving input data; and a data output for sending out the data.



Said actuator unit suite includes a steering unit connected to said steering subsystem and receiving said control output signals from said processor. Said steering subsystem includes: a steering controller including a steering processor and connected to said guidance controller. Said steering controller receives guidance signals as inputs from said guidance controller and computing steering signals as outputs from said steering controller; and said steering actuator receives said steering signals from said steering controller and steering said vehicle in response thereto.



Said sensor suite includes sensor units chosen from among the group comprising: a video camera unit oriented in the vehicle assembly direction of travel; a radar unit; a laser unit; radio input; telemetry; material application exclusion areas input; satellite image inputs; contour/elevation overlay inputs; prescription mapping; and a wheel angle sensor (WAS).



Said actuator suite includes actuator units chosen from among the group comprising: an implement steering unit, an implement sectional control unit, personal computer (PC) office software, material application rate control, secondary vehicle control, mapping, crop yield, and mapping skips and overlaps.



Said guidance controller is adapted for receiving and storing in said memory storage device GNSS-based positioning signals. Said processor is adapted for computing a GNSS-based guide pattern. Said guidance controller is adapted for providing output signals to a display device for displaying vehicle motion relative to guide patterns and contrasting displays of areas treated by said vehicle along previously-traveled portions of said guide patterns. Said guidance controller is adapted for calibrating and storing in said memory multiple vehicle profiles, each said profile including multiple, independent vehicle-specific automatons.



A method of vehicle control and guidance, comprises the steps: providing a vehicle assembly including a steering subsystem and dynamic attitude comprising a geo-reference location, vehicle assembly orientation, and vehicle assembly speed; providing a guidance system including processor with multiple sensor inputs and actuator outputs, a suite of sensor units connected to a respective sensor input, a suite of actuator units connected to a respective actuator output, and a data storage device including memory storage; providing a guidance controller; inputting signal input data to said guidance controller; and generating control output signals with said guidance controller based on said signal input.



The sensor unit suite includes an inertial measurement unit (IMU) sensor providing output signals corresponding to an inertial aspect of a dynamic attitude of said vehicle assembly to a respective processor input. The method of vehicle control and guidance also includes generating inertial measurement signals with said IMU sensor; receiving the inertial measurement signals with said guidance controller; and integrating said inertial measurement signals with said GNSS-based positioning signals.



The method of vehicle control and guidance also includes determining variable confidence levels with the processor in real time for each said sensor unit based on current relative performance; and utilizing said sensor unit outputs proportionally based on the respective confidence levels in generating said control output signals.



The method of vehicle control and guidance also includes defining multiple behavior-based automatons comprising self-operating entities in said guidance system; and instructing said automatons to perform respective behaviors using data output from one or more sensor units for achieving said behaviors wherein one or more sensor units provide the same or similar data.



The method of vehicle control and guidance also includes providing each automaton with an accepting interface for accepting requests from other automatons; providing each automaton with a requesting interface for making requests to another automaton; providing each automaton with a knowledge input for receiving a behavioral definition for affecting the behavior of the automatons; providing each automaton with a data input for receiving input data; and providing each automaton with a data output for sending data.



The method of vehicle control and guidance also includes providing a steering unit connected to said steering subsystem; and receiving said control output signals at said steering unit as steering control instructions. The method of vehicle control and guidance also includes providing a steering processor connected to said guidance controller; receiving guidance signals at said steering controller as inputs from said guidance controller; computing steering signals as outputs from said steering controller; receiving said steering signals with said steering actuator; and steering said vehicle assembly in response to said steering signals.



The method of vehicle control and guidance also includes determining variable confidence levels with the processor in real time for each said sensor unit based on current relative performance; and utilizing said sensor unit outputs proportionally based on the respective confidence levels in generating said control output signals.



An integrated multi-sensor guidance system for a vehicle assembly including a steering subsystem, includes: said vehicle assembly having a dynamic attitude comprising a geo-reference location, vehicle assembly orientation and vehicle assembly speed; a processor with multiple sensor inputs and actuator outputs; a suite of sensor units each connected to a respective sensor input.



Said sensor unit suite includes a GNSS unit with an antenna and a receiver connected to said antenna, said GNSS unit provides output signals corresponding to the GNSS-defined locations of said vehicle assembly dynamic attitude to a respective processor input; said sensor unit suite includes an inertial measurement unit (IMU) sensor providing output signals corresponding to an inertial aspect of a dynamic attitude of said vehicle assembly to a respective processor input.



Said guidance controller is adapted for receiving inertial measurement signals and integrating said inertial measurement signals with said GNSS-based positioning signals; said processor is programmed to determine variable confidence levels in real time for each said sensor unit based on its current relative performance.



Said processor is programmed to utilize said sensor unit outputs proportionally based on their respective confidence levels in generating said steering signals; a suite of actuator units are each connected to a respective actuator output; said actuator unit suite includes a steering unit connected to said steering subsystem and receiving said steering signals from said processor.



Said processor is programmed to define multiple behavior-based automatons comprising self-operating entities in said guidance system, said automatons performing respective behaviors using data output from one or more sensor units for achieving said behaviors and wherein one or more sensor units provide the same or similar data.



Each said automaton has an accepting interface for accepting requests from other automatons; a requesting interface for making requests to another automatons; a knowledge input for receiving a behavioral definition for affecting the behavior of the automatons; a data input for receiving input data; and a data output for sending out the data.


It will be appreciated that the components of the system 2 can be used for various other applications. Moreover, the subsystems, units and components of the system 2 can be combined in various configurations within the scope of the present invention. For example, the various units could be combined or subdivided as appropriate for particular applications. The system 2 is scalable as necessary for applications of various complexities. It is to be understood that while certain aspects of the disclosed subject matter have been shown and described, the disclosed subject matter is not limited thereto and encompasses various other embodiments and aspects.

Claims
  • 1. An integrated multi-sensor guidance system for a vehicle assembly including a steering subsystem, which guidance system includes: said vehicle assembly having a dynamic attitude comprising a geo-reference location, vehicle assembly orientation and vehicle assembly speed;a processor with multiple sensor inputs and actuator outputs;a suite of sensor units each connected to a respective sensor input;said sensor unit suite includes a GNSS unit with an antenna and a receiver connected to said antenna, a wheel angle sensor unit (WAS), and an inertial measurement unit (IMU) sensor, said GNSS unit providing output signals corresponding to the GNSS-defined locations of said vehicle assembly dynamic attitude to a respective processor input;a guidance controller adapted for receiving signal input and generating control output based on said signal input;a data storage device including memory storage;a suite of actuator units each connected to a respective actuator output;said guidance controller being adapted for receiving and storing in said memory storage device GNSS-based positioning signals;said processor being adapted for computing a GNSS-based. guide pattern;said guidance controller being adapted for providing output signals to a display device for displaying vehicle motion relative to guide patterns and contrasting displays of areas treated by said vehicle along previously-traveled portions of said guide patterns;said guidance controller being adapted for calibrating and storing in said memory multiple vehicle profiles, each said profile including multiple, independent vehicle-specific automatons;an accepting interface for accepting requests from other automatons;a requesting interface for making requests to another automaton;a knowledge input for receiving a behavioral definition for affecting the behavior of the automatons;a data input for receiving input data;a data output for sending output data;said processor programmed to determine different variable confidence levels in real time for each of said sensorGNSS unit, WAS, and IMU sensor based on its current relative performance;said processor programmed to utilize said sensorGNSS unit, WAS, and IMU sensor outputs proportionally based on their respective confidence levels in generating said control output signals; andwherein said processor is programmed to define multiple behavior-based automatons comprising self-operating entities in said guidance system, said automatons performing respective behaviors using data output from said one or more sensor unitsGNSS unit, said WAS, and said IMU sensor for achieving said behaviors and wherein said one or more sensor unitsGNSS unit, WAS, and IMU sensor provide at least some of the same or similar data.
  • 2. The guidance system as claimed in claim 1, wherein said sensor unit suite includes an inertial measurement unit (IMU) sensor providing output signals corresponding to an inertial aspect of a dynamic attitude of said vehicle assembly to a respective processor inputsaid guidance controller sends steering control instructions instructing said automatons to perform respective behaviors using data output from one or more GNSS unit, WAS, and IMU sensor.
  • 3. The guidance system as claimed in claim 21, wherein said guidance controller is adapted for receiving inertial measurement signals from the IMU and integrating said inertial measurement signals with said GNSS-based positioning signals.
  • 4. The guidance system as claimed in claim 1, wherein said. actuator unit suite includes a steering unit connected to said steering subsystem and receiving said control output signals from said processor.
  • 5. The guidance system as claimed in claim 4, wherein said steering subsystem includes: a steering controller including a steering processor and connected to said guidance controller;said steering controller receiving guidance signals as inputs from said guidance controller and computing steering signals as outputs from said steering controller; andsaid steering actuator receiving said steering signals from said steering controller and steering said vehicle in response thereto.
  • 6. The guidance system as claimed in claim 1, wherein said sensor suite includes sensor units chosen from among the group comprising: a video camera unit oriented in the vehicle assembly direction of travel;a radar unit;a laser unit;radio input;telemetry;material application exclusion areas input;satellite image inputs; andcontour/elevation overlay inputs;prescription mapping; anda wheel angle sensor (WAS).
  • 7. The guidance system as claimed in claim 1, wherein said actuator suite includes actuator units chosen from among the group comprising: an implement steering unit, an implement sectional control unit, personal computer (PC) office software, material application rate control, secondary vehicle control, mapping, crop yield, and mapping skips and overlaps.
  • 8. A method of vehicle control and guidance, which method comprises the steps: providing a vehicle assembly including a steering subsystem and dynamic attitude comprising a geo-reference location, vehicle assembly orientation, and vehicle assembly speed;providing a guidance system including a processor with multiple sensor inputs and actuator outputs, a suite of sensor units connected to a respective sensor input, a suite of actuator units connected to a respective actuator output, and a data storage device including memory storage;providing a guidance controller;inputting signal input data to said guidance controller;generating control output signals with said guidance controller based on said signal input;receiving and storing in said memory storage device GNSS-based positioning signals with said guidance controller;computing a GNSS-based guide pattern with said processor;providing output signals with said guidance controller to a display device for displaying vehicle motion relative to guide patterns and contrasting displays of areas treated by said vehicle along previously-traveled portions of said guide patterns;calibrating and storing in said memory multiple vehicle profiles with said guidance controller, each said profile including multiple, independent vehicle-specific automatons;wherein said sensor unit suite includes an inertial measurement unit (IMU) sensor providing output signals corresponding to an inertial aspect of a dynamic attitude of said vehicle assembly to a respective processor input;generating inertial measurement signals with said IMU sensor;receiving the inertial measurement signals with said guidance controller;integrating said inertial measurement signals with said GNSS-based positioning signals;defining multiple behavior-based automatons comprising self-operating entities in said guidance system;instructing said automatons to perform respective behaviors using data output from one or more sensor units for achieving said behaviors wherein one or inure sensor units provide the same or similar data;providing each automaton with an accepting interface for accepting requests from other automatons;providing each automaton with a requesting interface for making requests to another automaton;providing each automaton with a knowledge input for receiving a behavioral definition for affecting the behavior of the automatons;providing each automaton with a data input for receiving input data;providing each automaton with a data output for sending data; requesting instructions by each automaton from each other automaton; andaccepting instructions by each automaton provided from each other automaton.
  • 9. A method of vehicle control and guidance as claimed by claim 8, including the steps: determining variable confidence levels with the processor in real time for each said sensor unit based on current relative performance; andutilizing said sensor unit outputs proportionally based on the respective confidence levels in generating said control output signals.
  • 10. A method of vehicle control and guidance as claimed by claim 8, including the steps: providing a steering unit connected to said steering system; andreceiving said control output signals at said steering unit as steering control instructions.
  • 11. A method of vehicle control and guidance as claimed by claim 10, including the steps: providing a steering processor connected to said guidance controller;receiving guidance signals at said steering controller as inputs from said guidance controller;computing steering signals as outputs from said steering controller;receiving said steering signals with said steering actuator; andsteering said vehicle assembly in response to said steering signals.
  • 12. An integrated multi-sensor guidance system for a vehicle assembly including a steering subsystem, which guidance system includes, comprising: said vehicle assembly having a dynamic attitude comprising a geo-reference location, vehicle assembly orientation and vehicle assembly speed;a processor with multiple sensor inputs and actuator outputs;a suite of sensor units each connected to a respective sensor input;said sensor unit suite including a GNSS unit with an antenna and a receiver connected to said antenna, said GNSS unit providing output signals corresponding to the GNSS-defined locations of said vehicle assembly dynamic attitude to a respective processor input;said sensor unit suite including an inertial measurement unit (IMU) sensor providing output signals corresponding to an inertial aspect of a dynamic altitude of said vehicle assembly to a respective processor input;said guidance controller being adapted for receiving said inertial measurement signals and integrating said inertial measurement signals with said GNSS-based positioning signals;said processor being programmed to determine variable confidence levels in real time for each said sensor unit based on its current relative performance;said processor being programmed to utilize said sensor unit outputs proportionally based on their respective confidence levels in generating said steering signals;a suite of actuator units each connected to a respective actuator output;said actuator unit suite including a steering unit connected to said steering subsystem and receiving said steering signals from said processor;said processor being programmed to define multiple behavior-based automatons comprising self-operating entities in said guidance system, said automatons performing respective behaviors using data output from one or more said sensor units for achieving said behaviors and wherein said one or more sensor units provide the same or similar data;each said automaton having:an accepting interface for accepting requests from other automatons;a requesting interface for making requests to another automatons;a knowledge input for receiving a behavioral definition for affecting the behavior of the automatons;a data input for receiving input data; anda data output for sending out the data;said guidance controller being adapted for receiving and storingthe same or another processor further configured to:receive and store in saida memory storage device global navigation satellite system (GNSS)-based positioning signals and inertial measurement signals from an inertial measurement unit (IMU) sensor;said processor being adapted for computingcompute a GNSS-based guide pattern;said guidance controller being adapted for providingprovide output signals to a display device for displayingto display vehicle motion for the vehicle relative to guide patternsthe guide pattern and contrastingto contrast displays of areas treated by saidthe vehicle along previously-traveled portions of saidthe guide patternspattern; andsaid guidance controller being adapted for calibrating and storingcalibrate and store in saidthe memory storage device multiple vehicle profiles, each said profile including multiple, independent vehicle-specific automatons.
  • 13. A guidance system for operating multiple vehicles, the guidance system including a guidance controller to: receive and store in a memory storage device data from sensor units located on a first vehicle including velocity, acceleration and GNSS-based position data derived from a wheel angle sensor (WAS), inertial measurement unit (IMU), and global navigation satellite system (GNSS);determine variable confidence levels in real time the said sensor units based on its current relative performance;define multiple behavior-based automatons comprising self-operating entities in said guidance system, said automatons performing respective behaviors using data output from said one or more sensor units for achieving said behaviors and wherein said one or more sensor units provide the same or similar data;each said automaton having:an accepting interface for accepting requests from other automatons;a requesting interface for making requests to another automatons;a knowledge input for receiving a behavioral definition for affecting the behavior of the automatons;a data input for receiving input data; anda data output for sending out the data;calibrate and store in the memory storage device multiple vehicle profiles, each said profile including multiple, independent vehicle-specific automatons;compute a guide path for the first vehicle from at least the GNSS-based position data;send the GNSS-based position data to a display device to display movement of the first vehicle relative to the guide path;provide output signals to the display device to display vehicle motion for the first vehicle relative to the guide path and to contrast displays of areas treated by the first vehicle along previously-traveled portions of the guide path;store in the memory storage device a vehicle profile for a second vehicle;send guidance instructions to the second vehicle to use data output from one or more of the sensor units that provide the same position data as derived from the GNSS, WAS, and IMU to control movement of the second vehicle in cooperation with movement of the first vehicle along the guide path; andupdate the guidance instructions to the second vehicle based on computed differences in the position data of the first vehicle.
  • 14. An integrated multi-sensor guidance system comprising a guidance controller to: receive signals from different sensor units including a global navigation satellite system (GNSS)-based sensor unit, a wheel angle sensor (WAS), and an inertial measurement unit (IMU) sensor providing positions for a vehicle;define multiple behavior-based automatons comprising self-operating entities in said guidance system performing respective behaviors using data output from the sensor units and wherein said sensor units provide at least some same or similar data, each said automaton having:an accepting interface for accepting requests from other automatons;a requesting interface for making requests to another automatons;a knowledge input for receiving a behavioral definition for affecting the behavior of the automatons;a data input for receiving input data; anda data output for sending out the data;calibrate and store in a memory storage device multiple vehicle profiles, each said profile including multiple, independent vehicle-specific automatons;compute a GNSS-based guide path for the vehicle;determine variable confidence levels in real-time for the GNSS-based sensor unit, the WAS, and the IMU sensor based on current relative performance of the sensor units;use output signals from a first one of the sensor units with a highest one of the confidence levels to provide output signals to determine movements of the vehicle relative to the guide path;use output signals from the first one of the sensor units with the highest one of the confidence levels to calibrate a second one of the sensor units with a lower one of the confidence levels; andprovide output signals to a display device to display vehicle motion for the vehicle relative to the guide path and to contrast displays of areas treated by the vehicle along previously-traveled portions of the guide path.
CROSS REFERENCE TO RELATED APPLICATION

This application relates to U.S. Provisional Patent Application Ser. No. 61/243,417, filed Sep. 17, 2009, filed concurrently herewith, which is incorporated herein by reference. The present application is a reissue application of U.S. Pat. No. 8,649,930, issued Feb. 11, 2014, entitled: GNSS INTEGRATED MULTI-SENSOR CONTROL SYSTEM AND METHOD; which claims benefit of U.S. Provisional Patent Application No. 61/243,417, filed Sep. 17, 2009, the contents and disclosures of which are hereby incorporated by reference in their entireties.

US Referenced Citations (414)
Number Name Date Kind
3585537 Rennick et al. Jun 1971 A
3596228 Reed, Jr. et al. Jul 1971 A
3727710 Sanders et al. Apr 1973 A
3815272 Marleau Jun 1974 A
3899028 Morris et al. Aug 1975 A
3987456 Gelin Oct 1976 A
4132272 Holloway et al. Jan 1979 A
4170776 MacDoran et al. Oct 1979 A
4180133 Collogan et al. Dec 1979 A
4398162 Nagai Aug 1983 A
4453614 Allen et al. Jun 1984 A
4529990 Brunner Jul 1985 A
4637474 Leonard Jan 1987 A
4667203 Counselman, III May 1987 A
4689556 Cedrone Aug 1987 A
4694264 Owens et al. Sep 1987 A
4710775 Coe Dec 1987 A
4714435 Stipanuk et al. Dec 1987 A
4739448 Rowe et al. Apr 1988 A
4751512 Longaker Jun 1988 A
4769700 Pryor Sep 1988 A
4785463 Janc et al. Nov 1988 A
4802545 Nystuen et al. Feb 1989 A
4812991 Hatch Mar 1989 A
4813991 Hale Mar 1989 A
4858132 Holmquist Aug 1989 A
4864320 Munson et al. Sep 1989 A
4894662 Counselman Jan 1990 A
4916577 Dawkins Apr 1990 A
4918607 Wible Apr 1990 A
4963889 Hatch Oct 1990 A
5031704 Fleischer et al. Jul 1991 A
5100229 Lundberg et al. Mar 1992 A
5134407 Lorenz et al. Jul 1992 A
5148179 Allison Sep 1992 A
5152347 Miller Oct 1992 A
5155490 Spradley et al. Oct 1992 A
5155493 Thursby et al. Oct 1992 A
5156219 Schmidt et al. Oct 1992 A
5165109 Han et al. Nov 1992 A
5173715 Rodal et al. Dec 1992 A
5177489 Hatch Jan 1993 A
5185610 Ward et al. Feb 1993 A
5191351 Hofer et al. Mar 1993 A
5194851 Kraning et al. Mar 1993 A
5202829 Geier Apr 1993 A
5207239 Schwitalla May 1993 A
5239669 Mason et al. Aug 1993 A
5255756 Follmer et al. Oct 1993 A
5268695 Dentinger et al. Dec 1993 A
5293170 Lorenz et al. Mar 1994 A
5294970 Dornbusch et al. Mar 1994 A
5296861 Knight Mar 1994 A
5311149 Wagner et al. May 1994 A
5323322 Mueller et al. Jun 1994 A
5334987 Teach Aug 1994 A
5343209 Sennott et al. Aug 1994 A
5345245 Ishikawa et al. Sep 1994 A
5359332 Allison et al. Oct 1994 A
5361212 Class et al. Nov 1994 A
5365447 Dennis Nov 1994 A
5369589 Steiner Nov 1994 A
5375059 Kyrtsos et al. Dec 1994 A
5390124 Kyrtsos Feb 1995 A
5390125 Sennott et al. Feb 1995 A
5390207 Fenton et al. Feb 1995 A
5416712 Geier et al. May 1995 A
5430654 Kyrtsos et al. Jul 1995 A
5442363 Remondi Aug 1995 A
5444453 Lalezari Aug 1995 A
5451964 Babu Sep 1995 A
5467282 Dennis Nov 1995 A
5471217 Hatch et al. Nov 1995 A
5476147 Fixemer Dec 1995 A
5477228 Tiwari et al. Dec 1995 A
5477458 Loomis Dec 1995 A
5490073 Kyrtsos Feb 1996 A
5491636 Robertson Feb 1996 A
5495257 Loomis Feb 1996 A
5504482 Schreder Apr 1996 A
5511623 Frasier Apr 1996 A
5519620 Talbot et al. May 1996 A
5521610 Rodal May 1996 A
5523761 Gildea Jun 1996 A
5534875 Diefes et al. Jul 1996 A
5543804 Buchler et al. Aug 1996 A
5546093 Gudat et al. Aug 1996 A
5548293 Cohen et al. Aug 1996 A
5561432 Knight Oct 1996 A
5563786 Torii Oct 1996 A
5568152 Janky et al. Oct 1996 A
5568162 Samsel et al. Oct 1996 A
5583513 Cohen Dec 1996 A
5589835 Gildea et al. Dec 1996 A
5592382 Colley Jan 1997 A
5596328 Stangeland et al. Jan 1997 A
5600670 Turney Feb 1997 A
5604506 Rodal Feb 1997 A
5608393 Hartman Mar 1997 A
5610522 Locatelli et al. Mar 1997 A
5610616 Vallot et al. Mar 1997 A
5610845 Slabinski et al. Mar 1997 A
5612883 Shaffer et al. Mar 1997 A
5615116 Gudat et al. Mar 1997 A
5617100 Akiyoshi et al. Apr 1997 A
5617317 Ignagni Apr 1997 A
5621646 Enge et al. Apr 1997 A
5638077 Martin Jun 1997 A
5644139 Allen et al. Jul 1997 A
5646844 Gudat et al. Jul 1997 A
5663879 Trovato et al. Sep 1997 A
5664632 Frasier Sep 1997 A
5673491 Brenna et al. Oct 1997 A
5680140 Loomis Oct 1997 A
5684476 Anderson Nov 1997 A
5684696 Rao et al. Nov 1997 A
5706015 Chen et al. Jan 1998 A
5717593 Gvili Feb 1998 A
5725230 Walkup Mar 1998 A
5731786 Abraham et al. Mar 1998 A
5739785 Allison et al. Apr 1998 A
5757316 Buchler May 1998 A
5765123 Nimura et al. Jun 1998 A
5777578 Chang et al. Jul 1998 A
5810095 Orbach et al. Sep 1998 A
5828336 Yunck et al. Oct 1998 A
5838562 Gudat et al. Nov 1998 A
5854987 Sekine et al. Dec 1998 A
5862501 Talbot et al. Jan 1999 A
5864315 Welles et al. Jan 1999 A
5864318 Cosenza et al. Jan 1999 A
5875408 Bendett et al. Feb 1999 A
5877725 Kalafus Mar 1999 A
5890091 Talbot et al. Mar 1999 A
5899957 Loomis May 1999 A
5906645 Kagawa et al. May 1999 A
5912798 Chu Jun 1999 A
5914685 Kozlov et al. Jun 1999 A
5917448 Mickelson Jun 1999 A
5918558 Susag Jul 1999 A
5919242 Greatline et al. Jul 1999 A
5923270 Sampo et al. Jul 1999 A
5926079 Heine et al. Jul 1999 A
5927603 McNabb Jul 1999 A
5928309 Korver et al. Jul 1999 A
5929721 Munn et al. Jul 1999 A
5933110 Tang Aug 1999 A
5935183 Sahm et al. Aug 1999 A
5936573 Smith Aug 1999 A
5940026 Popeck Aug 1999 A
5941317 Mansur Aug 1999 A
5943008 Van Dusseldorp Aug 1999 A
5944770 Enge et al. Aug 1999 A
5945917 Harry Aug 1999 A
5948043 Mathis Sep 1999 A
5949371 Nichols Sep 1999 A
5955973 Anderson Sep 1999 A
5956250 Gudat et al. Sep 1999 A
5969670 Kalafus et al. Oct 1999 A
5987383 Keller et al. Nov 1999 A
6014101 Loomis Jan 2000 A
6014608 Seo Jan 2000 A
6018313 Engelmayer et al. Jan 2000 A
6023239 Kovach Feb 2000 A
6052647 Parkinson et al. Apr 2000 A
6055477 McBurney et al. Apr 2000 A
6057800 Yang et al. May 2000 A
6061390 Meehan et al. May 2000 A
6061632 Dreier May 2000 A
6062317 Gharsalli May 2000 A
6069583 Silvestrin et al. May 2000 A
6070673 Wendte Jun 2000 A
6076612 Carr et al. Jun 2000 A
6081171 Ella Jun 2000 A
6100842 Dreier et al. Aug 2000 A
6122595 Varley et al. Sep 2000 A
6128574 Diekhans Oct 2000 A
6144335 Rogers Nov 2000 A
6191730 Nelson, Jr. Feb 2001 B1
6191733 Dizchavez Feb 2001 B1
6198430 Hwang et al. Mar 2001 B1
6198992 Winslow Mar 2001 B1
6199000 Keller et al. Mar 2001 B1
6205401 Pickhard et al. Mar 2001 B1
6212453 Kawagoe et al. Apr 2001 B1
6215828 Signell et al. Apr 2001 B1
6229479 Kozlov et al. May 2001 B1
6230097 Dance et al. May 2001 B1
6233511 Berger et al. May 2001 B1
6236916 Staub et al. May 2001 B1
6236924 Motz May 2001 B1
6253160 Hanseder Jun 2001 B1
6256583 Sutton Jul 2001 B1
6259398 Riley Jul 2001 B1
6266595 Greatline et al. Jul 2001 B1
6285320 Olster et al. Sep 2001 B1
6292132 Wilson Sep 2001 B1
6307505 Green Oct 2001 B1
6313788 Wilson Nov 2001 B1
6314348 Winslow Nov 2001 B1
6325684 Knight Dec 2001 B1
6336066 Pellenc et al. Jan 2002 B1
6345231 Quincke Feb 2002 B2
6356602 Rodal et al. Mar 2002 B1
6377889 Soest Apr 2002 B1
6380888 Kucik Apr 2002 B1
6389345 Phelps May 2002 B2
6392589 Rogers et al. May 2002 B1
6397147 Whitehead May 2002 B1
6415229 Diekhans Jul 2002 B1
6418031 Archambeault Jul 2002 B1
6421003 Riley et al. Jul 2002 B1
6424915 Fukuda et al. Jul 2002 B1
6431576 Viaud et al. Aug 2002 B1
6434462 Bevly et al. Aug 2002 B1
6445983 Dickson et al. Sep 2002 B1
6445990 Manring Sep 2002 B1
6449558 Small Sep 2002 B1
6463091 Zhodzicshsky et al. Oct 2002 B1
6463374 Keller et al. Oct 2002 B1
6466871 Reisman et al. Oct 2002 B1
6469663 Whitehead et al. Oct 2002 B1
6484097 Fuchs et al. Nov 2002 B2
6501422 Nichols Dec 2002 B1
6515619 McKay, Jr. Feb 2003 B1
6516271 Upadhyaya et al. Feb 2003 B2
6539303 McClure et al. Mar 2003 B2
6542077 Joao Apr 2003 B2
6549835 Deguchi Apr 2003 B2
6553299 Keller et al. Apr 2003 B1
6553300 Ma et al. Apr 2003 B2
6553311 Aheam et al. Apr 2003 B2
6570534 Cohen et al. May 2003 B2
6577952 Strother et al. Jun 2003 B2
6587761 Kumar Jul 2003 B2
6606542 Hauwiller et al. Aug 2003 B2
6611228 Toda et al. Aug 2003 B2
6611754 Klein Aug 2003 B2
6611755 Coffee et al. Aug 2003 B1
6622091 Perlmutter et al. Sep 2003 B2
6631916 Miller Oct 2003 B1
6643576 O Connor et al. Nov 2003 B1
6646603 Dooley et al. Nov 2003 B2
6657875 Zeng et al. Dec 2003 B1
6671587 Hrovat et al. Dec 2003 B2
6688403 Bernhardt et al. Feb 2004 B2
6703973 Nichols Mar 2004 B1
6711501 McClure et al. Mar 2004 B2
6721638 Zeitler Apr 2004 B2
6732024 Wilhelm Rekow et al. May 2004 B2
6744404 Whitehead et al. Jun 2004 B1
6754584 Pinto et al. Jun 2004 B2
6774843 Takahashi Aug 2004 B2
6789014 Rekow et al. Sep 2004 B1
6792380 Toda Sep 2004 B2
6819269 Flick Nov 2004 B2
6819780 Benson et al. Nov 2004 B2
6822314 Beasom Nov 2004 B2
6865465 McClure Mar 2005 B2
6865484 Miyasaka et al. Mar 2005 B2
6876920 Mailer Apr 2005 B1
6900992 Kelly et al. May 2005 B2
6922635 Rorabaugh Jul 2005 B2
6931233 Tso et al. Aug 2005 B1
6967538 Woo Nov 2005 B2
6990399 Hrazdera et al. Jan 2006 B2
7006032 King et al. Feb 2006 B2
7026982 Toda et al. Apr 2006 B2
7027918 Zimmerman et al. Apr 2006 B2
7031725 Rorabaugh Apr 2006 B2
7089099 Shostak et al. Aug 2006 B2
7142956 Heiniger et al. Nov 2006 B2
7162348 McClure et al. Jan 2007 B2
7191061 McKay et al. Mar 2007 B2
7225060 O'Connor et al. May 2007 B2
7225068 Schick et al. May 2007 B2
7231290 Steichen et al. Jun 2007 B2
7248211 Hatch et al. Jul 2007 B2
7271766 Zimmerman et al. Sep 2007 B2
7277784 Weiss Oct 2007 B2
7277792 Overschie Oct 2007 B2
7292186 Miller et al. Nov 2007 B2
7324915 Altman et al. Jan 2008 B2
7358896 Gradincic et al. Apr 2008 B2
7373231 McClure et al. May 2008 B2
7388539 Whitehead et al. Jun 2008 B2
7395769 Jensen Jul 2008 B2
7428259 Wang et al. Sep 2008 B2
7437230 McClure et al. Oct 2008 B2
7451030 Eglington et al. Nov 2008 B2
7454290 Alban et al. Nov 2008 B2
7460942 Mailer Dec 2008 B2
7479900 Horstemeyer Jan 2009 B2
7505848 Flann et al. Mar 2009 B2
7522100 Yang et al. Apr 2009 B2
7571029 Dai et al. Aug 2009 B2
7580783 Dix Aug 2009 B2
7689354 Heiniger et al. Mar 2010 B2
7904226 Dix Mar 2011 B2
8160765 Morselli et al. Apr 2012 B2
8190337 McClure May 2012 B2
8437901 Anderson May 2013 B2
8649930 Reeve et al. Feb 2014 B2
20020004691 Kinashi et al. Jan 2002 A1
20020072850 McClure et al. Jun 2002 A1
20030014171 Ma et al. Jan 2003 A1
20030187560 Keller et al. Oct 2003 A1
20030208319 Ell et al. Nov 2003 A1
20040039514 Steichen et al. Feb 2004 A1
20040186644 McClure et al. Sep 2004 A1
20040212533 Whitehead et al. Oct 2004 A1
20050080559 Ishibashi et al. Apr 2005 A1
20050114023 Williamson et al. May 2005 A1
20050165546 Aral Jul 2005 A1
20050225955 Grebenkemper et al. Oct 2005 A1
20050265494 Goodings Dec 2005 A1
20060167600 Nelson et al. Jul 2006 A1
20060206246 Walker Sep 2006 A1
20060215739 Williamson et al. Sep 2006 A1
20070078570 Dai et al. Apr 2007 A1
20070088447 Stothert et al. Apr 2007 A1
20070121708 Simpson May 2007 A1
20070205940 Yang et al. Sep 2007 A1
20070285308 Bauregger et al. Dec 2007 A1
20080039991 May et al. Feb 2008 A1
20080059068 Strelow et al. Mar 2008 A1
20080129586 Martin Jun 2008 A1
20080195268 Sapilewski et al. Aug 2008 A1
20080204312 Euler Aug 2008 A1
20090171583 DiEsposti Jul 2009 A1
20090174597 DiLellio et al. Jul 2009 A1
20090174622 Kanou Jul 2009 A1
20090177395 Stelpstra Jul 2009 A1
20090177399 Park et al. Jul 2009 A1
20090259397 Stanton Oct 2009 A1
20090259707 Martin et al. Oct 2009 A1
20090262014 DiEsposti Oct 2009 A1
20090262018 Vasilyev et al. Oct 2009 A1
20090262974 Lithopoulos Oct 2009 A1
20090265054 Basnayake Oct 2009 A1
20090265101 Jow Oct 2009 A1
20090265104 Shroff Oct 2009 A1
20090273372 Brenner Nov 2009 A1
20090273513 Huang Nov 2009 A1
20090274079 Bhatia et al. Nov 2009 A1
20090274113 Katz Nov 2009 A1
20090276155 Jeerage et al. Nov 2009 A1
20090295633 Pinto et al. Dec 2009 A1
20090295634 Yu et al. Dec 2009 A1
20090299550 Baker Dec 2009 A1
20090322597 Medina Herrero et al. Dec 2009 A1
20090322598 Fly et al. Dec 2009 A1
20090322600 Whitehead et al. Dec 2009 A1
20090322601 Ladd et al. Dec 2009 A1
20090322606 Gronemeyer Dec 2009 A1
20090326809 Colley et al. Dec 2009 A1
20100013703 Tekawy et al. Jan 2010 A1
20100026569 Amidi Feb 2010 A1
20100030470 Wang et al. Feb 2010 A1
20100039316 Gronemeyer et al. Feb 2010 A1
20100039318 Kmiecik Feb 2010 A1
20100039320 Boyer et al. Feb 2010 A1
20100039321 Abraham Feb 2010 A1
20100060518 Bar-Sever et al. Mar 2010 A1
20100063649 Wu Mar 2010 A1
20100084147 Aral Apr 2010 A1
20100085249 Ferguson et al. Apr 2010 A1
20100085253 Ferguson et al. Apr 2010 A1
20100103033 Roh Apr 2010 A1
20100103034 Tobe et al. Apr 2010 A1
20100103038 Yeh et al. Apr 2010 A1
20100103040 Broadbent Apr 2010 A1
20100106414 Whitehead Apr 2010 A1
20100106445 Kondoh Apr 2010 A1
20100109944 Whitehead et al. May 2010 A1
20100109945 Roh May 2010 A1
20100109947 Rintanen May 2010 A1
20100109948 Razoumov et al. May 2010 A1
20100109950 Roh May 2010 A1
20100111372 Zheng et al. May 2010 A1
20100114483 Heo et al. May 2010 A1
20100117894 Velde et al. May 2010 A1
20100117899 Papadimitratos et al. May 2010 A1
20100117900 Van Diggelen et al. May 2010 A1
20100124210 Lo May 2010 A1
20100124212 Lo May 2010 A1
20100134354 Lennen Jun 2010 A1
20100149025 Meyers et al. Jun 2010 A1
20100149030 Verma et al. Jun 2010 A1
20100149033 Abraham Jun 2010 A1
20100149034 Chen Jun 2010 A1
20100149037 Cho Jun 2010 A1
20100150284 Fielder et al. Jun 2010 A1
20100152949 Nunan et al. Jun 2010 A1
20100156709 Zhang et al. Jun 2010 A1
20100156712 Pisz et al. Jun 2010 A1
20100156718 Chen Jun 2010 A1
20100159943 Salmon Jun 2010 A1
20100161179 McClure et al. Jun 2010 A1
20100161211 Chang Jun 2010 A1
20100161568 Xiao Jun 2010 A1
20100171660 Shyr et al. Jul 2010 A1
20100171757 Melamed Jul 2010 A1
20100185364 McClure Jul 2010 A1
20100185366 Heiniger et al. Jul 2010 A1
20100185389 Woodard Jul 2010 A1
20100188285 Collins Jul 2010 A1
20100188286 Bickerstaff et al. Jul 2010 A1
20100189163 Burgi et al. Jul 2010 A1
20100207811 Lackey Aug 2010 A1
20100210206 Young Aug 2010 A1
20100211248 Craig et al. Aug 2010 A1
20100211315 Toda Aug 2010 A1
20100211316 DaSilva et al. Aug 2010 A1
Foreign Referenced Citations (8)
Number Date Country
07244150 Sep 1995 JP
WO9836288 Aug 1998 WO
WO0024239 May 2000 WO
WO03019430 Mar 2003 WO
WO2005119386 Dec 2005 WO
WO2009066183 May 2009 WO
WO2009126587 Oct 2009 WO
WO2009148638 Dec 2009 WO
Non-Patent Literature Citations (35)
Entry
Information Disclosure Statement, Listing of Related Cases, Aug. 2, 2016, 1 page.
Noh, Kwang-Mo, Self-tuning controller for farm tractor guidance, Iowa State University Retrospective Theses and Dissertations, Paper 9874, (1990).
Van Zuydam,. R.P., Centimeter-Precision Guidance of Agricultural Implements in the Open Field by Means of Real Tim Kinematic DGPS, ASA-CSSA-SSSA, pp. 1023-1034 (1999).
Parkinson, Bradford W., et al., “Global Positioning System: Theory and Applications, vol. II”, Bradford W. Parkinson and James J. Spiker, Jr., eds., Global Postioning System: Theory and Applicaitons, vol. II, 1995, AIAA, Reston, VA, USA, pp. 3-50, (1995), 3-50.
“Orthman Manufacturing Co., www.orthman.com/htm;guidance.htm”, 2004 regarding the “Tracer Quick-Hitch”.
Lin, Dai et al., “Real-time Attitude Determination fro Microsatellite by Lamda Method Combined with Kalman Filtering”, A Collection fof the 22nd AIAA International Communications Satellite Systems Conference and Exhibit Technical Paers vol. 1, Monetrey, California American Institute of Aeronautics and Astronautics, Inc., (May 2004),136-143.
Xu, Jiangning et al., “An EHW Architecture for Real-Time GPS Attitude Determination Based on Parallel Genetic Algorithm”, The Computer SocietyProceedings of the 2002 NASA/DOD Conference on Evolvable Hardware (EH'02), (2002).
Han, Shaowel et al., “Single-Epoch Ambiguity Resolution for Real-Time GPS Attitude Determination with the Aid of One-Dimensional Optical Fiber Gyro”, GPS Solutions, vol. 3, No. 1, pp. 5-12 (1999) John Wiley & Sons, Inc.
Park, Chansik et al., “Integer Ambiguity Resolution for GPS Based Attitude Determination System”, SICE 1998, Jul. 29-31, Chiba, 1115-1120.
Last, J. D., et al., “Effect of skywave interference on coverage of radiobeacon DGPS Stations”, IEEE Proc.—Radar, Sonar Navig., vol. 144, No. 3, Jun. 1997, pp. 163-168.
“International Search Report and Written Opinion”, PCT/US2004/015678, filed May 17, 2004, Jun. 21, 2005.
“ISO”, 11783 Part 7 Draft Amendment 1 Annex, Paragraphs B.6 and B.7.ISO 11783-7 2004 DAM1, ISO: Mar. 8, 2004.
Kaplan, E D., “Understanding GPS: Principles and Applications”, Artech House, MA, 1996.
Irsigler, M et al., “PPL Tracking Performance in the Presence of Oscillator Phase Noise”, GPS Solutions, vol. 5, No. 4, pp. 45-57 (2002).
Ward, Phillip W., “Performance Comparisons Between FLL, PLL and a Novel FLL-Assisted-PLL Carrier Tracking Loop Under RF Interference Conditions”, 11th Int. Tech Meeting of the Satellite Division of the U.S. Inst. of Navigation, Nashville, TN, Sep. 15-18, 783-795, 1998.
Bevly, David M., “Comparison of INS v. Carrier-Phase DGPS for Attitude Determination in the Control of Off-Road Vehicles”, ION 55th Annual Meeting; Jun. 28-30, 1999; Cambridge, Massachusetts; pp. 497-504.
“International Search Report and Written Opinion”, International Searching Authortiy, PCT/US08/88070, Feb. 9, 2009.
Keicher, R. et al., “Automatic Guidance for Agricultural Vehicles in Europe”, Computers and Electronics in Agriculture, vol. 25, (Jan. 2000),169-194.
Takac, Frank et al., “SmartRTK: A Novel Method of Processing Standardised RTCM Network TRK Information For High Precision Positioning”, Proceedings of ENC GNSS 2008, Toulouse, France,(Apr. 22, 2008).
“International Search Report”, PCT/US09/33567, (Feb. 9, 2009).
“International Search Report”, PCT/US09/49776, (Aug. 11, 2009).
“International Search Report”, PCT/AU/2008/000002, (Feb. 28, 2008).
“International Search Report and Written Opinion”, PCT/IB2008/003796 (Jul. 15, 2009).
“International Search Report”, PCT/US09/33693, (Mar. 30, 2009).
“International Search Report”, PCT/US09/039686, (May 26, 2009).
“International Search Report”, PCT/US09/34376, (Nov. 2, 2009).
“International Search Report / Written Opinion”, PCT/US09/63594, (Jan. 11, 2010).
“International Search Report”, PCT/US09/60668, (Dec. 9, 2009).
“International Search Report”, PCT/US09/067693, (Jan. 26, 2010).
“International Search Report and Written Opinion”, PCT/US10/21334, (Mar. 12, 2010).
Rho, Hyundho et al., “Dual-Frequency GPS Precise Point Positioning with WADGPS Corrections”, [retrieved on May 18, 2010]. Retrieved from the Internet: ,URL: http://gauss.gge.unb.ca/papers.pdf/iongnss2005.rho.wadgps.pdf, (Jul. 12, 2006).
“Eurocontrol, Pegasus Technical Notes on SBAS”, report [online], Dec. 7, 2004 [retrieved on May 18, 2010]. Retrieved from the Internet: <URL: http://vvww.icao.int/icao/en/ro/nacc/meetings/2004/gnss/documentation/Pegasus/tn.pdf>, (Dec. 7, 2004),p. 89 paras [0001]-[0004].
“ARINC Engineering Services, Interface Specification IS-GPS-200, Revision D”, Online [retrieved on May 18, 2010]. Retrieved from the Internet<URL: http://www.navcen.uscg.gov/gps/geninfo/IS-GPS-200D.pdf>, (Dec. 7, 2004),p. 168 para [0001].
Schaer, et al., “Determination and Use of GPS Differential Code Bias Values”, Presentation [online]. Revtrieved May 18, 2010. Retrieved from the internet:.<http://nng.esoc.esa.de/ws2006/REPR2.pdf> (May 8, 2006).
“International Search Report”, PCT/US10/26509 (Apr. 20, 2010).
Provisional Applications (1)
Number Date Country
61243417 Sep 2009 US
Reissues (1)
Number Date Country
Parent 12884038 Sep 2010 US
Child 15041784 US