Visual, GNSS and gyro autosteering control

Information

  • Patent Grant
  • 9002566
  • Patent Number
    9,002,566
  • Date Filed
    Tuesday, February 10, 2009
    16 years ago
  • Date Issued
    Tuesday, April 7, 2015
    9 years ago
Abstract
A visual, GNSS and INS (gyro) system for autosteering control uses crop row and furrow row edge visual detection in an agricultural application in order to closely track the actual crop rows. Alternatively, previous vehicle tracks can be visually detected and followed in a tramline following operating mode. GNSS and inertial (gyroscopic) input subsystems are also provided for supplementing the video input subsystem, for example when visual references are lost. Crop damage is avoided or at least minimized by avoiding overdriving the existing crops. Other applications include equipment control in logistics operations.
Description
BACKGROUND OF THE INVENTION

1. Field of the Invention


The present invention relates generally to automated equipment control using video and other positioning inputs, and in particular to visually, automatically guiding between crop rows and against furrow row edges in agricultural applications.


2. Description of the Related Art


GNSS technology advanced vehicle and machine guidance and control in various technical fields, including the field of agricultural guidance by enabling reliable, accurate systems, which are relatively easy to use. GNSS guidance systems are adapted for displaying directional guidance information to assist operators with manually steering the vehicles. For example, the OUTBACK S™ steering guidance system, which is available from Hemisphere GPS LLC of Scottsdale, Ariz. and Hiawatha, Kans. and is covered by U.S. Pat. No. 6,539,303 and No. 6,711,501, which are incorporated herein by reference, includes an on-board computer capable of storing various straight-line and curved (“contour”) patterns. An advantage of this system is its ability to retain field-specific cultivating, planting, spraying, fertilizing, harvesting and other patterns in memory. This feature enables operators to accurately retrace such patterns. Another advantage relates to the ability to interrupt operations for subsequent resumption by referring to system-generated logs of previously treated areas.


Another type of GNSS vehicle guidance equipment automatically steers the vehicle along all or part of its travel path and can also control an agricultural procedure or operation, such as spraying, planting, tilling, harvesting, etc. Examples of such equipment are shown in U.S. Pat. No. 7,142,956, which is incorporated herein by reference. U.S. Patent Application Publication No. 2004/0186644 shows satellite-based vehicle guidance control in straight and contour modes, and is also incorporated herein by reference. U.S. Pat. No. 7,162,348 is incorporated herein by reference and discloses an articulated equipment position control system and method whereby a working component, such as an implement, can be guided independently of a motive component, such as a tractor. The implement can optionally be equipped with its own GNSS antenna and/or receiver for interacting with a tractor-mounted GNSS system.


Ideally crops would be planted in perfectly straight, evenly-spaced rows. Guidance through such fields would consist of following relatively simple straight-line patterns. Such guidance modes are commonly referred to as straight line or “A-B” in reference to the equipment traveling in a straight line between point A and point B in a repeating pattern in order to cover an entire field, which is typically flat and rectangular and therefore efficiently divided into multiple, parallel swaths. However, field conditions in many areas are not suitable for A-B guidance. For example, hilly terrain sometimes requires the formation of constant-elevation terraces.


Guidance systems accommodate such irregular conditions by operating in “contour following” modes consisting of curvilinear tracks defined by multiple GNSS points along which the equipment is guided. Initial planting passes made with manual and visually-guided navigation, which may or may not be supplemented with GNSS navigational aids, can cause crop rows to deviate from straight lines. Accommodating such irregular crop rows in subsequent operations (e.g., spraying and harvesting) may require the equipment to deviate from straight-line passes.


“Tramline” (sometimes referred to as “match tracks”) is another operating mode available with some modern GNSS guidance systems. In tramline operating mode the existing crop rows are relatively well protected because the equipment follows or “matches” the previously-driven passes. The equipment wheels or tracks are thus confined between the crop rows. Machine damage from running over crops is thus avoided, or at least minimized.


Notwithstanding recent advances in GNSS-based guidance accuracy, the natural irregularities of row crop cultivation tend to compromise the effectiveness of navigation based solely on location-finding from satellite signals. Moreover, satellite signals are occasionally lost due to interference from atmospheric conditions, weather and electromagnetic fields (EMF). There are various levels of differential accuracy available for GNSS. The use of these can cause offsets and drifts, especially over the crop growth season from field preparation to harvesting. In order to compensate for such lapses in GNSS reception, inertial navigation systems (INS) with gyroscopes has been utilized for relatively short-term, supplemental guidance input. Many systems accommodate operators overriding the automated functions. For example, an operator may respond to observed, actual field conditions in order to maintain the equipment on course. A system integrating input signals from GNSS, inertial and visual guidance subsystems could optimize guidance solutions in various conditions. Moreover, visually guiding with cameras directed at the crop rows or the furrow row edges can provide relatively accurate positioning solutions, supplemented by GNSS and gyro inputs. The GNSS receivers and inertial devices (i.e. gyroscopes) can be less accurate, and hence less expensive, in such systems where the most precise positioning inputs are from visual references. Highly accurate (i.e. centimeter level) positioning with GNSS signals alone typically involves one or more relatively sophisticated and expensive receivers, and often involves subscription-based broadcast corrections or localized broadcasts from real-time kinematic (RTK) base station GNSS equipment. Custom applicators, who use their equipment on multiple farms, need guidance equipment capable of universal operation for optimizing their productivity while minimizing crop damage. Such equipment should be usable by operators with minimal training operating at optimal speeds and should have the capacity for storing and recalling field data for reuse, for example from season-to-season. Higher equipment speeds also tend to create autosteering discrepancies, which can lead to crop damage from equipment overruns. Hence, visual referencing can accommodate faster equipment even with relatively basic GNSS/INS guidance receivers and sensors. Fields are sometimes planted using a variety of guidance methods, and guidance equipment used in subsequent operations should be responsive to actual field conditions, such as crop locations, without undue reliance on previous equipment and data recorded thereby, which may or may not be sufficiently accurate for subsequent operations.


Heretofore there has not been available a GNSS, inertial and visual guidance and control system and method with the advantages and features of the present invention.


SUMMARY OF THE INVENTION

In the practice of the present invention, a system and method are provided for automatically controlling vehicles and equipment using video, GNSS and inertial input subsystems. For example, agricultural equipment comprising a tractor and an implement can be equipped with a vector position and heading sensor subsystem including a GNSS receiver and antennas and an inertial (gyroscopic) subsystem with X, Y and Z axis sensors for sensing equipment attitude changes through six degrees of freedom. The GNSS and INS/gyroscopic input subsystems can be housed in a common enclosure for mounting on the tractor roof. A video input subsystem can comprise a pair of cameras each mounted on a respective side at the front of the tractor and directed at crop rows, swath edges or previous tracks (tramlines) in the forward path of movement. A microprocessor-based controller processes the inputs and automatically controls a vehicle steering system in response thereto. Depending on the crop growth cycle and the ability for edge detection, the use of visual or GNSS/inertial systems would be nominally better if used as the primary guidance mode. This invention allows manual or transparent switching between these modes. Calibration of the recent line curvatures and offsets from previously logged GNSS tracks can be used to switch between modes while minimizing any crop damage should visual edge detection be lost. The edges can be defined by furrows, physical plants visible against soil or touching plants from adjacent rows. Other aspects of the invention include logistics equipment applications and machine control.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a block diagram of a vehicle control system with GNSS, inertial and video input subsystems and embodying an aspect of the present invention.



FIG. 2 is a top plan view of a tractor equipped with the control system and coupled to an implement.



FIG. 3 is a side elevational view of the tractor and implement.



FIG. 4 is a top plan view of the tractor and implement, shown working a field planted in row crops.



FIG. 5 is a top plan view of the tractor and implement, shown working a field planted in row crops and encountering another field condition with an interrupted crop row.



FIG. 6 is a top plan view of the tractor and implement, shown working a field planted in emerging row crops.



FIG. 7 is a top plan view of the tractor and implement, shown working a field planted in row crops in a contour configuration.



FIG. 8 is a top plan view of the tractor and implement, shown following a tramline comprising previous vehicle tracks.



FIG. 9 is a top plan view of the tractor and an implement equipped with steering coulters.



FIG. 10 is a flowchart of a GNSS, INS and video vehicle guidance and control method embodying another aspect of the present invention.





DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS

I. Introduction and Environment


As required, detailed embodiments of the present invention are disclosed herein; however, it is to be understood that the disclosed embodiments are merely exemplary of the invention, which may be embodied in various forms. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as a basis for the claims and as a representative basis for teaching one skilled in the art to variously employ the present invention in virtually any appropriately detailed structure.


Certain terminology will be used in the following description for convenience in reference only and will not be limiting. For example, up, down, front, back, right and left refer to the invention as oriented in the view being referred to. The words “inwardly” and “outwardly” refer to directions toward and away from, respectively, the geometric center of the embodiment being described and designated parts thereof. Global navigation satellite systems (GNSS) are broadly defined to include GPS (U.S.), Galileo (proposed), GLONASS (Russia), Beidou (China), Compass (proposed), IRNSS (India, proposed), QZSS (Japan, proposed) and other current and future positioning technology using signals from satellites, with or without augmentation from terrestrial sources. Inertial navigation systems (INS) include gyroscopic (gyro) sensors, accelerometers and similar technologies for providing output corresponding to the inertia of moving components in all axes, i.e. through six degrees of freedom (positive and negative directions along transverse X, longitudinal Y and vertical Z axes). Yaw, pitch and roll refer to moving component rotation about the Z, X and Y axes respectively. Said terminology will include the words specifically mentioned, derivatives thereof and words of similar meaning.


II. Preferred Embodiment System 2.


Referring to the drawings in more detail, the reference numeral 2 generally designates a GNSS, inertial and video control system embodying the present invention. Without limitation on the generality of useful applications of the control system 2, a motive component 6 connected to a working component 8 through an optional articulated connection or hitch 10 is shown (collectively a vehicle 4). Also by way of example, the motive component 6 can comprise a tractor and the working component 8 can comprise a ground-working implement. However, the position control system 2 can be applied to other equipment configurations for a wide range of other applications. Such applications include equipment and components used in road construction, road maintenance, earthworking, mining, transportation, industry, manufacturing, logistics, etc.


The control system 2 can be implemented with a tractor 6 including a microprocessor 12 connected to a graphical user interface (GUI) 14, which can be original equipment manufacture (OEM) general-purpose components, or special-purpose for the system 2. The tractor 6 also includes a steering wheel 16 for operating an hydraulic steering system 18. A position sensor 20 is connected to the steering wheel 16 and provides an output corresponding to its position. The components can be connected and external communications


can be provided by suitable networks, buses, hardwired and wireless connections, controller area network (CAN) 58 (shown), serial connections and VT.


A position/heading (vector) sensor 28 can be mounted externally on the tractor 6, e.g. on its roof, and includes a pair of antennas 30 connected to a GNSS receiver 32. The GNSS receiver 32 disclosed herein can be adapted for various satellite navigational systems, and can utilize a variety of satellite based augmentation systems (SBAS). Technology is also available for continuing operation through satellite signal interruptions, and can be utilized with the system 2. The antennas 30 can be horizontally aligned transversely with respect to a direction of travel of the tractor 6, i.e. parallel to its X axis. The relative positions of the antennas 30 with respect to each other can thus be processed for determining yaw, i.e. rotation with respect to the vertical Z axis. The sensor 28 also includes a direction sensor 34 and inertial sensors 36, 38 and 40 for detecting and measuring inertial movement with respect to the X, Y and Z axes corresponding to yaw, roll and pitch movements in six degrees of freedom. Signals from the receiver 32 and the sensors 34, 36, 38 and 40 are received and processed by the microprocessor 12 based on how the system 2 is configured and programmed.


The implement (working component) 8 can optionally be equipped with an implement GNSS receiver 46 connected to an implement microprocessor 48 for steering the implement 8 independently of the tractor 6 via an implement steer subsystem 50. An optional articulated connection 10 can be provided between the tractor 6 and the implement 8. Examples of such an articulated connection and an implement steering system are described in U.S. Pat. No. 6,865,465 and No. 7,162,348, which are incorporated herein by reference. The implement 8 can comprise any of a wide range of suitable implements, such as planting, cultivating, harvesting and spraying equipment. For example, spraying applications are commonly performed with a boom 52, which can be equipped for automatic, selective control of multiple nozzles 54 and other boom operating characteristics, such as height, material dispensed, etc. Automatic boom control 56 can be utilized, for example, to selectively activate and deactivate individual spray nozzles 54 whereby overspraying previously treated areas can be avoided by the system 2 keeping track of previously treated areas and turning off the nozzles 54 when those areas are reached in an overlapping swath situation, which occasionally occurs in connection with irregularly shaped parcels, near field boundaries and in other operating situations.


A video guidance input subsystem 60 includes one or more cameras 62. In the agricultural application of the present invention described herein, the cameras 62 are adjustably mounted on each side of the front of the tractor 6 and can be oriented towards crop rows at predetermined distances ahead of the tractor 6 in a look-ahead, forward-predictive configuration. The output of the cameras 62 is received, converted and processed by the microprocessor 12 whereby the detected visual references are utilized for guidance. Without limitation on the generality of useful visual references, agricultural guidance can be based on edge detection using several methodologies depending on the growth state of the crop and rows in the soil. These include: 1) central row using the crop, soil ridge or straw residue for guidance; 2) edge row using edges on either side of the vehicle; 3) tramline following, using previous vehicle tire or tread tracks; and 4) combinations thereof.


III. Agricultural Applications


In operation, various guidance modes are available for adapting to particular field conditions. As used herein, guidance includes a graphical (visual, acoustic, etc.) interface with an operator in order to assist him or her in steering the tractor 6. Guidance also includes autosteering without operator intervention, except possibly through end-of-row turns, which can also be automated. The system 2 is initialized to select operating modes and provide various information about the equipment, such as antenna height, swath width (generally corresponding to the width of the implement 8) and other operating variables. Crop edge detection can also be used for guidance in non-row crops, such as wheat. For example, a combine creates a swath edge, which provides a visual positioning reference for the system 2.



FIG. 4 shows the equipment 4 comprising a tractor 6 and a sprayer 8 operating in a straight-line (A-B) mode with the cameras 62 oriented towards the edges 70 of the crop rows 72 located on either side of the equipment path. In addition to guidance, the system 2 can control the individual operation of the spray boom nozzles 54 whereby the crop rows 72 are properly treated. The microprocessor 12 can be preprogrammed to prioritize the inputs from the GNSS/INS input subsystem 28 and the video input subsystems 60. For example, the video feed can respond more directly to actual (observed) crop row conditions and locations and the microprocessor 12 can be preprogrammed to override the GNSS and INS guidance input accordingly. The inertial guidance input can be particularly helpful for maintaining the equipment on course when both GNSS and visual signals are lost or interfered with. Relatively accurate gyro guidance can be provided until GNSS and visual signals are locked and normal operations are reestablished. Inertial guidance accuracy tends to degrade and cause course drift, which can be corrected by GNSS and/or visual reference position fixes.



FIG. 5 shows a field condition with an interrupted crop row condition 74, which is detected by the camera 62 and causes the microprocessor 12 to alert the operator. The system 2 can be preprogrammed to automatically prioritize GNSS/inertial guidance, or continue in a visual guidance mode by guiding off of the left side crop row edge 70. FIG. 6 shows emerging, individual plants 76, which are detected by the video guidance subsystem 22 and avoided by the equipment 4. FIG. 7 shows a contour mode of operation with visual guidance being provided by the crop row edges 70 whereby the vehicle 4 is guided along a contour guide path 78. FIG. 8 shows a “tramline following” or “match tracks” mode whereby the video guidance subsystem 22 detects and causes the vehicle 4 to follow previous tire tracks 80. FIG. 9 shows a modified vehicle 82 including an implement steering configuration whereby coulters 84 interactively guide an implement 86 and adjust for crosstrack errors of the tractor 6. U.S. Pat. No. 6,865,465 shows such an implement steering system and is incorporated herein by reference. Interactive implement guidance can also be accomplished through a power-articulated hitch tractor-implement connection 10, as described in U.S. Pat. No. 7,162,348, which is also incorporated herein by reference.



FIG. 10 shows a flowchart of a method embodying an aspect of the present invention and commencing with Start 100, whereafter the system 2 is initialized at 102 with various operating parameters, such as field course, pre-existing guidance information, swath width, etc. The track followed will be modeled as a real-time radius of curvature, both for long-term nominally straight (A-B) lines and for short-term contour following operations. Visual lock on a crop row edge 70 is confirmed at step 106 and edge following commences at 108 with autosteering enabled at 110. The GNSS/gyro input subsystems 28 provide nominal turn radius calibration and an X, Y offset from a previously logged track at step 111. Offsets from the measured crop row edge 70 are used to generate steering commands to optimize on-line performance. A field can be completely treated in visual edge-following mode. An affirmative decision at “Field Complete?” decision box 112 leads to an End at 122 and the operation is terminated. Otherwise (negative decision at 112), the operation continues with sampling of the video input subsystem 60 determining if a crop row edge 70 is visible or not at “Edge Visible?” decision box 114, with a positive decision looping back to continue autosteering in a visual guidance mode 110. A negative decision at decision box 114 leads to an “Edge Loss” warning at 116 whereafter GNSS/INS guidance is prioritized at 118 and autosteering continues based on GNSS/INS guidance using the last XY offset if a track log is available or curvature if a track log is not available at 120. If visual lock on a crop row edge 70 is lost, the track will be forward-projected and used by the GNSS/gyro system 28 to enable continuing tracking on this path. “Field Complete?” decision box 112 leads to either continuation of GNSS/inertial autosteering 120 (positive decision) or operation termination at 122 (negative decision). The system 2 also continues to look for a crop row edge 70. When detected (affirmative decision at 114) the system 2 resumes autosteering from an edge visual at 110. Original planting operations may require GNSS/inertial guidance for lack of visual crop row edges, unless previous season tracks can be visually followed. Emerging crop operations can utilize visual, GNSS and inertial guidance, all integrated by the controller 12. In full crop operations guidance within the crop rows will often be possible with visual input only, supplemented as necessary by GNSS/INS guidance in end-of-row (e.g., headlands) turns.


Other applications can benefit from the system and method of the present invention. For example, another exemplary application involves machine control in logistics operations using visual references for controlling such operations as storage and retrieval. In warehousing and dockside environments, GNSS signals are often compromised by structures and cargo containers in which the equipment operates. Visual references can therefore provide primary guidance for navigating and controlling logistics vehicles (e.g., forklifts and cranes), with GNSS and inertial supplementation. Such references can comprise, for example, painted line edges for fine positioning, character recognition for identifying slots for cargo containers and other markings on structures, shelving, containers, etc. As with the agricultural applications discussed above, relatively basic, low-end GNSS/gyro equipment may provide acceptable performance when combined with the relative precision of a video input subsystem. Data bases can be maintained with information associating reference images with GPS-defined positioning information.


It is to be understood that the invention can be embodied in various forms, and is not to be limited to the examples discussed above.

Claims
  • 1. A method of guiding a vehicle using visual references along a guide path thereof and GNSS positioning, which method comprises the steps of: providing said vehicle with a GNSS positioning subsystem including a GNSS antenna and a GNSS receiver connected to said antenna;providing said GNSS subsystem with a position and heading sensor including multiple antennas;providing a visual reference subsystem including a camera mounted on said vehicle and oriented in its direction of travel;providing said vehicle with a microprocessor connected to said GNSS and visual reference subsystems;providing said GNSS subsystem with an inertial direction sensor connected to said microprocessor;inputting to said microprocessor GNSS positioning information from said receiver;inputting to said microprocessor visual reference information from said visual reference subsystem;providing an inertial navigation subsystem (INS);generating signals corresponding to inertial forces on said vehicle with said INS;inputting said INS signals to said microprocessor;utilizing with said microprocessor said INS signals in deriving a guidance solution;guiding said vehicle through a field of row crops;providing visual references for said vehicle comprising edges of said crop rows or furrows;visually locking on said crop row edges or furrows with said camera;outputting visual reference subsystem signals to said microprocessor corresponding to said camera lock on said crop row edges or furrows;logging vehicle track information with said microprocessor;providing said vehicle with an autosteering subsystem;calibrating with said microprocessor nominal turn radius information for said vehicle using information derived from said GNSS and INS subsystems and offsets from a previously logged track;testing a crop row edge or furrow visibility condition with said microprocessor based on visual reference subsystem signals from said camera;detecting a crop row edge or furrow visibility loss condition with said microprocessor;based on said visibility loss condition prioritizing GNSS and INS guidance using an offset from previously logged GNSS tracks if a track log is available or recent line curvatures if a track log is not available;deriving a guidance solution with said microprocessor based on said GNSS, said INS signals, and visual reference subsystem inputs in conjunction with each other;guiding said vehicle with said guidance solution;providing said vehicle with a tractor and an implement connected thereto via an articulated hitch connection;providing said tractor and implement with independent steering subsystems;a controller providing independent steering commands to said tractor and implement steering subsystems;providing said implement with an implement GNSS guidance subsystem;inputting GNSS-derived positioning information from said implement GNSS guidance subsystem to said microprocessor;independently guiding said implement with said microprocessor and said implement steering subsystem;providing said vehicle with a spray boom including multiple spray nozzles;programming microprocessor to independently and selectively control said spray nozzles;independently and selectively controlling said spray nozzles with said microprocessor based on the positions of said spray nozzles derived from GNSS, visual reference and INS positioning information;deriving a nominal turn radius for said vehicle; andguiding said vehicle through an end-of-row turn using said nominal turn radius.
CROSS-REFERENCE TO RELATED APPLICATION

This application claims priority in U.S. Provisional Patent Application No. 61/027,478, filed Feb. 10, 2008, which is incorporated herein by reference.

US Referenced Citations (412)
Number Name Date Kind
3585537 Rennick et al. Jun 1971 A
3596228 Reed, Jr. et al. Jul 1971 A
3727710 Sanders et al. Apr 1973 A
3815272 Marleau Jun 1974 A
3899028 Morris et al. Aug 1975 A
3987456 Gelin Oct 1976 A
4132272 Holloway et al. Jan 1979 A
4170776 MacDoran Oct 1979 A
4180133 Collogan et al. Dec 1979 A
4398162 Nagai Aug 1983 A
4453614 Allen et al. Jun 1984 A
4529990 Brunner Jul 1985 A
4637474 Leonard Jan 1987 A
4667203 Counselman, III May 1987 A
4689556 Cedrone Aug 1987 A
4694264 Owens et al. Sep 1987 A
4710775 Coe Dec 1987 A
4714435 Stipanuk et al. Dec 1987 A
4739448 Rowe et al. Apr 1988 A
4751512 Longaker Jun 1988 A
4769700 Pryor Sep 1988 A
4785463 Janc et al. Nov 1988 A
4802545 Nystuen et al. Feb 1989 A
4812991 Hatch Mar 1989 A
4813991 Hale Mar 1989 A
4858132 Holmquist Aug 1989 A
4864320 Munson et al. Sep 1989 A
4894662 Counselman Jan 1990 A
4916577 Dawkins Apr 1990 A
4918607 Wible Apr 1990 A
4963889 Hatch Oct 1990 A
5031704 Fleischer et al. Jul 1991 A
5100229 Lundberg et al. Mar 1992 A
5134407 Lorenz et al. Jul 1992 A
5148179 Allison Sep 1992 A
5152347 Miller Oct 1992 A
5155490 Spradley et al. Oct 1992 A
5155493 Thursby et al. Oct 1992 A
5156219 Schmidt et al. Oct 1992 A
5165109 Han et al. Nov 1992 A
5173715 Rodal et al. Dec 1992 A
5177489 Hatch Jan 1993 A
5185610 Ward et al. Feb 1993 A
5191351 Hofer et al. Mar 1993 A
5202829 Geier Apr 1993 A
5207239 Schwitalia May 1993 A
5239669 Mason et al. Aug 1993 A
5255756 Follmer et al. Oct 1993 A
5268695 Dentinger et al. Dec 1993 A
5293170 Lorenz et al. Mar 1994 A
5294970 Dornbusch et al. Mar 1994 A
5296861 Knight Mar 1994 A
5311149 Wagner et al. May 1994 A
5323322 Mueller et al. Jun 1994 A
5334987 Teach Aug 1994 A
5343209 Sennott et al. Aug 1994 A
5345245 Ishikawa et al. Sep 1994 A
5359332 Allison et al. Oct 1994 A
5361212 Class et al. Nov 1994 A
5365447 Dennis Nov 1994 A
5369589 Steiner Nov 1994 A
5375059 Kyrtsos et al. Dec 1994 A
5390124 Kyrtsos Feb 1995 A
5390125 Sennott et al. Feb 1995 A
5390207 Fenton et al. Feb 1995 A
5416712 Geier et al. May 1995 A
5442363 Remondi Aug 1995 A
5444453 Lalezari Aug 1995 A
5451964 Babu Sep 1995 A
5467282 Dennis Nov 1995 A
5471217 Hatch et al. Nov 1995 A
5476147 Fixemer Dec 1995 A
5477228 Tiwari et al. Dec 1995 A
5477458 Loomis Dec 1995 A
5490073 Kyrtsos Feb 1996 A
5491636 Robertson et al. Feb 1996 A
5495257 Loomis Feb 1996 A
5504482 Schreder Apr 1996 A
5511623 Frasier Apr 1996 A
5519620 Talbot et al. May 1996 A
5521610 Rodal May 1996 A
5523761 Gildea Jun 1996 A
5534875 Diefes et al. Jul 1996 A
5543804 Buchler et al. Aug 1996 A
5546093 Gudat et al. Aug 1996 A
5548293 Cohen et al. Aug 1996 A
5561432 Knight Oct 1996 A
5563786 Torii Oct 1996 A
5568152 Janky et al. Oct 1996 A
5568162 Samsel et al. Oct 1996 A
5583513 Cohen Dec 1996 A
5589835 Gildea et al. Dec 1996 A
5592382 Colley Jan 1997 A
5596328 Stangeland et al. Jan 1997 A
5600670 Turney Feb 1997 A
5604506 Rodal Feb 1997 A
5608393 Hartman Mar 1997 A
5610522 Locatelli et al. Mar 1997 A
5610616 Vallot et al. Mar 1997 A
5610845 Slabinski Mar 1997 A
5612883 Shaffer et al. Mar 1997 A
5615116 Gudat et al. Mar 1997 A
5617100 Akiyoshi et al. Apr 1997 A
5617317 Ignagni Apr 1997 A
5621646 Enge et al. Apr 1997 A
5638077 Martin Jun 1997 A
5644139 Allen Jul 1997 A
5664632 Frasier Sep 1997 A
5673491 Brenna et al. Oct 1997 A
5680140 Loomis Oct 1997 A
5684696 Rao et al. Nov 1997 A
5706015 Chen et al. Jan 1998 A
5717593 Gvilli Feb 1998 A
5725230 Walkup Mar 1998 A
5731786 Abraham et al. Mar 1998 A
5739785 Allison Apr 1998 A
5757316 Buchler May 1998 A
5765123 Nimura et al. Jun 1998 A
5777578 Chang et al. Jul 1998 A
5810095 Orbach et al. Sep 1998 A
5828336 Yunck et al. Oct 1998 A
5838562 Gudat et al. Nov 1998 A
5854987 Sekine et al. Dec 1998 A
5862501 Talbot et al. Jan 1999 A
5864315 Welles et al. Jan 1999 A
5864318 Cozenza et al. Jan 1999 A
5875408 Bendett et al. Feb 1999 A
5877725 Kalafus Mar 1999 A
5890091 Talbot et al. Mar 1999 A
5899957 Loomis May 1999 A
5906645 Kagawa et al. May 1999 A
5912798 Chu Jun 1999 A
5914685 Kozlov et al. Jun 1999 A
5917448 Mickelson Jun 1999 A
5918558 Susag Jul 1999 A
5919242 Greatline et al. Jul 1999 A
5923270 Sampo et al. Jul 1999 A
5926079 Heine et al. Jul 1999 A
5927603 McNabb Jul 1999 A
5928309 Korver et al. Jul 1999 A
5929721 Munn et al. Jul 1999 A
5933110 Tang Aug 1999 A
5935183 Sahm et al. Aug 1999 A
5936573 Smith Aug 1999 A
5940026 Popeck Aug 1999 A
5941317 Mansur Aug 1999 A
5943008 Van Dusseldorp Aug 1999 A
5944770 Enge et al. Aug 1999 A
5945917 Harry Aug 1999 A
5949371 Nichols Sep 1999 A
5955973 Anderson Sep 1999 A
5956250 Gudat et al. Sep 1999 A
5969670 Kalafus et al. Oct 1999 A
5987383 Keller Nov 1999 A
6014101 Loomis Jan 2000 A
6014608 Seo Jan 2000 A
6018313 Engelmayer et al. Jan 2000 A
6023239 Kovach Feb 2000 A
6052647 Parkinson et al. Apr 2000 A
6055477 McBurney et al. Apr 2000 A
6057800 Yang et al. May 2000 A
6061390 Meehan et al. May 2000 A
6061632 Dreier May 2000 A
6062317 Gharsalli May 2000 A
6069583 Silvestrin et al. May 2000 A
6076612 Carr et al. Jun 2000 A
6081171 Ella Jun 2000 A
6100842 Dreier et al. Aug 2000 A
6104978 Harrison et al. Aug 2000 A
6122595 Varley et al. Sep 2000 A
6128574 Diekhans Oct 2000 A
6144335 Rogers Nov 2000 A
6191730 Nelson, Jr. Feb 2001 B1
6191733 Dizchavez Feb 2001 B1
6198430 Hwang et al. Mar 2001 B1
6198992 Winslow Mar 2001 B1
6199000 Keller et al. Mar 2001 B1
6205401 Pickhard et al. Mar 2001 B1
6215828 Signell et al. Apr 2001 B1
6229479 Kozlov et al. May 2001 B1
6230097 Dance et al. May 2001 B1
6233511 Berger et al. May 2001 B1
6236916 Staub et al. May 2001 B1
6236924 Motz May 2001 B1
6253160 Hanseder Jun 2001 B1
6256583 Sutton Jul 2001 B1
6259398 Riley Jul 2001 B1
6266595 Greatline et al. Jul 2001 B1
6285320 Olster et al. Sep 2001 B1
6292132 Wilson Sep 2001 B1
6307505 Green Oct 2001 B1
6313788 Wilson Nov 2001 B1
6314348 Winslow Nov 2001 B1
6325684 Knight Dec 2001 B1
6336066 Pellenc et al. Jan 2002 B1
6345231 Quincke Feb 2002 B2
6356602 Rodal et al. Mar 2002 B1
6377889 Soest Apr 2002 B1
6380888 Kucik Apr 2002 B1
6389345 Phelps May 2002 B2
6392589 Rogers et al. May 2002 B1
6397147 Whitehead May 2002 B1
6415229 Diekhans Jul 2002 B1
6418031 Archambeault Jul 2002 B1
6421003 Riley et al. Jul 2002 B1
6424915 Fukuda et al. Jul 2002 B1
6431576 Viaud et al. Aug 2002 B1
6434462 Bevly et al. Aug 2002 B1
6445983 Dickson et al. Sep 2002 B1
6445990 Manring Sep 2002 B1
6449558 Small Sep 2002 B1
6463091 Zhodzicshsky et al. Oct 2002 B1
6463374 Keller et al. Oct 2002 B1
6466871 Reisman et al. Oct 2002 B1
6469663 Whitehead et al. Oct 2002 B1
6484097 Fuchs et al. Nov 2002 B2
6501422 Nichols Dec 2002 B1
6515619 McKay, Jr. Feb 2003 B1
6516271 Upadhyaya et al. Feb 2003 B2
6539303 McClure et al. Mar 2003 B2
6542077 Joao Apr 2003 B2
6549835 Deguchi Apr 2003 B2
6553299 Keller et al. Apr 2003 B1
6553300 Ma et al. Apr 2003 B2
6553311 Ahearn et al. Apr 2003 B2
6570534 Cohen et al. May 2003 B2
6577952 Geier et al. Jun 2003 B2
6587761 Kumar Jul 2003 B2
6606542 Hauwiller et al. Aug 2003 B2
6611228 Toda et al. Aug 2003 B2
6611754 Klein Aug 2003 B2
6611755 Coffee et al. Aug 2003 B1
6622091 Perlmutter et al. Sep 2003 B2
6631394 Ronkka et al. Oct 2003 B1
6631916 Miller Oct 2003 B1
6643576 O'Connor et al. Nov 2003 B1
6646603 Dooley Nov 2003 B2
6657875 Zeng et al. Dec 2003 B1
6671587 Hrovat et al. Dec 2003 B2
6686878 Lange Feb 2004 B1
6688403 Bernhardt et al. Feb 2004 B2
6703973 Nichols Mar 2004 B1
6711501 McClure et al. Mar 2004 B2
6721638 Zeitler Apr 2004 B2
6732024 Rekow et al. May 2004 B2
6744404 Whitehead et al. Jun 2004 B1
6754584 Pinto et al. Jun 2004 B2
6774843 Takahashi Aug 2004 B2
6792380 Toda Sep 2004 B2
6819269 Flick Nov 2004 B2
6822314 Beasom Nov 2004 B2
6865465 McClure Mar 2005 B2
6865484 Miyasaka et al. Mar 2005 B2
6879283 Bird et al. Apr 2005 B1
6900992 Kelly et al. May 2005 B2
6922635 Rorabaugh Jul 2005 B2
6931233 Tso et al. Aug 2005 B1
6961018 Heppe et al. Nov 2005 B2
6967538 Woo Nov 2005 B2
6990399 Hrazdera et al. Jan 2006 B2
7006032 King et al. Feb 2006 B2
7026982 Toda et al. Apr 2006 B2
7027918 Zimmerman et al. Apr 2006 B2
7031725 Rorabaugh Apr 2006 B2
7089099 Shostak et al. Aug 2006 B2
7142956 Heiniger et al. Nov 2006 B2
7155335 Rennels Dec 2006 B2
7162348 McClure et al. Jan 2007 B2
7191061 McKay et al. Mar 2007 B2
7221314 Brabec et al. May 2007 B2
7231290 Steichen et al. Jun 2007 B2
7248211 Hatch et al. Jul 2007 B2
7271766 Zimmerman et al. Sep 2007 B2
7277784 Weiss Oct 2007 B2
7292186 Miller et al. Nov 2007 B2
7324915 Altmann Jan 2008 B2
7358896 Gradincic et al. Apr 2008 B2
7373231 McClure et al. May 2008 B2
7388539 Whitehead et al. Jun 2008 B2
7395769 Jensen Jul 2008 B2
7428259 Wang et al. Sep 2008 B2
7437230 McClure et al. Oct 2008 B2
7451030 Eglington et al. Nov 2008 B2
7479900 Horstemeyer Jan 2009 B2
7505848 Flann et al. Mar 2009 B2
7522099 Zhodzishsky et al. Apr 2009 B2
7522100 Yang et al. Apr 2009 B2
7571029 Dai et al. Aug 2009 B2
7689354 Heiniger et al. Mar 2010 B2
20030014171 Ma et al. Jan 2003 A1
20030093210 Kondo et al. May 2003 A1
20030187560 Keller et al. Oct 2003 A1
20030208319 Ell et al. Nov 2003 A1
20040039514 Steichen et al. Feb 2004 A1
20040124605 McClure et al. Jul 2004 A1
20040212533 Whitehead Oct 2004 A1
20050080559 Ishibashi et al. Apr 2005 A1
20050225955 Grebenkemper et al. Oct 2005 A1
20050265494 Goodlings Dec 2005 A1
20060031664 Wilson et al. Feb 2006 A1
20060149472 Han et al. Jul 2006 A1
20060167600 Nelson et al. Jul 2006 A1
20060206246 Walker Sep 2006 A1
20060215739 Williamson et al. Sep 2006 A1
20070078570 Dai et al. Apr 2007 A1
20070088447 Stothert et al. Apr 2007 A1
20070121708 Simpson May 2007 A1
20070205940 Yang et al. Sep 2007 A1
20070285308 Bauregger et al. Dec 2007 A1
20080129586 Martin Jun 2008 A1
20080204312 Euler Aug 2008 A1
20090171583 DiEsposti Jul 2009 A1
20090174597 DiLellio et al. Jul 2009 A1
20090174622 Kanou Jul 2009 A1
20090177395 Stelpstra Jul 2009 A1
20090177399 Park et al. Jul 2009 A1
20090259397 Stanton Oct 2009 A1
20090259707 Martin et al. Oct 2009 A1
20090262014 DiEsposti Oct 2009 A1
20090262018 Vasilyev et al. Oct 2009 A1
20090262974 Lithopoulos Oct 2009 A1
20090265054 Basnayake Oct 2009 A1
20090265101 Jow Oct 2009 A1
20090265104 Shroff Oct 2009 A1
20090273372 Brenner Nov 2009 A1
20090273513 Huang Nov 2009 A1
20090274079 Bhatia et al. Nov 2009 A1
20090274113 Katz Nov 2009 A1
20090276155 Jeerage et al. Nov 2009 A1
20090295633 Pinto et al. Dec 2009 A1
20090295634 Yu et al. Dec 2009 A1
20090299550 Baker Dec 2009 A1
20090322597 Medina Herrero et al. Dec 2009 A1
20090322598 Fly et al. Dec 2009 A1
20090322600 Whitehead et al. Dec 2009 A1
20090322601 Ladd et al. Dec 2009 A1
20090322606 Gronemeyer Dec 2009 A1
20090326809 Colley et al. Dec 2009 A1
20100013703 Tekawy et al. Jan 2010 A1
20100026569 Amidi Feb 2010 A1
20100030470 Wang et al. Feb 2010 A1
20100039316 Gronemeyer et al. Feb 2010 A1
20100039318 Kmiecik Feb 2010 A1
20100039320 Boyer et al. Feb 2010 A1
20100039321 Abraham Feb 2010 A1
20100060518 Bar-Sever et al. Mar 2010 A1
20100063649 Wu et al. Mar 2010 A1
20100084147 Aral Apr 2010 A1
20100085249 Ferguson et al. Apr 2010 A1
20100085253 Ferguson et al. Apr 2010 A1
20100103033 Roh Apr 2010 A1
20100103034 Tobe et al. Apr 2010 A1
20100103038 Yeh et al. Apr 2010 A1
20100103040 Broadbent Apr 2010 A1
20100106414 Whitehead Apr 2010 A1
20100106445 Kondoh Apr 2010 A1
20100109944 Whitehead et al. May 2010 A1
20100109945 Roh May 2010 A1
20100109947 Rintanen May 2010 A1
20100109948 Razoumov et al. May 2010 A1
20100109950 Roh May 2010 A1
20100111372 Zheng et al. May 2010 A1
20100114483 Heo et al. May 2010 A1
20100117894 Velde et al. May 2010 A1
20100117899 Papadimitratos et al. May 2010 A1
20100117900 Van Diggelen et al. May 2010 A1
20100121577 Zhang et al. May 2010 A1
20100124210 Lo May 2010 A1
20100124212 Lo May 2010 A1
20100134354 Lennen Jun 2010 A1
20100149025 Meyers et al. Jun 2010 A1
20100149030 Verma et al. Jun 2010 A1
20100149033 Abraham Jun 2010 A1
20100149034 Chen Jun 2010 A1
20100149037 Cho Jun 2010 A1
20100150284 Fielder et al. Jun 2010 A1
20100152949 Nunan et al. Jun 2010 A1
20100156709 Zhang et al. Jun 2010 A1
20100156712 Pisz et al. Jun 2010 A1
20100156718 Chen Jun 2010 A1
20100159943 Salmon Jun 2010 A1
20100161179 McClure et al. Jun 2010 A1
20100161211 Chang Jun 2010 A1
20100161568 Xiao Jun 2010 A1
20100171660 Shyr et al. Jul 2010 A1
20100171757 Melamed Jul 2010 A1
20100185364 McClure Jul 2010 A1
20100185366 Heiniger et al. Jul 2010 A1
20100185389 Woodard Jul 2010 A1
20100188285 Collins Jul 2010 A1
20100188286 Bickerstaff et al. Jul 2010 A1
20100189163 Burgi et al. Jul 2010 A1
20100207811 Lackey Aug 2010 A1
20100210206 Young Aug 2010 A1
20100211248 Craig et al. Aug 2010 A1
20100211315 Toda Aug 2010 A1
20100211316 DaSilva Aug 2010 A1
20100220004 Malkos et al. Sep 2010 A1
20100220008 Conover et al. Sep 2010 A1
20100222076 Poon et al. Sep 2010 A1
20100225537 Abraham Sep 2010 A1
20100228408 Ford Sep 2010 A1
20100228480 Lithgow et al. Sep 2010 A1
20100231443 Whitehead Sep 2010 A1
20100231446 Marshall et al. Sep 2010 A1
20100232351 Chansarkar et al. Sep 2010 A1
20100235093 Chang Sep 2010 A1
20100238976 Young Sep 2010 A1
20100241347 King et al. Sep 2010 A1
20100241353 Park Sep 2010 A1
20100241441 Page et al. Sep 2010 A1
20100241864 Kelley et al. Sep 2010 A1
Foreign Referenced Citations (9)
Number Date Country
WO9836288 Aug 1998 WO
WO2009066183 May 2009 WO
WO-2009082745 Jul 2009 WO
WO2005119386 Oct 2009 WO
WO2009126587 Oct 2009 WO
WO2009148638 Dec 2009 WO
WO-2010005945 Jan 2010 WO
WO-2010104782 Sep 2010 WO
WO-2011014431 Feb 2011 WO
Non-Patent Literature Citations (39)
Entry
“International Search Report and Written Opinion”, International Searching Authority, PCT/US08/88070, Feb. 9, 2009, International Searching.
“International Search Report”, PCT/US09/039686, (May 26, 2009).
“International Search Report”, PCT/US09/33693, (Mar. 30, 2009).
“International Search Report”, PCT/AU/2008/000002, (Feb. 28, 2008).
“International Search Report”, PCT/US09/49776, (Aug. 11, 2009).
Keicher, R. et al., “Automatic Guidance for Agricultural Vehicles in Europe”, Computers and Electronics in Agriculture, vol. 25, (Jan. 2000),169-194.
Bevly, David M., “Comparison of Ins v. Carrier-Phase DGPS for Attitude Determination in the Control of Off-Road Vehicles”, ION 55th Annual Meeting; Jun. 28-30, 1999; Cambridge, Massachusetts; pp. 497-504.
Ward, Phillip W., “Performance Comparisons Between FLL, PLL and a Novel FLL-Assisted-PLL Carrier Tracking Loop Under RF Interference Conditions”, 11th Int. Tech Meeting of the Satellite Division of the U.S. Inst. of Navigation, Nashville, TN, Sep. 15-18, 783-795, 1998.
Irsigler, M et al., “PPL Tracking Performance in the Presence of Oscillator Phase Noise”, GPS Solutions, vol. 5 No. 4, pp. 45-57 2002.
Kaplan, E D., “Understanding GPS: Principles and Applications”, Artech House, MA, 1996.
“ISO”, 11783 Part 7 Draft Amendment 1 Annex, Paragraphs B.6 and B.7.ISO.11783-7 2004 DAM1 ISO: Mar. 8 2004.
“International Search Report and Written Opinion” PCT/US2004/015678, filed May 17, 2004, Jun. 21, 2005.
Last, J. D., et al., “Effect of skywave interference on coverage of radiobeacon DGPS stations”, IEEE Proc.-Radar, Sonar Navig., vol. 144, No. 3, Jun. 1997, pp. 163-168.
Park, Chansik et al., “Integer Ambiguity Resolution for GPS Based Attitude Determination System”, SICE 1998, Jul. 29-31, Chiba, 1115-1120.
Han, Shaowel et al., “Single-Epoch Ambiguity Resolution for Real-Time GPS Attitude Determination with the Aid of One-Dimensional Optical Fiber Gyro”, GPS Solutions, vol. 3, No. 1, pp. 5-12 (1999) John Wiley & Sons, Inc.
Xu, Jiangning et al., “An EHW Architecture for Real-Time GPS Attitude Determination Based on Parallel Genetic Algorithm”, The Computer SocietyProceedings of the 2002 NASA/DOD Conference on Evolvable Hardware (EH'02), (2002).
Lin, Dai et al., “Real-time Attitude Determination fro Microsatellite by Lamda Method Combined with Kalman Filtering”, A Collection fof the 22nd AIAA International Communications Satellite Systems Conference and Exhibit Technical Paers vol. 1, Monetrev, California American Institute of Aeronautics and Astronautics, Inc., (May 2004),136-143.
“Orthman Manufacturing Co., www.orthman.com/htm;guidance.htm”, 2004, regarding the “Tracer Quick-Hitch”.
Parkinson, Bradford W., et al., “Global Positioning System: Theory and Applications, vol. II”, Bradford W. Parkinson and James J. Spiker, Jr., eds., Global Postioning System: Theory and Applicaitons, vol. II, 1995, AIAA, Reston, VA, USA, pp. 3-50, (1995),3-50.
“International Search Report,”, PCT/US09/34376, (Nov. 2, 2009).
Takac, Frank et al., “SmartRIK: A Novel Method of Processing Standardised RTCM Network RTK Information for High Precision Positioning”, Proceedings of ENC GNSS 2008, Toulouse, France,(Apr. 22, 2008).
“International Search Report / Written Opinion”, PCT/US09/63594.
“International Search Report”, PCT/US09/60668.
“ISR Notification & Written Opinion”, PCT/US10/26509, (Apr. 20, 2010),1-7.
“Notification Concerning Transmittal of International Report on Patentability (PCT)”, PCT/US2009/049776, (Jan. 20, 2011).
“Notification of Transmittal of InternatinalPrelim. Report of Patentability”, International Application No. PCT/US09/039686, (Oct. 21, 2010).
“International Search Report and Written Opinion”, PCT/US2010/043094, (Sep. 17, 2010).
“Notification of Publication of International Application”, WO 2011/014431, (Feb. 3, 2011).
“International Search Report and Written Opinion”, PCT/US08/81727, (Dec. 23, 2008).
“International Search Report”, PCT/US09/33567, (Feb. 9, 2009).
“International Search Report and Written Opinion”, PCT/IB2008/003796, (Jul. 15, 2009).
“International Search Report”, PCT/US09/067693, (Jan. 26, 2010).
“International Search Report and Written Opinion”, PCT/US10/21334, (March. 12, 2010).
Rho, Hyundho et al., “Dual-Frequency GPS Precise Point Positioning with WADGPS Corrections”, [retrieved on May 18, 2010]. Retrieved from the Internet: ,URL:http://gauss.gge.unb.ca/papers.pdf/iongnss2005.rho.wadgps.pdf, (Jul. 12, 2006).
“Eurocontrol, Pegasus Technical Notes on SBAS”, report [online], Dec. 7, 2004 [retrieved on May 18 2010]. Retrieved from the Internet: <URL: http://www.icao.int/icao/en/ro/nacc/meetings/2004/gnss/documentation/Pegasus/tn.pdf>, (Dec. 7, 2004),p. 89 paras [0001]-[0004].
“Arinc Engineering Services, Interface Specification IS-GPS-200, Revision D”, Online [retrieved on May 18, 2010]. Retrieved from the Internet;<URL: http://www.navcen,uscg.gov/gps/geninfo/IS-GPS-200D.pdf>, (Dec. 7, 2004),p. 168 para [0001].
Schaer, et al., “Determination and Use of GPS Differential Code Bias Values”, Presentation [online]. Revtrieved May 18, 2010. Retrieved from the Internet: <http://nng.esoc.esa.de/ws2006/REPR2.pdf>., (May 8, 2006).
“International Search Report”, PCT/US10/26509, (Apr. 20, 2010).
“International Preliminary Report on Patentability and Written Opinion”, PCT/US2009/033693, Aug. 19, 2010.
Related Publications (1)
Number Date Country
20090204281 A1 Aug 2009 US
Provisional Applications (1)
Number Date Country
61027478 Feb 2008 US