The present invention relates generally to auto-steering systems based on Global Navigation Satellite System (GNSS) navigation, and in particular automatically calibrating the auto-steering system for accurate control.
Current automated agricultural vehicle guidance and steering systems use a combination of GNSS receivers and inertial sensors mounted on the vehicle to automatically steer a specific reference point (i.e. control point) on the vehicle. In one example, vehicle guidance system steers a control point located at the center of a vehicle rear axle along a desired path projected onto the ground.
Guidance systems need accurate ground position information. However, the guidance system measures the GNSS receiver antenna position not the vehicle control point. Therefore, the guidance system needs an accurate way to project the measured GNSS position down to the ground control point. This is commonly referred to as terrain compensation. This projection can be estimated from the inertial sensors or from a fusion process of the inertial sensors and GNSS receiver. Part of the projection process requires knowledge of the mounting orientation of the inertial sensors relative to the vehicle body. The process of measuring or estimating the inertial sensor offsets relative to the vehicle is commonly part of the inertial sensor calibration process.
The inertial sensor offsets are assumed to be fixed (or constant) and are typically measured once at the time of installation. The inertial sensors are typically part of the steering control unit which is installed at some fixed location on the vehicle.
If the mounting orientation of the steering control unit and the included inertial sensors change, the new mounting offsets would need to be updated. Otherwise the GNSS projection process produces erroneous projections. Thus, changing the steering control unit orientation typically requires re-running the calibration process.
There are various approaches to measuring the mounting offsets of the inertial sensors (typically referred to as roll and pitch biases). One calibration method records the estimated attitude measurements from the inertial sensors while stationary or traveling straight in one direction and then repeats the same process but facing 180 degrees in the other direction over the same section of the terrain. Comparing these two sets of readings allows the attitude component due to the terrain to be eliminated, leaving only the attitude component due to the mounting offsets (i.e. the roll and pitch bias). This calibration process carries out a distinct series of vehicle operational steps to identify the mounting biases. This means that the biases cannot be estimated during typical operation, and a separate dedicated process needs to be carried out.
Some automatic steering systems combine the steering actuator and controller, including the inertial sensor subsystem, in a single unit which is directly mounted to the steering wheel or column. In some vehicles, the steering column rake angle can be adjusted. It may not be desirable to have to manually re-run the calibration process in these vehicles every time the steering column rake angle is adjusted.
The included drawings are for illustrative purposes and serve to provide examples of possible structures and operations for the disclosed inventive systems, apparatus, methods and computer-readable storage media. These drawings in no way limit any changes in form and detail that may be made by one skilled in the art without departing from the spirit and scope of the disclosed implementations.
A steering control system automatically estimates and compensates for a relative attitude offset between a vehicle chassis or body and a steering system inertial sensor subsystem. The steering system automatically re-calibrates the mounting offsets during normal vehicle operation without user intervention. This is particularly useful with steering control units mounted on steering wheels or columns, where a user can freely adjust the steering column rake angle on the fly.
An inertial sensor subsystem 108 may include any combination of gyroscopes and accelerometers that are usually mounted at a second location on vehicle 102 different from the first location of GNSS subsystem 106. Inertial sensor subsystem 108 is typically mounted to vehicle 102 with an unknown and/or variable pitch mounting offset relative to the orientation of vehicle 102.
A steering control unit 118 may include one or more processors coupled to both GNSS subsystem 106 and inertial subsystem 108. Steering control unit 118 uses position data received from GNSS subsystem 106 and attitude data received from inertial sensor subsystem 108 to steer reference point 114 on vehicle 102 along a desired path over ground surface 116. In one embodiment, inertial sensor subsystem 108 is located in steering control unit 118 and mounted to vehicle 102 at a fixed position and orientation. In another embodiment, inertial sensor subsystem 108 is located in steering control unit 108 and mounted to a steering column of vehicle 102 which has an adjustable rake angle.
Control unit 118 needs to know the attitude of vehicle body axis 112 in order to accurately project the GNSS position measured by GNSS subsystem 106 down to reference point 114. In order to accurately project the position of GNSS subsystem 106 down to control point 114, control unit 118 calculates the attitude mounting offset between inertial axis 110 of inertial sensor subsystem 108 and vehicle body axis 112 of vehicle 102.
Control unit 118 automatically calibrates inertial sensor subsystem 108 for inertial sensor mounting attitude offsets without an operator having to manually measure or manually run specific calibration steps. Control unit 118 estimates the inertial mounting offset of inertial sensor subsystem 108 any time its position or orientation is changed, such as during vehicle use or at time of installation.
Referring now to
Second map 136 includes an x-axis that corresponds to distance (d) and a y-axis that corresponds to height or altitude (h). Control unit 118 generates a one-dimensional terrain profile 136 by calculating distances 134 between the locations of each altitude measurement 128. Control unit 118 measures the slope of curve 138 by calculating changes in height 128 vs. changes in associated distances 134. Control unit 118 uses the slope of curve 138 to estimate the attitude of vehicle 102.
Control unit 118 uses the slope of curve 138 in
Control unit 118 in operation 140A uses the GNSS positional measurements to generate the height map of the local terrain as shown above in
hi=h(xi,yi),
where hi is the height at the position coordinates (xi, yi) and h(⋅, ⋅) of the terrain model. There are numerous methods known to those skilled in the art for estimating the terrain model h(⋅, ⋅) from a dataset of GNSS measurements. It is assumed that an appropriate method has been used and that the terrain model is available for the rest of the process.
The partial derivative of the height map
is then evaluated and transformed to be aligned with the vehicle heading. The ground slopes aligned with the vehicle are converted into roll ϕg and pitch θg angles of the ground using:
where
are the partial derivative of the height map calculated or estimated around the current vehicle location and ψ is the vehicle body heading.
Operation 140B uses the inertial sensors, or a fusion process of the inertial sensors and other sensors, to estimate the attitude of the inertial sensor subsystem 108 (ϕh, θh) or of vehicle 102 (ϕ, θ) if the inertial mounting biases (ϕb, θb) are assumed to be known.
Operation 140C detects the mounting biases, and determines how the mounting biases need to be changed if the current mounting biases are wrong. This depends on whether or not the attitude of inertial sensor subsystem 108 is used in operation 140B or if the estimated vehicle attitude (corrected with a set of mounting biases) is used.
In either case, control unit 118 may low pass filter the difference between the estimated attitude and the ground attitude to measure the mean and remove any transient effects. If this mean value is significantly different from current mounting biases, the mounting biases can be updated to this new mean value. In the case when the vehicle attitude is used, when the mean shifts significantly from zero, the current mounting biases of inertial sensor subsystem 108 can be incremented by the mean.
The general method may generate a fairly accurate terrain map so that the roll bias can be accurately estimated. The accuracy of the roll bias may have a large effect on the ground cross track error, especially traversing pass to pass or traversing on the same control path but when vehicle 102 is driven in the opposite direction. To estimate the roll from the terrain, the terrain on the sides of vehicle 102 should already be mapped. One example simplified process for estimating the pitch bias is described below in
Operation 140A can be modified so the GNSS positional measurements are parameterized into a 1-dimensional height profile along the direction of travel. This reduction allows the pitch bias to be identified in a more computational efficient way without the need of mapping the terrain height profile in 2-dimensions.
Let the height profile be a function such that:
hi=h(di)
where hi is the height at displacement distance di and h(⋅) the height profile function or set. To convert GNSS positional measurements Pkn into this parameterized form, the following mapping may be used. The displacement between GNSS samples is calculated using:
Δdk=α√{square root over ((Px
where Δdk is the change in the parameterized distance for GNSS measurement k, (Px, Py)kn are the x and y positional components for GNSS measurement k, are the x (Px, Py)k-1n and y positional components for the previous GNSS measurement k−1, and α is a direction scaling factor that is α=1 if traveling forward (V>0) or α=−1 if traveling in reverse (V<0).
The current displacement distance dk is obtained by accumulating all the displacement differences for the previous measurements.
The change distance Δdk is approximated from the velocity for a smooth profile, such that:
Δdk=(tk−tk-1)V,
where tk is the time of the current GNSS measurement, tk-1 is the time of the previous GNSS measurement, and V is the signed forward speed of the vehicle. The altitude measurement hk is taken from the z component of the GNSS positional measurement Pkn, which is paired with the displacement to form the height profile data point (dk, hk).
The slope of terrain 116 around the current vehicle position may be estimated from this dataset. Control unit 118 may fit a mathematical feature, such as a first order polynomial, to the data set and the first derivative evaluated around the current point of interest (current location 114 of vehicle 102).
A discrete implementation of this evaluates:
where n is the smallest number (to keep the region of interest around the local position) which satisfies n>2 and n>w, where w is the wheel base of vehicle 102 to help account for the length of the vehicle.
Control unit 118 may extract the slope and hence ground pitch θg from the least squares straight line fit using:
Operation 140B is unmodified from before, however it is only the pitch attitude that is considered rather than both the roll and pitch.
Control unit 118 in operation 140C may monitor a time history of the difference between the ground pitch θg and inertial sensor pitch θh (or vehicle pitch θ=θh−θb). Control unit 118 may filter this difference to estimate the mean and when there is a detectable change in mean, adjust the pitch bias to the identified value.
If the difference between ground and inertial pitch is monitored γ=θh−θg, then when the low pass filtered value
Control unit 118 may monitor the difference between the ground and vehicle pitch, γ=θ−θg where θ=θh−θb. When the low pass filtered value
In operation 150A, control unit 118 incorporates new GNSS measurements into a terrain model. As explained above, the terrain model is alternatively referred to as a terrain map or terrain profile and may be data stored in computer memory that identifies GNSS measured altitudes for associated longitude and latitude positions. In operation 150B, control unit 118 calculates a slope of the terrain model at a current location. As explained above, control unit 118 may calculate a changes in distance vs. a change in altitude from a previously measured location to calculate the slope.
In operation 150C, control unit 118 converts the calculated terrain slope into a vehicle body roll and pitch. For example, control unit 118 calculates the partial derivative of the height map to convert the terrain slope into the vehicle roll and pitch angles. In operation 150D, control unit 118 measures the roll and pitch from the internal sensor subsystem 108. For example, control unit 118 reads the roll and pitch attitude measurements from inertial sensor subsystem 108.
In operation 150E, control unit 118 estimates the mounting biases of inertial sensor subsystem 108. For example, control unit 118 calculates the differences of the calculated terrain/vehicle roll and pitch compared with the measured roll and pitch from inertial sensor subsystem 108 to determine the mounting biases between inertial sensor subsystem 108 and vehicle 102.
In operation 150F, control unit 118 may filter the calculated mounting biases to estimate a mean mounting bias. Any filtering method that smoothes the time history while maintaining the relevant mean profile may be used. A simple first order low pass filter can be implemented
In operation 160C, control unit 118 accumulates a total displacement from the previous measurement point and pairs the accumulated total distance with a current height measurement. In operation 160D, control unit 118 adds the distance-height pair into a height profile set as shown above in
In operation 160E, control unit 118 uses a fit curve feature at a current displacement position in the height profile as discussed above. In operation 160F, control unit 118 estimates the terrain pitch from the slope of the fitted curve. In operation 160G, control unit measures the vehicle pitch from inertial sensor subsystem 108. In operation 160H, control unit 118 estimates the pitch bias as the difference between the calculated terrain pitch and the measured inertial sensor subsystem pitch.
In operation 160I, control unit 118 filters the calculated pitch bias to estimate a mean pitch bias. If the mean pitch bias has changed by some threshold amount in operation 160J, control unit 118 in operation 160K updates the pitch bias to the latest calculated mean pitch bias.
A Global navigation satellite system (GNSS) is broadly defined to include GPS (U.S.) Galileo (European Union, proposed) GLONASS (Russia), Beidou (China) Compass (China, proposed) IRNSS (India, proposed), QZSS (Japan, proposed) and other current and future positioning technology using signal from satellites, with or with augmentation from terrestrial sources.
IMUs may include gyroscopic (gyro) sensors, accelerometers and similar technologies for providing outputs corresponding to the inertial of moving components in all axes, i.e., through six degrees of freedom (positive and negative directions along transverse X, longitudinal Y and vertical Z axes). Yaw, pitch and roll refer to moving component rotation about the Z, X, and Y axes respectively. Said terminology will include the words specifically mentioned, derivative thereof and words of similar meaning.
Guidance processor 6 includes a graphical user interface (GUI) 26, a microprocessor 24, and a media element 22, such as a memory storage drive. Guidance processor 6 electrically communicates with, and provides control data to steering controller 166. Steering controller 166 includes a wheel movement detection switch 28 and an encoder 30 for interpreting guidance and steering commands from CPU 6.
Steering controller 166 may interface mechanically with the vehicle's steering column 34, which is mechanically attached to steering wheel 32. A control line 42 may transmit guidance data from CPU 6 to steering controller 166. An electrical subsystem 44, which powers the electrical needs of vehicle 102, may interface directly with steering controller 166 through a power cable 46. Steering controller 166 can be mounted to steering column 34 near the floor of the vehicle, and in proximity to the vehicle's control pedals 36. Alternatively, steering controller 166 can be mounted at other locations along steering column 34.
As explained above, some auto-steering systems 100 may include an inertial sensor subsystem 108 that attaches to steering column 34. Of course, inertial sensor subsystem 108 may be attached to any location in vehicle 102.
Steering controller 166 physically drives and steers vehicle 102 by actively turning the steering wheel 32 via steering column 34. A motor 45 powered by vehicle electrical subsystem 44 may power a worm drive which powers a worm gear 48 affixed to steering controller 166. These components are preferably enclosed in an enclosure. In other embodiments, steering controller 166 is integrated directly into the vehicle drive control system independently of steering column 34.
The diagram below shows a computing device 1000 used for operating the control unit 118 that includes guidance processor 6 discussed above. The computing device 1000 may operate in the capacity of a server or a client machine in a server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment. In other examples, computing device 1000 may be a personal computer (PC), a tablet, a Personal Digital Assistant (PDA), a cellular telephone, a smart phone, a web appliance, or any other machine or device capable of executing instructions 1006 (sequential or otherwise) that specify actions to be taken by that machine.
While only a single computing device 1000 is shown, the computing device 1000 may include any collection of devices or circuitry that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the operations discussed above. Computing device 1000 may be part of an integrated control system or system manager, or may be provided as a portable electronic device configured to interface with a networked system either locally or remotely via wireless transmission.
Processors 1004 may comprise a central processing unit (CPU), a graphics processing unit (GPU), programmable logic devices, dedicated processor systems, micro controllers, or microprocessors that may perform some or all of the operations described above. Processors 1004 may also include, but may not be limited to, an analog processor, a digital processor, a microprocessor, multi-core processor, processor array, network processor, etc.
Some of the operations described above may be implemented in software and other operations may be implemented in hardware. One or more of the operations, processes, or methods described herein may be performed by an apparatus, device, or system similar to those as described herein and with reference to the illustrated figures.
Processors 1004 may execute instructions or “code” 1006 stored in any one of memories 1008, 1010, or 1020. The memories may store data as well. Instructions 1006 and data can also be transmitted or received over a network 1014 via a network interface device 1012 utilizing any one of a number of well-known transfer protocols.
Memories 1008, 1010, and 1020 may be integrated together with processing device 1000, for example RAM or FLASH memory disposed within an integrated circuit microprocessor or the like. In other examples, the memory may comprise an independent device, such as an external disk drive, storage array, or any other storage devices used in database systems. The memory and processing devices may be operatively coupled together, or in communication with each other, for example by an I/O port, network connection, etc. such that the processing device may read a file stored on the memory.
Some memory may be “read only” by design (ROM) by virtue of permission settings, or not. Other examples of memory may include, but may be not limited to, WORM, EPROM, EEPROM, FLASH, etc. which may be implemented in solid state semiconductor devices. Other memories may comprise moving parts, such a conventional rotating disk drive. All such memories may be “machine-readable” in that they may be readable by a processing device.
“Computer-readable storage medium” (or alternatively, “machine-readable storage medium”) may include all of the foregoing types of memory, as well as new technologies that may arise in the future, as long as they may be capable of storing digital information in the nature of a computer program or other data, at least temporarily, in such a manner that the stored information may be “read” by an appropriate processing device. The term “computer-readable” may not be limited to the historical usage of “computer” to imply a complete mainframe, mini-computer, desktop, wireless device, or even a laptop computer. Rather, “computer-readable” may comprise storage medium that may be readable by a processor, processing device, or any computing system. Such media may be any available media that may be locally and/or remotely accessible by a computer or processor, and may include volatile and non-volatile media, and removable and non-removable media.
Computing device 1000 can further include a video display 1016, such as a liquid crystal display (LCD), light emitting diode (LED), organic light emitting diode (OLED), or a cathode ray tube (CRT) and a user interface 1018, such as a keyboard, mouse, touch screen, etc. All of the components of computing device 1000 may be connected together via a bus 1002 and/or network.
The system described above can use dedicated processor systems, micro controllers, programmable logic devices, or microprocessors that perform some or all of the operations. Some of the operations described above may be implemented in software, such as computer readable instructions contained on a storage media, or the same or other operations may be implemented in hardware.
For the sake of convenience, the operations are described as various interconnected functional blocks or distinct software modules. This is not necessary, however, and there may be cases where these functional blocks or modules are equivalently aggregated into a single logic device, program or operation with unclear boundaries. In any event, the functional blocks and software modules or features of the flexible interface can be implemented by themselves, or in combination with other operations in either hardware or software.
References above have been made in detail to preferred embodiment. Examples of the preferred embodiments were illustrated in the referenced drawings. While preferred embodiments where described, it should be understood that this is not intended to limit the invention to one preferred embodiment. To the contrary, it is intended to cover alternatives, modifications, and equivalents as may be included within the spirit and scope of the invention as defined by the appended claims.
Having described and illustrated the principles of the invention in a preferred embodiment thereof, it should be apparent that the invention may be modified in arrangement and detail without departing from such principles. Claim is made to all modifications and variation coming within the spirit and scope of the following claims.
The present application claims priority to U.S. Provisional Patent Application Ser. No. 62/652,239 filed on Apr. 3, 2018, entitled: METHOD FOR AUTOMATIC PITCH MOUNTING COMPENSATION IN AN AUTOMATIC STEERING SYSTEM which is incorporated by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
5194851 | Kraning et al. | Mar 1993 | A |
5390125 | Sennott et al. | Feb 1995 | A |
5663879 | Trovato et al. | Sep 1997 | A |
5923270 | Sampo et al. | Jul 1999 | A |
6052647 | Parkinson et al. | Apr 2000 | A |
6070673 | Wendte | Jun 2000 | A |
6212453 | Kawagoe et al. | Apr 2001 | B1 |
6373432 | Rabinowitz et al. | Apr 2002 | B1 |
6377889 | Soest | Apr 2002 | B1 |
6445983 | Dickson et al. | Sep 2002 | B1 |
6539303 | McClure et al. | Mar 2003 | B2 |
6711501 | McClure et al. | Mar 2004 | B2 |
6789014 | Rekow et al. | Sep 2004 | B1 |
6819780 | Benson et al. | Nov 2004 | B2 |
6865465 | McClure | Mar 2005 | B2 |
6876920 | Mailer | Apr 2005 | B1 |
7142956 | Heiniger et al. | Nov 2006 | B2 |
7162348 | McClure et al. | Jan 2007 | B2 |
7277792 | Overschie | Oct 2007 | B2 |
7373231 | McClure et al. | May 2008 | B2 |
7400956 | Feller et al. | Jul 2008 | B1 |
7437230 | McClure | Oct 2008 | B2 |
7460942 | Mailer | Dec 2008 | B2 |
7689354 | Heiniger et al. | Mar 2010 | B2 |
RE41358 | Heiniger et al. | May 2010 | E |
7835832 | Macdonald | Nov 2010 | B2 |
7885745 | McClure et al. | Feb 2011 | B2 |
8018376 | McClure et al. | Sep 2011 | B2 |
8190337 | McClure | May 2012 | B2 |
8214111 | Heiniger et al. | Jul 2012 | B2 |
8311696 | Reeve | Nov 2012 | B2 |
8386129 | Collins et al. | Feb 2013 | B2 |
8401704 | Pollock et al. | Mar 2013 | B2 |
8489291 | Dearborn et al. | Jul 2013 | B2 |
8521372 | Hunt et al. | Aug 2013 | B2 |
8548649 | Guyette et al. | Oct 2013 | B2 |
8583315 | Whitehead et al. | Nov 2013 | B2 |
8583326 | Collins et al. | Nov 2013 | B2 |
8589013 | Pieper et al. | Nov 2013 | B2 |
8594879 | Roberge et al. | Nov 2013 | B2 |
8634993 | McClure | Jan 2014 | B2 |
8639416 | Jones et al. | Jan 2014 | B2 |
8649930 | Reeve et al. | Feb 2014 | B2 |
8676620 | Hunt et al. | Mar 2014 | B2 |
8718874 | McClure et al. | May 2014 | B2 |
8768558 | Reeve et al. | Jul 2014 | B2 |
8781685 | McClure | Jul 2014 | B2 |
8803735 | McClure | Aug 2014 | B2 |
8897973 | Hunt et al. | Nov 2014 | B2 |
8924152 | Hunt et al. | Dec 2014 | B2 |
9002565 | Jones et al. | Apr 2015 | B2 |
9002566 | McClure et al. | Apr 2015 | B2 |
9141111 | Webber et al. | Sep 2015 | B2 |
9162703 | Miller et al. | Oct 2015 | B2 |
9173337 | Guyette et al. | Nov 2015 | B2 |
9223314 | McClure et al. | Dec 2015 | B2 |
9255992 | McClure | Feb 2016 | B2 |
9389615 | Webber et al. | Jul 2016 | B2 |
9703290 | Vandapel | Jul 2017 | B1 |
9731744 | Carter | Aug 2017 | B2 |
9791279 | Somieski | Oct 2017 | B1 |
10232869 | Carter | Mar 2019 | B2 |
10255670 | Wu | Apr 2019 | B1 |
20020072850 | McClure et al. | Jun 2002 | A1 |
20040186644 | McClure et al. | Sep 2004 | A1 |
20060167600 | Nelson, Jr. et al. | Jul 2006 | A1 |
20060208169 | Breed | Sep 2006 | A1 |
20070250261 | Soehren | Oct 2007 | A1 |
20080039991 | May | Feb 2008 | A1 |
20100274452 | Ringwald et al. | Oct 2010 | A1 |
20110022267 | Murphy | Jan 2011 | A1 |
20130041549 | Reeve | Feb 2013 | A1 |
20130046439 | Anderson | Feb 2013 | A1 |
20130332105 | McKown | Dec 2013 | A1 |
20140266877 | McClure | Sep 2014 | A1 |
20140277676 | Gattis | Sep 2014 | A1 |
20140336970 | Troni-Peralta | Nov 2014 | A1 |
20150175194 | Gattis | Jun 2015 | A1 |
20160039454 | Mortimer | Feb 2016 | A1 |
20160147225 | Sights | May 2016 | A1 |
20160154108 | McClure et al. | Jun 2016 | A1 |
20160205864 | Gattis et al. | Jul 2016 | A1 |
20160214643 | Joughin et al. | Jul 2016 | A1 |
20160216116 | Kourogi | Jul 2016 | A1 |
20160252909 | Webber et al. | Sep 2016 | A1 |
20160334804 | Webber et al. | Nov 2016 | A1 |
20160364678 | Cao | Dec 2016 | A1 |
20160364679 | Cao | Dec 2016 | A1 |
20170356757 | Bourret | Dec 2017 | A1 |
20190128673 | Faragher | May 2019 | A1 |
20190361457 | Johnson | Nov 2019 | A1 |
Entry |
---|
Noh, Kwang-Mo, Self-tuning controller for farm tractor guidance, Iowa State University Retrospective Theses and Dissertations, Paper 9874, (1990), 25 Pages. |
Van Zuydam,. R.P., Centimeter-Precision Guidance of Agricultural Implements in the Open Field by Means of Real Tim Kinematic DGPS, ASA-CSSA-SSSA, pp. 1023-1034 (1999). |
PCT, International Search Report and Written Opinion of the International Searching Authority for International Application No. PCT/US2019/022743, dated Sep. 10, 2019, pp. 12. |
Vinande E et al.: “Mounting-Angle Estimation for Personal Navigation Devices”,IEEE Transactions on Vehicular Technology, IEEE Service Center, Piscataway, NJ, US,vol. 59, No. 3, Mar. 1, 2010 (Mar. 1, 2010), pp. 1129-1138, XP011296528,ISSN: 0018-9545. |
USPTO, International Preliminary Report on Patentability for International Application No. PCT/US2019/022743, dated Oct. 15, 2020, pp. 11. |
Number | Date | Country | |
---|---|---|---|
20190299966 A1 | Oct 2019 | US |
Number | Date | Country | |
---|---|---|---|
62652239 | Apr 2018 | US |