Determining an angle between a tow vehicle and a trailer

Information

  • Patent Grant
  • 11714192
  • Patent Number
    11,714,192
  • Date Filed
    Friday, February 26, 2021
    3 years ago
  • Date Issued
    Tuesday, August 1, 2023
    9 months ago
Abstract
The angle of a trailer with respect to a tow vehicle is an important parameter to the stability of the vehicle and trailer. A tow vehicle pulling a trailer in a straight line is generally more stable than when the vehicle is turning. While turning, the angle between the tow vehicle and the trailer is not a straight line but is another angle depending on how sharply the tow vehicle is turning. To safely operate a vehicle towing a trailer, for a given steering input and speed, there is a maximum angle between the tow vehicle and trailer whereby exceeding the angle causes instability and may cause the trailer or tow vehicle to roll over or jackknife. Accordingly, the angle between the trailer and tow vehicle must be determined to ensure the vehicle and trailer will continue to be in control.
Description
TECHNICAL FIELD

This document relates to determining the angle between a tow vehicle and a trailer being towed.


BACKGROUND

Many different types of vehicles are used to tow a variety of different types of trailers. For example, commercial semi-trailer trucks, also known as semis, tractor-trailers, big rigs, eighteen-wheelers, or transports include a tractor towing one or more trailers. Other non-commercial vehicles such as pick-up trucks, motorhomes, recreational vehicles, and sport-utility vehicles also tow trailers, boats, campers, and other types of trailers. In each of the above examples, the stability of the trailer and tow vehicle depends on multiple factors such as the speed of the trailer and tow vehicle, the weather conditions such as wind and rain, the length of trailer, the number of axles, the angle between tow vehicle and the trailer, and others. New techniques are needed to measure the angle between the trailer and tow vehicle that are reliable, accurate, have a long service life, and are inexpensive.


SUMMARY

Disclosed are devices, systems and methods for determining an angle such as the angle that a trailer is towed behind a vehicle. In one aspect, a system for determining a trailer angle between a trailer and a vehicle is disclosed. The system includes one or more ultrasonic sensors, wherein each ultrasonic sensor is mountable to the vehicle to determine a distance from the ultrasonic sensor to a front-end of a trailer attached to the vehicle. The system further includes an ultrasonic control unit configured to receive the distance from each of the one or more ultrasonic sensors via a communications interface, wherein the ultrasonic control unit determines one or more angles, each angle corresponding to a distance received from the one or more ultrasonic sensors, wherein each angle is between the vehicle and the trailer, and wherein the ultrasonic control unit determines the trailer angle from the one or more angles.


In another aspect, a method for determining a trailer angle between a trailer and a vehicle is disclosed. The method includes receiving, from each of one or more ultrasonic sensors attached to the vehicle, information about a distance between each ultrasonic sensor and a front-end of the trailer attached to the vehicle, and determining one or more angles, each angle corresponding to the information about the distance received from the one or more ultrasonic sensors, wherein each angle is an estimate of alignment between the vehicle and the trailer, and wherein a trailer angle is determined from the one or more angles.


In another aspect, a non-transitory computer readable medium is disclosed. The non-transitory computer readable medium stores executable instructions for determining a trailer angle between a trailer and a vehicle that when executed by at least one processor perform at least the following: receiving, from each of one or more ultrasonic sensors, a distance between each ultrasonic sensor and a front-end of the trailer attached to the vehicle, and determining one or more angles, each angle corresponding to a distance received from the one or more ultrasonic sensors, wherein each angle is between the vehicle and the trailer, and wherein a trailer angle is determined from the one or more angles.


The following features can be included in various combinations. A first angle corresponding to a first ultrasonic sensor can be determined based on one or more geometrical relationships between a position of the first ultrasonic sensor and the front-end of the trailer. The trailer angle can be determined as an average angle of the one or more angles. The average angle can be weighted based on a standard deviation for distance values received at the ultrasonic control unit from each of the one or more ultrasonic sensors. The ultrasonic control unit can determine an error interval and a confidence level based on the standard deviation for distance values received at the ultrasonic control unit from each of the one or more ultrasonic sensors. The first angle can be determined from: a first neutral distance between the first ultrasonic sensor and the front-end of the trailer when the trailer is in line with the vehicle, a first angled distance when the trailer is angled with respect to the vehicle, and/or a first offset distance between the center of the first ultrasonic sensor and the center of the tractor. The first neutral distance can be determined when a steering angle of the vehicle is about zero degrees and the vehicle is travelling at about 10 kilometers per hour or more. The ultrasonic control unit can include at least one processor, at least one memory storing executable instructions that when executed by the at least one processor perform at least the following determining the first angle corresponding to a first ultrasonic sensor as: A1=arctangent ((D1-1−D1-2)/Dis1), wherein A1 is the first angle between the vehicle and the trailer, wherein D1-1 is a first neutral distance between the first ultrasonic sensor and the front-end of the trailer, wherein the first neutral distance is determined when the trailer is in line with the vehicle, wherein D1-2 is a first angled distance between the first ultrasonic sensor and the front-end of the trailer, wherein the first angled distance is determined when the trailer is angled with respect to the vehicle, and wherein Dis1 is a first offset distance, wherein the first offset distance is between the center of the first ultrasonic sensor and the center of the vehicle.


The above and other aspects and features of the disclosed technology are described in greater detail in the drawings, the description and the claims.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 depicts an example of an angle measurement system installed on a tractor, in accordance with some example embodiments;



FIG. 2 depicts an example of an arrangement of ultrasonic sensors for measuring an angle between a tow vehicle and a trailer, in accordance with some example embodiments;



FIG. 3A depicts an illustration showing a trailer at an angle to a tow vehicle that is not a straight line, and various distances, in accordance with some example embodiments;



FIG. 3B depicts another illustration showing a trailer at an angle to a tow vehicle, in accordance with some example embodiments;



FIG. 4 depicts an example of an apparatus, in accordance with some example embodiments;



FIG. 5 depicts an example of an apparatus where two sensors are not operational, in accordance with some example embodiments;



FIG. 6 depicts an example of an apparatus where four sensors are not operational, in accordance with some example embodiments;



FIG. 7 depicts an example of a process, in accordance with some example embodiments; and



FIG. 8 depicts an example of an apparatus, in accordance with some example embodiments.





DETAILED DESCRIPTION

The angle of a trailer with respect to a tow vehicle is an important parameter to the stability of the vehicle and trailer. A tow vehicle pulling a trailer in a straight line is generally more stable than when the vehicle is turning. While turning, the angle between the tow vehicle and the trailer is not a straight line but is another angle depending on how sharply the tow vehicle is turning. To safely operate a vehicle towing a trailer, for a given steering input and speed, there is a maximum angle between the tow vehicle and trailer whereby exceeding the angle causes instability and may cause the trailer or tow vehicle to roll over or jackknife. Accordingly, the angle between the trailer and tow vehicle may be determined to ensure the vehicle and trailer will continue to be in control. A trailer angle sensor system should provide high accuracy and redundancy to ensure the system is always operable even when one or more sensors are inoperable. This calculation becomes important when the vehicle is an autonomous vehicle and therefore lacks the benefit of judgment of a human driver regarding stability of the combination of the vehicle and the trailer when driving around curves.


Current trailer angle sensors have several limitations including: 1) Current trailer angle sensors need to be installed in trailers or at least have to be fixed to the trailer in order to determine the relative motion between trailer and tractor. This means the sensor needs to be reinstalled or recalibrated each time after changing the trailer; 2) Current techniques offer no redundancy since they use Hall effect devices or resistive position sensors; 3) The lifecycle of current trailer angle sensors is limited because they are typically mounted near the 5th wheel which is often a harsh environment due to vibration, exposure to the elements, and chemicals such as lubricating grease.



FIG. 1 depicts an example of an angle measurement system installed on a tow vehicle (also referred to herein as a “tractor”), in accordance with some example embodiments. Tractor 110 is physically coupled to trailer 120. A side-view is shown at 100A, a top-view at 100B, a front-view at 100C, and a back-view at 100D. The tractor 110 includes a plurality of ultrasonic sensors 130 mounted to the back of the tractor with the ultrasonic beams directed to the front surface of the trailer. Each of the ultrasonic sensors determines or helps to determine the distance from the ultrasonic sensor to the front of the trailer 120. Using the distances, ultrasonic control unit 125 determines the angle between the tractor and the trailer. In FIG. 1, the tractor and trailer lie in a straight line. When the tractor trailer is turning, the tractor and trailer may no longer be in line and there would be an angle between the tractor and trailer (also referred to as the “trailer angle”). UCU 125 may include a processor and memory, and various interfaces such as one or more communications interfaces.



FIG. 2 depicts an example of an arrangement of ultrasonic sensors for measuring an angle between a tow vehicle and a trailer. In some example embodiments, the ultrasonic sensors 130 may be arranged as shown at 200. The sensor arrangement shown is an “inverted V” arrangement. For example, on the back of tractor 110 in FIG. 1, the five ultrasonic sensors 131-135 may be mounted in the “inverted V” configuration with ultrasonic sensor 133 highest above the roadway in the center (left-to-right) of the tractor, and ultrasonic sensors 131 and 135 mounted on the back of the tractor closest to the roadway compared to ultrasonic sensors 132-134. Although five sensors are shown at 200, a greater or smaller number of sensors may be used. Although the ultrasonic sensors 131-135 are shown in an “inverted V” configuration, the sensors may be attached in a different configuration, such as a straight line (flat or tilted), a “W” or “inverted W” configuration or in another pattern. Organizing ultrasonic sensors in a two-dimensional pattern in a plane that is perpendicular to the axis between the tractor and the trailer may lead to a more robust determination of the trailer angle. For example, placing the ultrasonic sensors in a pattern can avoid objects between the tow vehicle and trailer and can avoid surfaces that are not normal to the ultrasonic sensor ultrasound when the trailer is pulled in a straight line. These non-perpendicular surfaces may not be suitable for ultrasonic distance determination. For example, an object with an angled surface (non-perpendicular to illuminating ultrasound) attached to the trailer may be at an obtuse angle to an ultrasonic sensor at a range of trailer angles causing the distance determined by the ultrasonic sensor to be inaccurate or preventing a distance from being determined due to little or no reflected ultrasonic signal from the angled object. In some example embodiments, ultrasonic sensor may be oriented to illuminate the trailer with ultrasound at a perpendicular angle when the trailer angle is not zero degrees. By doing so, some ultrasonic sensors may provide better distance determinations at different trailer angles. In some example embodiments, the ultrasonic sensors may provide timing information such as a transmit time and a reception time from which a transit time to from the sensor to the trailer can be determined, or some ultrasonic sensors may determine the transit time. Form the transit time, the distance can be determined based on the speed of the ultrasound.


At 210 is an example illustration of a top-view of a tractor trailer showing ultrasonic sensors 131-135. Also shown are distances 201-205 from each ultrasonic sensor to the front of the trailer 120. Distances 201-205 may be referred to “neutral distances” because the trailer is in a straight line with the tow vehicle. Each ultrasonic sensor determines information indicative of the distance, or the distance between the sensor to the front of the trailer. As noted above, information indicative of the distance includes a transit time for the ultrasound to propagate to the trailer (one way or round trip) or timing information such as time stamps from which the transit time can be determined. In the following, “distance” is used, but as noted above time information can be provided by the sensor instead of distance. For example, ultrasonic sensor 131 determines that the trailer is distance 201 from ultrasonic sensor 131. Ultrasonic sensor 132 determines that the trailer is distance 202 from ultrasonic sensor 132. Ultrasonic sensor 133 determines that the trailer is distance 203 from ultrasonic sensor 133. Ultrasonic sensor 134 determines that the trailer is distance 204 from ultrasonic sensor 134, and ultrasonic sensor 135 determines that the trailer is distance 205 from ultrasonic sensor 135. With the tractor 110 and trailer 120 in a straight line as shown at 210, the distances 201-205 are approximately equal. In some embodiments, a fairing, wind deflector, or equipment such as cooling equipment for the trailer 120 may cause the distances 201-205 to not be equal when the trailer and tractor lie in a straight line. The non-equal distances can be corrected in the UCU. The distances are provided to the UCU. The UCU may control the ultrasonic sensors and may receive health and status information from the ultrasonic sensors.



FIG. 3A depicts an example illustration showing an example of a trailer at an angle to a tow vehicle that is not a straight line, and various distances. At 310 is an illustration of a tractor 110 with a trailer 120 at an angle, trailer angle 350, to the trailer with ultrasonic sensors 131-135 attached to tractor 110. Each ultrasonic sensor determines the distance between the sensor to the front of the trailer as described above. In the example of FIG. 3A at 310, distance 301 is greater that distance 302, distance 302 is greater than distance 303, distance 303 is greater than distance 304, distance 304 is greater than distance 305. Distances 201-205 may be referred to “angled distances” because the trailer is not in a straight line with the tow vehicle. The distances are provided by ultrasonic sensors 131-135 to the UCU 125 and are used to determine the angle between the tractor 110 and trailer 120.


Shown at 320 is an illustration depicting distances 311-314 between the center of the center ultrasonic sensor 133 and the other ultrasonic sensors 131, 132, 134, and 135. Distances 311-314 are used in determining the angle between tractor 110 and trailer 120. Although distances 311-314 are shown from the center of one sensor to the center of another, other distances related to the spacing of the ultrasonic sensors may be used instead.


The UCU coordinates distance measuring by the sensors and determines the trailer angle based on the distances from the sensors. Coordination may include turning-on the sensors 131-135 individually at different times to prevent interference that could occur if multiple sensors were operating at the same time. In some example embodiments, the ultrasonic sensors may include a signature such as a pseudo-random noise (PN) code or other code, or the different sensors may modulate the emitted ultrasound to be orthogonal to other sensors. The UCU may also receive information including vehicle speed information, steering angle information, and the UCU may make zero clearing based on adapted control algorithm. The trailer angle may be sent to a vehicle dynamics controller, electronic stability controller (ESC), or vehicle stability controller (VSC) of the tow vehicle.



FIG. 3B depicts another illustration showing a trailer at an angle to a tow vehicle, in accordance with some example embodiments. FIG. 3B shows distances and angles related to determining the trailer angle and is further described below with respect to Equation 1.



FIG. 4 at 400 depicts an apparatus, in accordance with some example embodiments. FIG. 4 includes ultrasonic sensors 131-135, ultrasonic control unit (UCU) 125, vehicle speed sensor 410, steering angle sensor 420, and vehicle dynamics controller 430.


Ultrasonic sensors 131-135 include a communications interface to communicate with UCU 125. For example, commands such as powering-up or down each ultrasonic sensor, commanding each sensor to take a distance measurement, commands related to averaging distance values at each sensor, and others may be sent from UCU 125 to the ultrasonic sensors 131-135 individually or together. Data may be sent from each ultrasonic sensor to the UCU such as distance data, and sensor status and health information. UCU 125 may perform a process to determine the trailer angle. The UCU may receive vehicle speed information from the vehicle speed sensor 410 and/or steering angle information from steering angle sensor 420 via a dedicated or standard interface such as an on-board diagnostics (OBD) or another interface. The UCU may interface to a vehicle dynamics controller such as ESC or VSC or other stability management device in the vehicle.



FIG. 5 depicts an example of an apparatus where two sensors are not operational, in accordance with some example embodiments. In the example of FIG. 5, ultrasonic sensors 132A and 134A are not operational due to one or more types of failures. The UCU may use data from the remaining sensors 131, 133, and 135 to determine the trailer angle. Use of fewer ultrasonic sensors (3 instead of 5) may cause a reduction in the accuracy or resolution of the trailer angle, but may be sufficient to operate the tractor trailer safely. FIG. 5 is an illustrative example where two ultrasonic sensors are inoperable. Other combinations of sensors may fail resulting in similar reduced performance as well.



FIG. 6 depicts an example of an apparatus where four sensors are not operational, in accordance with some example embodiments. In the example of FIG. 6, ultrasonic sensors 132A. 133A, 134A, and 135A are not operational due to a one or more types of failures. The UCU may use data from the remaining sensor 131 to determine the trailer angle. Use of fewer ultrasonic sensors (1 instead of 5) may cause a reduction in the accuracy or resolution of the trailer angle, but may be sufficient to operate the tractor trailer safely. FIG. 6 is an illustrative example where four ultrasonic sensors are inoperable. Other combinations of sensors may fail resulting in reduced performance as well.



FIG. 7 depicts a process 700, in accordance with some example embodiments. The trailer angle may be determined using the process 700.


At 710, the process includes determining, by the ultrasonic sensors 131-135, the distances 201-205 to the trailer when trailer is in a neutral position (the trailer is in line with tractor). These values may be sent to the UCU and stored in memory such as a nonvolatile memory in the UCU. The neutral position may be determined when the steering angle is near zero and the tractor trailer velocity is greater than about 10 km/h. The distance values may be averaged or filtered over a period of time such as 1-5 seconds (or another time period). These stored values may be used as long as the tractor is attached to the same trailer. If a new trailer is attached to the tractor or the trailer is removed and then re-attached, the neutral distances may be determined again.


At 720, the trailer angle may be determined based the current distances 301-305, the geometrical relationships between the sensors and the tractor and trailer, and the distances determined at 710. As the tractor trailer travels, the ultrasonic sensors 131-135 periodically, intermittently, or continuously determine the distances between the tractor and trailer. When the tractor turns, the determined distances change. For example, when the FIG. 3A tractor 110 and trailer 120 turn left (as viewed from the top), distance 301 is greatest, followed by 302, 303, 304, and 305 which is the smallest distance. If the tractor were turning right, distance 305 would be greatest, 301 the smallest, and so on. In some example embodiments, angles may be determined for ultrasonic sensors 131, 132, 134, and 135 but not sensor 133 which may be used for initial angle determination.


Referring to FIGS. 3A and 3B, the following equation may be used to determine the trailer angle at a first ultrasonic sensor:

A1=arctan((D1-1−D1-2)/Dis1)  Equation 1

where A1 refers to the trailer angle 350B in FIG. 3B at ultrasonic sensor 131, distance D1-1 is distance 301, distance D1-2 is distance 302, distance D1-1−D1-2 is distance 330, and distance Dis1 is distance 311. By symmetry angles 350A and 350B have the same value which is equal to the trailer angle. A similar equation may be expressed for angles at additional ultrasonic sensors as A2, A3, and so on where A2 refers to the trailer angle at ultrasonic sensor 132, and so on, D2-1 is distance 202, and so on, D2-2 is distance 302, and so on, and Dis2 is distance 312, Dis4 is distance 313, and Dis5 is distance 314, and so on.


At 730, for each angle value (A1, A2, A4, and A5) a standard deviation SD1, SD2, SD4, SD5, may be determined. The noise from each ultrasonic sensor may be used to determine the standard deviation.


An average value may be determined for 100 angle determinations of A1 from:











A
1

_

=



A


1
1


+

A


1
2


+

+

A






1

1

0

0





1

0

0






Equation





2







The standard deviation may then be expressed as:










S


D
1


=





(


A






1
1


-


A





1

_


)

2

+

+


(


A






1
100


-


A





1

_


)

2


100






Equation





3







At 740, a weighting of the determined angle values from the different ultrasonic sensors based on their standard deviations may be expressed as:

W1=1−SD1/(SD1+SD2+SD4+SD5)
W2=1−W1−SD2/(SD2+SD4+SD5)
W4=1−W1−W2−SD4/(SD4+SD5)
W5=1−W1−W2−W4  Equations 4


Because the back of the tractor and the front of the trailer are structural and essentially rigid, in a perfect world without noise and imperfections, angles A1, A2, A4, and A5 would have the same value, but because of noise and imperfections they may differ in average and the standard deviation of each is a measure of the “noisiness” of each.


At 760, a weighted trailer angle may be expressed as:

AO=W1*A1+W2*A2+W4*A4+*A5  Equation 5


At 770, an error and confidence level of the weighted trailer angle may be expressed as:

AError=W1*SD1+W2*SD2+W4*SD4+W5*SD5  Equation 6
ACL=[AO−AError,AO+AError]  Equation 7


In some example embodiments, zero detection of the trailer angle may be performed. To begin zero detection, an error between the last angle output and current angle value may be 5 degrees or larger (or another value such as 4 degrees). Ultrasonic sensor 133 may be used as an initial reference of a zero angle for the other ultrasonic sensors. During the zero detection, each angle may be calculated using the following formulas:

A1=arctan((D1-2−D3-1)/Dis1)
A2=arctan((D2-2−D3-1)/Dis2)
A4=arctan((D4-2−D3-1)/Dis4)
A5=arctan((D5-2−D3-1)/Dis5),  Equations 8

where D1-2, D2-2, D4-2, D5-2 are the current distance measurements of sensors 131, 132, 134, and 135, respectively. Sensor 133 may be mounted in the middle of the tractor as shown in FIGS. 3A and 3B. The relative movement of the trailer and/or trailer angle will not influence, or will negligibly influence, the distance measured by sensor 133. Accordingly, sensor 133 may be used as a reference for the system and to compare to the distances from sensors 131, 132, 134, and 135.


In some example embodiments, a process 700 may be performed to determine a trailer angle between a trailer and a vehicle. At 720, the process includes receiving, from each of one or more ultrasonic sensors, a distance between each ultrasonic sensor and a front-end of the trailer attached to the vehicle. At 720, the process includes determining one or more angles, each angle corresponding to a distance received from the one or more ultrasonic sensors, wherein each angle is between the vehicle and the trailer, and wherein a trailer angle is determined from the one or more angles. The process may further include features described above in various combinations.



FIG. 8 depicts an example of an apparatus 800 that can be used to implement some of the techniques described in the present document. For example, the hardware platform 800 may implement the process 700, or other processes described above, and/or may implement the various modules described herein. The hardware platform 800 may include a processor 802 that can execute code to implement a method. The hardware platform 800 may include a memory 804 that may be used to store processor-executable code and/or store data. The hardware platform 800 may further include a communication interface 806. For example, the communication interface 806 may implement one or more wired or wireless communication protocols (Ethernet, LTE, Wi-Fi, Bluetooth, and so on).


Implementations of the subject matter and the functional operations described in this patent document can be implemented in various systems, semiconductor devices, ultrasonic devices, digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of them. Implementations of aspects of the subject matter described in this specification can be implemented as one or more computer program products, e.g., one or more modules of computer program instructions encoded on a tangible and non-transitory computer readable medium for execution by, or to control the operation of, data processing apparatus. The computer readable medium can be a machine-readable storage device, a machine-readable storage substrate, a memory device, a composition of matter effecting a machine-readable propagated signal, or a combination of one or more of them. The term “data processing unit” or “data processing apparatus” encompasses all apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, or multiple processors or computers. The apparatus can include, in addition to hardware, code that creates an execution environment for the computer program in question, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, or a combination of one or more of them.


A computer program (also known as a program, software, software application, script, or code) can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment. A computer program does not necessarily correspond to a file in a file system. A program can be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub programs, or portions of code). A computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.


The processes and logic flows described in this specification can be performed by one or more programmable processors executing one or more computer programs to perform functions by operating on input data and generating output. The processes and logic flows can also be performed by, and apparatus can also be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application specific integrated circuit).


Processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor will receive instructions and data from a read only memory or a random-access memory or both. The essential elements of a computer are a processor for performing instructions and one or more memory devices for storing instructions and data. Generally, a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto optical disks, or optical disks. However, a computer need not have such devices. Computer readable media suitable for storing computer program instructions and data include all forms of nonvolatile memory, media and memory devices, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices. The processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.


While this patent document contains many specifics, these should not be construed as limitations on the scope of any invention or of what may be claimed, but rather as descriptions of features that may be specific to particular embodiments of particular inventions. Certain features that are described in this patent document in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a subcombination or variation of a subcombination.


Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. Moreover, the separation of various system components in the embodiments described in this patent document should not be understood as requiring such separation in all embodiments.


Only a few implementations and examples are described, and other implementations, enhancements and variations can be made based on what is described and illustrated in this patent document.

Claims
  • 1. A system for determining a trailer angle between a trailer and a vehicle, comprising: one or more ultrasonic sensors, wherein each ultrasonic sensor is mountable to the vehicle to determine a distance from the ultrasonic sensor to a front-end surface of the trailer attached to the vehicle, and wherein the distance is oriented normal to the vehicle; andan ultrasonic control unit configured to receive the distance from each of the one or more ultrasonic sensors via a communications interface, wherein the ultrasonic control unit determines one or more angles, each angle corresponding to a distance received from the one or more ultrasonic sensors, wherein each angle is between the vehicle and the trailer, and wherein the ultrasonic control unit determines the trailer angle from the one or more angles, wherein a first angle corresponding to a first ultrasonic sensor is determined based on one or more geometrical relationships between a position of the first ultrasonic sensor and the front-end surface of the trailer.
  • 2. The system of claim 1, wherein the first angle is determined from: a first neutral distance between the first ultrasonic sensor and the front-end surface of the trailer when the trailer is in line with the vehicle,a first angled distance when the trailer is angled with respect to the vehicle, anda first offset distance between a center of the first ultrasonic sensor and a center of the vehicle.
  • 3. The system of claim 2, wherein the first neutral distance is determined when a steering angle of the vehicle is about zero degrees and the first neutral distance is averaged or filtered over a period of time.
  • 4. The system of claim 3, wherein the ultrasonic control unit comprises: at least one processor; andat least one memory storing executable instructions that when executed by the at least one processor perform at least: determining the first angle corresponding to a first ultrasonic sensor as: A1=arctangent((D1-1−D1-2)/Dis1),wherein A1 is the first angle between the vehicle and the trailer,wherein D1-1 is a first neutral distance between the first ultrasonic sensor and the front-end surface of the trailer, wherein the first neutral distance is determined when the trailer is in line with the vehicle,wherein D1-2 is a first angled distance between the first ultrasonic sensor and the front-end surface of the trailer, wherein the first angled distance is determined when the trailer is angled with respect to the vehicle, andwherein Dis1 is a first offset distance, wherein the first offset distance is between the center of the first ultrasonic sensor and the center of the vehicle.
  • 5. The system of claim 2, wherein the first neutral distance is determined when: a steering angle of the vehicle is about zero degrees, andthe vehicle is travelling about 10 kilometers per hour or more.
  • 6. The system of claim 1, wherein the trailer angle is determined as an average angle of the one or more angles.
  • 7. The system of claim 6, wherein the average angle is weighted based on a standard deviation for distance values received at the ultrasonic control unit from each of the one or more ultrasonic sensors.
  • 8. The system of claim 7, wherein the ultrasonic control unit determines an error interval and a confidence level based on the standard deviation for distance values received at the ultrasonic control unit from each of the one or more ultrasonic sensors.
  • 9. The system of claim 1, wherein the one or more ultrasonic sensors comprise a plurality of ultrasonic transducers mounted to the vehicle in an inverted “V” pattern, a “W” pattern, or an inverted “W” pattern.
  • 10. A method for determining a trailer angle between a trailer and a vehicle, comprising: receiving, from each of one or more ultrasonic sensors attached to the vehicle, information about a distance between each ultrasonic sensor and a front-end surface of the trailer attached to the vehicle, and wherein the distance is oriented normal to the vehicle; anddetermining one or more angles, each angle corresponding to the information about the distance received from the one or more ultrasonic sensors, wherein each angle is an estimate of alignment between the vehicle and the trailer, and wherein a trailer angle is determined from the one or more angles.
  • 11. The method of claim 10, wherein a first angle corresponding to a first ultrasonic sensor is determined based on one or more geometrical relationships between a position of the first ultrasonic sensor and the front-end surface of the trailer.
  • 12. The method of claim 11, wherein the first angle is determined from: a first neutral distance between the first ultrasonic sensor and the front-end surface of the trailer when the trailer is in line with the vehicle;a first angled distance when the trailer is angled with respect to the vehicle; anda first offset distance between a center of the first ultrasonic sensor and a center of the vehicle.
  • 13. The method of claim 12, wherein the first neutral distance is determined when a steering angle of the vehicle is about zero degrees and the vehicle is travelling at about 10 kilometers per hour or more.
  • 14. The method of claim 11, further comprising: determining the first angle corresponding to a first ultrasonic sensor as: A1=arctangent((D1-1−D1-2)/Dis1),
  • 15. The method of claim 11, wherein the one or more ultrasonic sensors are in a two-dimensional pattern in a plane that is perpendicular to an axis between the vehicle and the trailer.
  • 16. The method of claim 10, wherein the trailer angle is determined as an average angle of the one or more angles.
  • 17. The method of claim 16, wherein the average angle is weighted based on a standard deviation for distance values received from each of the one or more ultrasonic sensors.
  • 18. The method of claim 17, wherein an error interval and a confidence level is determined based on the standard deviation for the distance values received from each of the one or more ultrasonic sensors.
  • 19. A non-transitory computer readable medium storing executable instructions for determining a trailer angle between a trailer and a vehicle that when executed by at least one processor perform at least: receiving, from each of one or more ultrasonic sensors, a distance between each ultrasonic sensor and a front-end surface of the trailer attached to the vehicle, and wherein the distance is oriented normal to the vehicle;determining one or more angles, each angle corresponding to a distance received from the one or more ultrasonic sensors, wherein each angle is between the vehicle and the trailer, anddetermining a first angle corresponding to a first ultrasonic sensor based on one or more geometrical relationships between a position of the first ultrasonic sensor and the front-end surface of the trailer.
  • 20. The non-transitory computer readable medium of claim 18, wherein the first angle is determined from: a first neutral distance between the first ultrasonic sensor and the front-end surface of the trailer when the trailer is in line with the vehicle,a first angled distance when the trailer is angled with respect to the vehicle, anda first offset distance between a center of the first ultrasonic sensor and a center of the vehicle,wherein a trailer angle is determined from the one or more angles.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 16/174,980, titled “DETERMINING AN ANGLE BETWEEN A TOW VEHICLE AND A TRAILER,” filed on Oct. 30, 2018, published as U.S. Pre-Grant Publication 2020-0132835 on Apr. 30, 2020, the disclosure of which is hereby incorporated by reference in its entirety herein.

US Referenced Citations (231)
Number Name Date Kind
6084870 Wooten et al. Jul 2000 A
6263088 Crabtree Jul 2001 B1
6594821 Banning et al. Jul 2003 B1
6777904 Degner Aug 2004 B1
6975923 Spriggs Dec 2005 B2
7103460 Breed Sep 2006 B1
7689559 Canright Mar 2010 B2
7742841 Sakai et al. Jun 2010 B2
7783403 Breed Aug 2010 B2
7844595 Canright Nov 2010 B2
8041111 Wilensky Oct 2011 B1
8064643 Stein Nov 2011 B2
8082101 Stein Dec 2011 B2
8164628 Stein Apr 2012 B2
8175376 Marchesotti May 2012 B2
8271871 Marchesotti Sep 2012 B2
8346480 Trepagnier et al. Jan 2013 B2
8378851 Stein Feb 2013 B2
8392117 Dolgov Mar 2013 B2
8401292 Park Mar 2013 B2
8412449 Trepagnier Apr 2013 B2
8478072 Aisaka Jul 2013 B2
8532870 Hoetzer Sep 2013 B2
8553088 Stein Oct 2013 B2
8706394 Trepagnier et al. Apr 2014 B2
8718861 Montemerlo et al. May 2014 B1
8788134 Litkouhi Jul 2014 B1
8908041 Stein Dec 2014 B2
8917169 Schofield Dec 2014 B2
8917170 Padula Dec 2014 B2
8963913 Baek Feb 2015 B2
8965621 Urmson Feb 2015 B1
8981966 Stein Mar 2015 B2
8983708 Choe et al. Mar 2015 B2
8993951 Schofield Mar 2015 B2
9002632 Emigh Apr 2015 B1
9008369 Schofield Apr 2015 B2
9025880 Perazzi May 2015 B2
9042648 Wang May 2015 B2
9081385 Ferguson et al. Jul 2015 B1
9088744 Grauer et al. Jul 2015 B2
9111444 Kaganovich Aug 2015 B2
9117133 Barnes Aug 2015 B2
9118816 Stein Aug 2015 B2
9120485 Dolgov Sep 2015 B1
9122954 Srebnik Sep 2015 B2
9134402 Sebastian Sep 2015 B2
9145116 Clarke Sep 2015 B2
9147255 Zhang Sep 2015 B1
9156473 Clarke Oct 2015 B2
9176006 Stein Nov 2015 B2
9179072 Stein Nov 2015 B2
9183447 Gdalyahu Nov 2015 B1
9185360 Stein Nov 2015 B2
9191634 Schofield Nov 2015 B2
9214084 Grauer et al. Dec 2015 B2
9219873 Grauer et al. Dec 2015 B2
9233659 Rosenbaum Jan 2016 B2
9233688 Clarke Jan 2016 B2
9248832 Huberman Feb 2016 B2
9248835 Tanzmeister Feb 2016 B2
9251708 Rosenbaum Feb 2016 B2
9277132 Berberian Mar 2016 B2
9280711 Stein Mar 2016 B2
9282144 Tebay et al. Mar 2016 B2
9286522 Stein Mar 2016 B2
9297641 Stein Mar 2016 B2
9299004 Lin Mar 2016 B2
9315192 Zhu Apr 2016 B1
9317033 Ibanez-Guzman Apr 2016 B2
9317776 Honda Apr 2016 B1
9330334 Lin May 2016 B2
9342074 Dolgov May 2016 B2
9347779 Lynch May 2016 B1
9355635 Gao May 2016 B2
9365214 Ben Shalom Jun 2016 B2
9399397 Mizutani Jul 2016 B2
9418549 Kang et al. Aug 2016 B2
9428192 Schofield Aug 2016 B2
9436880 Bos Sep 2016 B2
9438878 Niebla Sep 2016 B2
9443163 Springer Sep 2016 B2
9446765 Ben Shalom Sep 2016 B2
9459515 Stein Oct 2016 B2
9466006 Duan Oct 2016 B2
9476970 Fairfield Oct 2016 B1
9483839 Kwon Nov 2016 B1
9490064 Hirosawa Nov 2016 B2
9494935 Okumura et al. Nov 2016 B2
9507346 Levinson et al. Nov 2016 B1
9513634 Pack et al. Dec 2016 B2
9531966 Stein Dec 2016 B2
9535423 Debreczeni Jan 2017 B1
9538113 Grauer et al. Jan 2017 B2
9547985 Tuukkanen Jan 2017 B2
9549158 Grauer et al. Jan 2017 B2
9555803 Pawlicki Jan 2017 B2
9568915 Berntorp Feb 2017 B1
9587952 Slusar Mar 2017 B1
9599712 Van Der Tempel et al. Mar 2017 B2
9600889 Boisson et al. Mar 2017 B2
9602807 Crane et al. Mar 2017 B2
9612123 Levinson et al. Apr 2017 B1
9620010 Grauer et al. Apr 2017 B2
9625569 Lange Apr 2017 B2
9628565 Stenneth et al. Apr 2017 B2
9649999 Amireddy et al. May 2017 B1
9652860 Maali May 2017 B1
9669827 Ferguson et al. Jun 2017 B1
9672446 Vallesi-Gonzalez Jun 2017 B1
9690290 Prokhorov Jun 2017 B2
9701023 Zhang et al. Jul 2017 B2
9712754 Grauer et al. Jul 2017 B2
9720418 Stenneth Aug 2017 B2
9723097 Harris Aug 2017 B2
9723099 Chen Aug 2017 B2
9723233 Grauer et al. Aug 2017 B2
9726754 Massanell et al. Aug 2017 B2
9729860 Cohen et al. Aug 2017 B2
9738280 Rayes Aug 2017 B2
9739609 Lewis Aug 2017 B1
9746550 Nath Aug 2017 B2
9753128 Schweizer et al. Sep 2017 B2
9753141 Grauer et al. Sep 2017 B2
9754490 Kentley et al. Sep 2017 B2
9760837 Nowozin et al. Sep 2017 B1
9766625 Boroditsky et al. Sep 2017 B2
9769456 You et al. Sep 2017 B2
9773155 Shotton et al. Sep 2017 B2
9779276 Todeschini et al. Oct 2017 B2
9785149 Wang et al. Oct 2017 B2
9805294 Liu et al. Oct 2017 B2
9810785 Grauer et al. Nov 2017 B2
9823339 Cohen Nov 2017 B2
9953236 Huang Apr 2018 B1
10147193 Huang Dec 2018 B2
10223806 Yi et al. Mar 2019 B1
10223807 Yi et al. Mar 2019 B1
10410055 Wang et al. Sep 2019 B2
10670479 Reed Jun 2020 B2
11221262 Reed Jan 2022 B2
20030114980 Klausner et al. Jun 2003 A1
20030174773 Comaniciu Sep 2003 A1
20040264763 Mas et al. Dec 2004 A1
20070067077 Liu Mar 2007 A1
20070183661 El-Maleh Aug 2007 A1
20070183662 Wang Aug 2007 A1
20070230792 Shashua Oct 2007 A1
20070286526 Abousleman Dec 2007 A1
20080249667 Horvitz Oct 2008 A1
20090040054 Wang Feb 2009 A1
20090087029 Coleman Apr 2009 A1
20100049397 Lin Feb 2010 A1
20100111417 Ward May 2010 A1
20100226564 Marchesotti Sep 2010 A1
20100281361 Marchesotti Nov 2010 A1
20110142283 Huang Jun 2011 A1
20110206282 Aisaka Aug 2011 A1
20110247031 Jacoby Oct 2011 A1
20110257860 Getman Oct 2011 A1
20120041636 Johnson et al. Feb 2012 A1
20120105639 Stein May 2012 A1
20120140076 Rosenbaum Jun 2012 A1
20120274629 Baek Nov 2012 A1
20120314070 Zhang et al. Dec 2012 A1
20130051613 Bobbitt et al. Feb 2013 A1
20130083959 Owechko Apr 2013 A1
20130182134 Grundmann et al. Jul 2013 A1
20130204465 Phillips et al. Aug 2013 A1
20130266187 Bulan Oct 2013 A1
20130329052 Chew Dec 2013 A1
20140072170 Zhang Mar 2014 A1
20140104051 Breed Apr 2014 A1
20140142799 Ferguson et al. May 2014 A1
20140143839 Ricci May 2014 A1
20140145516 Hirosawa May 2014 A1
20140198184 Stein Jul 2014 A1
20140321704 Partis Oct 2014 A1
20140334668 Saund Nov 2014 A1
20150062304 Stein Mar 2015 A1
20150269438 Samarsekera et al. Sep 2015 A1
20150310370 Burry Oct 2015 A1
20150353082 Lee et al. Dec 2015 A1
20160008988 Kennedy Jan 2016 A1
20160026787 Nairn et al. Jan 2016 A1
20160037064 Stein Feb 2016 A1
20160094774 Li Mar 2016 A1
20160118080 Chen Apr 2016 A1
20160129907 Kim May 2016 A1
20160165157 Stein Jun 2016 A1
20160210528 Duan Jul 2016 A1
20160275766 Venetianer et al. Sep 2016 A1
20160280261 Kyrtsos Sep 2016 A1
20160321381 English Nov 2016 A1
20160334230 Ross et al. Nov 2016 A1
20160342837 Hong et al. Nov 2016 A1
20160347322 Clarke et al. Dec 2016 A1
20160375907 Erban Dec 2016 A1
20170053169 Cuban et al. Feb 2017 A1
20170061632 Linder et al. Mar 2017 A1
20170080928 Wasiek Mar 2017 A1
20170124476 Levinson et al. May 2017 A1
20170134631 Zhao et al. May 2017 A1
20170177951 Yang et al. Jun 2017 A1
20170301104 Qian Oct 2017 A1
20170305423 Green Oct 2017 A1
20170318407 Meister Nov 2017 A1
20170334484 Koravadi Nov 2017 A1
20180057052 Dodd Mar 2018 A1
20180151063 Pun May 2018 A1
20180158197 Dasgupta Jun 2018 A1
20180260956 Huang Sep 2018 A1
20180283892 Behrendt Oct 2018 A1
20180373980 Huval Dec 2018 A1
20190025853 Julian Jan 2019 A1
20190065863 Luo et al. Feb 2019 A1
20190066329 Yi et al. Feb 2019 A1
20190066330 Yi et al. Feb 2019 A1
20190066344 Yi et al. Feb 2019 A1
20190084477 Gomez-Mendoza Mar 2019 A1
20190108384 Wang et al. Apr 2019 A1
20190132391 Thomas May 2019 A1
20190132392 Liu May 2019 A1
20190170867 Wang Jun 2019 A1
20190210564 Han Jul 2019 A1
20190210613 Sun Jul 2019 A1
20190225286 Schutt Jul 2019 A1
20190236950 Li Aug 2019 A1
20190266420 Ge Aug 2019 A1
20200331441 Sielhorst Oct 2020 A1
20220146285 Daiz May 2022 A1
Foreign Referenced Citations (45)
Number Date Country
106340197 Jan 2017 CN
106781591 May 2017 CN
108010360 May 2018 CN
108016444 May 2018 CN
2608513 Sep 1977 DE
102016105259 Sep 2016 DE
102017125662 May 2018 DE
0433858 Jun 1991 EP
890470 Jan 1999 EP
1754179 Feb 2007 EP
2448251 May 2012 EP
2448251 May 2012 EP
2463843 Jun 2012 EP
2761249 Aug 2014 EP
2946336 Nov 2015 EP
2993654 Mar 2016 EP
3081419 Oct 2016 EP
2470610 Dec 2010 GB
2513392 Oct 2014 GB
2010117207 May 2010 JP
100802511 Feb 2008 KR
1991009375 Jun 1991 WO
2005098739 Oct 2005 WO
2005098751 Oct 2005 WO
2005098782 Oct 2005 WO
2010109419 Sep 2010 WO
2013045612 Apr 2013 WO
2014111814 Jul 2014 WO
2014166245 Oct 2014 WO
2014201324 Dec 2014 WO
2015083009 Jun 2015 WO
2015103159 Jul 2015 WO
2015125022 Aug 2015 WO
2015186002 Dec 2015 WO
2016090282 Jun 2016 WO
2016135736 Sep 2016 WO
2017079349 May 2017 WO
2017079460 May 2017 WO
2017013875 May 2018 WO
2019040800 Feb 2019 WO
2019084491 May 2019 WO
2019084494 May 2019 WO
2019101848 May 2019 WO
2019140277 Jul 2019 WO
2019168986 Sep 2019 WO
Non-Patent Literature Citations (64)
Entry
Van Prooijen, Tom. European Application No. 21179854.1, Extended European Search Report, dated Nov. 10, 2021, pp. 1-8.
US Patent & Trademark Office, Non-final Office Action dated Aug. 26, 2022, in U.S. Appl. No. 17/346,567, 27 pages.
Office Action from corresponding Chinese Patent Application No. 201980072036.1, dated Dec. 8, 2022 (7 pages).
Chinese Search Report from corresponding Chinese Patent Application No. 201980072036.1, dated Dec. 4, 2022 (2 pages).
Carle, Patrick J.F. et al. “Global Rover Localization by Matching Lidar and Orbital 3D Maps.” IEEE, Anchorage Convention District, pp. 1-6, May 3-8, 2010. (Anchorage Alaska, US).
Caselitz, T. et al., “Monocular camera localization in 3D LiDAR maps,” European Conference on Computer Vision (2014) Computer Vision—ECCV 2014. ECCV 2014. Lecture Notes in Computer Science, vol. 8690. Springer, Cham.
Mur-Artal, R. et al., “ORB-SLAM: A Versatile and Accurate Monocular SLAM System,” IEEE Transaction on Robotics, Oct. 2015, pp. 1147-1163, vol. 31, No. 5, Spain.
Sattler, T. et al., “Are Large-Scale 3D Models Really Necessary for Accurate Visual Localization?” CVPR, IEEE, 2017, pp. 1-10.
Engel, J. et la. “LSD-SLAM: Large Scare Direct Monocular SLAM,” pp. 1-16, Munich.
Levinson, Jesse et al., Experimental Robotics, Unsupervised Calibration for Multi-Beam Lasers, pp. 179-194, 12th Ed., Oussama Khatib, Vijay Kumar, Gaurav Sukhatme (Eds.) Springer-Verlag Berlin Heidelberg 2014.
International Application No. PCT/US2019/013322, International Search Report and Written Opinion dated Apr. 2, 2019.
International Application No. PCT/US19/12934, International Search Report and Written Opinion dated Apr. 29, 2019.
International Application No. PCT/US18/53795, International Search Report and Written Opinion dated Dec. 31, 2018.
International Application No. PCT/US18/57484, International Search Report and Written Opinion dated Jan. 7, 2019.
International Application No. PCT/US2018/057851, International Search Report and Written Opinion dated Feb. 1, 2019.
International Application No. PCT/US2019/019839, International Search Report and Written Opinion dated May 23, 2019.
International Application No. PCT/US19/25995, International Search Report and Written Opinion dated Jul. 9, 2019.
Geiger, Andreas et al., “Automatic Camera and Range Sensor Calibration using a single Shot”, Robotics and Automation (ICRA), pp. 1-8, 2012 IEEE International Conference.
Zhang, Z. et al. A Flexible new technique for camera calibration. IEEE Transactions on Pattern Analysis and Machine Intelligence (vol. 22, Issue: 11, Nov. 2000).
International Application No. PCT/US2018/047830, International Search Report and Written Opinion dated Apr. 27, 2017.
Bar-Hillel, Aharon et al. “Recent progress in road and lane detection: a survey.” Machine Vision and Applications 25 (2011): 727-745.
Schindler, Andreas et al. “Generation of high precision digital maps using circular arc splines,” 2012 IEEE Intelligent Vehicles Symposium, Alcala de Henares, 2012, pp. 246-251. doi: 10.1109/IVS.2012.6232124.
International Application No. PCT/US2018/047608, International Search Report and Written Opinion dated Dec. 28, 2018.
Hou, Xiaodi and Zhang, Liqing, “Saliency Detection: A Spectral Residual Approach”, Computer Vision and Pattern Recognition, CVPR'07—IEEE Conference, pp. 1-8, 2007.
Hou, Xiaodi and Harel, Jonathan and Koch, Christof, “Image Signature: Highlighting Sparse Salient Regions”, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 34, No. 1, pp. 194-201, 2012.
Hou, Xiaodi and Zhang, Liqing, “Dynamic Visual Attention: Searching for Coding Length Increments”, Advances in Neural Information Processing Systems, vol. 21, pp. 681-688, 2008.
Li, Yin and Hou, Xiaodi and Koch, Christof and Rehg, James M. and Yuille, Alan L., “The Secrets of Salient Object Segmentation”, Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 280-287, 2014.
Zhou, Bolei and Hou, Xiaodi and Zhang, Liqing, “A Phase Discrepancy Analysis of Object Motion”, Asian Conference on Computer Vision, pp. 225-238, Springer Berlin Heidelberg, 2010.
Hou, Xiaodi and Yuille, Alan and Koch, Christof, “Boundary Detection Benchmarking Beyond F-Measures”, Computer Vision and Pattern Recognition, CVPR'13, vol. 2013, pp. 1-8, IEEE, 2013.
Hou, Xiaodi and Zhang, Liqing, “Color Conceptualization”, Proceedings of the 15th ACM International Conference on Multimedia, pp. 265-268, ACM, 2007.
Hou, Xiaodi and Zhang, Liqing, “Thumbnail Generation Based on Global Saliency”, Advances in Cognitive Neurodynamics, ICCN 2007, pp. 999-1003, Springer Netherlands, 2008.
Hou, Xiaodi and Yuille, Alan and Koch, Christof, “A Meta-Theory of Boundary Detection Benchmarks”, arXiv preprint arXiv: 1302.5985, 2013.
Li, Yanghao and Wang, Naiyan and Shi, Jianping and Liu, Jiaying and Hou, Xiaodi, “Revisiting Batch Normalization for Practical Domain Adaptation”, arXiv preprint arXiv: 1603.04779, 2016.
Li, Yanghao and Wang, Naiyan and Liu, Jiaying and Hou, Xiaodi, “Demystifying Neural Style Transfer”, arXiv preprint arXiv: 1701.01036, 2017.
Hou, Xiaodi and Zhang, Liqing, “A Time-Dependent Model of Information Capacity of Visual Attention”, International Conference on Neural Information Processing, pp. 127-136, Springer Berlin Heidelberg, 2006.
Wang, Panqu and Chen, Pengfei and Yuan, Ye and Liu, Ding and Huang, Zehua and Hou, Xiaodi and Cottrell, Garrison, “Understanding Convolution for Semantic Segmentation”, arXiv preprint arXiv: 1702.08502, 2017.
Li, Yanghao and Wang, Naiyan and Liu, Jiaying and Hou, Xiaodi, “Factorized Bilinear Models for Image Recognition”, arXiv preprint arXiv: 1611.05709, 2016.
Hou, Xiaodi, “Computational Modeling and Psychophysics in Low and Mid-Level Vision”, California Institute of Technology, 2014.
Spinello, Luciano, Triebel, Rudolph, Siegwart, Roland, “Multiclass Multimodal Detection and Tracking in Urban Environments”, Sage Journals, vol. 29 Issue 12, pp. 1498-1515 Article first published online: Oct. 7, 2010; Issue published: Oct. 1, 2010.
Matthew Barth, Carrie Malcolm, Theodore Younglove, and Nicole Hill, “Recent Validation Efforts for a Comprehensive Modal Emissions Model”, Transportation Research Record 1750, Paper No. 01-0326, College of Engineering, Center for Environmental Research and Technology, University of California, Riverside, CA 92521, date unknown.
Kyoungho Ahn, Hesham Rakha, “The Effects of Route Choice Decisions on Vehicle Energy Consumption and Emissions”, Virginia Tech Transportation Institute, Blacksburg, VA 24061, date unknown.
Ramos, Sebastian, Gehrig, Stefan, Pinggera, Peter, Franke, Uwe, Rother, Carsten, “Detecting Unexpected Obstacles for Self-Driving Cars: Fusing Deep Learning and Geometric Modeling”, arXiv: 1612.06573vl [cs.CV] Dec. 20, 2016.
Schroff, Florian, Dmitry Kalenichenko, James Philbin, (Google), “FaceNet: A Unified Embedding for Face Recognition and Clustering”, CVPR 2015.
Dai, Jifeng, Kaiming He, Jian Sun, (Microsoft Research), “Instance-aware Semantic Segmentation via Multi-task Network Cascades”, CVPR 2016.
Huval, Brody, Tao Wang, Sameep Tandon, Jeff Kiske, Will Song, Joel Pazhayampallil, Mykhaylo Andriluka, Pranav Rajpurkar, Toki Migimatsu, Royce Cheng-Yue, Fernando Mujica, Adam Coates, Andrew Y. Ng, “An Empirical Evaluation of Deep Learning on Highway Driving”, arXiv: 1504.01716v3 [cs.RO] Apr. 17, 2015.
Tian Li, “Proposal Free Instance Segmentation Based on Instance-aware Metric”, Department of Computer Science, Cranberry-Lemon University, Pittsburgh, PA., date unknown.
Mohammad Norouzi, David J. Fleet, Ruslan Salakhutdinov, “Hamming Distance Metric Learning”, Departments of Computer Science and Statistics, University of Toronto, date unknown.
Jain, Suyong Dutt, Grauman, Kristen, “Active Image Segmentation Propagation”, In Proceedings of the 1 EEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, Jun. 2016.
MacAodha, Oisin, Campbell, Neill D.F., Kautz, Jan, Brostow, Gabriel J., “Hierarchical Subquery Evaluation for Active Learning on a Graph”, In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2014.
Kendall, Alex, Gal, Yarin, “What Uncertainties Do We Need in Bayesian Deep Learning for Computer Vision”, arXiv: 1703.04977v1 [cs.CV] Mar. 15, 2017.
Wei, Junqing, John M. Dolan, Bakhtiar Litkhouhi, “A Prediction- and Cost Function-Based Algorithm for Robust Autonomous Freeway Driving”, 2010 IEEE Intelligent Vehicles Symposium, University of California, San Diego, CA, USA, Jun. 21-24, 2010.
Peter Welinder, Steve Branson, Serge Belongie, Pietro Perona, “The Multidimensional Wisdom of Crowds” http://www.vision.caltech.edu/visipedia/papers/WelinderEtaINIPS10.pdf, 2010.
Kai Yu, Yang Zhou, Da Li, Zhang Zhang, Kaiqi Huang, “Large-scale Distributed Video Parsing and Evaluation Platform”, Center for Research on Intelligent Perception and Computing, Institute of Automation, Chinese Academy of Sciences, China, arXiv: 1611.09580vl [cs.CV] Nov. 29, 2016.
P. Guarneri, G. Rocca and M. Gobbi, “A Neural-Network-Based Model for the Dynamic Simulation of the Tire/Suspension System While Traversing Road Irregularities,” in IEEE Transactions on Neural Networks, vol. 19, No. 9, pp. 1549-1563, Sep. 2008.
C. Yang, Z. Li, R. Cui and B. Xu, “Neural Network-Based Motion Control of an Underactuated Wheeled Inverted Pendulum Model,” in IEEE Transactions on Neural Networks and Learning Systems, vol. 25, No. 11, pp. 2004-2016, Nov. 2014.
Stephan R. Richter, Vibhav Vineet, Stefan Roth, Vladlen Koltun, “Playing for Data: Ground Truth from Computer Games”, Intel Labs, European Conference on Computer Vision (ECCV), Amsterdam, the Netherlands, 2016.
Thanos Athanasiadis, Phivos Mylonas, Yannis Avrithis, and Stefanos Kollias, “Semantic Image Segmentation and Object Labeling”, IEEE Transactions on Circuits and Systems for Video Technology, vol. 17, No. 3, Mar. 2007.
Marius Cordts, Mohamed Omran, Sebastian Ramos, Timo Rehfeld, Markus Enzweiler Rodrigo Benenson, Uwe Franke, Stefan Roth, and Bernt Schiele, “The Cityscapes Dataset for Semantic Urban Scene Understanding”, Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, Nevada, 2016.
Adhiraj Somani, Nan Ye, David Hsu, and Wee Sun Lee, “DESPOT: Online POMDP Planning with Regularization”, Department of Computer Science, National University of Singapore, date unknown.
Adam Paszke, Abhishek Chaurasia, Sangpil Kim, and Eugenio Culurciello. Enet: A deep neural network architecture for real-time semantic segmentation. CoRR, abs/1606.02147, 2016.
Szeliski, Richard, “Computer Vision: Algorithms and Applications” http://szeliski.org/Book/, 2010.
Office Action Mailed in Chinese Application No. 201810025516.X, dated Sep. 3, 2019.
Luo, Yi et al. U.S. Appl. No. 15/684,389 Notice of Allowance dated Oct. 9, 2019.
International Application No. PCT/US19/58863, International Search Report and Written Opinion dated Feb. 14, 2020, pp. 1-11.
Related Publications (1)
Number Date Country
20210181338 A1 Jun 2021 US
Continuations (1)
Number Date Country
Parent 16174980 Oct 2018 US
Child 17187088 US