This document relates to systems, apparatus, and methods to measure angles and/or orientations on a vehicle with multiple drivable sections.
Autonomous vehicle navigation is a technology that can allow a vehicle to sense the position and movement of vehicles around an autonomous vehicle and, based on the sensing, control the autonomous vehicle to safely navigate towards a destination. An autonomous vehicle may control the steering angle, a throttle amount to control the speed of the autonomous vehicle, gear changes, and/or a breaking amount to control the extent to which the brakes are engaged. An autonomous vehicle may operate in several modes. In some cases, an autonomous vehicle may allow a driver to operate the autonomous vehicle as a conventional vehicle by controlling the steering, throttle, clutch, gear shifter, and/or other devices. In other cases, a driver may engage the autonomous vehicle navigation technology to allow the vehicle to be driven by itself.
This patent document describes systems, apparatus, and methods to measure angles and/or orientations (e.g., directions of rotations) of a rear drivable section of a vehicle relative to a front drivable section of the vehicle.
In an exemplary embodiment a vehicle, comprises a front drivable section that comprises a first connector and a rotary encoder assembly. The first connector is coupled to a chassis of the front drivable section, where the first connector is located towards a rear region of the first drivable section. The rotary encoder assembly comprises a base surface, a housing and a rotary encoder. The base surface of the rotary encoder assembly includes the base surface that is coupled to a surface located below the first connector. The housing of the rotary encoder assembly includes a first end that is at least partially open and a second end that is opposite to the first end, where the second end of the housing is connected to the base surface, and where the first end of the housing is coupled to a housing cap. The rotary encoder assembly includes the rotary encoder that is located in the housing in between the base surface and the housing cap, where the rotary encoder includes a rotatable shaft that protrudes from a first hole located in the housing cap, and where a top of the rotatable shaft located away from the rotary encoder is coupled to one or more magnets.
In some embodiments, the rotary encoder is coupled to the housing cap via a plurality of non-rigid compressible couplings that include a plurality of shoulder screws, and at least some portion of each of the plurality of shoulder screws are located in one of a plurality of springs having first ends that are located below the housing cap and second ends that are opposite to the first ends that are located above the rotary encoder. In some embodiments, the housing cap includes a first set of holes that are along a first perimeter of a first imaginary circle away from an edge of the housing cap, where each hole in the first set of holes includes a low-friction grommet through which a shoulder screw is coupled to the rotary encoder via a spring, where each shoulder screw includes a screw head on one end, a threaded surface on another opposite end, and a smooth shaft between the screw head and the threaded surface, and where the screw head of each shoulder screw is located at or above the housing cap, the smooth shaft of each should shoulder screw is located in one spring, and the threaded surface of each shoulder screw is located in a body of the rotary encoder.
In some embodiments, the housing cap includes a second set of holes that are along a second perimeter of a second imaginary circle that is closer to the edge of the housing cap than the first perimeter of the first imaginary circle that includes the first set of holes, where the second set of holes include screws that couple the housing cap to a flange located at the first end of the housing. In some embodiments, a center region of the first connector includes a third hole, and where a top region of the rotary encoder assembly accessible via the third hole in the first connector. In some embodiments, the plurality of shoulder screws and the plurality of springs are structured to have the rotary encoder retract in a first position away from the housing cap in response to an absence of a metallic material at the third hole in the first connector. In some embodiments, the plurality of shoulder screws and the plurality of springs are structured to have the rotary encoder extend in a second position towards the housing cap in response to a presence of a magnetic material at the third hole in the first connector. In some embodiments, the first connector comprises a groove located towards a rear of the front drivable section, and where the center region of the first connector where the groove ends includes the third hole.
In some embodiments, the vehicle further comprises a rear drivable section located behind the front drivable section, where the rear drivable section comprises a second connector coupled to the first connector, and where the second connector is magnetically coupled to the one or more magnets located on the top of the rotatable shaft of the rotary encoder assembly via the third hole in the first connector. In some embodiments, the rotatable shaft of the rotary encoder is structured to have a rotational movement that corresponds to a circular movement of the second connector of the rear drivable section, and where the circular movement of the second connector of the rear drivable section is translated to the rotational movement of the rotatable shaft via the one or more magnets. In some embodiments, the first connector includes a fifth wheel and where the second connector includes a king pin. In some embodiments, the vehicle includes a semi-trailer truck, where the front drivable section includes a tractor unit, and where the rear drivable section includes a trailer unit.
In some embodiments, the front drivable section includes a computer comprising one or more processors and a memory configured to store one or more programs, where the one or more programs upon execution configure the one or more processors to: receive, from the rotary encoder and while the vehicle is operated on a road, information that indicates an angle or a direction of rotation of the rear drivable section relative to the front drivable section; cause the vehicle to perform an autonomous driving operation based on the angle or the direction of rotation of the rear drivable section. In some embodiments, the one or more processors are configured to cause the vehicle to perform the autonomous driving operation by being configured to: determine that the angle of the rear drivable section is outside a range of angles allowed for the rear drivable section when the vehicle is operated on the road at a speed greater than or equal to a threshold value; and send instruction to a motor in a steering system of the vehicle to cause the vehicle to steer to move the trailer unit within the range of angles allowed for the rear drivable section. In some embodiments, the one or more processors are further configured to: display, on a monitor located in the vehicle, the front drivable section and the rear drivable section, where an orientation of the rear drivable section relative to the front drivable section is displayed based on the angle received from the rotary encoder.
In some embodiments, the rotary encoder is connected to a movable cable, and the base surface includes a second hole located within a region where the housing is connected to the base surface such that at least some portion of the movable cable enters the housing via the second hole
In an exemplary embodiment, a rotary encoder assembly comprises a base surface, a housing, and a rotary encoder. The base surface of the rotary encoder assembly includes a plurality of holes located at close to an edge of the base surface. The housing of the rotary encoder assembly includes a first end that is at least partially open and a second end that is opposite to the first end, where the second end of the housing is connected to the base surface, and where the first end of the housing is coupled to a housing cap. The rotary encoder assembly includes the rotary encoder that is located in the housing in between the base surface and the housing cap, where the rotary encoder includes a rotatable shaft that protrudes from a first hole located in the housing cap, and where a top of the rotatable shaft located away from the rotary encoder is coupled to one or more magnets.
In some embodiments, the top of the rotatable shaft is coupled to the one or more magnets via a shaft adapter that is coupled to the rotatable shaft. In some embodiments, the base surface includes a second hole located within a region where the housing is connected to the base surface such that the rotary encoder is connectable to a movable cable via the second hole, and the second hole in the base surface includes a low-friction grommet through which the movable cable is connectable to the rotary encoder. In some embodiments, the rotary encoder is coupled to the housing cap via a plurality of non-rigid compressible couplings. In some embodiments, the first hole located in the housing cap includes a low-friction grommet through which at least some of the rotatable shaft protrudes from the first hole in the housing cap.
In yet another exemplary aspect, the above-described methods and the methods described in this patent document are embodied in a computer readable program stored on a non-transitory computer readable media. The computer readable program includes code that when executed by a processor, causes the processor to perform the methods described in this patent document.
In yet another exemplary embodiment, a device is disclosed that is configured or operable to perform the above-described methods and/or methods described in this patent document.
The above and other aspects and their implementations are described in greater detail in the drawings, the descriptions, and the claims.
Development in autonomous driving technology had led to the development of passenger vehicles that can autonomously drive passengers to a destination. However, certain unique challenges need to be addressed when autonomous driving technology is employed in a vehicle with multiple drivable sections (e.g., a semi-trailer trucks). For example, semi-trailer trucks can have multiple drivable sections where, for example, a tractor unit where a driver may sit moves differently than a trailer unit where goods may be located, where the trailer unit is connected to the tractor unit. Unlike semi-trailer trucks, passenger vehicles can be more easily maneuvered on roads at least because passenger vehicles tend to have a single rigid body. This patent document describes technology that can enable a vehicle with multiple drivable sections (e.g., a semi-trailer truck with tractor unit and trailer unit, a truck or a car with fifth wheel camper) to measure an angle and/or orientation (e.g., direction or rotation) of a rear drivable section (e.g., trailer unit or fifth wheel camper) relative to a front drivable section (e.g., a tractor unit, truck, or car) so that the vehicle can be autonomously driven by taking into account, for example, the angle and/or orientation of the rear drivable section relative to the front drivable section.
As shown below, in Section I, this patent document describes the devices located on or in a vehicle that can use angle and/or orientation measurements for autonomous driving operations. In Section II of this patent document, technologies are described to enable measurement of the angle and/or orientation of a rear drivable section of a vehicle relative to a front drivable section of the vehicle. The example headings for the various sections below are used to facilitate the understanding of the disclosed subject matter and do not limit the scope of the claimed subject matter in any way. Accordingly, one or more features of one example section can be combined with one or more features of another example section.
I. Example Autonomous Vehicle Technology for Using Angle and/or Orientation Measurements
The vehicle 105 may include various vehicle subsystems that support of the operation of vehicle 105. The vehicle subsystems may include a vehicle drive subsystem 142, a vehicle sensor subsystem 144, and/or a vehicle control subsystem 146. The vehicle drive subsystem 142 may include components operable to provide powered motion for the vehicle 105. In an example embodiment, the vehicle drive subsystem 142 may include an engine or motor, wheels/tires, a transmission, an electrical subsystem, and a power source.
The vehicle sensor subsystem 144 may include a number of sensors configured to sense information about an environment or condition of the vehicle 105. For example, the vehicle sensor subsystem 144 may include a rotary encoder assembly, an inertial measurement unit (IMU), a Global Positioning System (GPS) transceiver, a RADAR unit, a laser range finder/LIDAR unit, and/or one or more cameras or image capture devices. As further explained in Section II of this patent document, the rotary encoder assembly is designed or configured to provide one or more measurements related to angle and/or orientation of a rear drivable section (e.g., tractor unit) relative to the front drivable section (e.g., trailer unit). In some embodiments, the rotary encoder assembly may be an absolute encoder that can provide angle and/or orientation of the rear drivable section relative to the front drivable section. The vehicle sensor subsystem 144 may also include sensors configured to monitor internal systems of the vehicle 105 (e.g., an O2 monitor, a fuel gauge, an engine oil temperature).
The IMU may include any combination of sensors (e.g., accelerometers and gyroscopes) configured to sense position and orientation changes of the vehicle 105 based on inertial acceleration. The GPS transceiver may be any sensor configured to estimate a geographic location of the vehicle 105. For this purpose, the GPS transceiver may include a receiver/transmitter operable to provide information regarding the position of the vehicle 105 with respect to the Earth. The RADAR unit may represent a system that utilizes radio signals to sense objects within the local environment of the vehicle 105. In some embodiments, in addition to sensing the objects, the RADAR unit may additionally be configured to sense the speed and the heading of the objects proximate to the vehicle 105. The laser range finder or LIDAR unit may be any sensor configured to sense objects in the environment in which the vehicle 105 is located using lasers. The cameras may include one or more devices configured to capture a plurality of images of the environment of the vehicle 105. The cameras may be still image cameras or motion video cameras.
The vehicle control system 146 may be configured to control operation of the vehicle 105 and its components. Accordingly, the vehicle control system 146 may include various elements such as a throttle, a brake unit, a navigation unit, and/or a steering system.
The throttle may be configured to control, for instance, the operating speed of the engine and, in turn, control the speed of the vehicle 105. The brake unit can include any combination of mechanisms configured to decelerate the vehicle 105. The brake unit can use friction to slow the wheels in a standard manner. The navigation unit may be any system configured to determine a driving path or route for the vehicle 105. The navigation unit may additionally be configured to update the driving path dynamically while the vehicle 105 is in operation. In some embodiments, the navigation unit may be configured to incorporate data from the GPS transceiver and one or more predetermined maps so as to determine the driving path for the vehicle 105. The steering system may represent any combination of mechanisms that may be operable to adjust the heading of vehicle 105 in an autonomous mode or in a driver-controlled mode.
Many or all of the functions of the vehicle 105 can be controlled by the in-vehicle control computer 150. The in-vehicle control computer 150 may include at least one data processor 170 (which can include at least one microprocessor) that executes processing instructions stored in a non-transitory computer readable medium, such as the memory 175. The in-vehicle control computer 150 may also represent a plurality of computing devices that may serve to control individual components or subsystems of the vehicle 105 in a distributed fashion. In some embodiments, the data storage device 175 may contain processing instructions (e.g., program logic) executable by the data processor 170 to perform various methods and/or functions of the vehicle 105, including those described in this patent document. For instance, as further explained in Section II of this patent document, the data processor 170 executes the operations associated with autonomous driving module 165 for using the information provided by the rotary encoder assembly (e.g., the angle and/or orientation of the rear drivable section relative to the front drivable section of a vehicle 105) for operating the various vehicle subsystems (e.g., the vehicle drive subsystem 142, the vehicle sensor subsystem 144, and the vehicle control subsystem 146) of the vehicle 105 to autonomously operate the vehicle 105. The data storage device 175 may contain additional instructions as well, including instructions to transmit data to, receive data from, interact with, or control one or more of the vehicle drive subsystem 142, the vehicle sensor subsystem 144, and the vehicle control subsystem 146. The in-vehicle control computer 150 can be configured to include a data processor 170 and a data storage device 175.
The in-vehicle control computer 150 may control the function of the vehicle 105 based on inputs received from various vehicle subsystems (e.g., the vehicle drive subsystem 142, the vehicle sensor subsystem 144, and the vehicle control subsystem 146). For example, the in-vehicle control computer 150 may use data from the rotary encoder assembly in order to control the steering system to turn the vehicle 105 by taking into account the orientation and angle of the rear drivable section relative to the front drivable section. In an example embodiment, the in-vehicle control computer 150 can be operable to provide control over many aspects of the vehicle 105 and its subsystems.
II. Example Rotary Encoder Assembly for Performing Angle and/or Orientation Measurements
One of the technical benefits having the rotary encoder assembly 304 couple to the mounting surface 316 below the fifth wheel 302 is that rotary encoder assembly 304 can be installed on a single tractor unit rather than being installed on multiple trailer units. Furthermore, the rotary encoder assembly 304 can installed on the tractor unit without involving the driver. Another technical benefit of the rotary encoder assembly being located on or in the tractor unit is that the rotary encoder assembly 304 can be communicably coupled (e.g., via a cable 314 in
The base surface 404 may include four holes as shown in
The flange 410 of the housing 402 includes a plurality of holes that are located around the perimeter of the flange so that a housing cap 414 can be coupled to the housing 402 via screws. For example, as shown in
The housing cap 414 can protect the rotary encoder 408 from the environment, such as from grease or debris that may fall from the trailer unit. The housing cap 414 may have flat circular shape which can correspond to the cylindrical shape of the housing 402 in some embodiments. The shape of housing cap 414 may extend up to an edge of the flange 410 or up to an outer wall of the housing 402. The housing cap 414 includes two sets of holes. A first set of holes in the housing cap 414 are located at a first perimeter that is close to the outer edge of the housing cap 414 so that a first set of a plurality of screws (e.g., four screws 412a-412d in
Each shoulder screw is inserted through a hole in the housing cap 414 thorough a low-friction grommet 420 or a smooth grommet 420 and then through a compression spring 418 to couple with the rotary encoder 408. The low-friction grommets or smooth grommets may be low-friction nylon shaft grommets located at the second set of holes in between the shoulder screws and the second set of holes. The ends of the compressions springs 418 are located in between the rotary encoder 408 and the housing cap 414 so that when the compression springs 418 are mostly extended, the rotary encoder 408 is in a retracted position away from the housing cap 414 within the housing 402 (as shown in
At least some portion of the smooth shaft of each shoulder screw is located inside the compression spring 418 so that the shoulder screws 416a-416d function as guide rods for compressions springs 418 which can create non-rigid compressible connections or non-rigid compressible couplings between the rotary encoder 408 and the housing 402 and/or between the rotary encoder 408 and the housing cap 414. As shown in
The rotary encoder 408 may be a commercial off the shelf (COTS) absolute encoder that may provide an angle and/or orientation (e.g., direction of rotation) information to the in-vehicle control computer. In some embodiments, the rotary encoder 408 may be a mechanical rotary encoder, an optical rotary encoder, or electrical rotary encoder. A top region of the rotary encoder 408 includes a rotatable shaft 424. The rotary encoder 408 measures an angle and/or orientation of the rotatable shaft 424 relative to the base surface 404 or the housing 402 of the rotary encoder assembly 400 so that the rotary encoder 408 provides electrical signals to the in-vehicle control computer indicating an angle and/or orientation of the rotatable shaft 424 relative to the base surface 404 or the housing 404 of the rotary encoder assembly 400.
After the rotatable shaft 424 extends with the rotary encoder 408 to couple to the king pin via the one or more magnets 430 and the shaft adapter 424 (as further explained below), the rotatable shaft 424 rotates together with the king pin as the trailer turns. Since the base surface 404 of the rotatable encoder assembly 400 is coupled to the mounting surface below the fifth wheel and since the rotary encoder 408 is coupled to the housing cap via the shoulder screws, when the rotary encoder 408 with the rotatable shaft 424 extends to magnetically couple the king pin, the rotatable shaft 424 of the rotary encoder 408 turns with the king pin while the body of the rotary encoder 408 mostly does not turn with the king pin. Thus, the circular movement of the king pin is translated to the rotatable movement or rotational movement of the rotatable shaft 424, which is used by the rotary encoder 408 to measure the angle and/or orientation of the trailer unit relative to the tractor unit. The rotatable shaft 424 protrudes or extends from a hole 426 in the center of the housing cap 414. As shown in
A top region of the rotatable shaft 424 includes a shaft adapter 428 which can be coupled to the rotatable shaft by press fit or by a screw. The shaft adapter 428 may have a bottom region (e.g., a cylindrical shape region) that can couple to the rotatable shaft 424. Above the bottom region, the shaft adapter 428 may have a top region (e.g., a cylindrical shape region) that extends outward from the bottom region of the shaft adapter 428. The top region of the shaft adapter 428 has a larger width than the bottom region of the shaft adapter 428 so that the top region of the shaft adapter 428 can be coupled to one or more magnets 430. As shown in
The base surface 404, the housing 402, the housing cap 414, and/or the shaft adapter 428 can be made of machined aluminum or other non-magnetic metal so that at least these parts do not interfere with the magnetic coupling operation of the one or more magnets 430 with the king pin of the trailer unit.
In
In another example, when the vehicle 105 is stopped at a traffic sign and is about to make a turn, the autonomous driving module 165 can determine, using the video provided by the one or more cameras and the angle and/or orientation information from the rotary encoder assembly, that the trailer unit may hit another vehicle is located close to the trailer unit of the vehicle 105. In this example, the autonomous driving module 165 can determine a trajectory for the vehicle 105 to steer the vehicle 105 in a way to avoid having the trailer unit hit the vehicle or the autonomous driving module 165 may keep applying brakes until the another vehicle has driven away so that the autonomous driving module 165 can safely turn the vehicle 105.
In yet another example, when the autonomous driving module 165 determines that the vehicle 105 is being driven on a road and that the trailer unit not within a range of angles from a position of the tractor unit (e.g., within ±2 degrees of 180 degrees from the position of the tractor unit or within 179 degrees and 181 degrees from the position of the tractor unit), then the autonomous driving module 165 may apply corrective steering to move the trailer unit to be within the range of angles from the position of the tractor unit. In this example, a gust of wind may have moved the trailer unit relative to the tractor unit so that if the trailer unit move to the left (assuming the vehicle 105 is driven in a north direction), then the autonomous driving module 165 can instruct the motors in the steering system to turn to the left to move the trailer unit back so that the angle formed by the trailer unit relative to the tractor unit is within the range of angles. In some embodiments, the range of angles may be pre-determined or the range of angles may be a function of the speed of the vehicle (e.g., the range of angles is ±2 degrees of 180 degrees if the speed of the vehicle 105 is greater than or equal to a threshold value (e.g., 40 mph) and is ±1 degrees of 180 degrees if the speed of the vehicle is greater than or equal to another threshold value (e.g., 60 mph)).
In some embodiments, the autonomous driving module 165 can update an estimate of the center of gravity to establish a stable boundary around the vehicle 105 based on the angle and/or orientation information provided by the rotary encoder assembly. Knowing the center of gravity of the trailer unit as well as the tractor unit (also known as bobtail), the angle subtended by the trailer unit and the tractor unit can determine where an instantaneous center of gravity of the truck as a whole may be located. Unlike rigid vehicle such as cars and buses, vehicles with multiple drivable sections (e.g., a semi-trailer truck) have a moving center of gravity since it can be composed of at least two objects that are joined by the fifth wheel. In some embodiments, the autonomous driving module 165 may refer to a standard equation that provides an estimate of center of gravity based on an angle measured by the rotary encoder assembly.
The following section describes example features as described in this document:
Feature 1: A truck, comprising: a tractor, comprising: a first connector; and an angle measuring device coupled to the first connector; and a trailer, comprising: a second connector, connecting to the first connector; and at least one magnetic device coupled to the second connector, wherein the angle measuring device measures a motion of the at least one magnetic device, wherein an angle between the tractor and the trailer is determined based on a measurement conducted by the angle measuring device.
Feature 2: The truck of feature 1, wherein the first connector comprises a fifth-wheel, the second connector comprises a kingpin.
Feature 3: The truck of feature 1, wherein the angle measuring device comprises a rotary encoder.
Feature 4: The truck of feature 3, wherein the motion of the at least one magnetic device is translated to a shaft of the rotary encoder.
Feature 5: The truck of feature 1, wherein the angle measuring device measures a rotation of the second connector by measuring the at least one magnetic device.
Feature 6: The truck of feature 1, wherein a boundary of the truck is depicted based on the angle between the tractor and the trailer.
Feature 7: The truck of feature 1, wherein the at least one magnetic device comprises at least one of magnet attached on the second connector, wherein the at least one of magnet is detachable.
Feature 8: The truck of feature 7, wherein at least one sensor is attached to the second connector via the at least one magnet.
Feature 9: A tractor, configured to tow a trailer of a truck, comprising: a first connector, connecting to a second connector of the trailer; and an angle measuring device coupled to the first connector, wherein the angle measuring device measures a motion of at least one magnetic device coupled to the second connector, wherein an angle between the tractor and the trailer is determined based on a measurement conducted by the angle measuring device.
Feature 10: A trailer, configured to be towed by a tractor of a truck, wherein the tractor comprises a first connector and an angle measuring device coupled to the first connector, the trailer comprises: a second connector, connecting to the first connector; and at least one magnetic device coupled to the second connector, wherein the angle measuring device measures a motion of the at least one magnetic device, wherein an angle between the tractor and the trailer is determined based on a measurement conducted by the angle measuring device.
Feature 11: A system, comprising: an internet server, comprising: an I/O port, configured to transmit and receive electrical signals to and from a client device; a memory; one or more processing units; and one or more programs stored in the memory, the one or more programs configured to cause the one or more processing units to perform at least: measuring, by an angle measuring device coupled to a first connector of a tractor, a motion of at least one magnetic device, wherein the at least one magnetic device is coupled to a second connector of a trailer, wherein the first connector is connected to the second connector; and determining an angle between the tractor and the trailer based on a measurement conducted by the angle measuring device.
Feature 12: A method, comprising: measuring, by an angle measuring device coupled to a first connector of a tractor, a motion of at least one magnetic device, wherein the at least one magnetic device is coupled to a second connector of a trailer, wherein the first connector is connected to the second connector; and determining an angle between the tractor and the trailer based on a measurement conducted by the angle measuring device.
Feature 13: A non-transitory computer-readable medium storing a program causing a computer to execute a process, the process comprising: measuring, by an angle measuring device coupled to a first connector of a tractor, a motion of at least one magnetic device, wherein the at least one magnetic device is coupled to a second connector of a trailer, wherein the first connector is connected to the second connector; and determining an angle between the tractor and the trailer based on a measurement conducted by the angle measuring device.
In this document the term “exemplary” is used to mean “an example of” and, unless otherwise stated, does not imply an ideal or a preferred embodiment. In this document, while the techniques to measure angle and/or orientation (e.g., direction of rotation) of a rear drivable section relative to a front drivable section is described in the context of a semi-trailer truck, the rotary encoder assembly may be installed on other types of multiple drivable sections (e.g., on or in a hitch on a truck or on a car with fifth wheel camper).
Some of the embodiments described herein are described in the general context of methods or processes, which may be implemented in one embodiment by a computer program product, embodied in a computer-readable medium, including computer-executable instructions, such as program code, executed by computers in networked environments. A computer-readable medium may include removable and non-removable storage devices including, but not limited to, Read Only Memory (ROM), Random Access Memory (RAM), compact discs (CDs), digital versatile discs (DVDs), etc. Therefore, the computer-readable media can include a non-transitory storage media. Generally, program modules may include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. Computer- or processor-executable instructions, associated data structures, and program modules represent examples of program code for executing steps of the methods disclosed herein. The particular sequence of such executable instructions or associated data structures represents examples of corresponding acts for implementing the functions described in such steps or processes.
Some of the disclosed embodiments can be implemented as devices or modules using hardware circuits, software, or combinations thereof. For example, a hardware circuit implementation can include discrete analog and/or digital components that are, for example, integrated as part of a printed circuit board. Alternatively, or additionally, the disclosed components or modules can be implemented as an Application Specific Integrated Circuit (ASIC) and/or as a Field Programmable Gate Array (FPGA) device. Some implementations may additionally or alternatively include a digital signal processor (DSP) that is a specialized microprocessor with an architecture optimized for the operational needs of digital signal processing associated with the disclosed functionalities of this application. Similarly, the various components or sub-components within each module may be implemented in software, hardware or firmware. The connectivity between the modules and/or components within the modules may be provided using any one of the connectivity methods and media that is known in the art, including, but not limited to, communications over the Internet, wired, or wireless networks using the appropriate protocols.
While this document contains many specifics, these should not be construed as limitations on the scope of an invention that is claimed or of what may be claimed, but rather as descriptions of features specific to particular embodiments. Certain features that are described in this document in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable sub-combination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a sub-combination or a variation of a sub-combination. Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results.
Only a few implementations and examples are described and other implementations, enhancements and variations can be made based on what is described and illustrated in this disclosure.
This patent document claims the priority to and the benefits of U.S. Provisional Application No. 63/040,662 entitled “TRAILER ANGLE MEASUREMENT USING A ROTARY ENCODER TO THE FIFTH WHEEL” filed on Jun. 18, 2020. The entire disclosure of the aforementioned application is hereby incorporated by reference as part of the disclosure of this application.
Number | Name | Date | Kind |
---|---|---|---|
6084870 | Wooten et al. | Jul 2000 | A |
6263088 | Crabtree et al. | Jul 2001 | B1 |
6594821 | Banning et al. | Jul 2003 | B1 |
6777904 | Degner et al. | Aug 2004 | B1 |
6975923 | Spriggs | Dec 2005 | B2 |
7103460 | Breed | Sep 2006 | B1 |
7689559 | Canright et al. | Mar 2010 | B2 |
7742841 | Sakai et al. | Jun 2010 | B2 |
7783403 | Breed | Aug 2010 | B2 |
7844595 | Canright et al. | Nov 2010 | B2 |
8041111 | Wilensky | Oct 2011 | B1 |
8064643 | Stein et al. | Nov 2011 | B2 |
8082101 | Stein et al. | Dec 2011 | B2 |
8164628 | Stein et al. | Apr 2012 | B2 |
8175376 | Marchesotti | May 2012 | B2 |
8271871 | Marchesotti | Sep 2012 | B2 |
8346480 | Trepagnier et al. | Jan 2013 | B2 |
8378851 | Stein et al. | Feb 2013 | B2 |
8392117 | Dolgov et al. | Mar 2013 | B2 |
8401292 | Park et al. | Mar 2013 | B2 |
8412449 | Trepagnier et al. | Apr 2013 | B2 |
8478072 | Aisaka et al. | Jul 2013 | B2 |
8532870 | Hoetzer et al. | Sep 2013 | B2 |
8553088 | Stein et al. | Oct 2013 | B2 |
8706394 | Trepagnier et al. | Apr 2014 | B2 |
8718861 | Montemerlo et al. | May 2014 | B1 |
8788134 | Litkouhi et al. | Jul 2014 | B1 |
8908041 | Stein et al. | Dec 2014 | B2 |
8917169 | Schofield et al. | Dec 2014 | B2 |
8917170 | Padula | Dec 2014 | B2 |
8963913 | Baek | Feb 2015 | B2 |
8965621 | Urmson et al. | Feb 2015 | B1 |
8981966 | Stein et al. | Mar 2015 | B2 |
8983708 | Choe et al. | Mar 2015 | B2 |
8993951 | Schofield | Mar 2015 | B2 |
9002632 | Emigh | Apr 2015 | B1 |
9008369 | Schofield | Apr 2015 | B2 |
9025880 | Perazzi et al. | May 2015 | B2 |
9042648 | Wang et al. | May 2015 | B2 |
9081385 | Ferguson et al. | Jul 2015 | B1 |
9088744 | Grauer et al. | Jul 2015 | B2 |
9111444 | Kaganovich | Aug 2015 | B2 |
9117133 | Barnes et al. | Aug 2015 | B2 |
9118816 | Stein | Aug 2015 | B2 |
9120485 | Dolgov | Sep 2015 | B1 |
9122954 | Srebnik et al. | Sep 2015 | B2 |
9134402 | Sebastian et al. | Sep 2015 | B2 |
9145116 | Clarke et al. | Sep 2015 | B2 |
9147255 | Zhang et al. | Sep 2015 | B1 |
9156473 | Clarke et al. | Oct 2015 | B2 |
9176006 | Stein | Nov 2015 | B2 |
9179072 | Stein et al. | Nov 2015 | B2 |
9183447 | Gdalyahu et al. | Nov 2015 | B1 |
9185360 | Stein et al. | Nov 2015 | B2 |
9191634 | Schofield et al. | Nov 2015 | B2 |
9214084 | Grauer et al. | Dec 2015 | B2 |
9219873 | Grauer et al. | Dec 2015 | B2 |
9233659 | Rosenbaum et al. | Jan 2016 | B2 |
9233688 | Clarke et al. | Jan 2016 | B2 |
9248832 | Huberman | Feb 2016 | B2 |
9248835 | Tanzmeister | Feb 2016 | B2 |
9251708 | Rosenbaum et al. | Feb 2016 | B2 |
9277132 | Berberian | Mar 2016 | B2 |
9280711 | Stein | Mar 2016 | B2 |
9282144 | Tebay et al. | Mar 2016 | B2 |
9286522 | Stein et al. | Mar 2016 | B2 |
9297641 | Stein | Mar 2016 | B2 |
9299004 | Lin et al. | Mar 2016 | B2 |
9315192 | Zhu et al. | Apr 2016 | B1 |
9317033 | Ibanez-guzman et al. | Apr 2016 | B2 |
9317776 | Honda et al. | Apr 2016 | B1 |
9330334 | Lin et al. | May 2016 | B2 |
9342074 | Dolgov et al. | May 2016 | B2 |
9347779 | Lynch | May 2016 | B1 |
9355635 | Gao et al. | May 2016 | B2 |
9365214 | Ben Shalom et al. | Jun 2016 | B2 |
9399397 | Mizutani et al. | Jul 2016 | B2 |
9418549 | Kang et al. | Aug 2016 | B2 |
9428192 | Schofield et al. | Aug 2016 | B2 |
9436880 | Bos et al. | Sep 2016 | B2 |
9438878 | Niebla, Jr. et al. | Sep 2016 | B2 |
9443163 | Springer | Sep 2016 | B2 |
9446765 | Ben Shalom | Sep 2016 | B2 |
9459515 | Stein | Oct 2016 | B2 |
9466006 | Duan | Oct 2016 | B2 |
9476970 | Fairfield et al. | Oct 2016 | B1 |
9483839 | Kwon et al. | Nov 2016 | B1 |
9490064 | Hirosawa et al. | Nov 2016 | B2 |
9494935 | Okumura et al. | Nov 2016 | B2 |
9507346 | Levinson et al. | Nov 2016 | B1 |
9513634 | Pack et al. | Dec 2016 | B2 |
9531966 | Stein et al. | Dec 2016 | B2 |
9535423 | Debreczeni | Jan 2017 | B1 |
9538113 | Grauer et al. | Jan 2017 | B2 |
9547985 | Tuukkanen | Jan 2017 | B2 |
9549158 | Grauer et al. | Jan 2017 | B2 |
9555803 | Pawlicki et al. | Jan 2017 | B2 |
9568915 | Berntorp et al. | Feb 2017 | B1 |
9587952 | Slusar | Mar 2017 | B1 |
9599712 | Van Der Tempel et al. | Mar 2017 | B2 |
9600889 | Boisson et al. | Mar 2017 | B2 |
9602807 | Crane et al. | Mar 2017 | B2 |
9612123 | Levinson et al. | Apr 2017 | B1 |
9620010 | Grauer et al. | Apr 2017 | B2 |
9625569 | Lange | Apr 2017 | B2 |
9628565 | Stenneth et al. | Apr 2017 | B2 |
9649999 | Amireddy et al. | May 2017 | B1 |
9652860 | Maali et al. | May 2017 | B1 |
9669827 | Ferguson et al. | Jun 2017 | B1 |
9672446 | Vallesi-Gonzalez | Jun 2017 | B1 |
9690290 | Prokhorov | Jun 2017 | B2 |
9701023 | Zhang et al. | Jul 2017 | B2 |
9712754 | Grauer et al. | Jul 2017 | B2 |
9720418 | Stenneth | Aug 2017 | B2 |
9723097 | Harris et al. | Aug 2017 | B2 |
9723099 | Chen et al. | Aug 2017 | B2 |
9723233 | Grauer et al. | Aug 2017 | B2 |
9726754 | Massanell et al. | Aug 2017 | B2 |
9729860 | Cohen et al. | Aug 2017 | B2 |
9738280 | Rayes | Aug 2017 | B2 |
9739609 | Lewis | Aug 2017 | B1 |
9746550 | Nath et al. | Aug 2017 | B2 |
9753128 | Schweizer et al. | Sep 2017 | B2 |
9753141 | Grauer et al. | Sep 2017 | B2 |
9754490 | Kentley et al. | Sep 2017 | B2 |
9760837 | Nowozin et al. | Sep 2017 | B1 |
9766625 | Boroditsky et al. | Sep 2017 | B2 |
9769456 | You et al. | Sep 2017 | B2 |
9773155 | Shotton et al. | Sep 2017 | B2 |
9779276 | Todeschini et al. | Oct 2017 | B2 |
9785149 | Wang et al. | Oct 2017 | B2 |
9805294 | Liu et al. | Oct 2017 | B2 |
9810785 | Grauer et al. | Nov 2017 | B2 |
9823339 | Cohen | Nov 2017 | B2 |
9953236 | Huang et al. | Apr 2018 | B1 |
10147193 | Huang et al. | Dec 2018 | B2 |
10223806 | Yi et al. | Mar 2019 | B1 |
10223807 | Yi et al. | Mar 2019 | B1 |
10410055 | Wang et al. | Sep 2019 | B2 |
10670479 | Reed | Jun 2020 | B2 |
10942271 | Han et al. | Mar 2021 | B2 |
11221262 | Reed | Jan 2022 | B2 |
20030114980 | Klausner et al. | Jun 2003 | A1 |
20030174773 | Comaniciu et al. | Sep 2003 | A1 |
20040264763 | Mas et al. | Dec 2004 | A1 |
20070034787 | Mutschler | Feb 2007 | A1 |
20070067077 | Liu et al. | Mar 2007 | A1 |
20070183661 | El-Maleh et al. | Aug 2007 | A1 |
20070183662 | Wang et al. | Aug 2007 | A1 |
20070230792 | Shashua et al. | Oct 2007 | A1 |
20070286526 | Abousleman et al. | Dec 2007 | A1 |
20080249667 | Horvitz et al. | Oct 2008 | A1 |
20090040054 | Wang et al. | Feb 2009 | A1 |
20090087029 | Coleman et al. | Apr 2009 | A1 |
20100049397 | Liu et al. | Feb 2010 | A1 |
20100111417 | Ward et al. | May 2010 | A1 |
20100226564 | Marchesotti et al. | Sep 2010 | A1 |
20100281361 | Marchesotti | Nov 2010 | A1 |
20110142283 | Huang et al. | Jun 2011 | A1 |
20110206282 | Aisaka et al. | Aug 2011 | A1 |
20110247031 | Jacoby | Oct 2011 | A1 |
20110257860 | Getman et al. | Oct 2011 | A1 |
20120041636 | Johnson et al. | Feb 2012 | A1 |
20120105639 | Stein et al. | May 2012 | A1 |
20120140076 | Rosenbaum et al. | Jun 2012 | A1 |
20120274629 | Baek | Nov 2012 | A1 |
20120314070 | Zhang et al. | Dec 2012 | A1 |
20130051613 | Bobbitt et al. | Feb 2013 | A1 |
20130083959 | Owechko et al. | Apr 2013 | A1 |
20130182134 | Grundmann et al. | Jul 2013 | A1 |
20130204465 | Phillips et al. | Aug 2013 | A1 |
20130266187 | Bulan et al. | Oct 2013 | A1 |
20130329052 | Chew | Dec 2013 | A1 |
20140072170 | Zhang et al. | Mar 2014 | A1 |
20140104051 | Breed | Apr 2014 | A1 |
20140142799 | Ferguson et al. | May 2014 | A1 |
20140143839 | Ricci | May 2014 | A1 |
20140145516 | Hirosawa et al. | May 2014 | A1 |
20140198184 | Stein et al. | Jul 2014 | A1 |
20140321704 | Partis | Oct 2014 | A1 |
20140334668 | Saund | Nov 2014 | A1 |
20150062304 | Stein et al. | Mar 2015 | A1 |
20150269438 | Samarsekera et al. | Sep 2015 | A1 |
20150310370 | Burry et al. | Oct 2015 | A1 |
20150353082 | Lee et al. | Dec 2015 | A1 |
20160008988 | Kennedy et al. | Jan 2016 | A1 |
20160026787 | Nairn et al. | Jan 2016 | A1 |
20160037064 | Stein et al. | Feb 2016 | A1 |
20160094774 | Li et al. | Mar 2016 | A1 |
20160118080 | Chen | Apr 2016 | A1 |
20160129907 | Kim et al. | May 2016 | A1 |
20160165157 | Stein et al. | Jun 2016 | A1 |
20160210528 | Duan | Jul 2016 | A1 |
20160275766 | Venetianer et al. | Sep 2016 | A1 |
20160280261 | Kyrtsos et al. | Sep 2016 | A1 |
20160321381 | English et al. | Nov 2016 | A1 |
20160334230 | Ross et al. | Nov 2016 | A1 |
20160342837 | Hong et al. | Nov 2016 | A1 |
20160347322 | Clarke et al. | Dec 2016 | A1 |
20160375907 | Erban | Dec 2016 | A1 |
20170053169 | Cuban et al. | Feb 2017 | A1 |
20170061632 | Linder et al. | Mar 2017 | A1 |
20170080928 | Wasiek et al. | Mar 2017 | A1 |
20170124476 | Levinson et al. | May 2017 | A1 |
20170134631 | Zhao et al. | May 2017 | A1 |
20170177951 | Yang et al. | Jun 2017 | A1 |
20170301104 | Qian | Oct 2017 | A1 |
20170305423 | Green | Oct 2017 | A1 |
20170318407 | Meister | Nov 2017 | A1 |
20170334484 | Koravadi | Nov 2017 | A1 |
20180057052 | Dodd | Mar 2018 | A1 |
20180151063 | Pun et al. | May 2018 | A1 |
20180158197 | Dasgupta et al. | Jun 2018 | A1 |
20180260956 | Huang et al. | Sep 2018 | A1 |
20180283892 | Behrendt | Oct 2018 | A1 |
20180373980 | Huval | Dec 2018 | A1 |
20190025853 | Julian | Jan 2019 | A1 |
20190065863 | Luo et al. | Feb 2019 | A1 |
20190066329 | Luo et al. | Feb 2019 | A1 |
20190066330 | Luo et al. | Feb 2019 | A1 |
20190066344 | Luo et al. | Feb 2019 | A1 |
20190084477 | Gomez-mendoza et al. | Mar 2019 | A1 |
20190108384 | Wang et al. | Apr 2019 | A1 |
20190132391 | Thomas et al. | May 2019 | A1 |
20190132392 | Liu et al. | May 2019 | A1 |
20190170867 | Wang et al. | Jun 2019 | A1 |
20190210564 | Han et al. | Jul 2019 | A1 |
20190210613 | Sun et al. | Jul 2019 | A1 |
20190225286 | Schutt | Jul 2019 | A1 |
20190236950 | Li et al. | Aug 2019 | A1 |
20190266420 | Ge et al. | Aug 2019 | A1 |
20200331441 | Sielhorst | Oct 2020 | A1 |
20220146285 | Dölz | May 2022 | A1 |
Number | Date | Country |
---|---|---|
106340197 | Jan 2017 | CN |
106781591 | May 2017 | CN |
108010360 | May 2018 | CN |
2608513 | Sep 1977 | DE |
102016105259 | Sep 2016 | DE |
102017125662 | May 2018 | DE |
0433858 | Jun 1991 | EP |
890470 | Jan 1999 | EP |
1754179 | Feb 2007 | EP |
2448251 | May 2012 | EP |
2463843 | Jun 2012 | EP |
2761249 | Aug 2014 | EP |
2946336 | Nov 2015 | EP |
2993654 | Mar 2016 | EP |
3081419 | Oct 2016 | EP |
2470610 | Dec 2010 | GB |
2513392 | Oct 2014 | GB |
2010117207 | May 2010 | JP |
100802511 | Feb 2008 | KR |
1991009375 | Jun 1991 | WO |
2005098739 | Oct 2005 | WO |
2005098751 | Oct 2005 | WO |
2005098782 | Oct 2005 | WO |
2010109419 | Sep 2010 | WO |
2013045612 | Apr 2013 | WO |
2014111814 | Jul 2014 | WO |
2014166245 | Oct 2014 | WO |
2014201324 | Dec 2014 | WO |
2015083009 | Jun 2015 | WO |
2015103159 | Jul 2015 | WO |
2015125022 | Aug 2015 | WO |
2015186002 | Dec 2015 | WO |
2016090282 | Jun 2016 | WO |
2016135736 | Sep 2016 | WO |
2017079349 | May 2017 | WO |
2017079460 | May 2017 | WO |
2017013875 | May 2018 | WO |
2019040800 | Feb 2019 | WO |
2019084491 | May 2019 | WO |
2019084494 | May 2019 | WO |
2019101848 | May 2019 | WO |
2019140277 | Jul 2019 | WO |
2019168986 | Sep 2019 | WO |
2020092563 | May 2020 | WO |
Entry |
---|
Adam Paszke, Abhishek Chaurasia, Sangpil Kim, and Eugenio Culurciello. Enet: A deep neural network architecture for real-time semantic segmentation. CoRR, abs/1606.02147, 2016. |
Adhiraj Somani, Nan Ye, David Hsu, and Wee Sun Lee, “DESPOT: Online POMDP Planning with Regularization”, Department of Computer Science, National University of Singapore, date unknown. |
Bar-Hillel, Aharon et al. “Recent progress in road and lane detection: a survey.” Machine Vision and Applications 25 (2011): 727-745. |
C. Yang, Z. Li, R. Cui and B. Xu, “Neural Network-Based Motion Control of an Underactuated Wheeled Inverted Pendulum Model,” in IEEE Transactions on Neural Networks and Learning Systems, vol. 25, No. 11, pp. 2004-2016, Nov. 2014. |
Carle, Patrick J.F. et al. “Global Rover Localization by Matching Lidar and Orbital 3D Maps.” IEEE, Anchorage Convention District, pp. 1-6, May 3-8, 2010. (Anchorage Alaska, US). |
Caselitz, T. et al., “Monocular camera localization in 3D LiDAR maps,” European Conference on Computer Vision (2014) Computer Vision—ECCV 2014. ECCV 2014. Lecture Notes in Computer Science, vol. 8690. Springer, Cham. |
Dai, Jifeng, Kaiming He, Jian Sun, (Microsoft Research), “Instance-aware Semantic Segmentation via Multi-task Network Cascades”, CVPR 2016. |
Engel, J. et la. “LSD-SLAM: Large Scare Direct Monocular SLAM,” pp. 1-16, Munich. |
Geiger, Andreas et al., “Automatic Camera and Range Sensor Calibration using a single Shot”, Robotics and Automation (ICRA), pp. 1-8, 2012 IEEE International Conference. |
Hou, Xiaodi and Harel, Jonathan and Koch, Christof, “Image Signature: Highlighting Sparse Salient Regions”, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 34, No. 1, pp. 194-201, 2012. |
Hou, Xiaodi and Yuille, Alan and Koch, Christof, “A Meta-Theory of Boundary Detection Benchmarks”, arXiv preprint arXiv:1302.5985, 2013. |
Hou, Xiaodi and Yuille, Alan and Koch, Christof, “Boundary Detection Benchmarking: Beyond F-Measures”, Computer Vision and Pattern Recognition, CVPR'13, vol. 2013, pp. 1-8, IEEE, 2013. |
Hou, Xiaodi and Zhang, Liqing, “A Time-Dependent Model of Information Capacity of Visual Attention”, International Conference on Neural Information Processing, pp. 127-136, Springer Berlin Heidelberg, 2006. |
Hou, Xiaodi and Zhang, Liqing, “Color Conceptualization”, Proceedings of the 15th ACM International Conference on Multimedia, pp. 265-268, ACM, 2007. |
Hou, Xiaodi and Zhang, Liqing, “Dynamic Visual Attention: Searching For Coding Length Increments”, Advances in Neural Information Processing Systems, vol. 21, pp. 681-688, 2008. |
Hou, Xiaodi and Zhang, Liqing, “Saliency Detection: A Spectral Residual Approach”, Computer Vision and Pattern Recognition, CVPR'07—IEEE Conference, pp. 1-8, 2007. |
Hou, Xiaodi and Zhang, Liqing, “Thumbnail Generation Based on Global Saliency”, Advances in Cognitive Neurodynamics, ICCN 2007, pp. 999-1003, Springer Netherlands, 2008. |
Hou, Xiaodi, “Computational Modeling and Psychophysics in Low and Mid-Level Vision”, California Institute of Technology, 2014. |
Huval, Brody, Tao Wang, Sameep Tandon, Jeff Kiske, Will Song, Joel Pazhayampallil, Mykhaylo Andriluka, Pranav Rajpurkar, Toki Migimatsu, Royce Cheng-Yue, Fernando Mujica, Adam Coates, Andrew Y. Ng, “An Empirical Evaluation of Deep Learning on Highway Driving”, arXiv:1504.01716v3 [cs.RO] Apr. 17, 2015. |
International Application No. PCT/US18/53795, International Search Report and Written Opinion dated Dec. 31, 2018. |
International Application No. PCT/US18/57848, International Search Report and Written Opinion dated Jan. 7, 2019. |
International Application No. PCT/US19/12934, International Search Report and Written Opinion dated Apr. 29, 2019. |
International Application No. PCT/US19/25995, International Search Report and Written Opinion dated Jul. 9, 2019. |
International Application No. PCT/US19/58863, International Search Report and Written Opinion dated Feb. 14, 2020. |
International Application No. PCT/US2018/047608, International Search Report and Written Opinion dated Dec. 28, 2018. |
International Application No. PCT/US2018/047830, International Search Report and Written Opinion dated Apr. 27, 2017. |
International Application No. PCT/US2018/057851, International Search Report and Written Opinion dated Feb. 1, 2019. |
International Application No. PCT/US2019/013322, International Search Report and Written Opinion dated Apr. 2, 2019. |
International Application No. PCT/US2019/019839, International Search Report and Written Opinion dated May 23, 2019. |
Jain, Suyong Dutt, Grauman, Kristen, “Active Image Segmentation Propagation”, In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, Jun. 2016. |
Kai Yu, Yang Zhou, Da Li, Zhang Zhang, Kaiqi Huang, “Large-scale Distributed Video Parsing and Evaluation Platform”, Center for Research on Intelligent Perception and Computing, Institute of Automation, Chinese Academy of Sciences, China, arXiv:1611.09580v1 [cs.CV] Nov. 29, 2016. |
Kendall, Alex, Gal, Yarin, “What Uncertainties do we Need in Bayesian Deep Learning for Computer Vision”, arXiv:1703.04977v1 [cs.CV] Mar. 15, 2017. |
Kyoungho Ahn, Hesham Rakha, “The Effects of Route Choice Decisions on Vehicle Energy Consumption and Emissions”, Virginia Tech Transportation Institute, Blacksburg, VA 24061, date unknown. |
Levinson, Jesse et al., Experimental Robotics, Unsupervised Calibration for Multi-Beam Lasers, pp. 179-194, 12th Ed., Oussama Khatib, Vijay Kumar, Gaurav Sukhatme (Eds.) Springer-Verlag Berlin Heidelberg 2014. |
Li, Yanghao and Wang, Naiyan and Liu, Jiaying and Hou, Xiaodi, “Demystifying Neural Style Transfer”, arXiv preprint arXiv:1701.01036, 2017. |
Li, Yanghao and Wang, Naiyan and Liu, Jiaying and Hou, Xiaodi, “Factorized Bilinear Models for Image Recognition”, arXiv preprint arXiv:1611.05709, 2016. |
Li, Yanghao and Wang, Naiyan and Shi, Jianping and Liu, Jiaying and Hou, Xiaodi, “Revisiting Batch Normalization for Practical Domain Adaptation”, arXiv preprint arXiv:1603.04779, 2016. |
Li, Yin and Hou, Xiaodi and Koch, Christof and Rehg, James M. and Yuille, Alan L., “The Secrets of Salient Object Segmentation”, Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 280-287, 2014. |
Luo, Yi et al. U.S. Appl. No. 15/684,389 Notice of Allowance dated Oct. 9, 2019. |
MacAodha, Oisin, Campbell, Neill D.F., Kautz, Jan, Brostow, Gabriel J., “Hierarchical Subquery Evaluation for Active Learning on a Graph”, In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2014. |
Marius Cordts, Mohamed Omran, Sebastian Ramos, Timo Rehfeld, Markus Enzweiler Rodrigo Benenson, Uwe Franke, Stefan Roth, and Bernt Schiele, “The Cityscapes Dataset for Semantic Urban Scene Understanding”, Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, Nevada, 2016. |
Matthew Barth, Carrie Malcolm, Theodore Younglove, and Nicole Hill, “Recent Validation Efforts for a Comprehensive Modal Emissions Model”, Transportation Research Record 1750, Paper No. 01-0326, College of Engineering, Center for Environmental Research and Technology, University of California, Riverside, CA 92521, date unknown. |
Mohammad Norouzi, David J. Fleet, Ruslan Salakhutdinov, “Hamming Distance Metric Learning”, Departments of Computer Science and Statistics, University of Toronto, date unknown. |
Mur-Artal, R. et al., “ORB-SLAM: A Versatile and Accurate Monocular SLAM System,” IEEE Transaction on Robotics, Oct. 2015, pp. 1147-1163, vol. 31, No. 5, Spain. |
Office Action Mailed in Chinese Application No. 201810025516.X, dated Sep. 3, 2019. |
P. Guarneri, G. Rocca and M. Gobbi, “A Neural-Network-Based Model for the Dynamic Simulation of the Tire/Suspension System While Traversing Road Irregularities,” in IEEE Transactions on Neural Networks, vol. 19, No. 9, pp. 1549-1563, Sep. 2008. |
Peter Welinder, Steve Branson, Serge Belongie, Pietro Perona, “The Multidimensional Wisdom of Crowds”; http://www.vision.caltech.edu/visipedia/papers/WelinderEtalNIPS10.pdf, 2010. |
Ramos, Sebastian, Gehrig, Stefan, Pinggera, Peter, Franke, Uwe, Rother, Carsten, “Detecting Unexpected Obstacles for Self-Driving Cars: Fusing Deep Learning and Geometric Modeling”, arXiv:1612.06573v1 [cs.CV] Dec. 20, 2016. |
Sattler, T. et al., “Are Large-Scale 3D Models Really Necessary for Accurate Visual Localization?” CVPR, IEEE, 2017, pp. 1-10. |
Schindler, Andreas et al. “Generation of high precision digital maps using circular arc splines,” 2012 IEEE Intelligent Vehicles Symposium, Alcala de Henares, 2012, pp. 246-251. doi: 10.1109/IVS.2012.6232124. |
Schroff, Florian, Dmitry Kalenichenko, James Philbin, (Google), “FaceNet: A Unified Embedding for Face Recognition and Clustering”, CVPR 2015. |
Spinello, Luciano, Triebel, Rudolph, Siegwart, Roland, “Multiclass Multimodal Detection and Tracking in Urban Environments”, Sage Journals, vol. 29 Issue 12, pp. 1498-1515 Article first published online: Oct. 7, 2010; Issue published: Oct. 1, 2010. |
Stephan R. Richter, Vibhav Vineet, Stefan Roth, Vladlen Koltun, “Playing for Data: Ground Truth from Computer Games”, Intel Labs, European Conference on Computer Vision (ECCV), Amsterdam, the Netherlands, 2016. |
Szeliski, Richard, “Computer Vision: Algorithms and Applications” http://szeliski.org/Book/, 2010. |
Thanos Athanasiadis, Phivos Mylonas, Yannis Avrithis, and Stefanos Kollias, “Semantic Image Segmentation and Object Labeling”, IEEE Transactions on Circuits and Systems for Video Technology, vol. 17, No. 3, Mar. 2007. |
Tian Li, “Proposal Free Instance Segmentation Based on Instance-aware Metric”, Department of Computer Science, Cranberry-Lemon University, Pittsburgh, PA., date unknown. |
Wang, Panqu and Chen, Pengfei and Yuan, Ye and Liu, Ding and Huang, Zehua and Hou, Xiaodi and Cottrell, Garrison, “Understanding Convolution for Semantic Segmentation”, arXiv preprint arXiv:1702.08502, 2017. |
Wei, Junqing, John M. Dolan, Bakhtiar Litkhouhi, “A Prediction- and Cost Function-Based Algorithm for Robust Autonomous Freeway Driving”, 2010 IEEE Intelligent Vehicles Symposium, University of California, San Diego, CA, USA, Jun. 21-24, 2010. |
Zhang, Z. et al. A Flexible new technique for camera calibration. IEEE Transactions on Pattern Analysis and Machine Intelligence (vol. 22, Issue: 11, Nov. 2000). |
Zhou, Bolei and Hou, Xiaodi and Zhang, Liqing, “A Phase Discrepancy Analysis of Object Motion”, Asian Conference on Computer Vision, pp. 225-238, Springer Berlin Heidelberg, 2010. |
Van Prooijen, Tom. European Application No. 21179854.1, Extended European Search Report, dated Nov. 10, 2021, pp. 1-8. |
Number | Date | Country | |
---|---|---|---|
20210394570 A1 | Dec 2021 | US |
Number | Date | Country | |
---|---|---|---|
63040662 | Jun 2020 | US |