This disclosure relates to vehicle operation, including routing and navigation.
A vehicle may include a control system that generates and maintains a route of travel and may control the vehicle to traverse the route of travel. For example, an autonomous vehicle may be controlled autonomously, without direct human intervention, to traverse a route of travel from an origin to a destination. With both autonomous vehicles and non-autonomous vehicles, it is desirable to know the distance between vehicles for identifying intersections, producing warnings, etc.
Disclosed herein are aspects, features, elements, implementations, and embodiments of determining a path length between vehicles along a curve.
An aspect of the disclosed embodiments is a method of generating projected vehicle transportation network information for use in traversing a vehicle transportation network. The method may include receiving, from a remote vehicle, a remote vehicle message, the remote vehicle message including remote vehicle information, the remote vehicle information indicating remote vehicle geospatial state information for the remote vehicle and remote vehicle kinematic state information for the remote vehicle, the remote vehicle kinematic state information including at least a remote heading angle for the remote vehicle. The remote vehicle message may be received at a host vehicle via a wireless electronic communication link. The method may also include identifying host vehicle information for the host vehicle, the host vehicle information including host vehicle geospatial state information and host vehicle kinematic state information for the host vehicle, the host vehicle kinematic state information including at least a host heading angle for the host vehicle. Projected vehicle transportation network information representing a portion of the vehicle transportation network may then be generated based on the remote vehicle information and the host vehicle information, the portion including a curved path having a substantially constant radius. The method also includes determining an expected host vehicle route for the host vehicle in response to generating the projected vehicle transportation network information representing the curved path, and traversing, by the host vehicle, the curved path using the expected host vehicle route.
Another aspect of the disclosed embodiments is a host vehicle that includes a processor configured to execute instructions stored on a non-transitory computer readable medium to generate projected vehicle transportation network information for use in traversing a vehicle transportation network. The processor generates the projected vehicle transportation network information for use in traversing a vehicle transportation network by receiving, from a remote vehicle, via a wireless electronic communication link, a remote vehicle message, the remote vehicle message including remote vehicle information, the remote vehicle information indicating remote vehicle geospatial state information for the remote vehicle and remote vehicle kinematic state information for the remote vehicle, the remote vehicle kinematic state information including at least a remote heading angle for the remote vehicle, identifying host vehicle information for the host vehicle, the host vehicle information including host vehicle geospatial state information and host vehicle kinematic state information for the host vehicle, the host vehicle kinematic state information including at least a host heading angle for the host vehicle; generating projected vehicle transportation network information representing a portion of the vehicle transportation network based on the remote vehicle information and the host vehicle information, the portion including a curved path having a substantially constant radius, determining an expected host vehicle route for the host vehicle in response to generating the projected vehicle transportation network information representing the curved path, and traversing the curved path using the expected host vehicle route.
Variations in these and other aspects, features, elements, implementations and embodiments of the methods, apparatus, procedures and algorithms disclosed herein are described in further detail hereafter.
The various aspects of the methods and apparatuses disclosed herein will become more apparent by referring to the examples provided in the following description and drawings in which:
A vehicle may travel from a point of origin to a destination in a vehicle transportation network. When vehicles are traveling in the vehicle transportation network, collisions are possible.
Commercially available collision warning systems rely on detection and ranging systems to determine whether a collision is imminent. These systems (e.g.,
Collision warning systems may be based on vehicle-to-vehicle (V2V) exchange of a basic safety message containing location and trajectory information, which does not require line of sight between vehicles. One way in which to use the information within the message is to first correlate the information to map data to determine an actual location before making a potential collision determination. However, correlation to map data is time and resource intensive, and accurate and timely collision warnings may be hampered by a lack of resolution.
An alternative technique involves a converging path determination. While still using global positioning system (GPS) coordinate location data from the message, a determination may be made as to whether first and second vehicles (e.g., a host vehicle and a remote vehicle) are on converging paths—paths that intersect in the direction of their trajectories—or are on diverging paths—paths that do not intersect in the direction of their trajectories. Using a predetermined coordinate layout, a straight line between the host vehicle and remote vehicle is determined, as well as an angle between the host vehicle trajectory and the straight line. The determination of whether the vehicles are on converging or diverging paths may then be made knowing the vehicle locations, trajectories, the straight line and the angle.
Vehicles may be traveling along a curved path within the vehicle transportation network. When determining a distance between vehicles (also called path length herein), it is desirable to take the radius of curvature into account. As compared to a straight-path distance, the true path length is larger. As the radius of curvature increases, this difference becomes larger. Thus, the warning application on the host vehicle estimates that the remote vehicle is closer than it actually is. This may produce false alarms. Conventionally, the radius of curvature may be determined by dividing yaw rate by speed. This calculation may be more inaccurate the smaller the radius of curvature.
In contrast, the teachings herein describe how data from the host vehicle and remote vehicle safety message may be used to identify the radius of curvature from a plurality of possible scenarios involving direction of the curve and heading angles for the vehicles. The path length between the vehicles can be determined using the radius of curvature, thus more accurately timing possible warnings or threat mitigation operations if needed. The information may also be used to identify vehicle transportation network information, such as the curved shape of the roadway. Based on the calculated curve, the host vehicle can easily determine future points along the path and then compare its current position with these predicted positions. The teachings herein provide an accurate measure of distance between the host and remote vehicles without relying on momentary yaw rates or complicated and time-consuming calculations to correlate with map data.
Further, using GPS position and heading, a more stable prediction of vehicle heading over methods that use instantaneous values of yaw rate and speed is provided. Details are described below in conjunction with a description of the environments in which the teachings herein may be implemented.
As used herein, the terminology “computer” or “computing device” includes any unit, or combination of units, capable of performing any method, or any portion or portions thereof, disclosed herein.
As used herein, the terminology “processor” indicates one or more processors, such as one or more special purpose processors, one or more digital signal processors, one or more microprocessors, one or more controllers, one or more microcontrollers, one or more application processors, one or more Application Specific Integrated Circuits, one or more Application Specific Standard Products; one or more Field Programmable Gate Arrays, any other type or combination of integrated circuits, one or more state machines, or any combination thereof.
As used herein, the terminology “memory” indicates any computer-usable or computer-readable medium or device that can tangibly contain, store, communicate, or transport any signal or information that may be used by or in connection with any processor. For example, a memory may be one or more read only memories (ROM), one or more random access memories (RAM), one or more registers, low power double data rate (LPDDR) memories, one or more cache memories, one or more semiconductor memory devices, one or more magnetic media, one or more optical media, one or more magneto-optical media, or any combination thereof.
As used herein, the terminology “instructions” may include directions or expressions for performing any method, or any portion or portions thereof, disclosed herein, and may be realized in hardware, software, or any combination thereof. For example, instructions may be implemented as information, such as a computer program, stored in memory that may be executed by a processor to perform any of the respective methods, algorithms, aspects, or combinations thereof, as described herein. In some embodiments, instructions, or a portion thereof, may be implemented as a special purpose processor, or circuitry, that may include specialized hardware for carrying out any of the methods, algorithms, aspects, or combinations thereof, as described herein. In some implementations, portions of the instructions may be distributed across multiple processors on a single device, on multiple devices, which may communicate directly or across a network such as a local area network, a wide area network, the Internet, or a combination thereof.
As used herein, the terminology “example”, “embodiment”, “implementation”, “aspect”, “feature” or “element” indicates serving as an example, instance or illustration. Unless expressly indicated, any example, embodiment, implementation, aspect, feature or element is independent of each other example, embodiment, implementation, aspect, feature or element and may be used in combination with any other example, embodiment, implementation, aspect, feature or element.
As used herein, the terminology “determine” and “identify”, or any variations thereof, includes selecting, ascertaining, computing, looking up, receiving, determining, establishing, obtaining, or otherwise identifying or determining in any manner whatsoever using one or more of the devices shown and described herein.
As used herein, the terminology “or” is intended to mean an inclusive “or” rather than an exclusive “or”. That is, unless specified otherwise, or clear from context, “X includes A or B” is intended to indicate any of the natural inclusive permutations. That is, if X includes A; X includes B; or X includes both A and B, then “X includes A or B” is satisfied under any of the foregoing instances. In addition, the articles “a” and “an” as used in this application and the appended claims should generally be construed to mean “one or more” unless specified otherwise or clear from context to be directed to a singular form.
Further, for simplicity of explanation, although the figures and descriptions herein may include sequences or series of steps or stages, elements of the methods disclosed herein may occur in various orders or concurrently. Additionally, elements of the methods disclosed herein may occur with other elements not explicitly presented and described herein. Furthermore, not all elements of the methods described herein may be required to implement a method in accordance with this disclosure. Although aspects, features, and elements are described herein in particular combinations, each aspect, feature or element may be used independently or in various combinations with or without other aspects, features and elements.
The powertrain 1200 may include a power source 1210, a transmission 1220, a steering unit 1230, an actuator 1240, or any other element or combination of elements of a powertrain, such as a suspension, a drive shaft, axles or an exhaust system. Although shown separately, the wheels 1400 may be included in the powertrain 1200.
The power source 1210 may include an engine, a battery, or a combination thereof. The power source 1210 may be any device or combination of devices operative to provide energy, such as electrical energy, thermal energy, or kinetic energy. For example, the power source 1210 may include an engine, such as an internal combustion engine, an electric motor, or a combination of an internal combustion engine and an electric motor, and may be operative to provide kinetic energy as a motive force to one or more of the wheels 1400. In some embodiments, the power source 1210 may include a potential energy unit, such as one or more dry cell batteries, such as nickel-cadmium (NiCd), nickel-zinc (NiZn), nickel metal hydride (NiMH), lithium-ion (Li-ion); solar cells; fuel cells; or any other device capable of providing energy.
The transmission 1220 may receive energy, such as kinetic energy, from the power source 1210, and may transmit the energy to the wheels 1400 to provide a motive force. The transmission 1220 may be controlled by the controller 1300, the actuator 1240, or both. The steering unit 1230 may be controlled by the controller 1300, the actuator 1240, or both, and may control the wheels 1400 to steer the vehicle 1000. The actuator 1240 may receive signals from the controller 1300 and may actuate or control the power source 1210, the transmission 1220, the steering unit 1230, or any combination thereof to operate the vehicle 1000.
In some embodiments, the controller 1300 may include a location unit 1310, an electronic communication unit 1320, a processor 1330, a memory 1340, a user interface 1350, a sensor 1360, an electronic communication interface 1370, or any combination thereof. Although shown as a single unit, any one or more elements of the controller 1300 may be integrated into any number of separate physical units. For example, the user interface 1350 and processor 1330 may be integrated in a first physical unit and the memory 1340 may be integrated in a second physical unit. Although not shown in
In some embodiments, the processor 1330 may include any device or combination of devices capable of manipulating or processing a signal or other information now-existing or hereafter developed, including optical processors, quantum processors, molecular processors, or a combination thereof. For example, the processor 1330 may include one or more special purpose processors, one or more digital signal processors, one or more microprocessors, one or more controllers, one or more microcontrollers, one or more integrated circuits, one or more an Application Specific Integrated Circuits, one or more Field Programmable Gate Array, one or more programmable logic arrays, one or more programmable logic controllers, one or more state machines, or any combination thereof. The processor 1330 may be operatively coupled with the location unit 1310, the memory 1340, the electronic communication interface 1370, the electronic communication unit 1320, the user interface 1350, the sensor 1360, the powertrain 1200, or any combination thereof. For example, the processor may be operatively coupled with the memory 1340 via a communication bus 1380.
The memory 1340 may include any tangible non-transitory computer-usable or computer-readable medium capable of, for example, containing, storing, communicating, or transporting machine-readable instructions, or any information associated therewith, for use by or in connection with the processor 1330. The memory 1340 may be, for example, one or more solid state drives, one or more memory cards, one or more removable media, one or more read only memories, one or more random access memories, one or more disks, including a hard disk, a floppy disk, an optical disk, a magnetic or optical card, or any type of non-transitory media suitable for storing electronic information, or any combination thereof.
The communication interface 1370 may be a wireless antenna, as shown, a wired communication port, an optical communication port, or any other wired or wireless unit capable of interfacing with a wired or wireless electronic communication medium 1500. Although
The communication unit 1320 may be configured to transmit or receive signals via the wired or wireless medium 1500, such as via the communication interface 1370. Although not explicitly shown in
The location unit 1310 may determine geolocation information, such as longitude, latitude, elevation, direction of travel, or speed, of the vehicle 1000. For example, the location unit may include a global positioning system (GPS) unit, such as a Wide Area Augmentation System (WAAS) enabled National Marine Electronics Association (NMEA) unit, a radio triangulation unit, or a combination thereof. The location unit 1310 can be used to obtain information that represents, for example, a current heading of the vehicle 1000, a current position of the vehicle 1000 in two or three dimensions, a current angular orientation of the vehicle 1000, or a combination thereof.
The user interface 1350 may include any unit capable of interfacing with a person, such as a virtual or physical keypad, a touchpad, a display, a touch display, a speaker, a microphone, a video camera, a sensor, a printer, or any combination thereof. The user interface 1350 may be operatively coupled with the processor 1330, as shown, or with any other element of the controller 1300. Although shown as a single unit, the user interface 1350 may include one or more physical units. For example, the user interface 1350 may include an audio interface for performing audio communication with a person and/or a touch display for performing visual and touch-based communication with the person.
The sensor 1360 often includes one or more sensors 1360, such as an array of sensors, which may be operable to provide information that may be used to control the vehicle 1000. The sensor 1360 may provide information regarding current operating characteristics of the vehicle. When multiple sensors 1360 are included, they can include, for example, a speed sensor, acceleration sensors, a steering angle sensor, traction-related sensors, braking-related sensors, or any sensor, or combination of sensors, that is operable to report information regarding some aspect of the current dynamic situation of the vehicle 1000.
In some embodiments, the sensors 1360 may include one or more sensors that are operable to obtain information regarding the physical environment surrounding the vehicle 1000. For example, one or more sensors 1360 may detect road geometry and obstacles, such as fixed obstacles, vehicles and pedestrians. In some embodiments, the sensors 1360 can be or include one or more video cameras, laser-sensing systems, infrared-sensing systems, acoustic-sensing systems, or any other suitable type of on-vehicle environmental sensing device, or combination of devices, now known or later developed. In some embodiments, the sensors 1360 and the location unit 1310 may be combined.
Although not shown separately, in some embodiments, the vehicle 1000 may include a trajectory controller. The trajectory controller may be operable to obtain information describing a current state of the vehicle 1000 and a route planned for the vehicle 1000, and, based on this information, to determine and optimize a trajectory for the vehicle 1000. In some embodiments, the trajectory controller may output signals operable to control the vehicle 1000 such that the vehicle 1000 follows the trajectory that is determined by the trajectory controller. For example, the output of the trajectory controller can be an optimized trajectory that may be supplied to the powertrain 1200, the wheels 1400, or both. In some embodiments, the optimized trajectory can be control inputs such as a set of steering angles, with each steering angle corresponding to a point in time or a position. In some embodiments, the optimized trajectory can be one or more paths, lines, curves, or a combination thereof. The trajectory controller may be implemented, at least in part, using one or more elements of the controller 1300.
One or more of the wheels 1400 may be a steered wheel, which may be pivoted to a steering angle under control of the steering unit 1230, a propelled wheel, which may be torqued to propel the vehicle 1000 under control of the transmission 1220, or a steered and propelled wheel that may steer and propel the vehicle 1000.
Although not shown in
In some embodiments, the electronic communication network 2300 may be, for example, a multiple access system and may provide for communication, such as voice communication, data communication, video communication, messaging communication, or a combination thereof, between each vehicle 2100/2110 and one or more communicating devices 2400. For example, a vehicle 2100/2110 may receive information, such as information representing the vehicle transportation network 2200, from a communicating device 2400 via the network 2300. In certain embodiments described herein, the electronic communication network 2300 can be used in vehicle-to-vehicle communication of the basic safety message containing location and trajectory information of the vehicle 2100. Each vehicle 2100/2110 may also communicate this information directly to one or more other vehicles as discussed in more detail below.
In some embodiments, a vehicle 2100/2110 may communicate via a wired communication link (not shown), a wireless communication link 2310/2320/2370, or a combination of any number of wired or wireless communication links. For example, as shown, a vehicle 2100/2110 may communicate via a terrestrial wireless communication link 2310, via a non-terrestrial wireless communication link 2320, or via a combination thereof. In some implementations, a terrestrial wireless communication link 2310 may include an Ethernet link, a serial link, a Bluetooth link, an infrared (IR) link, an ultraviolet (UV) link, or any link capable of providing for electronic communication.
In some embodiments, a vehicle 2100/2110 may communicate with another vehicle 2100/2110. For example, a host, or subject, vehicle (HV) 2100 may receive one or more automated inter-vehicle messages, such as the basic safety message, from a remote, or target, vehicle (RV) 2110, via a direct communication link 2370, or via the network 2300. For example, the remote vehicle 2110 may broadcast the message to host vehicles within a defined broadcast range, such as 300 meters. In some embodiments, the host vehicle 2100 may receive a message via a third party, such as a signal repeater (not shown) or another remote vehicle (not shown). In some embodiments, a vehicle 2100/2110 may transmit one or more automated inter-vehicle messages periodically based on a defined interval, such as 100 milliseconds.
Automated inter-vehicle messages may include vehicle identification information, geospatial state information, such as longitude, latitude and/or elevation information, geospatial location accuracy information, kinematic state information, such as vehicle acceleration information, yaw rate information, speed information, vehicle heading information, braking system status information, throttle information, steering wheel angle information, or vehicle routing information, or vehicle operating state information, such as vehicle size information, headlight state information, turn signal information, wiper status information, transmission information, or any other information, or combination of information, relevant to the transmitting vehicle state. For example, transmission state information may indicate whether the transmitting vehicle is in a neutral state, a parked state, a forward state or a reverse state.
In some embodiments, the vehicle 2100 may communicate with the communication network 2300 via an access point 2330. The access point 2330, which may include a computing device, may be configured to communicate with a vehicle 2100, with a communication network 2300, with one or more communication devices 2400, or with a combination thereof via wired or wireless communication links 2310/2340. For example, an access point 2330 may be a base station, a base transceiver station (BTS), a Node-B, an enhanced Node-B (eNode-B), a Home Node-B (HNode-B), a wireless router, a wired router, a hub, a relay, a switch, or any similar wired or wireless device. Although shown as a single unit, an access point may include any number of interconnected elements.
In some embodiments, the vehicle 2100 may communicate with the communication network 2300 via a satellite 2350, or other non-terrestrial communication device. The satellite 2350, which may include a computing device, may be configured to communicate with the vehicle 2100, with the communication network 2300, with one or more communication devices 2400, or with a combination thereof via one or more communication links 2320/2360. Although shown as a single unit, a satellite may include any number of interconnected elements.
The vehicle 2110 may similarly communicate with the communication network 2300 via the access point 2330 and/or the satellite 2350.
An electronic communication network 2300 may be any type of network configured to provide for voice, data, or any other type of electronic communication. For example, the electronic communication network 2300 may include a local area network (LAN), a wide area network (WAN), a virtual private network (VPN), a mobile or cellular telephone network, the Internet, or any other electronic communication system. The electronic communication network 2300 may use a communication protocol, such as the transmission control protocol (TCP), the user datagram protocol (UDP), the internet protocol (IP), the real-time transport protocol (RTP) the Hyper Text Transport Protocol (HTTP), or a combination thereof. Although shown as a single unit, an electronic communication network may include any number of interconnected elements.
In some embodiments, a vehicle 2100 may identify a portion or condition of the vehicle transportation network 2200. For example, the vehicle may include one or more on-vehicle sensors 2150, such as sensor 1360 shown in
In some embodiments, a vehicle 2100 may traverse a portion or portions of one or more vehicle transportation networks 2200 using information communicated via the network 2300, such as information representing the vehicle transportation network 2200, information identified by one or more on-vehicle sensors 2105, or a combination thereof.
Although, for simplicity,
Although the vehicle 2100 is shown communicating with the communication device 2400 via the network 2300, the vehicle 2100 and/or the vehicle 2110 may communicate with the communication device 2400 via any number of direct or indirect communication links. For example, each vehicle 2100/2110 may communicate with the communication device 2400 via a direct communication link, such as a Bluetooth communication link.
In
In some embodiments, the host vehicle HV may traverse a portion of a vehicle transportation network (not shown in
In some embodiments, the automated inter-vehicle messages may indicate information such as geospatial location information and heading information. In some embodiments, the host vehicle HV may transmit one or more automated inter-vehicle messages including host vehicle information, such as host vehicle heading information. Similarly, the remote vehicle RV may transmit one or more automated inter-vehicle messages including remote vehicle information, such as remote vehicle heading information.
In some embodiments, the host vehicle HV may identify a host vehicle expected path for the host vehicle HV based on host vehicle information, such as host vehicle geospatial state information and host vehicle kinematic state information. In some embodiments, the host vehicle HV may identify a remote vehicle expected path for a remote vehicle RV based on the automated inter-vehicle messages, which may include remote vehicle information, such as remote vehicle geospatial state information and remote vehicle kinematic state information. This information may be used to generate projected vehicle transportation network information.
According to the description of the teachings herein, the host vehicle RV is assumed to be on a curved path following a remote vehicle RV on the curved path. The curved path is an example of projected vehicle transportation network information.
The defined set of orientation sectors may be identified in the geospatial domain relative to the host vehicle HV and a reference direction, which is north in this example. The reference direction may alternatively correspond to the host vehicle heading angle δHV. As seen in
The geodesic may be determined based on host vehicle information, such as a geospatial location of the host vehicle, remote vehicle information, such as a geospatial location of the remote vehicle, or a combination thereof. For example, the host vehicle information may indicate a longitude (θHV) for the host vehicle, a latitude (φHV) for the host vehicle, or both, the remote vehicle information may indicate a longitude (θRV) for the remote vehicle, a latitude (φRV) for the remote vehicle, or both, σ may indicate a very small value used to avoid dividing by zero, and determining the convergence angle β1 may be expressed as Equation (1) below:
A length of the geodesic, or instantaneous distance D between the host vehicle HV and the remote vehicle RV, may be determined based on the host vehicle information, the remote vehicle information, or a combination thereof. For example, f may indicate an earth flattening value, such as f=1/298.257223563, re may indicate a measure of the earth's equatorial radius, such as re=6,378,137 meters, and determining the distance D may be expressed as Equation (2) below:
In the example of the first orientation sector Q1 in
In a similar manner, the orientation of the remote vehicle RV may be defined. The remote vehicle may have a remote vehicle heading angle δRV where 0≦δRV<π/2 (i.e., from 0 to 90 degrees from the reference direction). The remote vehicle may have a remote vehicle heading angle δRV where π/2≦δRV<π (i.e., from 90 to 180 degrees from the reference direction). The remote vehicle may have a remote vehicle heading angle δRV where π≦δRV<3π/2 (i.e., from 180 to 270 degrees from the reference direction). Finally, the remote vehicle may have a remote vehicle heading angle δRV where 3π/2≦δRV<2π (i.e., from 270 to 360 degrees from the reference direction).
A radius of curvature of the curved path P and an angle extending along the curved path between the host vehicle HV and the remote vehicle HV (also called a facing angle herein) can be used to determine an expected host vehicle route for the host vehicle. Determining an expected host vehicle route can include determining the path length of the curved path P. The path length is the distance between the host vehicle HV and the remote vehicle RV along the curved path P.
The calculation, however, is complicated because vehicle locations and heading angles do not indicate the direction of the curve forming the curved path. That is, the calculation of path length requires knowing the direction of the curve, and hence knowing which vehicle is following which, but a remote vehicle message that includes remote vehicle geospatial state information for the remote vehicle and remote vehicle kinematic state information for the remote vehicle including at least a remote heading angle for the remote vehicle does not provide this information, even in conjunction with similar information identified for the host vehicle. This deficiency is addressed herein by assuming that the host vehicle HV is following the remote vehicle RV along a curved path P of constant radius R in known directions. From the developed geometric relationships, generating the projected vehicle transportation network information can be achieved by generating a common predictor for the path length whether the curved path is curving left or curving right.
The development of the geometric relationships starts with the four different ranges of values for each of the host vehicle heading angle δHV and the remote vehicle heading angle δRV associated with every measured position of the vehicle as described above with reference to
Referring first to
To determine the angle extending along the curved path P between the host vehicle HV and the remote vehicle RV, the radius of curvature R is first determined. More specifically, the host vehicle heading angle δHV defines a host vehicle trajectory at a particular geospatial location along the curved path P. Similarly, the remote vehicle heading angle δRV defines a remote vehicle trajectory at a particular geospatial location along the curved path P. A line perpendicular to the trajectory and extending from each vehicle to the left relative to the trajectory intersect at an intersect angle α3. In
Together with the geodesic D, lines R form a triangle. Accordingly, by determining the acute angle α1 between the geodesic D and line RHV and the acute angle α2 between the geodesic D and line RRV, the intersect angle α3 can be calculated as α3=π−(α1+α2). Determining the acute angle α1 between the geodesic D and line RHV and the acute angle α2 between the geodesic D and line RRV can be achieved using geometric equations. Acute angle α1 between the geodesic D and line RHV may also be referred to herein as a host angle as it is generated from intersecting line segments at the host vehicle. Acute angle α2 between the geodesic D and line RRV may also be referred to herein as a remote angle as it is generated from intersecting line segments at the remote vehicle.
In
π/2−δHV=α1−β1; and
β1+π+α2+π/2−δRV=2π.
Solving for acute angle α1 results in:
α1=π/2−δHV+β1
Solving for acute angle α2 results in:
α2=π/2+δRV−β1.
In
β1+π+α2+π/2=δRV.
Solving for acute angle α2 results in:
α2=−(3π/2−δRV+β1).
In
β1−α1+π/2−δHV=2π; and
β1−π+α2+π/2=δRV.
Solving for acute angle α1 results in:
α1=−(3π/2+δHV−β1).
Solving for acute angle α2 results in:
α2=π/2+δRV−β1.
In
β1+α1+π/2−δHV=2π; and
β1−π−α2=δRV+π/2.
Solving for acute angle α1 results in:
α1=3π/2+δHV−β1.
Solving for acute angle α2 results in:
α2=−(π/2+δRV−β1).
Referring next to
To determine the angle extending along the curved path P between the host vehicle HV and the remote vehicle RV, the radius of curvature R is first determined. More specifically, the host vehicle heading angle δHV defines a host vehicle trajectory at a particular geospatial location along the curved path P. Similarly, the remote vehicle heading angle δRV defines a remote vehicle trajectory at a particular geospatial location along the curved path P. A line perpendicular to the trajectory and extending from each vehicle to the left relative to the trajectory intersect at an intersect angle α3. As in
Together with the geodesic D, lines R form a triangle. Accordingly, by determining the acute angle α1 between the geodesic D and line RHV and the acute angle α2 between the geodesic D and line RRV, the intersect angle α3 can be calculated as α3=π−(α1+α2). Determining the acute angle α1 between the geodesic D and line RHV and the acute angle α2 between the geodesic D and line RRV can be achieved using geometric equations.
In
δHV−β1=α1−π/2; and
β1−π+α2+π/2=δRV.
Solving for acute angle α1 results in:
α1=π/2−δHV+β1.
Solving for acute angle α2 results in:
α2=π/2+δRV−β1.
In
In
In
δHV=β1+α1+π/2; and
β1+π−α2+π/2−δRV=2π.
Solving for acute angle α1 results in:
α1=−(π/2−δHV+β1).
Solving for acute angle α2 results in:
α2=−(π/2+δRV−β1).
Referring next to
To determine the angle extending along the curved path P between the host vehicle HV and the remote vehicle RV, the radius of curvature R is first determined. More specifically, the host vehicle heading angle δHV defines a host vehicle trajectory at a particular geospatial location along the curved path P. Similarly, the remote vehicle heading angle δRV defines a remote vehicle trajectory at a particular geospatial location along the curved path P. A line perpendicular to the trajectory and extending from each vehicle to the left relative to the trajectory intersect at an intersect angle α3. As in
Together with the geodesic D, lines R form a triangle. Accordingly, by determining the acute angle α1 between the geodesic D and line RHV and the acute angle α2 between the geodesic D and line RRV, the intersect angle α3 can be calculated as α3=π−(α1+α2). Determining the acute angle α1 between the geodesic D and line RHV and the acute angle α2 between the geodesic D and line RRV can be achieved using geometric equations.
In
δHV−β1=π/2−α1; and
β1−π+α2+π/2=δRV.
Solving for acute angle α1 results in:
α1−π/2−δHV+β1.
Solving for acute angle α2 results in:
α2=π/2+δRV−β1.
In
In
β1+π+α2+π/2−δRV=2π.
Solving for acute angle α2 results in:
α2=π/2+δRV−β1.
In
β1+α1+π/2=δHV; and
β1+π−α2=δRV−π/2.
Solving for acute angle α1 results in:
α1=−(π/2−δHV+β1).
Solving for acute angle α2 results in:
α2=3π/2−δRV+β1.
Referring next to
To determine the angle extending along the curved path P between the host vehicle HV and the remote vehicle RV, the radius of curvature R is first determined. More specifically, the host vehicle heading angle δHV defines a host vehicle trajectory at a particular geospatial location along the curved path P. Similarly, the remote vehicle heading angle δRV defines a remote vehicle trajectory at a particular geospatial location along the curved path P. A line perpendicular to the trajectory and extending from each vehicle to the left relative to the trajectory intersect at an intersect angle α3. As in
Together with the geodesic D, lines R form a triangle. Accordingly, by determining the acute angle α1 between the geodesic D and line RHV and the acute angle α2 between the geodesic D and line RRV, the intersect angle α3 can be calculated as α3=π−(α1+α2). Determining the acute angle α1 between the geodesic D and line RHV and the acute angle α2 between the geodesic D and line RRV can be achieved using geometric equations.
In
δHV−β1=π/2−α1; and
β1+π+α2+π/2−δRV=2π.
Solving for acute angle α1 results in:
α1=π/2−δHV+β1.
Solving for acute angle α2 results in:
α2=π/2+δRV−β1.
In
In
β1+π+α2+π/2=δRV.
Solving for acute angle α2 results in:
α2=−(3π/2−δRV+β1).
In
β1+α1+π/2−δHV=2π; and
δRV−(β1−π)=π/2−α2.
Solving for acute angle α1 results in:
α1=3π/2+δHV−β1.
Solving for acute angle α2 results in:
α2=−(π/2+δRV−β1).
As mentioned above, the cases described with reference to
To calculate path length L, an angle extending along the curved path P between the host vehicle HV and the remote vehicle RV is desirable. This is called the path angle herein. In most cases shown in
A similar analysis applies to
Specifically, and referring first to
To determine the angle extending along the curved path P between the host vehicle HV and the remote vehicle RV, the radius of curvature R is first determined. More specifically, the host vehicle heading angle δHV defines a host vehicle trajectory at a particular geospatial location along the curved path P. Similarly, the remote vehicle heading angle δRV defines a remote vehicle trajectory at a particular geospatial location along the curved path P. A line perpendicular to the trajectory and extending from each vehicle to the left relative to the trajectory intersect at an intersect angle α3. In
Together with the geodesic D, lines R form a triangle. Accordingly, by determining the acute angle α1 between the geodesic D and line RHV and the acute angle α2 between the geodesic D and line RRV, the intersect angle α3 can be calculated as α3=π−(α1+α2). Determining the acute angle α1 between the geodesic D and line RHV and the acute angle α2 between the geodesic D and line RRV can be achieved using geometric equations.
In
π/2+δHV=α1+β1; and
δRV−β1+π/2+α2=π.
Solving for acute angle α1 results in:
α1=π/2+δHV−β1
Solving for acute angle α2 results in:
α2=π/2−δRV+β1.
In
In
In
δHV+π/2+α1=β1; and
δRV+π/2−α2−(β1−π)1=2π.
Solving for acute angle α1 results in:
α1=−(π/2+δHV−β1)
Solving for acute angle α2 results in:
α2=−(π/2−δRV+β1).
Referring next to
To determine the angle extending along the curved path P between the host vehicle HV and the remote vehicle RV, the radius of curvature R is first determined. More specifically, the host vehicle heading angle δHV defines a host vehicle trajectory at a particular geospatial location along the curved path P. Similarly, the remote vehicle heading angle δRV defines a remote vehicle trajectory at a particular geospatial location along the curved path P. A line perpendicular to the trajectory and extending from each vehicle to the left relative to the trajectory intersect at an intersect angle α3. As in
Together with the geodesic D, lines R form a triangle. Accordingly, by determining the acute angle α1 between the geodesic D and line RHV and the acute angle α2 between the geodesic D and line RRV, the intersect angle α3 can be calculated as α3=π−(α1+α2). Determining the acute angle α1 between the geodesic D and line RHV and the acute angle α2 between the geodesic D and line RRV can be achieved using geometric equations.
In
π/2+δHV=α1+β1; and
δRV−β1+π/2+α2=π.
Solving for acute angle α1 results in:
α1=π/2+δHV−β1.
Solving for acute angle α2 results in:
α2=π/2−δRV+β1.
In
In
δRV+π/2−(β1−π−α2)=2π.
Solving for acute angle α2 results in:
α2=π/2−δRV+β1.
In
δHV+π/2+α1=β1; and
δRV+π/2=β1−π+α2.
Solving for acute angle α1 results in:
α1=−(π/2+δHV−β1).
Solving for acute angle α2 results in:
α2=3π/2+δRV−β1.
Referring next to
To determine the angle extending along the curved path P between the host vehicle HV and the remote vehicle RV, the radius of curvature R is first determined. More specifically, the host vehicle heading angle δHV defines a host vehicle trajectory at a particular geospatial location along the curved path P. Similarly, the remote vehicle heading angle δRV defines a remote vehicle trajectory at a particular geospatial location along the curved path P. A line perpendicular to the trajectory and extending from each vehicle to the left relative to the trajectory intersect at an intersect angle α3. As in
Together with the geodesic D, lines R form a triangle. Accordingly, by determining the acute angle α1 between the geodesic D and line RHV and the acute angle α2 between the geodesic D and line RRV, the intersect angle α3 can be calculated as α3=π−(α1+α2). Determining the acute angle α1 between the geodesic D and line RHV and the acute angle α2 between the geodesic D and line RRV can be achieved using geometric equations.
In
π/2+δHV=α1+β1; and
δRV+π/2+α2−(β1−π)=2π.
Solving for acute angle α1 results in:
α1=π/2+δHV−β1.
Solving for acute angle α2 results in:
α2=π/2−δRV+β1.
In
δRV+π/2−(β1−π−α2)=2π.
Solving for acute angle α2 results in:
α2=π/2−δRV+β1.
In
δHV+π/2=β1+α1; and
δRV+π/2+α2=β1−π.
Solving for acute angle α1 results in:
α1=π/2+δHV−β1.
Solving for acute angle α2 results in:
α2=−(3π/2+δRV−β1).
In
δHV+π/2+α1−β1=2π; and
δRV+π/2=β1+π+α2.
Solving for acute angle α1 results in:
α1=3π/2−δHV+β1.
Solving for acute angle α2 results in:
α2=−(π/2−δRV+β1).
Referring next to
To determine the angle extending along the curved path P between the host vehicle HV and the remote vehicle RV, the radius of curvature R is first determined. More specifically, the host vehicle heading angle δHV defines a host vehicle trajectory at a particular geospatial location along the curved path P. Similarly, the remote vehicle heading angle δRV defines a remote vehicle trajectory at a particular geospatial location along the curved path P. A line perpendicular to the trajectory and extending from each vehicle to the left relative to the trajectory intersect at an intersect angle α3. As in
Together with the geodesic D, lines R form a triangle. Accordingly, by determining the acute angle α1 between the geodesic D and line RHV and the acute angle α2 between the geodesic D and line RRV, the intersect angle α3 can be calculated as α3=π−(α1+α2). Determining the acute angle α1 between the geodesic D and line RHV and the acute angle α2 between the geodesic D and line RRV can be achieved using geometric equations.
In
β1−δHV=π/2−α1; and
δRV+π/2+α2−(β1−π)=2π.
Solving for acute angle α1 results in:
α1=π/2+δHV−β1.
Solving for acute angle α2 results in:
α2=π/2−δRV+β1.
In
δRV+π/2+α2=β1−π.
Solving for acute angle α2 results in:
α2=−(3π/2+δRV−β1).
In
δHV+π/2−β1−α1=2π; and
δRV+π/2+α2−β1=π.
Solving for acute angle α1 results in:
α1=−(3π/2−δHV+β1).
Solving for acute angle α2 results in:
α2=π/2−δRV+β1.
In
δHV+π/2+α1−β1=2π; and
β1+π+α2=δRV+π/2.
Solving for acute angle α1 results in:
α1=3π/2−δHV+β1.
Solving for acute angle α2 results in:
α2=−(π/2−δRV+β1).
As mentioned above, the cases described with reference to
To calculate path length L, an angle extending along the curved path P between the host vehicle HV and the remote vehicle RV is desirable. This is called the path angle herein. In most cases shown in
While there are sixteen possible values for the angles α1, α2 and α′3 for each of a left curve and right curve, it is desirable to have a single equation for each angle. To that end, a combination matrix Fm,n of normalizing values H1 through H4 associated with the vehicle heading angle δHV and normalizing values R1 through R4 associated with the remote vehicle heading angle δRV may be expressed as shown in Table 7.
In Table 7, H1 through H4 are calculated as in Table 8 below.
In Table 7, R1 though R4 are calculated as in Table 9 below.
The combination matrix Fm,n is effectively a filter. As such, it can be used to filter out all but the relevant values for the value of acute angle α1 while in a left curve (referred to as α1L) or while in a right curve (referred to as α1R). The filtered value for α1L is represented by the matrix in Table 10 below where:
Similarly, the filtered value for α1R is represented by the matrix in Table 11 below where:
The combination matrix Fm,n can also be used to filter out all but the relevant values for the value of acute angle α2 while in a left curve (referred to as α2L) or while in a right curve (referred to as α2R). The filtered value for α2L is represented by the matrix in Table 12 below where:
Similarly, the filtered value for α2R is represented by the matrix in Table 13 below where:
Finally, the combination matrix Fm,n can be used to filter out all but the relevant values for the value of facing angle α′3 while in a left curve (referred to as α′3L) or while in a right curve (referred to as α′3R). The filtered value for α′3L is represented by the matrix in Table 14 below where:
Similarly, the filtered value for α′3R is represented by the matrix in Table 15 below where:
The matrices above produce two values each for α1, α2 and α′3—one for left-hand curves and one for right-hand curves. Whether the curved path P includes a left-hand curve or a right-hand curve may be determined by calculating one or both of two operators L and R. Operators L and R may also be called, respectively, left and right probability operators. Operator L may be calculated by:
where each value of Lm,n is calculated according to Equation (3) below:
Operator R may be calculated by:
where each value of Rm,n is calculated according to Equation (3) below:
If L=1, the host vehicle is in a left-hand curve and hence the curved path is curved in the left direction relative to the reference direction. Otherwise, if R=1, the host vehicle is in a right-hand curve and hence the curved path is curved in the right direction relative to the reference direction. For this reason, the angles α1, α2 and α′3L can be generally defined using operators L and R and the results of Tables 10 to 15 as follows:
Knowing the angles α1, α2 and α′3L, the law of sines can be used to determine the radius of curvature. Namely the law of sines relates the lengths of the sides of a triangle to the sines of its angles. According to the triangles formed in
Similarly, the following results from the law of sines:
In each case, the length of the geodesic D may be determined according to Equation (2). Recall, however, that the radius of curvature determined at the host vehicle RHV and the radius of curvature determined at the remote vehicle RRV should be equal or substantially equal. Although it is not necessary, determining both RHV and RRV can be used to indicate the quality of the data. That is, if the difference between RHV and RRV is greater than a minimum defined value (determined by experimentation, for example), it could indicate that the input data cannot be trusted to result in a correct calculation of the path length L of the curved path P. This could occur due to faulty sensors, a bad communication link, etc. However, it could also result from a sudden change in trajectory of one or both of the host vehicle and the remote vehicle to avoid an obstacle.
Path length L, that is, the length of the curved path P between the host vehicle and the remote vehicle may be determined by either of the following calculations using Equation (2) in place of D.
According to the above described, the geometric relationships developed using vehicle positions and heading angle combinations for left-hand and right-hand curves generate projected vehicle transportation network information in the form of calculation for the path length L of vehicles traveling along a curved path that does not require knowledge of whether the curved path is curving left or curving right.
In some embodiments, traversing a vehicle transportation network including generating projected vehicle transportation network information includes receiving remote vehicle information at 6000, identifying host vehicle information at 6100, generating projected vehicle transportation network information at 6200, determining an expected route at 6300 and traversing a curved path using the expected route at 6400.
Host information receives remote vehicle information while traversing a portion of a vehicle transportation network at 6000. Remote vehicle information received by a host vehicle at 6000 may be in the form of a remote vehicle message that indicates remote vehicle geospatial state information for the remote vehicle and remote vehicle kinematic state information for the remote vehicle. The remote vehicle geospatial state information may include, for example, geospatial coordinates for the remote vehicle. These coordinates may be GPS coordinates for a latitude and a longitude of the remote vehicle in some embodiments. The remote vehicle kinematic state information includes at least a remote heading angle for the remote vehicle, or information from which the remote heading angle may be determined. The remote heading angle for the remote vehicle is provided with reference to a reference direction (e.g., north). Other remote vehicle kinematic state information may be provided within the remote vehicle message as discussed initially, including remote vehicle speed, vehicle acceleration, etc. The remote vehicle information may be received according to any number of wireless electronic communication links discussed with reference to
Identifying host vehicle information occurs at 6100. For example, the host vehicle may identify host vehicle information by reading various sensors throughout the host vehicle as discussed with reference to
The host vehicle kinematic state information includes at least a host heading angle for the host vehicle. The host heading angle is provided with reference to a reference direction. Generally, this direction is north and is the same as the reference direction used to identify the remote vehicle heading angle, but this is not required. The vehicles may use different reference directions as long as the host vehicle knows the reference directions so as to convert them to a common reference direction. The reference direction does not need to be the same reference direction as used to generate the projected vehicle transportation network information. The host vehicle kinematic state information may include additional information such as host vehicle speed, host vehicle acceleration, etc.
Upon receipt of remote vehicle information and identification of host vehicle information, projected vehicle transportation network information may be generated at 6200. For example, the host vehicle may generate projected vehicle transportation network information representing a portion of the vehicle transportation network based on the remote vehicle information received at 6000 and the host vehicle information identified at 6100. This portion may include a curved path having a substantially constant radius.
In some embodiments, generating the projected vehicle transportation network information includes generating a remote radius of curvature of a travel path for the remote vehicle using the remote heading angle and the remote vehicle geospatial state information, generating a host radius of curvature of a travel path for the host vehicle using the host heading angle and the host vehicle geospatial state information, and generating the curved path using the remote radius of curvature, the host radius of curvature, and an angle between the remote radius of curvature and the host radius of curvature. The angle between the remote radius of curvature and the host radius of curvature may be, for example, facing angle α′3.
In some embodiments, generating the projected vehicle transportation network information includes generating a remote radius of curvature of a travel path for the remote vehicle using the remote heading angle, generating a host radius of curvature of a travel path for the host vehicle using the host heading angle, and generating a length of the curved path between the host vehicle and the remote vehicle using the remote radius of curvature, the host radius of curvature, the remote vehicle geospatial state information, the host vehicle geospatial state information, and an angle between the remote radius of curvature and the host radius of curvature. The angle between the remote radius of curvature and the host radius of curvature may be, for example, facing angle α′3.
In some embodiments, bad data can be detected and optionally omitted from the generation of the projected vehicle transportation network information. For example, values for the remote heading angle and the remote vehicle geospatial state information may be periodically received and/or identified. Then, the remote radius of curvature for the remote vehicle for each of the periodically received values of the remote heading angle and the remote vehicle geospatial state information with a respective host radius of curvature generated from concurrent values of the host heading angle and the host vehicle geospatial state information may be compared. When the pairs of values including a remote radius of curvature and a host radius of curvature have values that are different by more than a predetermined amount, they may be ignored or omitted. Essentially, such a difference may indicate the data is bad. If used to calculate path length, different results would occur, making it unclear what calculated path length value was correct.
In some embodiments, generating the projected vehicle transportation network information includes generating the equation for path length L discussed above with reference to
In some embodiments, generating the projected vehicle transportation network information includes generating the path length L using the equation for the path length L discussed above with reference to
Once the projected vehicle transportation network information is generated at 6200, it can be used to determine an expected route of the host vehicle and optionally of the remote vehicle at 6300. For example, the current position of the host vehicle along the curved path and/or the path length can be used, together with the speed of the host vehicle, to generate an expected position of the host vehicle along the curved path at a future point in time. Even when the speed and/or acceleration of the remote vehicle is not received as part of the remote vehicle information, the path length over time can be used to determine if the host vehicle is gaining on the remote vehicle and, if so, how long before they intersect on the curved path without further changes in the system.
The expected route may be used to traverse the curved path at 6400. For example, the expected route may indicate a possible collision at a future point in time with no changes in trajectory, speed and/or acceleration. In this case, traversing the curved path could involve issuing alarms to the driver of the host vehicle or taking corrective actions such as issuing braking instructions to a braking system of the host vehicle. Other corrective actions may be taken while traversing the curved path using the expected route.
The method as described by example in
Vehicles may be traveling along a curved path within a vehicle transportation network. The teachings herein describe a relatively fast and simple way to take the radius of curvature into account when determining a distance between vehicles. For example, for a path with radius of curvature R of 200 meters, such as a freeway on-ramp, the difference between a calculated true path length L and a straight-line distance D between the host vehicle and the remote vehicle is four meters (157 meters vs. 153 meters) when the facing angle α′3 is 45 degrees. By eliminating this degree of difference, false alarms may be minimized, particularly where the radius of curvature is small.
The above aspects, examples and implementations have been described in order to allow easy understanding of the disclosure are not limiting. On the contrary, the disclosure covers various modifications and equivalent arrangements included within the scope of the appended claims, which scope is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structure as is permitted under the law.
Number | Name | Date | Kind |
---|---|---|---|
5939976 | Sasaki et al. | Aug 1999 | A |
5940010 | Sasaki et al. | Aug 1999 | A |
6008741 | Shinagawa et al. | Dec 1999 | A |
6615137 | Lutter et al. | Sep 2003 | B2 |
6700504 | Aslandogan et al. | Mar 2004 | B1 |
6720898 | Ostrem | Apr 2004 | B1 |
6791471 | Wehner et al. | Sep 2004 | B2 |
6810328 | Yokota et al. | Oct 2004 | B2 |
7363117 | Tengler | Apr 2008 | B2 |
8000897 | Breed et al. | Aug 2011 | B2 |
8175796 | Blackburn et al. | May 2012 | B1 |
8229663 | Zeng et al. | Jul 2012 | B2 |
8290704 | Bai | Oct 2012 | B2 |
8340894 | Yester | Dec 2012 | B2 |
8466807 | Mudalige | Jun 2013 | B2 |
8548729 | Mizuguchi | Oct 2013 | B2 |
8577550 | Lu et al. | Nov 2013 | B2 |
8587418 | Mochizuki et al. | Nov 2013 | B2 |
8639426 | Dedes et al. | Jan 2014 | B2 |
8717192 | Durekovic et al. | May 2014 | B2 |
20070109111 | Breed et al. | May 2007 | A1 |
20070262881 | Taylor | Nov 2007 | A1 |
20090033540 | Breed et al. | Feb 2009 | A1 |
20090140887 | Breed et al. | Jun 2009 | A1 |
20090198412 | Shiraki | Aug 2009 | A1 |
20100169009 | Breed et al. | Jul 2010 | A1 |
20120016581 | Mochizuki et al. | Jan 2012 | A1 |
20120218093 | Yoshizawa et al. | Aug 2012 | A1 |
20130099911 | Mudalige et al. | Apr 2013 | A1 |
20130116915 | Ferreira et al. | May 2013 | A1 |
20130179047 | Miller et al. | Jul 2013 | A1 |
20130278440 | Rubin et al. | Oct 2013 | A1 |
Number | Date | Country |
---|---|---|
2001118199 | Apr 2001 | JP |
2003051099 | Feb 2003 | JP |
Entry |
---|
Kurt, Arda (dissertation), “Hybrid-state system modelling for control, estimation and prediction in vehicular autonomy”, presented in Partial Fulfillment of the Requirements for the Degree Doctor of Philosophy in the Graduate School of The Ohio State University, Mar. 2012, UMI/Proquest Pub. No. 3497707, 136 pages (total). |
Kurt, Arda et al., “Hybrid-state driver/vehicle modelling, estimation and prediction”, 13th International IEEE Annual Conference on Intelligent Transportation Systems, Madeira Island, Portugal, Paper TA3.4, Sep. 19-22, 2010, pp. 806-811. |
Number | Date | Country | |
---|---|---|---|
20170018186 A1 | Jan 2017 | US |