This disclosure generally relates to systems and methods of operating automated vehicles.
Partially and fully-automated or autonomous vehicles have been proposed. However, the systems and methods necessary to control the vehicle can be improved.
In accordance with one embodiment, an autonomous guidance system that operates a vehicle in an autonomous mode is provided. The system includes a camera module, a radar module, and a controller. The camera module outputs an image signal indicative of an image of an object in an area about a vehicle. The radar module outputs a reflection signal indicative of a reflected signal reflected by the object. The controller determines an object-location of the object on a map of the area based on a vehicle-location of the vehicle on the map, the image signal, and the reflection signal. The controller classifies the object as small when a magnitude of the reflection signal associated with the object is less than a signal-threshold.
In accordance with one embodiment, an autonomous guidance system that operates a vehicle in an autonomous mode is provided. The system includes a camera module, a radar module, and a controller. The camera module outputs an image signal indicative of an image of an object in an area about a vehicle. The radar module outputs a reflection signal indicative of a reflected signal reflected by the object. The controller generates a map of the area based on a vehicle-location of the vehicle, the image signal, and the reflection signal, wherein the controller classifies the object as small when a magnitude of the reflection signal associated with the object is less than a signal-threshold.
In accordance with an embodiment of the invention, a method off operating a autonomous vehicle is provided. The method includes the step of receiving a message from roadside infrastructure via an electronic receiver and the step of providing, by a computer system in communication with the electronic receiver, instructions based on the message to automatically implement countermeasure behavior by a vehicle system.
According to a first example, the roadside infrastructure is a traffic signaling device and data contained in the message includes a device location, a signal phase, and a phase timing. The vehicle system is a braking system. The step of providing instructions includes the sub-steps of:
According to a second example, the roadside infrastructure is a construction zone warning device and data contained in the message includes the information of a zone location, a zone direction, a zone length, a zone speed limit, and/or lane closures. The vehicle system may be a braking system, a steering system, and/or a powertrain system. The step of providing instructions may include the sub-steps of:
According to a third example, the roadside infrastructure is a stop sign and data contained in the message includes sign location and stop direction. The vehicle system is a braking system. The step of providing instructions may include the sub-steps:
According to a fourth example, the roadside infrastructure is a railroad crossing warning device and data contained in the message includes device location and warning state. The vehicle system is a braking system. The step of providing instructions includes the sub-steps of:
According to a fifth example, the roadside infrastructure is an animal crossing zone warning device and data contained in the message includes zone location, zone direction, and zone length. The vehicle system is a forward looking sensor. The step of providing instructions includes the sub-step of providing, by the computer system, instructions to the forward looking sensor to widen a field of view so as to include at least both road shoulders within the field of view.
According to a sixth example, the roadside infrastructure is a pedestrian crossing warning device and data contained in the message may be crossing location and/or warning state. The vehicle system may be a braking system and/or a forward looking sensor. The step of providing instructions may include the sub-steps of:
According to a seventh example, the roadside infrastructure is a school crossing warning device and data contained in the message a device location and a warning state. The vehicle system is a braking system. The step of providing instructions includes the sub-steps of:
According to an eighth example, the roadside infrastructure is a lane direction indicating device and data contained in the message is a lane location and a lane direction. The vehicle system is a roadway mapping system. The step of providing instructions includes the sub-step of providing, by the computer system, instructions to the roadway mapping system to dynamically update the roadway mapping system's lane direction information.
According to a ninth example, the roadside infrastructure is a speed limiting device and data contained in the message includes a speed zone location, a speed zone direction, a speed zone length, and a zone speed limit. The vehicle system is a powertrain system. The step of providing instructions includes the sub-steps of:
According to a tenth example, the roadside infrastructure is a no passing zone device and data contained in the message includes a no passing zone location, a no passing zone direction, and a no passing zone length. The vehicle system includes a powertrain system, a forward looking sensor and/or a braking system. The step of providing instructions may include the sub-steps of:
In accordance with another embodiment, another method of operating an autonomous vehicle is provided. The method comprises the step of receiving a message from another vehicle via an electronic receiver, and the step of providing, by a computer system in communication with said electronic receiver, instructions based on the message to automatically implement countermeasure behavior by a vehicle system.
According to a first example, the other vehicle is a school bus and data contained in the message includes school bus location and stop signal status. The vehicle system is a braking system. The step of providing instructions includes the sub-steps of:
According to a second example, the other vehicle is a maintenance vehicle and data contained in the message includes a maintenance vehicle location and a safe following distance. The vehicle system is a powertrain system and/or a braking system. The step of providing instructions may include the sub-steps of:
According to a third example, the other vehicle is an emergency vehicle and data contained in the message may include information regarding an emergency vehicle location, an emergency vehicle speed, and a warning light status. The vehicle system is a braking system, a steering system, a forward looking sensor, and/or a powertrain system. The step of providing instructions may include the sub-steps:
In accordance with an embodiment of the invention, a method of automatically operating a vehicle is provided. The method includes the steps of:
In the case wherein the vehicle system is a braking system, the method may further include the steps of:
In the case wherein the vehicle system is a steering system, the method may include the steps of:
In the case wherein the vehicle system is a powertrain system, the method may further include the steps of:
In the case wherein the vehicle system is a powertrain system and the cellular telephone is carried by another vehicle, the method may include the steps of:
The cellular telephone may by carried by a pedestrian or may be carried by another vehicle.
The present disclosure provides a LED V2V Communication System for an on road vehicle. The LED V2V Communication System includes LED arrays for transmitting encoded data; optical receivers for receiving encoded data; a central-processing-unit (CPU) for processing and managing data flow between the LED arrays and optical receivers; and a control bus routing communication between the CPU and the vehicle's systems such as a satellite-based positioning system, driver infotainment system, and safety systems. The safety systems may include audio or visual driver alerts, active braking, seat belt pre-tensioners, air bags, and the likes.
The present disclosure also provides a method using pulse LED for vehicle-to-vehicle communication. The method includes the steps of receiving input information from an occupant or vehicle system of a transmitting vehicle; generating an output information based on the input information of the transmit vehicle; generating a digital signal based output information of the transmit vehicle; and transmitting the digital signal in the form of luminous digital pulses to a receiving vehicle. The receiving vehicle then receives the digital signal in the form of luminous digital pulses; generates a received message based on received digital signal; generate an action signal based on received information; and relay the action signal to the occupant or vehicle system of the received vehicle. The step of transmitting the digital signal to a receive vehicle includes generating luminous digital pulses in the infra-red or ultra-violet frequency invisible to the human eye.
One aspect of the disclosure involves a method comprising controlling by one or more computing devices an autonomous vehicle in accordance with a first control strategy; developing by one or more computing devices said first control strategy based at least in part on data contained on a first map; receiving by one or more computing devices sensor data from said vehicle corresponding to a first set of data contained on said first map; comparing said sensor data to said first set of data on said first map on a periodic basis; developing a first correlation rate between said sensor data and said first set of data on said first map; and adopting a second control strategy when said correlation rate drops below a predetermined value.
Another aspect of the disclosure involves a method comprising controlling by one or more computing devices an autonomous vehicle in accordance with a first control strategy; receiving by one or more computing devices map data corresponding to a route of said vehicle; developing by one or more computing devices a lane selection strategy; receiving by one or more computing devices sensor data from said vehicle corresponding to objects in the vicinity of said vehicle; and changing said lane selection strategy based on changes to at least one of said sensor data and said map data.
Another aspect of the disclosure involves a method comprising controlling by one or more computing devices an autonomous vehicle in accordance with a first control strategy; receiving by one or more computing devices sensor data from said vehicle corresponding to moving objects in the vicinity of said vehicle; receiving by one or more computing devices road condition data; determining by one or more computing devices undesirable locations for said vehicle relative to said moving objects; and wherein said step of determining undesirable locations for said vehicle is based at least in part on said road condition data.
Another aspect of the disclosure involves a method comprising controlling by one or more computing devices an autonomous vehicle in accordance with a first control strategy; developing by one or more computing devices said first control strategy based at least in part on data contained on a first map, wherein said first map is simultaneously accessible by more than one vehicle; receiving by one or more computing devices sensor data from said vehicle corresponding to objects in the vicinity of said vehicle; and updating by said one or more computing devices said first map to include information about at least one of said objects.
Another aspect of the disclosure involves a method comprising controlling by one or more computing devices an autonomous vehicle; activating a visible signal on said autonomous vehicle when said vehicle is being controlled by said one or more computing devices; and keeping said visible signal activated during the entire time that said vehicle is being controlled by said one or more computing devices.
Another aspect of the disclosure involves a method comprising controlling by one or more computing devices an autonomous vehicle in accordance with a first control strategy; receiving by one or more computing devices sensor data corresponding to a first location; detecting a first moving object at said first location; changing said first control strategy based on said sensor data relating to said first moving object; and wherein said sensor data is obtained from a first sensor that is not a component of said autonomous vehicle.
Another aspect of the disclosure involves a method comprising controlling by one or more computing devices an autonomous vehicle in accordance with a first control strategy; approaching an intersection with said vehicle; receiving by one or more computing devices sensor data from said autonomous vehicle corresponding to objects in the vicinity of said vehicle; determining whether another vehicle is at said intersection based on said sensor data; determining by said one or more computing devices whether said other vehicle or said autonomous vehicle has priority to proceed through said intersection; and activating a yield signal to indicate to said other vehicle that said autonomous vehicle is yielding said intersection.
The present disclosure also provides an autonomously driven car in which the sensors used to provide the 360 degrees of sensing do not extend beyond the pre-existing, conventional outer surface or skin of the vehicle.
The present disclosure provides an integrated active cruise control and lane keeping assist system. The potential exists for a car attempting to pass a leading car to fail in that pass attempt and be returned to the lane in which the leading car travels but too close to the leading car, or at least closer than the predetermined threshold that an active cruise control system would normally maintain.
In the preferred embodiment disclosed, the active cruise control system includes an additional and alternative deceleration scheme. If the vehicle fails in an attempt to pass a leading-vehicle, and makes a lane reentry behind the leading-vehicle that puts it at a following-distance less than the predetermined threshold normally maintained by the cruise control system, a more aggressive deceleration of the vehicle is imposed, as by braking or harder and longer braking, to return the vehicle quickly to the predetermined threshold-distance.
In another preferred embodiment a method of operating an adaptive cruise control system for use in a vehicle configured to actively maintain a following-distance behind a leading-vehicle at no less than a predetermined threshold-distance is provided. The method includes determining when a following-distance of a trailing-vehicle behind a leading-vehicle is less than a threshold-distance. The method also includes maintaining the following-distance when the following-distance is not less than the threshold-distance. The method also includes determining when the following-distance is less than a minimum-threshold that is less than the threshold-distance. The method also includes decelerating the trailing-vehicle at a normal-deceleration-rate when the following-distance is less than the threshold-distance and not less than the minimum-distance. The method also includes decelerating the trailing-vehicle at an aggressive-deceleration-rate when the following-distance is less than the minimum-distance.
Further features and advantages will appear more clearly on a reading of the following detailed description of the preferred embodiment, which is given by way of non-limiting example only and with reference to the accompanying drawings.
The present invention will now be described, by way of example with reference to the accompanying drawings, in which:
Described herein are various systems, methods, and apparatus for controlling or operating an automated vehicle. While the teachings presented herein are generally directed to fully-automated or autonomous vehicles where the operator of the vehicle does little more than designate a destination, it is contemplated that the teaching presented herein are applicable to partially-automated vehicles or vehicles that are generally manually operated with some incremental amount of automation that merely assists the operator with driving.
Autonomous Guidance System
Autonomous guidance systems that operate vehicles in an autonomous mode have been proposed. However, many of these systems rely on detectable markers in the roadway so the system can determine where to steer the vehicle. Vision based systems that do not rely on detectable markers but rather rely on image processing to guide the vehicle have also been proposed. However image based systems require critical alignment of the camera in order to reliably determine distance to objects.
The vehicle 10A is equipped with a sensor assembly, hereafter the assembly 20A, which is shown in this example located in an interior compartment of the vehicle 10A behind a window 12A of the vehicle 10A. While an automobile is illustrated, it will be evident that the assembly 20A may also be suitable for use on other vehicles such as heavy duty on-road vehicles like semi-tractor-trailers, and off-road vehicles such as construction equipment. In this non-limiting example, the assembly 20A is located behind the windshield and forward of a rearview mirror 14A so is well suited to detect an object 16A in an area 18A forward of the vehicle 10A. Alternatively, the assembly 20A may be positioned to ‘look’ through a side or rear window of the vehicle 10A to observe other areas about the vehicle 10A, or the assembly may be integrated into a portion of the vehicle body in an unobtrusive manner. It is emphasized that the assembly 20A is advantageously configured to be mounted on the vehicle 10A in such a way that it is not readily noticed. That is, the assembly 20A is more aesthetically pleasing than previously proposed autonomous systems that mount a sensor unit in a housing that protrudes above the roofline of the vehicle on which it is mounted. As will become apparent in the description that follows, the assembly 20A includes features particularly directed to overcoming problems with detecting small objects.
The controller 120A includes a radar module 30A for transmitting radar signals through the window 12A to detect an object 16A through the window 12A and in an area 18A about the vehicle 10A. The radar module 30A outputs a reflection signal 112A indicative of a reflected signal 114A reflected by the object 16A. In the example, the area 18A is shown as generally forward of the vehicle 10A and includes a radar field of view defined by dashed lines 150A. The radar module 30A receives reflected signal 114A reflected by the object 16A when the object 16A is located in the radar field of view.
The controller 120A also includes a camera module 22A for capturing images through the window 12A in a camera field of view defined by dashed line 160A. The camera module 22A outputs an image signal 116A indicative of an image of the object 16A in the area about a vehicle. The controller 120A is generally configured to detect one or more objects relative to the vehicle 10A. Additionally, the controller 120A may have further capabilities to estimate the parameters of the detected object(s) including, for example, the object position and velocity vectors, target size, and classification, e.g., vehicle verses pedestrian. In additional to autonomous driving, the assembly 20A may be employed onboard the vehicle 10A for automotive safety applications including adaptive cruise control (ACC), forward collision warning (FCW), and collision mitigation or avoidance via autonomous braking and lane departure warning (LDW).
The controller 120A or the assembly 20A advantageously integrates both radar module 30A and the camera module 22A into a single housing. The integration of the camera module 22A and the radar module 30A into a common single assembly (the assembly 20A) advantageously provides a reduction in sensor costs. Additionally, the camera module 22A and radar module 30A integration advantageously employs common or shared electronics and signal processing as shown in
The assembly 20A may advantageously employ a housing 100A comprising a plurality of walls as shown in
The controller 120A may also incorporate or combine the radar module 30A, the camera module 22A, the radar-camera processing unit 50A, and a vehicle control unit 72A. The radar module 30A and camera module 22A both communicate with the radar-camera processing unit 50A to process the received radar signals and camera generated images so that the sensed radar and camera signals are useful for various radar and vision functions. The vehicle control unit 72A may be integrated within the radar-camera processing unit or may be separate therefrom. The vehicle control unit 72A may execute any of a number of known applications that utilize the processed radar and camera signals including, but not limited to autonomous vehicle control, ACC, FCW, and LDW.
The camera module 22A is shown in
The radar module 30A may include a transceiver 32A coupled to an antenna 48A. The transceiver 32A and antenna 48A operate to transmit radar signals within the desired coverage zone or beam defined by the dashed lines 150A and to receive reflected radar signals reflected from objects within the coverage zone defined by the dashed lines 150A. The radar module 30A may transmit a single fan-shaped radar beam and form multiple receive beams by receive digital beam-forming, according to one embodiment. The antenna 48A may include a vertical polarization antenna for providing vertical polarization of the radar signal which provides good propagation over incidence (rake) angles of interest for the windshield, such as a seventy degree (70°) incidence angle. Alternately, a horizontal polarization antenna may be employed; however, the horizontal polarization is more sensitive to the RF properties and parameters of the windshield for high incidence angle.
The radar module 30A may also include a switch driver 34A coupled to the transceiver 32A and further coupled to a programmable logic device (PLD 36A). The programmable logic device (PLD) 36A controls the switch driver in a manner synchronous with the analog-to-digital converter (ADC 38A) which, in turn, samples and digitizes signals received from the transceiver 32A. The radar module 30A also includes a waveform generator 40A and a linearizer 42A. The radar module 30A may generate a fan-shaped output which may be achieved using electronic beam forming techniques. One example of a suitable radar sensor operates at a frequency of 76.5 gigahertz. It should be appreciated that the automotive radar may operate in one of several other available frequency bands, including 24 GHz ISM, 24 GHz UWB, 76.5 GHz, and 79 GHz.
The radar-camera processing unit 50A is shown employing a video microcontroller 52A, which includes processing circuitry, such as a microprocessor. The video microcontroller 52A communicates with memory 54A which may include SDRAM and flash memory, amongst other available memory devices. A device 56A characterized as a debugging USB2 device is also shown communicating with the video microcontroller 52A. The video microcontroller 52A communicates data and control with each of the radar module 30A and camera module 22A. This may include the video microcontroller 52A controlling the radar module 30A and camera module 22A and includes receiving images from the camera module 22A and digitized samples of the received reflected radar signals from the radar module 30A. The video microcontroller 52A may process the received radar signals and camera images and provide various radar and vision functions. For example, the radar functions executed by video microcontroller 52A may include radar detection 60A, tracking 62A, and threat assessment 64A, each of which may be implemented via a routine, or algorithm. Similarly, the video microcontroller 52A may implement vision functions including lane tracking function 66A, vehicle detection 68A, and pedestrian detection 70A, each of which may be implemented via routines or algorithms. It should be appreciated that the video microcontroller 52A may perform various functions related to either radar or vision utilizing one or both of the outputs of the radar module 30A and camera module 22A.
The vehicle control unit 72A is shown communicating with the video microcontroller 52A by way of a controller area network (CAN) bus and a vision output line. The vehicle control unit 72A includes an application microcontroller 74A coupled to memory 76A which may include electronically erasable programmable read-only memory (EEPROM), amongst other memory devices. The memory 76A may also be used to store a map 122A of roadways that the vehicle 10A may travel. As will be explained in more detail below, the map 122A may be created and or modified using information obtained from the radar module 30A and/or the camera module 22A so that the autonomous control of the vehicle 10A is improved. The vehicle control unit 72A is also shown including an RTC watchdog 78A, temperature monitor 80A, and input/output interface for diagnostics 82A, and CAN/HW interface 84A. The vehicle control unit 72A includes a twelve volt (12V) power supply 86A which may be a connection to the vehicle battery. Further, the vehicle control unit 72A includes a private CAN interface 88A and a vehicle CAN interface 90A, both shown connected to an electronic control unit (ECU) that is connected to an ECU connector 92A. Those in the art will recognize that vehicle speed, braking, steering, and other functions necessary for autonomous operation of the vehicle 10A can be performed by way of the ECU connector 92A.
The vehicle control unit 72A may be implemented as a separate unit integrated within the assembly 20A or may be located remote from the assembly 20A and may be implemented with other vehicle control functions, such as a vehicle engine control unit. It should further be appreciated that functions performed by the vehicle control unit 72A may be performed by the video microcontroller 52A, without departing from the teachings of the present invention.
The camera module 22A generally captures camera images of an area in front of the vehicle 10A. The radar module 30A may emit a fan-shaped radar beam so that objects generally in front of the vehicle reflect the emitted radar back to the sensor. The radar-camera processing unit 50A processes the radar and vision data collected by the corresponding camera module 22A and radar module 30A and may process the information in a number of ways. One example of processing of radar and camera information is disclosed in U.S. Patent Application Publication No. 2007/0055446, which is assigned to the assignee of the present application, the disclosure of which is hereby incorporated herein by reference.
Referring to
The assembly 20A has the camera module 22A generally shown mounted near an upper end and the radar module 30A is mounted below. However, the camera module 22A and radar module 30A may be located at other locations relative to each other. The radar module 30A may include an antenna 48A that is vertical oriented mounted generally at the forward side of the radar module 30A for providing a vertical polarized signal. The antenna 48A may be a planar antenna such as a patch antenna. A glare shield 28A is further provided shown as a lower wall of the housing 100A generally below the camera module 22A. The glare shield 28A generally shields light reflection or glare from adversely affecting the light images received by the camera module 22A. This includes preventing glare from reflecting off of the vehicle dash or other components within the vehicle and into the imaging view of the camera module 22A. Additionally or alternately, an electromagnetic interference (EMI) shield may be located in front or below the radar module 30A. The EMI shield may generally be configured to constrain the radar signals to a generally forward direction passing through the window 12A, and to prevent or minimize radar signals that may otherwise pass into the vehicle 10A. It should be appreciated that the camera module 22A and radar module 30A may be mounted onto a common circuit board which, in turn, communicates with the radar-camera processing unit 50A, all housed together within the housing 100A.
Described above is an autonomous guidance system (the system 110A) that operates a vehicle 10A in an autonomous mode. The system 110A includes a camera module 22A and a radar module 30A. The camera module 22A outputs an image signal 116A indicative of an image of an object 16A in the area 18A about a vehicle 10A. The radar module 30A outputs a reflection signal 112A indicative of a reflected signal 114A reflected by the object 16A. The controller 120A may be used to generate from scratch and store a map 122A of roadways traveled by the vehicle 10A, and/or update a previously stored/generated version of the map 122A. The controller 120A may include a global-positioning-unit, hereafter the GPS 124A to provide a rough estimate of a vehicle-location 126A of the vehicle 10A relative to selected satellites (not shown).
As will become clear in the description that follows, the system 110A advantageously is able to accurately determine an object-location 128A of the object 16A relative to the vehicle 10A so that small objects that are not normally included in typical GPS based maps can be avoided by the vehicle when being autonomously operated. By way of example and not limitation, the object 16A illustrated in
In one embodiment, the controller 120A is configured to generate the map 122A of the area 18A based on the vehicle-location 126A of the vehicle 10A. That is, the controller 120A is not preloaded with a predetermined map such as those provided with a typical commercially available navigation assistance device. Instead, the controller 120A builds or generates the map 122A from scratch based on, the image signal 116A, and the reflection signal 112A and global position coordinates provide by the GPS 124A. For example, the width of the roadways traveled by the vehicle 10A may be determined from the image signal 116A, and various objects such as signs, bridges, buildings, and the like may be recorded or classified by a combination of the image signal 116A and the reflection signal.
Typically, vehicle radar systems ignore small objects detected by the radar module 30A. By way of example and not limitation, small objects include curbs, lamp-posts, mail-boxes, and the like. For general navigation systems, these small objects are typically not relevant to determining when the next turn should be made an operator of the vehicle. However, for an autonomous guidance system like the system 110A described herein, prior knowledge of small targets can help the system keep the vehicle 10A centered in a roadway, and can indicate some unexpected small object as a potential threat if an unexpected small object is detected by the system 110A. Accordingly, the controller 120A may be configured to classify the object 16A as small when a magnitude of the reflection signal 112A associated with the object 16A is less than a signal-threshold. The system may also be configured to ignore an object classified as small if the object is well away from the roadway, more than five meters (5 m) for example.
In an alternative embodiment, the controller 120A may be preprogrammed or preloaded with a predetermined map such as those provided with a typical commercially available navigation assistance device. However, as those in the art will recognize that such maps typically do not include information about all objects proximate to a roadway, for example, curbs, lamp-posts, mail-boxes, and the like. The controller 120A may be configured or programmed to determine the object-location 128A of the object 16A on the map 122A of the area 18A based on the vehicle-location 126A of the vehicle 10A on the map 122A, the image signal 116A, and the reflection signal 112A. That is, the controller 120A may add details to the preprogrammed map in order to identify various objects to assist the system 110A avoid colliding with various objects and keep the vehicle 10A centered in the lane or roadway on which it is traveling. As mention before, prior radar based system may ignore small objects. However, in this example, the controller 120A classifies the object as small when the magnitude of the reflection signal 112A associated with the object 16A is less than a signal-threshold. Accordingly, small objects such as curbs, lamp-posts, mail-boxes, and the like can be remembered by the system 110A to help the system 110A safely navigate the vehicle 10A.
It is contemplated that the accumulation of small objects in the map 122A will help the system 110A more accurately navigate a roadway that is traveled more than once. That is, the more frequently a roadway is traveled, the more detailed the map 122A will become as small objects that were previously ignored by the radar module 30A are now noted and classified as small. It is recognized that some objects are so small that it may be difficult to distinguish an actual small target from noise. As such, the controller may be configured to keep track of each time a small object is detected, but not add that small object to the map 122A until the small object has been detected multiple times. In other words, the controller classifies the object 16A as verified if the object 16A is classified as small and the object 16A is detected a plurality of occasions that the vehicle 10A passes through the area 18A. It follows that the controller 120A adds the object 16A to the map 122A after the object 16A is classified as verified after having been classified as small.
Instead of merely counting the number of times an object that is classified as small is detected, the controller 120A may be configured or programmed to determine a size of the object 16A based on the image signal 116A and the reflection signal 112A, and then classify the object 16A as verified if the object is classified as small and a confidence level assigned to the object 16A is greater than a confidence-threshold, where the confidence-threshold is based on the magnitude of the reflection signal 112A and a number of occasions that the object is detected. For example, if the magnitude of the reflection signal 112A is only a few percent below the signal-threshold used to determine that an object is small, then the object 16A may be classified as verified after only two or three encounters. However, if the magnitude of the reflection signal 112A is more than fifty percent below the signal-threshold used to determine that an object is small, then the object 16A may be classified as verified only after many encounter, eight encounters for example. As before, the controller 120A then adds the object 16A to the map 122A after the object 16A is classified as verified.
Other objects may be classified based on when they appear. For example, if the vehicle autonomously travels the same roadway every weekday to, for example, convey a passenger to work, objects such garbage cans may appear adjacent to the roadway on one particular day, Wednesday for example. The controller 120A may be configured to log the date, day of the week, and/or time of day that an object is encountered, and then look for a pattern so the presence of that object can be anticipated in the future and the system 110A can direct the vehicle 10A to give the garbage can a wide berth.
Accordingly, an autonomous guidance system (the system 110A), and a controller 120A for the system 110A is provided. The controller 120A learns the location of small objects that are not normally part of navigation maps but are a concern when the vehicle 10A is being operated in an autonomous mode. If a weather condition such as snow obscures or prevents the detection of certain objects by the camera module 22A and/or the radar module 30A, the system 110A can still direct the vehicle 10A to avoid the object 16A because the object-location 128A relative to other un-obscured objects is present in the map 122A.
Method of Automatically Controlling an Autonomous Vehicle Based on Electronic Messages from Roadside Infrastructure or Other Vehicles
Some vehicles are configured to operate automatically so that the vehicle navigates through an environment with little or no input from a driver. Such vehicles are often referred to as “autonomous vehicles”. These autonomous vehicles typically include one or more sensors that are configured to sense information about the environment. The autonomous vehicle may use the sensed information to navigate through the environment. For example, if the sensors sense that the autonomous vehicle is approaching an intersection with a traffic signal, the sensors must determine the state of the traffic signal to determine whether the autonomous vehicle needs to stop at the intersection. The traffic signal may be obscured to the sensor by weather conditions, roadside foliage, or other vehicles between the sensor and the traffic signal. Therefore, a more reliable method of determining the status of roadside infrastructure is desired.
Because portions of the driving environment may be obscured to environmental sensors, such as forward looking sensors, it is desirable to supplement sensor inputs. Presented herein is a method of operating an automatically controlled or “autonomous” vehicle wherein the vehicle receives electronic messages from various elements of the transportation infrastructure, such as traffic signals, signage, or other vehicles. The infrastructure contains wireless transmitters that broadcast information about the state of each element of the infrastructure, such as location and operational state. The information may be broadcast by a separate transmitter associated with each element of infrastructure or it may be broadcast by a central transmitter. The infrastructure information is received by the autonomous vehicle and a computer system on-board the autonomous vehicle then determines whether countermeasures are required by the autonomous vehicle and sends instructions to the relevant vehicle system, e.g. the braking system, to perform the appropriate actions.
The environment in which the autonomous vehicle 10B operates may also include other vehicles with which the autonomous vehicle 10B may interact. The illustrated examples of other vehicles include:
The autonomous vehicle 10B includes a computer system connected to a wireless receiver that is configured to receive the electronic messages from the transmitters associated with the infrastructure and/or other vehicles. The transmitters and receivers may be configured to communicate using any of a number of protocols, including Dedicated Short Range Communication (DSRCB) or WIFI (IEEE 802.11xB). The transmitters and receivers may alternatively be transceivers allowing two-way communication between the infrastructure and/or other vehicles and the autonomous vehicle 10B. The computer system is interconnected to various sensors and actuators responsible for controlling the various systems in the autonomous vehicle 10B, such as the braking system, the powertrain system, and the steering system. The computer system may be a central processing unit or may be several distributed processors communication over a communication bus, such as a Controller Area Network (CANB) bus.
The autonomous vehicle 10B further includes a locating device configured to determine both the geographical location of the autonomous vehicle 10B as well as the vehicle speed. An example of such a device is a Global Positioning System (GPSB) receiver.
The autonomous vehicle 10B may also include a forward looking sensor 40B configured to identify objects in the forward path of the autonomous vehicle 10B. Such a sensor 40B may be a visible light camera, an infrared camera, a radio detection and ranging (RADARB) transceiver, and/or a laser imaging, detecting and ranging (LIDARB) transceiver.
The method 100B further includes STEP 104B, PROVIDE, BY A COMPUTER SYSTEM IN COMMUNICATION WITH THE ELECTRONIC RECEIVER, INSTRUCTIONS BASED ON THE MESSAGE TO AUTOMATICALLY IMPLEMENT COUNTERMEASURE BEHAVIOR BY A VEHICLE SYSTEM, that includes providing instructions to a vehicle system to automatically implement countermeasure behavior. The instructions are sent to the vehicle system by a computer system that is in communication with the electronic receiver and the instruction are based on the information contained within a message received from the roadside infrastructure by the receiver.
The method 200B further includes STEP 204B, PROVIDE, BY A COMPUTER SYSTEM IN COMMUNICATION WITH THE ELECTRONIC RECEIVER, INSTRUCTIONS BASED ON THE MESSAGE TO AUTOMATICALLY IMPLEMENT COUNTERMEASURE BEHAVIOR BY A VEHICLE SYSTEM, that includes providing instructions to a vehicle system to automatically implement countermeasure behavior. The instructions are sent to the vehicle system by a computer system that is in communication with the electronic receiver and the instruction are based on the information contained within a message received from the other vehicle by the receiver.
The embodiments described herein are described in terms of an autonomous vehicle 10B. However, elements of the embodiments may also be applied to warning systems that alert the driver to manually take these identified countermeasures.
Accordingly a method 100B of automatically operating an autonomous vehicle 10B is provided. The method 100B provides the benefits of allowing automatic control of the autonomous vehicle 10B when instances of the forward looking sensor 40B are be obscured.
Method of Automatically Controlling an Autonomous Vehicle Based on Cellular Telephone Location Information
Some vehicles are configured to operate automatically so that the vehicle navigates through an environment with little or no input from a driver. Such vehicles are often referred to as “autonomous vehicles”. These autonomous vehicles typically includes one or more forward looking sensors, such as visible light cameras, infrared cameras, radio detection and raging (RADAR) or laser imaging, detecting and ranging (LIDAR) that are configured to sense information about the environment. The autonomous vehicle may use the information from the sensors(s) to navigate through the environment. For example, the sensor(s) may be used to determine whether pedestrians are located in the vicinity of the autonomous vehicle and to determine the speed and direction, i.e. the velocity, in which the pedestrians are traveling. However, the pedestrians may be obscured to the sensor by weather conditions, roadside foliage, or other vehicles. Because portions of the driving environment may be obscured to environmental sensors, such as forward looking sensors, it is desirable to supplement sensor inputs.
Autonomous vehicle systems have been proposed and implemented that supplement sensors inputs from data communicated over a short range radio network, such as a Dedicated Short Range Communication (DSRC) transceiver, from other nearby vehicles. The transmissions from these nearby vehicles include information regarding the location and velocity of the nearby vehicles. As used herein, velocity refers to both the speed and direction of travel. However, not all objects of interest in the driving environment include DRSC transceivers, e.g. pedestrians, cyclists, older vehicles. Therefore, a more reliable method of determining the velocity of nearby pedestrians, cyclists, and/or older vehicles is desired.
Presented herein is a method of operating an automatically controlled or “autonomous” vehicle wherein the autonomous vehicle receives electronic messages from nearby cellular telephones contain information regarding the location of the cellular telephone. The autonomous vehicle receives this information and a computer system on-board the autonomous vehicle then determines the location and velocity of the cellular telephone and since the cellular telephone is likely carried by a pedestrian, cyclist, or another vehicle, the computer system determines the location and velocity of nearby pedestrians, cyclists, or/or other vehicles. The computer system then determines whether countermeasures are required by the autonomous vehicle to avoid a collision and sends instructions to the relevant vehicle system, e.g. the braking system, to perform the appropriate actions. Countermeasures may be used to avoid a collision with another vehicle, pedestrian, or cyclist. Countermeasures may include activating the braking system to stop or slow the autonomous vehicle,
The computer system is interconnected to various sensors and actuators (not shown) responsible for controlling the various systems in the autonomous vehicle 10C, such as the braking system, the powertrain system, and the steering system. The computer system may be a central processing unit or may be several distributed processors communication over a communication bus, such as a Controller Area Network (CAN) bus.
The autonomous vehicle 10C further includes a locating device configured to determine both the current location 16C of the autonomous vehicle 10C as well as the vehicle velocity 18C. As used herein, vehicle velocity 18C indicates both vehicle speed and direction of vehicle travel. An example of such a device is a Global Positioning System (GPS) receiver. The autonomous vehicle 10C also includes a mapping system to determine the current location 16C of the autonomous vehicle 10C relative to the roadway. The design and function of these location devices and mapping systems are well known to those skilled in the art.
Receiving location information from cellular telephone 14C provides some advantages over receiving location information from a dedicated short range transceiver, such as a Dedicated Short Range Communication (DSRC) transceiver in a scheme typically referred to as Vehicle to Vehicle communication (V2V). One advantage is that cellular phone with location capabilities are currently more ubiquitous than DSRC transceivers, since most vehicle drivers and/or vehicle passenger are in possession of a cellular telephone 14C. cellular telephone 14C with location technology are also built into many vehicles, e.g. ONSTAR® communication systems in vehicles manufactured by the General Motors Company or MBRACE® communication systems in vehicles marketed by Mercedes-Benz USA, LLC. Another advantage is that cellular telephone 14C that report location information to the autonomous vehicle 10C are also carried by a pedestrian 20C and/or a cyclist 22C, allowing the autonomous vehicle 10C to automatically take countermeasures based on their location. The pedestrian 20C and/or the cyclist 22C are unlikely to carry a dedicated transceiver, such as a DSRC transceiver. Location information from cellular telephone 14C may also be reported from non-roadway vehicles. For example, the location and velocity of a locomotive train (not shown) crossing the path of the autonomous vehicle 10C at a railroad crossing may be detected by the transmissions of a cellular telephone carried by the engineer or conductor on the locomotive.
As shown in
STEP 104C, DETERMINE A VELOCITY OF THE CELLULAR TELEPHONE BASED ON CHANGES IN LOCATION OVER A PERIOD OF TIME, includes determining a velocity 28C of the cellular telephone 14C based on changes in location 26C over a period of time.
STEP 106C, PROVIDE, BY A COMPUTER SYSTEM IN COMMUNICATION WITH THE ELECTRONIC RECEIVER, INSTRUCTIONS BASED ON THE LOCATION AND VELOCITY OF THE CELLULAR TELEPHONE TO AUTOMATICALLY IMPLEMENT COUNTERMEASURE BEHAVIOR BY A VEHICLE SYSTEM, includes providing instructions to a vehicle system to automatically implement countermeasure behavior based on the location 26C and velocity 28C of the cellular telephone 14C and further based on the current location 16C and velocity 18C of the autonomous vehicle 10C. The instructions are sent to the vehicle system, e.g. the braking system, by a computer system that is in communication with the electronic receiver and the instruction are based on the location 26C and velocity 28C of the cellular telephone 14C and further based on the current location 16C and velocity 18C of the autonomous vehicle 10C.
STEP 114C, PROVIDE, BY THE COMPUTER SYSTEM, INSTRUCTIONS TO THE BRAKING SYSTEM TO APPLY VEHICLE BRAKES, includes providing instructions to the braking system to apply the brakes to slow or stop the autonomous vehicle 10C in order to avoid a collision between the autonomous vehicle 10C and the carrier (20C, 2C, 24C) of the cellular telephone 14C if it is determined in STEP 112C that the concurrence between the current location 16C and the cellular telephone location 26C will occur.
STEP 116C, PROVIDE, BY THE COMPUTER SYSTEM, INSTRUCTIONS TO THE POWERTRAIN SYSTEM TO ADJUST THE VEHICLE VELOCITY, includes providing instructions to the powertrain system to adjust the vehicle velocity 18C by slowing or accelerating the autonomous vehicle 10C to in order to avoid a collision between the autonomous vehicle 10C and the carrier (20C, 22C, 24C) of the cellular telephone 14C if it is determined in STEP 112C that the concurrence between the current location 16C and the cellular telephone location 26C will occur.
STEP 118C, DETERMINE A STEERING ANGLE TO AVOID THE CONCURRENCE, includes determining a steering angle to avoid the concurrence if it is determined in STEP 112C that the concurrence between the current location 16C and the cellular telephone location 26C will occur. STEP 120C, PROVIDE, BY THE COMPUTER SYSTEM, INSTRUCTIONS TO THE STEERING SYSTEM TO ADJUST A VEHICLE PATH BASED ON THE STEERING ANGLE, includes providing instructions to the steering system to adjust a vehicle path to avoid the concurrence based on the steering angle determined in STEP 118C.
STEP 122C, DETERMINE WHETHER THE VEHICLE VELOCITY AND THE CELLULAR TELEPHONE VELOCITY ARE SUBSTANTIALLY PARALLEL AND IN A SAME DIRECTION, includes determining whether the vehicle velocity 18C determined in STEP 108C and the cellular telephone velocity 28C determined in STEP 104C are substantially parallel and in a same direction indicating the autonomous vehicle 10C and the cellular telephone 14C are travelling on the same path in the same direction. As used herein, substantially parallel means within 15 degrees of absolutely parallel. STEP 124C, PROVIDE, BY THE COMPUTER SYSTEM, INSTRUCTIONS TO THE POWERTRAIN SYSTEM TO ADJUST THE VEHICLE VELOCITY TO MAINTAIN A FOLLOWING DISTANCE IF IT IS DETERMINED THAT THE VEHICLE VELOCITY AND THE CELLULAR TELEPHONE VELOCITY ARE SUBSTANTIALLY PARALLEL AND IN THE SAME DIRECTION, includes providing instructions to the powertrain system to adjust the vehicle velocity 18C to maintain a following distance if it is determined that the vehicle velocity 18C and the cellular telephone velocity 28C are substantially parallel and in the same direction. The following distance is based on the vehicle velocity 18C in order to allow a safe stopping distance, if required. STEP 124C may also include determining a velocity threshold for the cellular telephone velocity 28C so that the autonomous vehicle 10C does not automatically match the speed a cellular telephone 14C that is moving too slowly, e.g. a cellular telephone 14C carried by a pedestrian 20C or an other vehicle 24C that is moving too quickly, e.g. a cellular telephone 14C carried by the other vehicle 24C exceeding the posted speed limit.
The embodiments described herein are described in terms of an autonomous vehicle 10C. However, elements of the embodiments may also be applied to warning systems that alert the driver to manually take these identified countermeasures.
Accordingly a method 100C of automatically operating an autonomous vehicle 10C is provided. The method 100C provides the benefits of allowing automatic control of the autonomous vehicle 10C when forward looking sensors are be obscured. It also provides the benefit of receiving location information from cellular telephone 14C that are nearly ubiquitous in the driving environment rather than from dedicated transceivers.
Pulsed LED Vehicle to Vehicle Communication System
For autonomous vehicles traveling in a single file down a stretch of road, it is advantageous for the vehicles to be able to send messages and data up and down the chain of vehicles to ensure that the vehicles are traveling within a safe distance from one another. This is true even for occupant controlled vehicles traveling down a single lane road. For example, if a lead vehicle needs to make a sudden deceleration, the lead vehicle could send information to the rear vehicles to alert the occupants and/or to instruct the rear vehicles to decelerate accordingly or activate the rear vehicles' safety systems, such as automatic braking or seat belt pre-tensioners, if collision is imminent.
It is known to utilizing radio frequency transmissions for relaying vehicle information such as distance between vehicles, speed, acceleration, and vehicle location from a lead vehicle to the rear vehicles. However, the use of radio frequency transmissions require directional transmissions so that radio transmissions from vehicles in the adjacent lanes or opposing traffic do not interfere with the radio transmissions from the lead vehicle to the rear vehicles. Using radio frequency transmissions to communicate may require additional hardware, such as radars, lasers, or other components known in the art to measure the distance, speed, and acceleration between adjacent vehicles. This results in complexity of hardware requirements and data management systems, resulting in a costly vehicle-to-vehicle communication system.
Based on the foregoing and other factors, there remains a need for a low cost, directional, interference resistant communication system for vehicles traveling in single file.
Shown in
A front facing LED array 102D configured to transmit an encoded digital signal in the form of light pulses and a front facing optical receiver 106D for receiving a digital signal in the form of light pulses are mounted to the front end of the vehicle. Similarly, mounted to the rear of the vehicle 10D are a rear facing LED array 104D configured to transmit a digital signal in the form of light pulses and a rear optical receiver 108D for receiving a digital signal in the form of light pulses.
Each of the front and rear LED arrays 102D, 104D may include a plurality of individual LEDs that may be activated independently of each other within the LED array. The advantage of this is that the each LED may transmit its own separate and distinct encoded digital signal. The front LED array 102D is positioned where it would be able to transmit unobstructed light pulses to a receiving vehicle immediately in front of the vehicle 10D. Similarly, the rear LED array 104D is positioned where it would be able to transmit unobstructed light pulses to a receiving vehicle immediately behind the vehicle 10D. For aesthetic purposes, the front LED array 102D may be incorporated in the front headlamp assembly of the vehicle 10D and the rear LED array 104D may be incorporated in the brake lamp assembly of the vehicle 10D.
To avoid driver distraction, it is preferable that the LED arrays 102D, 104D emit light pulses outside of the visible light spectrum to the human eye in order to avoid distraction to the drivers of other vehicles. A digital pulse signal is preferred over an analog signal since an analog signal may be subject to degradation as the light pulse is transmitted over harsh environmental conditions. It is preferable that that the LED arrays 102D, 104D emit non-visible light in the infrared frequency to cut through increment weather conditions such as rain, fog, or snow. As an alternative, the LED arrays 102D, 104D may emit light in the ultra-violet frequency range.
The front optical receiver 106D is mounted onto the front of the vehicle 10D such that the front optical receiver 106D has an unobstructed line of sight to a transmitting vehicle immediately in front of the vehicle 10D. Similarly, the rear optical receiver 108D is mounted onto the rear of the vehicle 10D such that the rear optical receiver 108D has an unobstructed line of sight to a transmitting vehicle immediately in rear of the vehicle 10D. As an alternative, the front LED array 102D and front optical receiver 106D may be integrated into a single unit to forming a front LED transceiver, which it is capable of transmitting and receiving a luminous pulse digital signal. Similarly, the rear LED array 104D and rear optical receiver 108D may be integrated as a rear LED transceiver. It should be recognized that each of the exemplary vehicles discussed above in front and rear of vehicle 10D may function as both a receiving and transmitting vehicle, the relevance of which will be discussed below.
A CPU 110D is provided in the vehicle 10D and is configured to receive vehicle input information from a plurality of sources in the vehicle 10D, such as text or voice information from the occupants or data information from the vehicle's GPS 114D, and generates corresponding output information based on the input information. The CPU 110D then sends the output information to the front LED array 102D, the rear LED array 104D, or both, which then transmit the output information as a coded digital signal in the form of light pulses directed to the immediate adjacent front and/or rear vehicles. The CPU HOD is also configured to receive and process incoming messages from the front and rear optical receivers 106D, 108D, and generate an action signal based on the incoming message. A control bus 112D is provided to facilitate electronic communication between the CPU 10D and the vehicle's electronic features such the GPS 114D, driver infotainment system 116D, and safety systems 118D.
Shown in
Referring to
As an additional safety measure for autonomous and/or driver controlled vehicles, the CPU of the first vehicle may receive vehicle location, direction, and speed information from the first vehicle's GPS system. The first vehicle transmits this information via the first vehicle's rear LED array directly to the second vehicle. The second vehicle's CPU may use algorithms to analyze the GPS data received from the first vehicle together with the second vehicle's own GPS data to determine if the two vehicles are traveling in too close of a distance or if collision is imminent. This determination is compared with the distance information calculated from the time it takes to transmit and received a pulse of light between vehicles to ensure accuracy and reliability of the data received from GPS. Just as the first vehicle passing its GPS information to the second vehicle, the second vehicle passes its GPS information to the third vehicle, and so on and so forth.
Utilizing the V2V Communication System 100D, direct audio or text communications between vehicles may be initiated by an occupant of a vehicle. For example, the occupant of the center vehicle may relay a message to the immediate vehicle in front or rear. As previously mentioned, the V2V Communication system 100D may transmit information down a string of vehicle traveling in a single file down a road. If an upfront vehicle encounters an accident, road obstruction, and/or traffic accident, information can be sent down in series through the string of vehicles to slow down or activate safety systems 118D of individual vehicles to ensure that the column of cars slows evenly to avoid vehicle-to-vehicle collisions. Emergency vehicles may utilize the V2V communication system 100D to warn a column of vehicles. For example, if an emergency vehicle is traveling up from behind, the emergency vehicle having a V2V communication system 100D may communicate the information up the column of vehicles to notify the drivers to pull their vehicles over to the side of the road to allow room for the emergency vehicle to pass.
Method and Apparatus for Controlling an Autonomous Vehicle
Autonomous vehicles typically utilize multiple data sources to determine their location, to identify other vehicles, to identify potential hazards, and to develop navigational routing strategies. These data sources can include a central map database that is preloaded with road locations and traffic rules corresponding to areas on the map. Data sources can also include a variety of sensors on the vehicle itself to provide real-time information relating to road conditions, other vehicles and transient hazards of the type not typically included on a central map database.
In many instances a mismatch can occur between the map information and the real-time information sensed by the vehicle. Various strategies have been proposed for dealing with such a mismatch. For example, U.S. Pat. No. 8,718,861 to Montemerlo et al. teaches detecting deviations between a detailed map and sensor data and alerting the driver to take manual control of the vehicle when the deviations exceed a threshold. U.S. Pub. No. 2014/0297093 to Murai et al. discloses a method of correcting an estimated position of the vehicle by detecting an error in the estimated position, in particular when a perceived mismatch exists between road location information from a map database and from vehicle sensors, and making adjustments to the estimated position.
A variety of data sources can be used for the central map database. For example, the Waze application provides navigational mapping for vehicles. Such navigational maps include transient information about travel conditions and hazards uploaded by individual users. Such maps can also extract location and speed information from computing devices located within the vehicle, such as a smart phone, and assess traffic congestion by comparing the speed of various vehicles to the posted speed limit for a designated section of roadway.
Strategies have also been proposed in which the autonomous vehicle will identify hazardous zones relative to other vehicles, such as blind spots. For example, U.S. Pat. No. 8,874,267 to Dolgov et al. discloses such a system. Strategies have also been developed for dealing with areas that are not detectable by the sensors on the vehicle. For example, the area behind a large truck will be mostly invisible to the sensors on an autonomous vehicle. U.S. Pat. No. 8,589,014 to Fairfield et al. teaches a method of calculating the size and shape of an area of sensor diminution caused by an obstruction and developing a new sensor field to adapt to the diminution.
Navigational strategies for autonomous vehicles typically include both a destination-based strategy and a position-based strategy. Destination strategies involve how to get from point ‘A’ to point ‘B’ on a map using known road location and travel rules. These involve determining a turn-by-turn path to direct the vehicle to the intended destination. Position strategies involve determining optimal locations for the vehicle (or alternatively, locations to avoid) relative to the road surface and to other vehicles. Changes to these strategies are generally made during the operation of the autonomous vehicle in response to changing circumstances, such as changes in the position of surrounding vehicles or changing traffic conditions that trigger a macro-level rerouting evaluation by the autonomous vehicle.
Position-based strategies have been developed that automatically detect key behaviors of surrounding vehicles. For example, U.S. Pat. No. 8,935,034 to Zhu et al. discloses a method for detecting when a surrounding vehicle has performed one of several pre-defined actions and altering the vehicle control strategy based on that action.
One of many challenges for controlling autonomous vehicles is managing interactions between autonomous vehicles and human-controlled vehicles in situations that are often handled by customs that are not easily translated into specific driving rules.
In an alternative embodiment, road features 330E and map elements 340E can relate to characteristics about the road surface such as the surface material (dirt, gravel, concrete, asphalt). In another alternative embodiment, road features 330E and map elements 340E can relate to transient conditions that apply to an area of the road such as traffic congestion or weather conditions (rain, snow, high winds).
In block 404E, computer 170E selects a preferred road feature 330E (such as lane lines 332E) and determines its respective location. In block 406E, computer 170E determines the location of the selected instance of the road feature 330E and in block 408E compares this with the location of a corresponding map element 340E. In block 410E, computer 170E determines a correlation rate between the location of road feature 330E and corresponding map element 340E. In block 412E, computer 170E determines whether the correlation rate exceeds a predetermined value. If not, computer 170E adopts an alternative control strategy according to block 414E and reverts to block 404E to repeat the process described above. If the correlation rate is above the predetermined value, computer maintains the default control strategy according to block 416E and reverts to block 404E to repeat the process.
The correlation rate can be determined based on a wide variety of factors. For example, in reference to
In one embodiment of the disclosure, only one of the road features 330E, such as lane lines 332E, are used to determine the correlation between road features 330E and map elements 340E. In other embodiments of the disclosure, the correlation rate is determined based on multiple instances of the road features 330E such as lane lines 332E and pavement edges 336E. In yet another embodiment of the disclosure, the individual correlation between one type of road feature 330E and map element 340E, such as lane lines 332E, is weighted differently than the correlation between other road features 330E and map elements 340E, such as pavement edges 334E, when determining an overall correlation rate. This would apply in situations where the favored road feature (in this case, lane lines 332E) is deemed a more reliable tool for verification of the location of vehicle 100E relative to road network 310E.
In the first protocol, computer 170E relies on a secondary road feature 330E (such as pavement edges 336E) for verification of the location of road network 310E relative to the vehicle 100E and for verification of the position of vehicle 100E within a lane on a roadway (such as the left lane 202E in highway 200E, as shown in
The second protocol is triggered when the computer is unable to reliably use information about alternative road features 330E to verify the position of the vehicle 100E. In this situation, computer 170E may use the position and trajectory of surrounding vehicles to verify the location of road network 310E and to establish the position of vehicle 100E. If adjacent vehicles have a trajectory consistent with road network 310E on map 300E, computer will operate on the assumption that other vehicles are within designated lanes in a roadway. If traffic density is not sufficiently dense (or is non-existent) such that computer 170E cannot reliably use it for lane verification, computer 170E will rely solely on GPS location relative to the road network 310E for navigational control purposes.
In either control strategy discussed above, computer 170E will rely on typical hazard avoidance protocols to deal with unexpected lane closures, accidents, road hazards, etc. Computer 170E will also take directional cues from surrounding vehicles in situations where the detected road surface does not correlate with road network 310E but surrounding vehicles are following the detected road surface, or in situations where the path along road network 310E is blocked by a detected hazard but surrounding traffic is following a path off of the road network and off of the detected road surface.
In accordance with another aspect of the disclosure, referring back to
Computer 170E communicates with navigational database 160E regarding the location of hazards 650E, 670E detected by external sensor system 110E. Navigational database 160E is simultaneously accessible by computer 170E and other computers in other vehicles and is updated with hazard-location information received by such computers to provide a real-time map of transient hazards. In a further embodiment, navigational database 160E sends a request to computer 170E to validate the location of hazards 650E, 670E detected by another vehicle. Computer 170E uses external sensor system 110E to detect the presence or absence of hazards 650E, 670E and sends a corresponding message to navigational database 160E.
In accordance with another aspect of the disclosure,
Computer 170E adapts the lane selection strategy in real time based on information about surrounding vehicles 620E. Computer 170E calculates a traffic density measurement based on the number and spacing of surrounding vehicles 620E in the vicinity of vehicle 100E. Computer 170E also evaluates the number and complexity of potential lane change pathways in the vicinity of vehicle 100E to determine a freedom of movement factor for vehicle 100E. Depending upon the traffic density measurement, the freedom of movement factor, or both, computer 170E evaluates whether to accelerate the lane change maneuver. For example, when traffic density is heavy and freedom of movement limited for vehicle 100E, as shown in
In another aspect of the disclosure as shown in
One of the complexities of autonomous control of vehicle 100E arises in negotiating the right-of-way between vehicles. Driver-controlled vehicles often perceive ambiguity when following the rules for determining which vehicle has the right of way. For example, at a four-way stop two vehicles may each perceive that they arrived at an intersection first. Or one vehicle may believe that all vehicles arrived at the same time but another vehicle perceived that one of the vehicles was actually the first to arrive. These situations are often resolved by drivers giving a visual signal that they are yielding the right of way to another driver, such as with a hand wave. To handle this situation when vehicle 100E is under autonomous control, yield signal 790E is included on vehicle 100E. Computer 170E follows a defined rule set for determining when to yield a right-of-way and activates yield signal 790E when it is waiting for the other vehicle(s) to proceed. Yield signal 790E can be a visual signal such as a light, an electronic signal (such as a radio-frequency signal) that can be detected by other vehicles, or a combination of both.
In accordance with another aspect of the disclosure,
Autonomous Vehicle with Unobtrusive Sensors
An autonomously driven vehicle requires that the surroundings of the vehicle be sensed more or less continually and, more importantly, for 360 degrees around the perimeter of the car.
Atypical means for sensing is a relatively large LIDAR unit (a sensor unit using pulsed laser light rather than radio waves). An example of a known-vehicle 12F is shown in
Referring now to the
Referring first to
Referring next to
Still referring to
While the invention has been described in detail in connection with only a limited number of embodiments, it should be readily understood that the invention is not limited to such disclosed embodiments. Within the broad objective of providing 360 degree sensor coverage, while remaining within the exterior envelope of the car, other compact or improved sensors could be used.
Adaptive Cruise Control Integrated with Lane Keeping Assist System
Earlier cruise control systems, decades old now, allowed a driver to set a certain speed, typically used on highways in fairly low traffic situations, where not a lot of stop and go traffic could be expected. This was necessary, as the systems could not account for closing of the distance behind a leading-vehicle. It was incumbent upon the driver to notice this, and step on the brake, which would also cancel the cruise control setting, necessitating that it be reset. This was an obvious annoyance in stop and go traffic, so the system would unlikely be used in that situation. The systems typically did not cancel the setting for mere acceleration, allowing for the passing of slower leading-vehicles, and a return to the set speed when the passing car returned to its lane.
Newer cruise control systems, typically referred to as adaptive cruise control, use a combination of radar and camera sensing to actively hold a predetermined distance threshold behind the leading car. These vary in how actively they decelerate the car, if needed, to maintain the threshold. Some merely back off of the throttle, some provide a warning to the driver and pre-charge the brakes, and some actively brake while providing a warning.
Appearing on vehicles more recently have been so called lane keeping systems, to keep or help to keep a vehicle in the correct lane. These also vary in how active they are. Some systems merely provide audible or haptic warnings if it is sensed that the car is drifting out of its lane, or if an approaching car is sensed as a car attempts to pass a leading car. Others will actively return the car to the lane if an approaching car is sensed.
Referring first to
Referring next to
The temporarily more aggressive deceleration would be beneficial regardless of whether the abrupt return to the original lane was due to driver direct action or the action of an active lane keeping system. However, it is particularly beneficial when the two are integrated, as a driver inattentive to an approaching vehicle in the adjacent lane is likely to be equally inattentive to the proximity of a leading-vehicle in the original lane.
While this invention has been described in terms of the preferred embodiments thereof, it is not intended to be so limited, but rather only to the extent set forth in the claims that follow.
This application is a divisional of, and claims priority to U.S. Utility patent application Ser. No. 15/792,960, filed Oct. 25, 2017, which is a divisional of U.S. Utility patent application Ser. No. 14/983,695, filed Dec. 30, 2015, which in turn claims the benefit under 35 U.S.C. § 119(e) of U.S. Provisional Patent Application Nos. 62/112,770, 62/112,776, 62/112,786, 62/112,792, 62/112,771, 62/112,775, 62/112,783, 62/112,789, all of which were filed Feb. 6, 2015, the entire disclosures of which is hereby incorporated herein by reference in their entireties.
Number | Date | Country | |
---|---|---|---|
62112770 | Feb 2015 | US | |
62112775 | Feb 2015 | US | |
62112776 | Feb 2015 | US | |
62112786 | Feb 2015 | US | |
62112789 | Feb 2015 | US | |
62112792 | Feb 2015 | US | |
62112771 | Feb 2015 | US | |
62112783 | Feb 2015 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15792960 | Oct 2017 | US |
Child | 16927859 | US | |
Parent | 14983695 | Dec 2015 | US |
Child | 15792960 | US |