The present disclosure relates generally to autonomous vehicles and, more particularly, to controlling the operation of an autonomous vehicle upon the detection of an unknown object on a carriageway along which the autonomous vehicle is to travel.
One aim of autonomous vehicle technologies is to provide vehicles that can safely navigate towards a destination with limited or no driver assistance. In some instances, unanticipated objects may be on the carriageway including, in some instances, within a driving lane along which the autonomous vehicle is to travel. In an instance in which the unanticipated object is identified to be a known type of object, the autonomous vehicle may be controlled so as to respond to the known object in a particular fashion that may be based upon the type of object that has been identified. However, in other instances, the unanticipated object is unknown and is not a recognized type of object. Current autonomous vehicle technologies may respond uniformly in response to the detection of an unknown object which may, in turn, lead to responses by the autonomous vehicle that are unnecessary or excessive for a particular unknown object, thereby potentially creating more disruption to the traffic flow than is required.
A control subsystem, method, computer program product and autonomous vehicle are provided in accordance with an example embodiment in order to respond to an unknown object on a carriageway along which an autonomous vehicle is to travel. Based upon characteristics of the object and the location of the object relative to the autonomous vehicle, a motion plan may be defined for the autonomous vehicle which not only ensures the safety of the unknown object, but also safely and efficiently navigates the autonomous vehicle relative to the object. As such, the control subsystem, method, computer program product and autonomous vehicle of an example embodiment recognize various problems and previously unmet needs related to the navigation of autonomous vehicles which previously generally responded to the detection of an unknown object in the uniform manner regardless of characteristics of the unknown object. In contrast, the control subsystem, method, computer program product and autonomous vehicle of an example embodiment tailor the response based upon characteristics, such as the size, of the unknown object and the location of the unknown object relative to the autonomous vehicle, to continue to insure the safety of the object and the autonomous vehicle, but to respond in a more measured manner that may result in less disruption to the traffic flow about the object. As such, certain embodiments of the present disclosure provide unique solutions to technical problems of autonomous vehicle navigation techniques including problems associated with safely navigating about an unknown object that is detected on a carriageway along which an autonomous vehicle is to travel.
In an example embodiment, a control subsystem of an autonomous vehicle is provided that may include processing circuitry configured to receive sensor data from at least one vehicle sensor of the autonomous vehicle. The sensor data may include location coordinates of an object on a carriageway along which the autonomous vehicle is to travel. The processing circuitry is also configured to evaluate the sensor data to determine whether the object is known or unknown. In an instance in which the object is unknown, the processing circuitry is configured to determine the size of the object. The processing circuitry is further configured to define a motion plan for the autonomous vehicle depending upon the size of the object and the location coordinates of the object relative to the autonomous vehicle. The motion plan that is defined is dependent upon the size of the object with different motion plans being defined for differently sized objects. Various motion plans may be defined including, for example, changing lanes, slowing and biasing away from the object or stopping the autonomous vehicle depending upon the size of the object and the location coordinates of the object relative to the autonomous vehicle. The processing circuitry is additionally configured to update driving instructions for the autonomous vehicle based upon the motion plan that is defined.
The processing circuitry of an example embodiment is configured to determine the size of the object by determining the height and width associated with the object. In this example embodiment, the processing circuitry is further configured to classify the object as a larger object or a smaller object based on the height of the object. The processing circuitry of this example embodiment may also be configured to define a motion plan for the autonomous vehicle such that a larger distance is maintained between the autonomous vehicle and the object in an instance in which the object is classified as the larger object than in an instance in which the object is classified as the smaller object. In another example embodiment, the processing circuitry is further configured to classify the object as a larger object, an intermediate object or a smaller object based on the height of the object. In this example embodiment, the processing circuitry is configured to define the motion plan for the autonomous vehicle such that the autonomous vehicle straddles the object in an instance in which the object is classified as the smaller object.
In another example embodiment, a method is provided that includes receiving sensor data from at least one vehicle of an autonomous vehicle. The sensor data may include location coordinates of an object on a carriageway along which the autonomous vehicle is to travel. The method also includes evaluating the sensor data to determine whether the object is known or unknown and, in an instance in which the object is unknown, determining the size of the object. The method further includes defining a motion plan for the autonomous vehicle depending upon the size of the object and the location coordinates of the object relative to the autonomous vehicle. The motion plan that is defined is dependent upon the size of the object with different motion plans being defined for differently sized objects. For example, the motion plan may be one of changing lanes, slowing and biasing away from the object or stopping the autonomous vehicle depending upon the size of the object and the location coordinates of the object relative to the autonomous vehicle. Regardless of the particular type of motion plan, the method further includes updating driving instructions for the autonomous vehicle based upon the motion plan that is defined.
The method of an example embodiment determines the size of the object by determining the height and width associated with the object. In this example embodiment, the method may further include classifying the object as a larger object or a smaller object based upon the height of the object. The method of this example embodiment may also include defining the motion plan for the autonomous vehicle such that a larger distance is maintained between the autonomous vehicle and the object in an instance in which the object is classified as larger object than in an instance in which the object is classified as a smaller object. In another example embodiment, the method classifies the object as larger object, an intermediate object or a smaller object based upon the height of the object. In this example embodiment, the method may define the motion plan for the autonomous vehicle such that the autonomous vehicle straddles the object in an instance in which the object is classified as the smaller object.
In a further example embodiment, a computer program product is provided that includes at least one non-transitory computer-readable storage medium having computer-executable program code instructions stored therein. The computer-executable program code instructions include program code instructions configured to receive sensor data from at least one vehicle sensor of an autonomous vehicle. The sensor data may include location coordinates of an object along a carriageway along which the autonomous vehicle is to travel. The computer-executable program code instructions also include program code instructions configured to evaluate the sensor data to determine whether the object is known or unknown and, in an instance in which the object is unknown, the program code instructions configured to determine the size of the object. The computer-executable program code instructions further include program code instructions configured to define a motion plan for the autonomous vehicle depending upon the size of the object and the location coordinates of the object relative to the autonomous vehicle. The motion plan that is defined is dependent upon the size of the object with different motion plans being defined for differently sized objects. For example, the motion plan made be one of changing lanes, slowing in biasing away from the object or stopping the autonomous vehicle depending upon the size of the object and the location coordinates of the object relative to the autonomous vehicle. Regardless of the type of motion plan, the computer disc executable program code instructions also include program code instructions configured to update the driving instructions for the autonomous vehicle based upon the motion plan that is defined.
The program code instructions that are configured to determine the size of the object include, in one example embodiment, program code instructions configured to determine the height and width associated with the object. In this example embodiment, the computer-executable program code instructions may also include program code instructions configured to classify the object as a larger object or a smaller object based upon the height of the object. In this example embodiment, the program code instructions configured to define the motion plan for the autonomous vehicle provide for a larger distance to be maintained between the autonomous vehicle and the object in an instance in which the object is classified as the larger object than in an instance in which the object is classified as a smaller object. In another example embodiment, the computer-executable program code instructions further include program code instructions configured to classify the object as a larger object, an intermediate object or a smaller object based on the height of the object. The program code instructions of this example embodiment may be configured to define the motion plan such that the autonomous vehicle straddles the object in an instance in which the object is classified as the smaller object.
In yet another example embodiment, an apparatus is provided that includes means for receiving sensor data from at least one vehicle senor of an autonomous vehicle. The sensor data includes location coordinates of an object along a carriageway along which the autonomous vehicle is to travel. The apparatus also includes means for evaluating the sensor data to determine whether the object is known or unknown and, in an instance in which the object is unknown, means for determining the size of the object. The apparatus further includes means for defining the motion plan for the autonomous vehicle depending upon the size of the object and the location coordinates of the object relative to the autonomous vehicle. The motion plan that is defined as dependent upon the size of the object with different motion plans being defined for differently sized objects. For example, the motion plan may be one of changing lanes, slowing and biasing away from the object or stopping the autonomous vehicle depending upon the size of the object and the location coordinates of the object relative to the autonomous vehicle. The apparatus further includes means for updating driving instructions for the autonomous vehicle based upon the motion plan that is defined.
The means for determining the size of the object in accordance with an example embodiment includes means for determining the height and width associated with the object. In this example embodiment, the apparatus further includes means for classifying the object as a larger object or a smaller object based upon the height of the object. The means for defining the motion plan for the autonomous vehicle in this example embodiment may include means for defining the motion plan for the autonomous vehicle such that a larger distance is maintained between the autonomous vehicle and the object in an instance in which the object is classified as the larger object then in an instance in which the object is classified as the smaller object. In another example embodiment, the apparatus also includes means for classifying the object as a larger object, an intermediate object or a smaller object based on the height of the object. In this example embodiment, the means for defining the motion plan for the autonomous vehicle may include means for defining the motion plan for the autonomous vehicle such that the autonomous vehicle straddles the object in an instance in which the object is classified as the smaller object.
In another example embodiment, a system is provided that includes an autonomous vehicle. The autonomous vehicle may include at least one vehicle sensor that collects sensor data that includes location coordinates of an object on a carriageway along which the autonomous vehicle is to travel. The system also may include a control subsystem carried by or otherwise associated with the autonomous vehicle. The control subsystem may include processing circuitry configured to receive the sensor data from the at least one vehicle sensor. The sensor data may include location coordinates of the object. The processing circuitry is also configured to evaluate the sensor data to determine whether the object is known or unknown. In an instance in which the object is unknown, the processing circuitry is configured to determine the size of the object. The processing circuitry is further configured to define a motion plan for the autonomous vehicle depending upon the size of the object and the location coordinates of the object relative to the autonomous vehicle. The motion plan that is defined is dependent upon the size of the object with different motion plans being defined for differently sized objects. Various motion plans may be defined including, for example, changing lanes, slowing and biasing away from the object or stopping the autonomous vehicle depending upon the size of the object and the location coordinates of the object relative to the autonomous vehicle. The processing circuitry is additionally configured to update driving instructions for the autonomous vehicle based upon the motion plan that is defined.
The system and, more particularly, the processing circuitry of an example embodiment is configured to determine the size of the object by determining the height and width associated with the object. In this example embodiment, the processing circuitry is further configured to classify the object as a larger object or a smaller object based on the height of the object. The processing circuitry of this example embodiment may also be configured to define a motion plan for the autonomous vehicle such that a larger distance is maintained between the autonomous vehicle and the object in an instance in which the object is classified as the larger object than in an instance in which the object is classified as the smaller object. In another example embodiment, the processing circuitry is further configured to classify the object as a larger object, an intermediate object or a smaller object based on the height of the object. In this example embodiment, the processing circuitry is configured to define the motion plan for the autonomous vehicle such that the autonomous vehicle straddles the object in an instance in which the object is classified as the smaller object.
For a more complete understanding of this disclosure, reference is now made to the following brief description, taken in connection with the accompanying drawings and detailed description, wherein like referenced numerals represent like parts.
Some embodiments of the present invention will now be described more fully hereinafter with reference to the accompanying drawings, in which some, but not all embodiments of the invention are shown. Indeed, various embodiments of the invention may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will satisfy applicable legal requirements. Like reference numerals refer to like elements throughout.
A control subsystem of an autonomous vehicle, as well as a corresponding method and computer program product are provided in accordance with an example embodiment. The control subsystem, method and computer program product may be employed by or in conjunction with any of a variety of different types of autonomous vehicles including a motor vehicle, such as an automobile or a truck, and more particularly, a tractor trailer that is configured to operate autonomously or at least semi-autonomously. Regardless of the type of autonomous vehicle, the control subsystem, method and computer program product of an example embodiment are configured to update driving instructions for the autonomous vehicle based upon a motion plan that is defined in a manner that is dependent upon the size of an object that has been identified on or along the carriageway along which the autonomous vehicle is to travel. As such, the control subsystem, method and computer program product may provide for more efficient operation of the autonomous vehicle by safely navigating about the object in a controlled fashion, thereby creating less disruption to the traffic flow about the object.
Although the control subsystem, method and computer program product of an example embodiment may be employed in conjunction with a variety of different types of autonomous vehicles, the control subsystem, method and computer program product will be described in conjunction with a truck, such as a tractor trailer, that is configured to operate autonomously by way of example, but not of limitation. As shown in
An object may be on or near the carriageway along which the autonomous vehicle is to travel. As shown, for example, in
Referring now to
As shown in
As shown in
In some example embodiments, the processing circuitry 22 may include a processor, and in some embodiments, such as that illustrated in
Although the processing circuitry 22 may include a single processor, it will be appreciated that the processing circuitry may comprise a plurality of processors. The plurality of processors may be in operative communication with each other and may be collectively configured to perform one or more functionalities of the control subsystem 20 as described herein. The plurality of processors may be embodied on a single computing device or distributed across a plurality of computing devices collectively configured to function as the control subsystem. In some example embodiments, the processing circuitry may be configured to execute instructions stored in the memory 24 or otherwise accessible to the processing circuitry. As such, whether configured by hardware or by a combination of hardware and software, the processing circuitry may represent an entity (e.g., physically embodied in circuitry) capable of performing operations according to embodiments of the present invention while configured accordingly. Thus, for example, when the processing circuitry is embodied as an ASIC, FPGA, or the like, the processing circuitry may be specifically configured hardware for conducting the operations described herein. Alternatively, as another example, when the processing circuitry is embodied as an executor of software instructions, the instructions may specifically configure the processing circuitry to perform one or more operations described herein.
In some example embodiments, the memory 24 may include one or more non-transitory memory devices such as, for example, volatile and/or non-volatile memory that may be either fixed or removable. In this regard, the memory may comprise a non-transitory computer-readable storage medium. It will be appreciated that while the memory is illustrated as a single memory, the memory may comprise a plurality of memories. The plurality of memories may be embodied on a single computing device or may be distributed across a plurality of computing devices. The memory may be configured to store information, data, applications, computer program code, instructions and/or the like for enabling the control subsystem to carry out various functions in accordance with one or more example embodiments.
The memory 24 may be further configured to buffer input data for processing by the processing circuitry 22. Additionally or alternatively, the memory may be configured to store instructions for execution by the processing circuitry. Among the contents of the memory, applications may be stored for execution by the processing circuitry to carry out the functionality associated with each respective application. In some cases, the memory may be in communication with one or more of the processing circuitry and/or communication interface 26, for passing information among components of the control subsystem.
The communication interface 26, such as a network interface, may include one or more interface mechanisms for enabling communication with other devices and/or networks. In some cases, the communication interface may be any means such as a device or circuitry embodied in either hardware, or a combination of hardware and software that is configured to receive and/or transmit data from/to a network and/or any other device or module in communication with the processing circuitry 22. By way of example, the communication interface may be configured to communicate with any of one or more various external systems such as an operation server, the communication systems of law enforcement or other traffic safety personnel, or the like, in an embodiment in which the control subsystem is carried by the autonomous vehicle. Alternatively, the communication interface may be configured to communicate with the autonomous vehicle and, more particularly, the various subsystems of the autonomous vehicle in an embodiment in which the control subsystem is off board the autonomous vehicle.
In some example embodiments, the control subsystem 20 may include a sensor interface 28 configured to communicate with one or more vehicle sensors of the autonomous vehicle. Although the sensor interface may be embodied by the processing circuitry 22 and/or the communication interface 26, the sensor interface of an example embodiment is a discrete component configured to communicate with vehicle sensor(s). In an embodiment in which the sensor interface is a discrete component, the sensor interface is also in communication with the processing circuitry in order to provide the processing circuitry with at least some of the sensor data received via the sensor interface.
Referring now to
The sensor data may be provided by a variety of vehicle sensors configured to provide information regarding an object 16 on the carriageway 12 along with the autonomous vehicle 10 is to travel. For example, the vehicle sensor may be a LiDAR (light detection and ranging) sensor, a camera, e.g., an infrared camera, a radar system or the like. With respect to a LiDAR sensor, the LiDAR sensor may have a predefined range defining the distance in advance of the autonomous vehicle at which an object may be reliably detected. For example, a LiDAR sensor may have a range of 150 meters to 200 meters, a range of 250 meters to 300 meters or a range of approximately 500 meters. In an example embodiment in which the vehicle sensor is a LiDAR sensor the sensor data may include distance measurements. For example, the distance measurements may include a distance traveled by an object (e.g., a displacement of an object), distances of an object from a LiDAR sensor at different times (t), etc.
In another example, the sensor data from the LiDAR sensor may include a cloud of point data representing obstacles or objects 16, which have been illuminated by the laser (e.g., radio wave), within the environment surrounding the autonomous vehicle 10, that is, within the detection zones of the vehicle sensors. The cloud of point data may include points corresponding to light emitted from the LiDAR sensors and reflected from objects within the environment surrounding the autonomous vehicle. The time delay between the transmitted light and the reflected light bounced off an object corresponds to the distance between the LiDAR sensor and that object. The intensity of the reflected light bounced off an object may be indicative of a surface type of that object, e.g., a metal, skin, plastic, fabric, concrete, etc.
In a further example, if the vehicle sensors include motion sensors and the resulting sensor data may include motion measurements. For example, the motion measurements may include the motion of an object 16 from a first location to a second location. In such cases, the control subsystem 20 may determine the speed with which the object is moving and the direction of movement of that object. For example, the control subsystem may determine whether an object is moving towards the autonomous vehicle 10, away from the autonomous vehicle, across the autonomous vehicle (e.g., a pedestrian crossing the road), etc.
Regardless of the type of vehicle sensor, the range of the vehicle sensor is sufficient such that the object 16 may be detected and the autonomous vehicle 10 may respond so as to efficiently and safely navigate about the object prior to reaching the object. In some example embodiments, the autonomous vehicle includes a plurality of sensors, such as a plurality of different types of sensors and/or a plurality of the same type of sensor having different ranges. In this regard, the control subsystem 20 may be configured to use two or more types of sensor data to determine whether an object is detected (e.g., by combining or fusing camera images, LiDAR data, and radar data as described below with respect to
As shown in block 32 of
However, in an instance in which the object 16 is unknown, the control subsystem 20 includes means, such as the processing circuitry 22 or the like, for determining the size of the object. See block 36 of
The control subsystem 20 also includes means, such as the processing circuitry 22 or the like, for defining a motion plan for the autonomous vehicle 10 depending upon the size of the object 16 and the location coordinates of the object relative to the autonomous vehicle. See block 38. As such, the motion plan that is defined may be dependent upon the size of the object with different motion plans being defined for differently sized objects. For example, in an instance in which the size of the object is determined to be that of a larger object or a smaller object, different motion plans may be defined for larger objects than for smaller objects. In another example embodiment in which the size of the object causes the object to be classified as a larger object, an intermediate object or a smaller object, different motion plans may be defined for larger objects, intermediate objects and smaller objects. By way of example of the motion plans that are defined based at least partly upon the size of the object, the control subsystem, such as the processing circuitry, may be configured to define a motion plan of the autonomous vehicle such that a larger distance is maintained between the autonomous vehicle and the object in an instance in which the object is classified as a larger object than in an instance in which the object is classified as a smaller object.
As shown in block 40 of
By way of further illustration, the response of a control subsystem 20 and, in turn, the autonomous vehicle 10 directed by the control subsystem to the detection of a larger object will be described below with respect to
As noted above, the motion plan that is defined for the autonomous vehicle 10 depends not only upon the size of the object 16, but also the location coordinates of the object relative to the autonomous vehicle 10. In the illustrated embodiment in which a larger object has been identified, the control subsystem 20 includes means, such as the processing circuitry 22 or the like, for determining the location of the object relative to the autonomous vehicle including the distance therebetween, based upon the respective location coordinates of autonomous vehicle and the object. In this example embodiment, the control subsystem, such as the processing circuitry, determines whether the larger object is in a driving lane 14 of the carriageway 12 along which the autonomous vehicle is traveling, such as the same driving lane of the carriageway in which the autonomous vehicle is traveling or in a different driving lane of the carriageway. See block 52 of
In an instance in which the larger object 16 that is located within the driving lane 14 of the carriageway 12 is determined to not be within the predefined distance of a stopped vehicle, the control subsystem 20 includes means, such as the processing circuitry 22 or the like, for determining whether the autonomous vehicle is able to safely change lanes, if necessary, so to have a lateral gap of at least one empty lane between the autonomous vehicle 10 and the object at the time at which the autonomous vehicle passes most closely to the object. See block 56. As mentioned herein, the control subsystem, such as the processing circuitry, is configured to determine the gap, if any, between the object and the autonomous vehicle when the autonomous vehicle passes most closely to the object based upon the relative locations of the object and the autonomous vehicle, the direction of movement of the object and the autonomous vehicle and the speed with which both object and the autonomous vehicle are moving.
The control subsystem 20, such as the processing circuitry 22, may be configured to determine whether the autonomous vehicle 10 is able to change lanes 14 by evaluating the traffic conditions in the proximity of the autonomous vehicle. In this regard, the control subsystem, such as the processing circuitry, may be configured to evaluate information regarding the location of other vehicles or objects that have been detected relative to the autonomous vehicle and, after taking into account the relative speeds of the autonomous vehicle and the other vehicles in the proximity of the autonomous vehicle, to determine whether the autonomous vehicle can safely change lanes if necessary to ensure that at least one lane is vacant between the autonomous vehicle and the object 16 that has been detected at the time that the autonomous vehicle passes the object most closely.
In an instance in which the control subsystem 20, such as the processing circuitry 22, determines that the autonomous vehicle 10 is able to safely change lanes 14, the control subsystem includes means, such as the processing circuitry or the like, for defining a motion plan for the autonomous vehicle that causes the autonomous vehicle to change lanes, if necessary to have the lateral gap of at least one empty lane between the autonomous vehicle and the object 16 at the time in which the autonomous vehicle most closely passes the object. See block 58 of
Referring now to block 70 of
However, in an instance in which the autonomous vehicle 10 is determined not to be able to safely change lanes 14, such as due to other vehicles driving near the autonomous vehicle in the lane to which the autonomous vehicle would otherwise transition, the control subsystem 20 includes means, such as the processing circuitry 22 or the like, for determining whether the autonomous vehicle is able to pass the object 16 while remaining within the same lane with at least a minimum lateral distance between the autonomous vehicle and the object at the time at which the autonomous vehicle most closely passes the object. See block 74 of
In an instance in which the larger object 16 is not detected to be within a driving lane 14 of the carriageway 12, the control subsystem 20 includes means, such as the processing circuitry 22 or the like, for determining whether the larger object is located in a non-driving area of the carriageway, such as the shoulder of the road, a gore point or the like. See block 62 of
In an instance in which the autonomous vehicle 10 was determined to not be able to change lanes 14 to pass the object 16 with a predefined lateral distance, the control subsystem 20 includes means, such as the processing circuitry 22 or the like, for determining whether the autonomous vehicle is able to pass the object while remaining within the same lane with at least the minimum lateral distance being maintained relative to the object at the time that the autonomous vehicle closest to the object. See block 74. In an instance in which the autonomous vehicle is determined to pass the object with a gap of at least the minimum lateral distance, the control subsystem includes means, such as the processing circuitry or the like, for defining the motion plan to cause the autonomous vehicle to slow and to bias away from the object so as to pass the object with a gap of at least the minimum lateral distance. See block 76. However, if the autonomous vehicle is determined to not be able to pass the object with at least the minimum lateral distance, the control subsystem includes means, such as the processing circuitry or the like, for defining a motion plan to cause the autonomous vehicle to stop prior to reaching the object. See block 78.
In an instance in which the larger object 16 has been determined to not be within a driving lane 14 of the carriageway 12 and to not be in a non-driving area of the carriageway, such as an instance in which the larger object is off of the carriageway, but near the carriageway, such as within a predefined distance, e.g., 6 feet, of the carriageway, the control subsystem 20 includes means, such as the processing circuitry 22 or the like, for determining whether the autonomous vehicle 10 is able to maintain a minimum lateral distance with respect to the object that is off of the carriageway at the point at which the autonomous vehicle most closely passes the object. See block 64 of
As such, in an instance in which a larger object 16 is detected, the control subsystem 20 and an associated method and computer program product may be configured to define a variety of different motion plans for the autonomous vehicle 10 based not only on the object being classified as a larger object as a result of the size of the object, but also based upon the location coordinates of the object relative to the autonomous vehicle. In each of the different scenarios, however, the motion plan that is defined by the control subsystem provides for the efficient and safe navigation of the autonomous vehicle relative to the object that has been detected, thereby potentially creating less disruption to the traffic flow about the object.
In an instance in which as smaller object 16 has been detected as shown in block 80 of
In an instance in which the autonomous vehicle 10 is able to safely change lanes 14, if necessary in order to pass the object 16 with at least a predefined lateral distance therebetween, the control subsystem 20 includes means, such as the processing circuitry 22 or the like, for defining a motion plan that causes the autonomous vehicle to change lanes, if necessary, in a direction away from the object so as to allow the autonomous vehicle to pass the object with at least a gap of the predefined lateral distance. See block 88 of
In an instance in which the autonomous vehicle 10 is determined to pass the object 16 with at least the minimum lateral distance, the control subsystem 20 includes means, such as the processing circuitry 22 or the like, for defining a motion plan that causes the autonomous vehicle to slow and to bias away from the object, while remaining within the same lane 14 in which the autonomous vehicle is traveling so as to increase the gap between the autonomous vehicle and the object, thereby allowing the autonomous vehicle to pass the object of a gap of at least the minimum lateral distance. See block 92 of
As such, in an instance in which a smaller object 16 is identified, the control subsystem 20 as well as the associated method and computer program product also define a motion plan based not only upon the size of the object and the classification of the object as a smaller object, but also upon the location coordinates of the object relative to the autonomous vehicle 10. As such, a motion plan is defined to allow the autonomous vehicle to respond in an efficient and safe manner to the detection of the smaller object. However, as the foregoing examples illustrate, the motion plans that are defined are different for a larger object than for a smaller object, thereby tailoring the response of the autonomous vehicle based upon the size of the unknown object.
As described above, other embodiments determine the size of the object 16 in other manners, such as by classifying the object not as a larger object or a smaller object, but as a larger object, an intermediate object or a smaller object. In this example embodiment, the control subsystem 20, method and computer program product may be configured to respond to the detection of a larger object as described above in conjunction with
Provision of an Alert and/or Updating of a Map
In some embodiments, in addition to defining a motion plan for the autonomous vehicle 10 in response to the detection of an object 16, the control subsystem 20 includes means, such as the processing circuitry 22, the communication interface 26 or the like, for communicating with another entity to alert the other entity of the presence of the object. For example, in an embodiment in which the control subsystem is carried by the autonomous vehicle, the control subsystem, such as the communication interface, may be configured to cause a signal to be transmitted to an operation server, e.g., an oversight system, a control center, etc., and/or to the communication system(s) of law enforcement and/or traffic safety personnel to alert the other entities as to the presence of the object. In addition to merely providing an alert as to the object, information regarding the location of the object and/or the size of the object may be provided. As such, the notification of law enforcement or safety personnel may also allow the law enforcement of safety personnel to be dispatched to address the object, such as by ensuring removal of the object from the carriageway.
In response to an alert, an operation server may alert other autonomous vehicles traveling along the carriageway 12 in the same direction as the autonomous vehicle 10 that detected the object 16. These other autonomous vehicles generally trail the autonomous vehicle that detected the object by some distance and, as a result, will potentially encounter the same object at a later point in time. In this regard, the operation server may be configured to redefine the motion plans for the other autonomous vehicles so as to allow the other autonomous vehicle not only to be alert to detection of the object, but to position the other autonomous vehicles in advance of reaching the object so as to be further away from the object, thereby reducing or eliminating the evasive maneuvers to be taken by the other autonomous vehicles upon detection of the object.
Although such an alert may be provided in response to the detection of a single object 16, the control subsystem 20, such as the processing circuitry 22, of an example embodiment is configured to identify an instance in which a predefined number of objects 16 are detected within a predefined distance or a predefined area and to issue an alert as described above in an instance in which at least the predefined number of objects are identified within the predefined distance or the predefined area, such as in the instance in which a debris field is identified on the carriageway 12.
The detection of the object 16 may also facilitate updating of a map that includes a representation of the carriageway 12. In this regard, the autonomous vehicle 10 may maintain a map of the roadways including the carriageway along which the autonomous vehicle is traveling. The map may be stored, for example, in memory 24 and accessible to the processing circuitry 22, such as in conjunction with navigation of the autonomous vehicle. In response to the detection of an object, the control subsystem may include means, such as the processing circuitry or the like, for updating the map to indicate the location of the object that has been detected and, in some embodiments, the size of the object. Once updated, the control subsystem, such as the processing circuitry, the communication interface 26 or the like, may be configured to share a copy of the map, such as with the operation sever and/or other nearby autonomous vehicles, thereby providing an update as to the detected object.
In other embodiments, a map of the roadways including the carriageway 12 along which the autonomous vehicle 10 is traveling is maintained off board the autonomous vehicle, such as by the operation server. In these embodiments, the control subsystem 20 includes means, such as a processing circuitry 22 or the like, for providing information, such as via the communication interface 26, in response to detecting the object. The information may include the location coordinates of the object and, in some embodiments, information regarding the size of the object. The operation server of this example embodiment may be configured to similarly update the map to include information regarding the object that has been detected and, in some embodiments, to then provide the updated map data to other autonomous vehicles in the vicinity of the object in order to further alert the other autonomous vehicles of the object that has been detected.
The autonomous vehicle 102 may include various vehicle subsystems that support of the operation of autonomous vehicle. The vehicle subsystems may include the control subsystem 20, a vehicle drive subsystem 110, a vehicle sensor subsystem 112, and/or a vehicle control subsystem 114. The components or devices of the vehicle drive subsystem, the vehicle sensor subsystem, and the vehicle control subsystem shown in
The vehicle sensor subsystem 112 may include a number of sensors 116 configured to sense information about an environment or condition of the autonomous vehicle 102. The vehicle sensor subsystem may include one or more cameras 116a or image capture devices, a radar unit 116b, one or more temperature sensors 116c, a wireless communication unit 116d (e.g., a cellular communication transceiver), an inertial measurement unit (IMU) 116e, a laser range finder/LiDAR unit 116f, a Global Positioning System (GPS) transceiver 116g, and/or a wiper control system 116h. The vehicle sensor subsystem may also include sensors configured to monitor internal systems of the autonomous vehicle (e.g., an 02 monitor, a fuel gauge, an engine oil temperature, etc.).
The IMU 116e may include any combination of sensors (e.g., accelerometers and gyroscopes) configured to sense position and orientation changes of the autonomous vehicle 102 based on inertial acceleration. The GPS transceiver 116g may be any sensor configured to estimate a geographic location of the autonomous vehicle. For this purpose, the GPS transceiver may include a receiver/transmitter operable to provide information regarding the position of the autonomous vehicle with respect to the Earth. The radar unit 116b may represent a system that utilizes radio signals to sense objects within the local environment of the autonomous vehicle. In some embodiments, in addition to sensing the objects, the radar unit may additionally be configured to sense the speed and the heading of the objects proximate to the autonomous vehicle. The laser range finder or LiDAR unit 116f may be any sensor configured to sense objects in the environment in which the autonomous vehicle is located using lasers. The cameras 116a may include one or more devices configured to capture a plurality of images of the environment of the autonomous vehicle. The cameras may be still image cameras or motion video cameras.
The vehicle control subsystem 114 may be configured to control the operation of the autonomous vehicle 102 and its components. Accordingly, the vehicle control subsystem may include various elements such as a throttle and gear 114a, a brake unit 114b, a navigation unit 114c, a steering system 114d, and/or an autonomous control unit 114e. The throttle may be configured to control, for instance, the operating speed of the engine and, in turn, control the speed of the autonomous vehicle. The gear may be configured to control the gear selection of the transmission. The brake unit can include any combination of mechanisms configured to decelerate the autonomous vehicle. The brake unit can use friction to slow the wheels in a standard manner. The brake unit may include an Anti-lock brake system (ABS) that can prevent the brakes from locking up when the brakes are applied. The navigation unit may be any system configured to determine a driving path or route for the autonomous vehicle. The navigation unit may additionally be configured to update the driving path dynamically while the autonomous vehicle is in operation. In some embodiments, the navigation unit may be configured to incorporate data from the GPS transceiver 116g and one or more predetermined maps so as to determine the driving path for the autonomous vehicle. The steering system may represent any combination of mechanisms that may be operable to adjust the heading of autonomous vehicle in an autonomous mode or in a driver-controlled mode.
The autonomous control unit 114e may represent a control system configured to identify, evaluate, and avoid or otherwise negotiate potential obstacles or obstructions in the environment of the autonomous vehicle 102. In general, the autonomous control unit may be configured to control the autonomous vehicle for operation without a driver or to provide driver assistance in controlling the autonomous vehicle. In some embodiments, the autonomous control unit may be configured to incorporate data from the GPS transceiver 116g, the radar 116b, the LiDAR unit 116f, the cameras 116a, and/or other vehicle subsystems to determine the driving path or trajectory for the autonomous vehicle.
Many or all of the functions of the autonomous vehicle 102 can be controlled by the in-vehicle control computer 104. The in-vehicle control computer may include at least one data processor 118 (which can include at least one microprocessor) that executes processing instructions 120 stored in a non-transitory computer readable medium, such as the data storage device 122 or memory. The in-vehicle control computer may also represent a plurality of computing devices that may serve to control individual components or subsystems of the autonomous vehicle in a distributed fashion. In some embodiments, the data storage device may contain processing instructions (e.g., program logic) executable by the data processor to perform various methods and/or functions of the autonomous vehicle, including those described with respect to
The data storage device 122 may contain additional instructions as well, including instructions to transmit data to, receive data from, interact with, or control one or more of the vehicle drive subsystem 110, the vehicle sensor subsystem 112, and the vehicle control subsystem 114. The in-vehicle control computer 104 can be configured to include a data processor 118 and a data storage device 122. The in-vehicle control computer may control the function of the autonomous vehicle 102 based on inputs received from various vehicle subsystems (e.g., the vehicle drive subsystem, the vehicle sensor subsystem, and the vehicle control subsystem).
The sensor fusion module 132 can perform instance segmentation 138 on image and/or point cloud data item to identify an outline (e.g., boxes) around the objects and/or obstacles located around the autonomous vehicle 10. The sensor fusion module can perform temporal fusion 140 where objects and/or obstacles from one image and/or one frame of point cloud data item are correlated with or associated with objects and/or obstacles from one or more images or frames subsequently received in time.
The sensor fusion module 132 can fuse the objects and/or obstacles from vehicle sensors, such as the images obtained from the camera and/or point cloud data item obtained from the LiDAR sensors. For example, the sensor fusion module may determine based on a location of two cameras that an image from one of the cameras comprising one half of a vehicle located in front of the autonomous vehicle 10 is the same as the vehicle located captured by another camera. The sensor fusion module may send the fused object information to the interference module 142 and the fused obstacle information to the occupancy grid module 144. The in-vehicle control computer may include the occupancy grid module 144, which can retrieve landmarks from a map database 146 stored in the in-vehicle control computer. The occupancy grid module 144 can determine drivable areas and/or obstacles from the fused obstacles obtained from the sensor fusion module 132 and the landmarks stored in the map database 146. For example, the occupancy grid module 144 can determine that a drivable area may include a speed bump obstacle.
Below the sensor fusion module 132, the in-vehicle control computer 104 includes a LiDAR based object detection module 148 that can perform object detection 150 based on point cloud data item obtained from the LiDAR sensors 152 located on the autonomous vehicle 10. The object detection technique can provide a location (e.g., in 3D world coordinates) of objects from the point cloud data item. Below the LiDAR based object detection module, the in-vehicle control computer includes an image based object detection module 154 that can perform object detection 156 based on images obtained from cameras 158 located on the autonomous vehicle. The object detection technique can employ a deep machine learning technique to provide a location (e.g., in 3D world coordinates) of objects from the image provided by the cameras 158.
The radar 160 on the autonomous vehicle 10 can scan an area in front of the autonomous vehicle or an area towards which the autonomous vehicle is driven. The radar data may be sent to the sensor fusion module 132 that can use the radar data to correlate the objects and/or obstacles detected by the radar 160 with the objects and/or obstacles detected from both the LiDAR point cloud data item and the camera image. The radar data also may be sent to the inference module 142 that can perform data processing on the radar data to track objects 162 as further described below.
The in-vehicle control computer 104 includes an interference module 142 that may receive the locations of the objects from the point cloud and the objects from the image, and the fused objects from the sensor fusion module 132. The interference module 142 also receives the radar data with which the interference module can track objects 162 from one point cloud data item and one image obtained at one time instance to another (or the next) point cloud data item and another image obtained at another subsequent time instance.
The interference module 142 may perform object attribute estimation 164 to estimate one or more attributes of an object detected in an image or point cloud data item. The one or more attributes of the object may include a type of object (e.g., pedestrian, car, or truck, etc.). The interference module may perform behavior prediction 166 to estimate or predict a motion pattern of an object detected in an image and/or a point cloud. The behavior prediction can be performed to detect a location of an object in a set of images received at different points in time (e.g., sequential images) or in a set of point cloud data item received at different points in time (e.g., sequential point cloud data items). In some embodiments the behavior prediction can be performed for each image received from a camera 158 and/or each point cloud data item received from the LiDAR sensor 152. In some embodiments, the interference module 142 can reduce its computational load by performing behavior prediction on every other or after every pre-determined number of images received from a camera 158 or point cloud data item received from the LiDAR sensor 152 (e.g., after every two images or after every three point cloud data items).
The behavior prediction 166 feature may determine the speed and direction of the objects that surround the autonomous vehicle 10 based on the radar data, where the speed and direction information can be used to predict or determine motion patterns of objects. A motion pattern may include a predicted trajectory information of an object over a pre-determined length of time in the future after an image is received from a camera 158. Based on the motion pattern predicted, the interference module 142 may assign motion pattern situational tags to the objects (e.g., “located at coordinates (x,y),” “stopped,” “driving at 50 mph,” “speeding up” or “slowing down”). The situation tags can describe the motion pattern of the object. The interference module may send the one or more object attributes (e.g., types of the objects) and motion pattern situational tags to a planning module 170. The interference module 142 may perform an environment analysis 168 using any information acquired by the system 130 and any number and combination of its components.
The in-vehicle control computer 104 includes the planning module 170 that may receive the object attributes and motion pattern situational tags from the interference module 142, the drivable area and/or obstacles, and the vehicle location and pose information from the fused localization module 172 (further described below).
The planning module 170 can perform navigation planning 174 to determine a set of trajectories on which the autonomous vehicle 10 can be driven. The set of trajectories can be determined based on the drivable area information, the one or more object attributes of objects, the motion pattern situational tags of the objects, location of the obstacles, and the drivable area information. In some embodiments, the navigation planning may include determining an area next to the road where the autonomous vehicle can be safely parked in case of emergencies. The planning module may include behavioral decision making 176 to determine driving actions (e.g., steering, braking, throttle) in response to determining changing conditions on the road (e.g., traffic light turned yellow, or the autonomous vehicle is in an unsafe driving condition because another vehicle drove in front of the autonomous vehicle and in a region within a pre-determined safe distance of the location of the autonomous vehicle). The planning module may perform trajectory generation 178 and may select a trajectory from the set of trajectories determined by the navigation planning operation. The selected trajectory information may be sent by the planning module to the control module 180.
The in-vehicle control computer 104 includes a control module 180 that may receive the proposed trajectory from the planning module 170 and the autonomous vehicle location and pose from the fused localization module 172. The control module may include a system identifier 182. The control module can perform a model based trajectory refinement 184 to refine the proposed trajectory. For example, the control module 180 can apply a filtering (e.g., Kalman filter) to make the proposed trajectory data smooth and/or to minimize noise. The control module 180 may perform robust control 186 by determining, based on the refined proposed trajectory information and current location and/or pose of the autonomous vehicle, an amount of brake pressure to apply, a steering angle, a throttle amount to control the speed of the vehicle, and/or a transmission gear. The control module can send the determined brake pressure, steering angle, throttle amount, and/or transmission gear to one or more devices in the autonomous vehicle to control and facilitate precise driving operations of the autonomous vehicle 10.
The deep image-based object detection 156 performed by the image based object detection module 154 can also be used detect landmarks (e.g., stop signs, speed bumps, etc.,) on the road. The in-vehicle control computer 104 may include a fused localization module 172 that obtains the landmarks detected from the images, the landmarks obtained from a map database 188 stored on the in-vehicle control computer, the landmarks detected from the point cloud data item by the LiDAR based object detection module 148, the speed and displacement from the odometer sensor 190, and/or the estimated location of the autonomous vehicle from the GPS/IMU sensor 194 (e.g., GPS sensor 196 and IMU sensor 198) located on or in the autonomous vehicle 10. Based on this information, the fused localization module can perform a localization operation 192 to determine a location of the autonomous vehicle, which can be sent to the planning module 170 and the control module 180.
The fused localization module 172 can estimate a pose 200 of the autonomous vehicle 10 based, e.g., on the GPS and/or IMU sensors 194, the landmarks detected from the images, the landmarks obtained from the map database 188, and/or the landmarks detected from the point cloud data item by the LiDAR based object detection module 148. The pose of the autonomous vehicle can be sent to the planning module 170 and the control module 180. The fused localization module can also estimate status (e.g., location, possible angle of movement) of the trailer unit 202 based on, for example, the information provided by the IMU sensor 198 (e.g., angular rate and/or linear velocity). The fused localization module may also check the map content 204, e.g., to identify new objects perceived by the LiDARs 152, radar 160, and/or cameras 158 but are not stored in the landmarks map 188.
Accordingly, blocks of the flowchart support combinations of means for performing the specified functions and combinations of operations for performing the specified functions. It will also be understood that one or more blocks of the flowchart, and combinations of blocks in the flowchart, can be implemented by special purpose hardware-based computer systems which perform the specified functions, or combinations of special purpose hardware and computer instructions.
As used herein, in an instance in which the control subsystem 20 is described to receive data or other information from another device, it will be appreciated that the data or other information may be received directly from the other device and/or may be received indirectly via one or more intermediary devices, such as, for example, one or more servers, relays, routers, network access points, and/or the like. Similarly, in an instance in which the control subsystem is described herein to transmit data or other information to another device, it will be appreciated that the data or other information may be sent directly to the other device or may be sent to the other device via one or more interlinking devices, such as, for example, one or more servers, relays, routers, network access points, and/or the like.
In this regard, devices, including the control subsystem 20 or components thereof, shown or discussed as coupled or directly coupled or communicating with each other may be indirectly coupled or communicating through some interface, device, or intermediate component whether electrically, mechanically, or otherwise. In addition, techniques, systems, subsystems, and methods described and illustrated in the various embodiments as discrete or separate may be combined or integrated with other systems, modules, techniques, or methods without departing from the scope of this disclosure.
Many modifications and other embodiments of the inventions set forth herein will come to mind to one skilled in the art to which these inventions pertain having the benefit of the teachings presented in the foregoing descriptions and the associated drawings. Therefore, it is to be understood that the inventions are not to be limited to the specific embodiments disclosed and that modifications and other embodiments are intended to be included within the scope of the appended claims. Moreover, although the foregoing descriptions and the associated drawings describe example embodiments in the context of certain example combinations of elements and/or functions, it should be appreciated that different combinations of elements and/or functions may be provided by alternative embodiments without departing from the scope of the appended claims. In this regard, for example, different combinations of elements and/or functions than those explicitly described above are also contemplated as may be set forth in some of the appended claims. Although specific terms are employed herein, they are used in a generic and descriptive sense only and not for purposes of limitation.
This application claims benefit of U.S. Provisional Application No. 63/322,116 filed Mar. 21, 2022, the entire contents of which are incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
63322116 | Mar 2022 | US |