Autonomous vehicles, for instance, vehicles that do not require a human driver, can be used to aid in the transport of passengers or items from one location to another. Such vehicles may operate in a fully autonomous mode where passengers may provide some initial input, such as a pickup or destination location, and the vehicle maneuvers itself to that location, for instance, by determining and following a route which may require the vehicle to respond to and interact with other road users such as vehicles, pedestrians, bicyclists, etc. It is critical that the autonomous control software used by these vehicles to operate in the autonomous mode is tested and validated before such software is actually used to control the vehicles in areas where the vehicles are interacting with other objects.
Aspects of the disclosure provide for a method for simulating sensor data and evaluating sensor behavior in an autonomous vehicle. The method includes receiving, by one or more processors, log data collected for an environment along a given run for a given vehicle; performing, by the one or more processors using a software for autonomous driving, a simulated run of the given run using logged sensor data from the log data and environment data constructed using the log data; determining, by the one or more processors, first details regarding detection of objects during the given run using logged sensor data; running, by the one or more processors using the software for autonomous driving, a simulation of one or more detection devices on a simulated vehicle driving along the given run to obtain simulated sensor data, the simulation including the environment data constructed using the log data; determining, by the one or more processors, second details regarding detection of objects using the simulated sensor data; extracting, by the one or more processors, one or more metrics from the first details and the second details; and evaluating, by the one or more processors, the simulation based on the one or more metrics.
In one example, the method also includes selecting, by the one or more processors, the given run based on the log data. In this example, the selecting of the given run is further based on a type of object appear along a run in the log data. In another example, the method also includes constructing, by the one or more processors, the environment data using the log data. In this example, the constructing of the environment data includes representing objects in an area encompassing the given run in a scaled mesh.
In a further example, the determining of the first details includes determining a relationship between the logged sensor data and objects represented in the environment data. In yet another example, the running of the simulation includes retracing rays transmitted from the one or more detection devices and recomputing intensities of the rays off points in the constructed environment data. In a still further example, the running of the simulation includes modeling the one or more detection devices based on configuration characteristics or operational settings of a perception system of the given vehicle. In another example, the determining of the second details includes determining a relationship between the simulated sensor data and objects represented in the environment data. In a further example, the extracting of the one or more metrics includes a first metric related to a precision of detected object types; a second metric related to an amount of recall of an object type; and a third metric related to an average detection time.
Other aspects of the disclosure provide for a non-transitory, tangible computer-readable medium on which computer-readable instructions of a program are stored. The instructions, when executed by one or more computing devices, cause the one or more computing devices to perform a method for implementing a simulation for sensor data for an autonomous vehicle. The method includes receiving log data collected for an environment along a given run for a given vehicle; performing, using a software for autonomous driving, a simulated run of the given run using logged sensor data from the log data and environment data constructed using the log data; determining first details regarding detection of objects during the given run using logged sensor data; running, using the software for autonomous driving, a simulation of one or more detection devices on a simulated vehicle driving along the given run to obtain simulated sensor data, the simulation including the environment data constructed using the log data; determining second details regarding detection of objects using the simulated sensor data; extracting one or more metrics from the first details and the second details; and evaluating the simulation based on the one or more metrics.
In one example, the method also includes selecting the given run based on the log data. In this example, the selecting of the given run is further based on a type of object appear along a run in the log data. In another example, the method also includes constructing the environment data using the log data. In this example, the constructing of the environment data includes representing objects in an area encompassing the given run in a scaled mesh.
In a further example, the determining of the first details includes determining a relationship between the logged sensor data and objects represented in the environment data. In yet another example, the running of the simulation includes retracing rays transmitted from the one or more detection devices and recomputing intensities of the rays off points in the constructed environment data. In a still further example, the running of the simulation includes modeling the one or more detection devices based on configuration characteristics or operational settings of a perception system of the given vehicle. In another example, the determining of the second details includes determining a relationship between the simulated sensor data and objects represented in the environment data. In a further example, the extracting of the one or more metrics includes a first metric related to a precision of detected object types; a second metric related to an amount of recall of an object type; and a third metric related to an average detection time.
The technology relates to using simulations to model sensor behavior in an autonomous vehicle. In particular, the sensor behavior may be evaluated to determine effectiveness of a perception system of the autonomous vehicle. A simulated run may be performed using data collected in a run of the autonomous vehicle. Metrics may be extracted from the simulated run, which can indicate how one or more sensors behaved relative to certain types of objects or relative to previous simulations.
An autonomous vehicle may be maneuvered by one or more processors using a software. The autonomous vehicle may also have a perception system configured to detect data related to objects in the vehicle's environment. A simulation system may be configured to run the software through different scenarios based at least in part on log data of the vehicle.
To model sensor behavior and evaluate for realism, the simulation system may be configured to compare sensor data from log data for a given run and simulated sensor data from a simulation of the given run. The comparison be based on resulting perception objects from perception logic that processes the sensor data and the simulated sensor data. The perception logic may be a portion of the software of the autonomous vehicle. The data and/or the resulting perception objects may be compared and evaluated using one or more metrics.
Modeling sensor behavior includes selecting a given run based on sensor data collected by a vehicle using a perception system. A time frame of about twenty seconds from the run in the log data may be selected for the given run. The one or more processors may construct environment data for a simulation using the log data. The constructed environment data may include a scaled mesh representing objects in the environment. The scaled mesh may include points from LIDAR data in the log data. The one or more processors may run the logged sensor data of the given run using the perception logic to determine details regarding detection of objects during the given run. The logged sensor data may be run in the constructed environment data to establish the relationship between the logged sensor data and objects represented in the environment. To obtain simulated sensor data, the one or more processors may run a simulation using one or more simulated detection devices of a perception system and the constructed environment data. The one or more simulated detection devices may be based on configuration characteristics or operational settings of the perception system of the vehicle during the given run. The simulation may build an environment for the given run using the constructed environment data and perform the given run using the one or more simulated detection devices on the vehicle moving through the environment. The one or more processors may then determine details regarding detection of objects in the simulated sensor data using the perception logic in a same or similar manner as described above for the logged sensor data.
The one or more processors may extract one or more metrics from the details of the logged sensor data and the details of the simulated sensor data. The one or more metrics may be measurements of how similar the simulated sensor data is to the logged sensor data. The more similar the simulated sensor data is to the logged sensor data, the more realistic the simulation is. Additionally or alternatively, the one or more metrics may compare the characteristics of a detected object in the simulation or the determined details with labels or other input by human reviewers of the logged sensor data or the constructed environment data. Based on the one or more metrics, the one or more processors may evaluate how the simulation performed. The evaluation may be for the simulated sensor or for the constructed environment. For example, the evaluation may be for realism, or how well the simulation matches what occurs on the vehicle. When the one or more metrics indicate that the simulated sensor data match or nearly matches the logged sensor data, the simulation software may be utilized in future simulations for the vehicle. The technology described herein allows for evaluation of sensor simulation that can be used to build simulation software for running future simulations. The evaluation techniques increase confidence in the sensor simulation, which results in increased confidence in simulating other aspects of autonomous vehicle navigation or developing improvements to autonomous vehicle navigation on the basis of the simulated sensors. Using the sensor simulation software validated in the manner described herein may result in a more realistic future simulation of autonomous vehicle navigation. More log data may be simulated rather than collected over many runs and many hours of driving on a roadway. More accurate tests of autonomous vehicle software may be performed in simulation, which may be more efficient and safer than running tests on a roadway. The autonomous vehicle software may be continually improved using the simulation technology.
As shown in
The memory 130 stores information accessible by the one or more processors 120, including instructions 134 and data 132 that may be executed or otherwise used by the processor 120. The memory 130 may be of any type capable of storing information accessible by the processor, including a computing device-readable medium, or other medium that stores data that may be read with the aid of an electronic device, such as a hard-drive, memory card, ROM, RAM, DVD or other optical disks, as well as other write-capable and read-only memories. Systems and methods may include different combinations of the foregoing, whereby different portions of the instructions and data are stored on different types of media.
The instructions 134 may be any set of instructions to be executed directly (such as machine code) or indirectly (such as scripts) by the processor. For example, the instructions may be stored as computing device code on the computing device-readable medium. In that regard, the terms “software,” “instructions” and “programs” may be used interchangeably herein. The instructions may be stored in object code format for direct processing by the processor, or in any other computing device language including scripts or collections of independent source code modules that are interpreted on demand or compiled in advance. Functions, methods and routines of the instructions are explained in more detail below.
The data 132 may be retrieved, stored or modified by processor 120 in accordance with the instructions 134. For instance, although the claimed subject matter is not limited by any particular data structure, the data may be stored in computing device registers, in a relational database as a table having a plurality of different fields and records, XML documents or flat files. The data may also be formatted in any computing device-readable format.
The one or more processors 120 may be any conventional processors, such as commercially available CPUs. Alternatively, the one or more processors may be a dedicated device such as an ASIC or other hardware-based processor. Although
Computing devices 110 may have all of the components normally used in connection with a computing device such as the processor and memory described above as well as a user input 150 (e.g., a mouse, keyboard, touch screen and/or microphone) and various electronic displays (e.g., a monitor having a screen or any other electrical device that is operable to display information). In this example, the vehicle includes an internal electronic display 152 as well as one or more speakers 154 to provide information or audio-visual experiences. In this regard, internal electronic display 152 may be located within a cabin of vehicle 100 and may be used by computing devices 110 to provide information to passengers within the vehicle 100.
Computing devices 110 may also include one or more wireless network connections 156 to facilitate communication with other computing devices, such as the client computing devices and server computing devices described in detail below. The wireless network connections may include short range communication protocols such as Bluetooth, Bluetooth low energy (LE), cellular connections, as well as various configurations and protocols including the Internet, World Wide Web, intranets, virtual private networks, wide area networks, local networks, private networks using communication protocols proprietary to one or more companies, Ethernet, WiFi and HTTP, and various combinations of the foregoing.
In one example, computing devices 110 may be control computing devices of an autonomous driving computing system or incorporated into vehicle 100. The autonomous driving computing system may capable of communicating with various components of the vehicle in order to control the movement of vehicle 100 according to the autonomous control software of memory 130 as discussed further below. For example, returning to
As an example, computing devices 110 may interact with one or more actuators of the deceleration system 160 and/or acceleration system 162, such as brakes, accelerator pedal, and/or the engine or motor of the vehicle, in order to control the speed of the vehicle. Similarly, one or more actuators of the steering system 164, such as a steering wheel, steering shaft, and/or pinion and rack in a rack and pinion system, may be used by computing devices 110 in order to control the direction of vehicle 100. For example, if vehicle 100 is configured for use on a road, such as a car or truck, the steering system may include one or more actuators to control the angle of wheels to turn the vehicle. Signaling system 166 may be used by computing devices 110 in order to signal the vehicle's intent to other drivers or vehicles, for example, by lighting turn signals or brake lights when needed.
Routing system 168 may be used by computing devices 110 in order to determine and follow a route to a location. In this regard, the routing system 168 and/or data 132 may store detailed map information, e.g., highly detailed maps identifying the shape and elevation of roadways, lane lines, intersections, crosswalks, speed limits, traffic signals, buildings, signs, real time traffic information, vegetation, or other such objects and information.
Positioning system 170 may be used by computing devices 110 in order to determine the vehicle's relative or absolute position on a map or on the earth. For example, the position system 170 may include a GPS receiver to determine the device's latitude, longitude and/or altitude position. Other location systems such as laser-based localization systems, inertial-aided GPS, or camera-based localization may also be used to identify the location of the vehicle. The location of the vehicle may include an absolute geographical location, such as latitude, longitude, and altitude as well as relative location information, such as location relative to other cars immediately around it which can often be determined with less noise that absolute geographical location.
The positioning system 170 may also include other devices in communication with computing devices 110, such as an accelerometer, gyroscope or another direction/speed detection device to determine the direction and speed of the vehicle or changes thereto. By way of example only, an acceleration device may determine its pitch, yaw or roll (or changes thereto) relative to the direction of gravity or a plane perpendicular thereto. The device may also track increases or decreases in speed and the direction of such changes. The device's provision of location and orientation data as set forth herein may be provided automatically to the computing devices 110, other computing devices and combinations of the foregoing.
The perception system 172 also includes one or more components for detecting objects external to the vehicle such as other vehicles, obstacles in the roadway, traffic signals, signs, trees, etc. For example, the perception system 172 may include lasers, sonar, radar, cameras and/or any other detection devices that record data which may be processed by computing device 110. In the case where the vehicle is a passenger vehicle such as a minivan, the minivan may include a laser or other sensors mounted on the roof or other convenient location. For instance,
The computing devices 110 may control the direction and speed of the vehicle by controlling various components. By way of example, computing devices 110 may navigate the vehicle to a destination location completely autonomously using data from the detailed map information and routing system 168. Computing devices 110 may use the positioning system 170 to determine the vehicle's location and perception system 172 to detect and respond to objects when needed to reach the location safely. In order to do so, computing devices 110 may cause the vehicle to accelerate (e.g., by increasing fuel or other energy provided to the engine by acceleration system 162), decelerate (e.g., by decreasing the fuel supplied to the engine, changing gears, and/or by applying brakes by deceleration system 160), change direction (e.g., by turning the front or rear wheels of vehicle 100 by steering system 164), and signal such changes (e.g., by lighting turn signals of signaling system 166). Thus, the acceleration system 162 and deceleration system 160 may be a part of a drivetrain that includes various components between an engine of the vehicle and the wheels of the vehicle. Again, by controlling these systems, computing devices 110 may also control the drivetrain of the vehicle in order to maneuver the vehicle autonomously.
Computing device 110 of vehicle 100 may also receive or transfer information to and from other computing devices, such as those computing devices that are a part of the transportation service as well as other computing devices.
As shown in
The network 460, and intervening nodes, may include various configurations and protocols including short range communication protocols such as Bluetooth, Bluetooth LE, the Internet, World Wide Web, intranets, virtual private networks, wide area networks, local networks, private networks using communication protocols proprietary to one or more companies, Ethernet, WiFi and HTTP, and various combinations of the foregoing. Such communication may be facilitated by any device capable of transmitting data to and from other computing devices, such as modems and wireless interfaces.
In one example, one or more computing devices 410 may include one or more server computing devices having a plurality of computing devices, e.g., a load balanced server farm, that exchange information with different nodes of a network for the purpose of receiving, processing and transmitting the data to and from other computing devices. For instance, one or more computing devices 410 may include one or more server computing devices that are capable of communicating with computing device 110 of vehicle 100 or a similar computing device of vehicle 100A as well as computing devices 420, 430, 440 via the network 460. For example, vehicles 100, 100A, may be a part of a fleet of vehicles that can be dispatched by server computing devices to various locations. In this regard, the server computing devices 410 may function as a simulation system which can be used to validate autonomous control software which vehicles such as vehicle 100 and vehicle 100A may use to operate in an autonomous driving mode. The simulation system may additionally or alternatively be used to run simulations for the autonomous control software as further described below. In addition, server computing devices 410 may use network 460 to transmit and present information to a user, such as user 422, 432, 442 on a display, such as displays 424, 434, 444 of computing devices 420, 430, 440. In this regard, computing devices 420, 430, 440 may be considered client computing devices.
As shown in
Although the client computing devices 420, 430, and 440 may each comprise a full-sized personal computing device, they may alternatively comprise mobile computing devices capable of wirelessly exchanging data with a server over a network such as the Internet. By way of example only, client computing device 420 may be a mobile phone or a device such as a wireless-enabled PDA, a tablet PC, a wearable computing device or system, or a netbook that is capable of obtaining information via the Internet or other networks. In another example, client computing device 430 may be a wearable computing system, shown as a wristwatch as shown in
In some examples, client computing device 440 may be an operations workstation used by an administrator or operator to review simulation outcomes, handover times, and validation information. Although only a single operations workstation 440 is shown in
As with memory 130, storage system 450 can be of any type of computerized storage capable of storing information accessible by the server computing devices 410, such as a hard-drive, memory card, ROM, RAM, DVD, CD-ROM, write-capable, and read-only memories. In addition, storage system 450 may include a distributed storage system where data is stored on a plurality of different storage devices which may be physically located at the same or different geographic locations. Storage system 450 may be connected to the computing devices via the network 460 as shown in
Storage system 450 may store various types of information as described in more detail below. This information may be retrieved or otherwise accessed by a server computing device, such as one or more server computing devices 410, in order to perform some or all of the features described herein. For instance, storage system 450 may store log data. This log data may include, for instance, sensor data generated by a perception system, such as perception system 172 of vehicle 100 as the vehicle is being driven autonomously or manually. Additionally or alternatively, the log data may be generated from one or more sensors positioned along a roadway or mounted on another type of vehicle, such as an aerial vehicle. As an example, the sensor data may include raw sensor data as well as data identifying defining characteristics of perceived objects such as shape, location, orientation, speed, etc. of objects such as vehicles, pedestrians, bicyclists, vegetation, curbs, lane lines, sidewalks, crosswalks, buildings, etc. The log data may also include “event” data identifying different types of events such as collisions or near collisions with other objects, planned trajectories describing a planned geometry and/or speed for a potential path of the vehicle 100, actual locations of the vehicle at different times, actual orientations/headings of the vehicle at different times, actual speeds, accelerations and decelerations of the vehicle at different times, classifications of and responses to perceived objects, behavior predictions of perceived objects, status of various systems (such as acceleration, deceleration, perception, steering, signaling, routing, power, etc.) of the vehicle at different times including logged errors, inputs to and outputs of the various systems of the vehicle at different times, etc. As such, these events and the sensor data may be used to “recreate” the vehicle's environment, including perceived objects, and behavior of a vehicle in a simulation.
In addition, the storage system 450 may also store autonomous control software which is to be used by vehicles, such as vehicle 100, to operate a vehicle in an autonomous driving mode. This autonomous control software stored in the storage system 450 may be a version which has not yet been validated. Once validated, the autonomous control software may be sent, for instance, to memory 130 of vehicle 100 in order to be used by computing devices 110 to control vehicle 100 in an autonomous driving mode.
In addition to the operations described above and illustrated in the figures, various operations will now be described. It should be understood that the following operations do not have to be performed in the precise order described below. Rather, various steps can be handled in a different order or simultaneously, and steps may also be added or omitted.
To model and evaluate behavior of the perception system 172, the server computing devices 410 may run simulations of various scenarios for an autonomous vehicle. In particular, a simulation may be run to compare sensor data from log data for a given run and simulated sensor data from a simulation of the given run. In some implementations, the simulation may be for a particular sensor or detection device or group of sensors or detection devices, such as LIDAR, radar, or cameras. The sensor data from the log data may be from the aforementioned log data of storage system 450. The comparison may be based on resulting perception objects from perception logic that processes the sensor data and the simulated sensor data. The data and/or the resulting perception objects may be compared and evaluated using one or more metrics.
Modeling sensor behavior includes the server computing devices 410 selecting a given run based on log data collected by a vehicle using a perception system, such as vehicle 100 using perception system 172. The vehicle may or may not be capable of driving autonomously. The given run may be selected from the log data based on certain criteria or based on user selections. The certain criteria may include one or more types of objects detectable by the perception logic, such as pedestrians, cyclist, vehicles, motorcycles, foliage, sidewalks, adults, children, or free space. For example, the server computing devices 410 or the user selections may identify a point at which the one or more type of objects appear along a run in the log data. A time frame of about twenty seconds from the run in the log data may be selected for the given run, such as a time frame including ten seconds before where the vehicle detects an object of the one or more type of objects and ten seconds after where the vehicle detects the object. Different time frames may be used in other runs or implementations.
As shown in
The given run 601 may comprise the locations logged by the vehicle 100 during ten seconds of driving in the area 600. In the given run 601, the vehicle is approaching an intersection 604 from an initial location in a first direction. In
The server computing devices 410 may construct environment data for a simulation using the log data. For example, the server computing devices 410 may use log data to identify static scenery and perception objects in the area encompassing the given run. The log data used to construct environment data may include data collected before or after the given run. For constructing the static scenery or non-static scenery, the data used may include data collected on a different day, data collected by different vehicles or devices, or map data. The constructed environment data may include a scaled mesh representing objects in the environment. The scaled mesh may include points from LIDAR data in the log data. In some implementations, the constructed environment data may include regenerated mesh points based on the LIDAR data in the log data.
In the example shown in
The server computing devices 410 may perform a simulated run of the given run to compare the logged sensor data with objects represented in the environment data. The perception logic may be used by the server computing devices 410 to determine first details regarding detection of objects during the given run, such as how data is received from objects in the environment data using one or more detection devices in the perception system 172 and how the data is then processed. In addition, the logged sensor data may be run in the constructed environment data to establish the relationship between the logged sensor data and objects represented in the environment. The logged sensor data may include camera image data. In some implementations, the sensor data and the perception logic used at this step may be for a particular sensor or detection device or a group of sensors or detection devices in the perception system 172 that is selected for testing. The particular sensor or detection device may be selected based on user input received by the server computing devices 410. In addition, a particular configuration for the particular sensor or detection device may be used for the simulated run, including such as location, pointing direction, or field of view. The perception logic used at this step may be used in a particular manner to alter or mutate simulated sensor data in a desired way. The first details determined regarding the detection of objects may include shape of a detected object, a location of the detected object, and/or a point in time when the object is detected. For example, the first details may include or be extracted based on a collection of points, or pointset, from the constructed scaled mesh that are associated with a particular object.
To obtain simulated sensor data, the server computing devices 410 may run a simulation using one or more simulated detection devices of the perception system 172 and the constructed environment data. The simulation may include retracing rays transmitted from the one or more simulated detection devices and recompute intensities of the reflected rays off points in the constructed environment data. The one or more simulated detection devices may be based on configuration characteristics or operational settings of the perception system 172 of the vehicle 100 during the given run. For example, the configuration characteristics may include types of transmitters or receivers, types of lenses, connections between components, or position relative to the vehicle 100, and the operational settings may include frequency of data capture, signal frequencies, or pointing directions. The simulation may build an environment for the given run using the constructed environment data and perform the given run using the one or more simulated detection devices on the vehicle moving through the environment. The given run may be performed in the simulation at a same day and time, along a same path, and in a same manner as the given run in the log data. The same path may be a path corresponding to the time frame for the given run.
The server computing devices 410 may then determine second details regarding detection of objects in the simulated sensor data using the perception logic in a same or similar manner as described above for the first details of the logged sensor data. For example, the second details may include how data is received from objects in the environment data by the one or more simulated detection devices in the perception system 172 and how the data is then processed. In addition, the relationship between the simulated sensor data and objects represented in the environment may be determined for the second details. In some implementations, the sensor data and the perception logic used at this step may be for a particular sensor or detection device in the perception system 172 that is selected for testing. The particular sensor or detection device may be selected based on user input received by the server computing devices 410. The second details determined regarding the detection of objects may include shape of a detected object, a location of the detected object, and/or a point in time when the object is detected. For example, the second details may include or be extracted based on a collection of points, or pointset, from the constructed scaled mesh that are associated with a particular object.
As shown in
The server computing devices 410 may extract one or more metrics from the first details of the logged sensor data and the second details of the simulated sensor data. The one or more metrics may be measurements of how similar the simulated sensor data is to the logged sensor data. The more similar the simulated sensor data is to the logged sensor data, the more realistic the simulation is. As shown in flow diagram 900 shown in
Additionally or alternatively, the one or more metrics may compare the characteristics of a detected object in the simulation or the determined details with labels or other input by human reviewers of the logged sensor data or the constructed environment data. These one or more metrics may be measurements of how similar the simulated or logged sensor data are to what a human driver sees. The more similar the simulated or logged sensor data is to the human reviewer input, the more accurately the data reflects the ground truths in the environment.
Based on the one or more metrics 910, the server computing devices 410 or other one or more processors may perform an evaluation 920 of how the simulation performed. The evaluation may be for the simulated sensor or for the constructed environment. For example, the evaluation may be for realism, or how well the simulation matches what occurred in the perception system 172 of the vehicle 100. Additionally or alternatively, the evaluation may be for how well the constructed environment matches the ground truths in the environment. The one or more metrics may be tracked over multiple simulations of a same scenario or different scenarios to determine whether the simulated sensor data matches or nearly matches the logged sensor data.
When the one or more metrics indicate that the simulated sensor data match or nearly matches the logged sensor data, the simulation software may be utilized in future simulations for the vehicle. A future simulation may be used to identify bugs in the autonomous vehicle software or find areas of improvement for the autonomous vehicle software. In some implementations, a future simulation may test how the objects detected by a sensor configuration or a perception logic compares to objects in the environment data. In other implementations, the future simulation may test how changes in the sensor configuration (different settings, different setup, new sensors, etc.) or changes in the perception logic alters object detection effectiveness in comparison to a current configuration or perception logic. The one or more metrics may be determined for the future simulations to evaluate whether the object detection effectiveness is improved from the current configuration or perception logic. In further implementations, the simulation software may be used to simulate a partial amount of sensor data in a future simulation, such as sensor data for some of the detection devices on the vehicle and not others, or some types of sensor data (such as sensor field of view or contours) and not others.
In some alternative implementations, the simulation may be configured to simulate at least a portion of a path different from the path of the vehicle in the log data. For example, the one or more processors may determine a different path in the time frame through the constructed environment data, as well as a simulated pose of each simulated detection device along the different path to obtain the simulated sensor data.
The technology described herein allows for evaluation of sensor simulation that can be used to build simulation software for running future simulations. The evaluation techniques increase confidence in the sensor simulation, which results in increased confidence in simulating other aspects of autonomous vehicle navigation or developing improvements to autonomous vehicle navigation on the basis of the simulated sensors. Using the sensor simulation software validated in the manner described herein may result in a more realistic future simulation of autonomous vehicle navigation. More log data may be simulated rather than collected over many runs and many hours of driving on a roadway. More accurate tests of autonomous vehicle software may be performed in simulation, which may be more efficient and safer than running tests on a roadway. The autonomous vehicle software may be continually improved using the simulation technology.
Unless otherwise stated, the foregoing alternative examples are not mutually exclusive, but may be implemented in various combinations to achieve unique advantages. As these and other variations and combinations of the features discussed above can be utilized without departing from the subject matter defined by the claims, the foregoing description of the embodiments should be taken by way of illustration rather than by way of limitation of the subject matter defined by the claims. In addition, the provision of the examples described herein, as well as clauses phrased as “such as,” “including” and the like, should not be interpreted as limiting the subject matter of the claims to the specific examples; rather, the examples are intended to illustrate only one of many possible embodiments. Further, the same reference numbers in different drawings can identify the same or similar elements.