The present invention relates generally to computer test systems, and specifically to an autonomous vehicle simulation system.
Unmanned vehicles are becoming increasingly more common in a number of tactical missions, such as in surveillance and/or combat missions. As an example, in the case of aircraft, as some flight operations became increasingly more dangerous or tedious, unmanned aerial vehicles (UAV) have been developed as a means for replacing pilots in the aircraft for controlling the aircraft. Furthermore, as computer processing and sensor technology has advanced significantly, unmanned vehicles can be operated in an autonomous manner. For example, a given unmanned vehicle can be operated based on sensors configured to monitor external stimuli, and can be programmed to respond to the external stimuli and to execute mission objectives that are either programmed or provided as input commands, as opposed to being operated by a remote pilot.
One embodiment includes a simulation system for an autonomous vehicle. The simulation system includes a user interface configured to facilitate user inputs comprising spontaneous simulated events in a simulated virtual environment during simulated operation of the autonomous vehicle via an autonomous vehicle control system. The system also includes a simulation controller configured to generate simulated sensor data based on model data and behavioral data associated with each of the simulated virtual environment and the spontaneous simulated events. The simulated sensor data corresponds to simulated sensor inputs provided to the autonomous vehicle control system via sensors of the autonomous vehicle. The simulation controller is further configured to receive simulation feedback data from the autonomous vehicle control system corresponding to simulated interaction of the autonomous vehicle within the simulated virtual environment. The simulated interaction includes reactive behavior of the autonomous vehicle control system in response to the spontaneous simulated events.
Another embodiment includes a method for simulating a mission for an autonomous vehicle. The method includes storing model data and behavioral data associated with a simulated virtual environment and receiving control inputs via a user interface for control of simulated interaction of the autonomous vehicle in the simulated virtual environment. The method also includes providing control commands to an autonomous vehicle control system for control of the simulated interaction of the autonomous vehicle in the simulated virtual environment based on the control inputs. The method also includes receiving an event input via the user interface corresponding to a spontaneous simulated event in the simulated virtual environment during the simulated mission of the autonomous vehicle. The method also includes integrating the spontaneous simulated event into the simulated virtual environment based on the model and behavioral data associated with each of a simulated virtual environment and the autonomous vehicle and model data and behavioral data associated with the spontaneous simulated event. The method also includes providing simulated sensor data to the autonomous vehicle control system based on the model data and the behavioral data associated with each of the simulated virtual environment and the spontaneous simulated event. The method further includes providing simulation feedback data from the autonomous vehicle control system comprising the simulated interaction of the autonomous vehicle within the simulated virtual environment and reactive behavior of the autonomous vehicle control system in response to the spontaneous simulated event to the user interface.
Another embodiment includes a simulation system for an autonomous vehicle. The simulation system includes a user interface configured to facilitate user inputs comprising spontaneous simulated events in a simulated virtual environment during simulated operation of the autonomous vehicle via an autonomous vehicle control system, and to record a simulated interaction of the autonomous vehicle in the simulated virtual environment to generate an event log comprising a simulated mission of the autonomous vehicle. The system also includes a simulation controller. The simulation controller includes a memory configured to store model data and behavior data associated with the simulated virtual environment. The simulation controller also includes a simulation driver configured to generate at least one event entity based on the model data, the behavioral data, the user inputs, and a clock signal; to integrate the at least one event entity into the simulated virtual environment; to provide simulated sensor data based on the model and behavioral data associated with each of the simulated virtual environment and the at least one event entity, and to receive simulation feedback data from the autonomous vehicle control system corresponding to the simulated interaction of the autonomous vehicle within the simulated virtual environment. The simulated interaction includes reactive behavior of the autonomous vehicle control system in response to the at least one event entity.
The present invention relates generally to computer test systems, and specifically to an autonomous vehicle simulation system. An autonomous vehicle simulation system includes a user interface configured to facilitate user inputs for the simulation system and a simulation controller configured to implement a simulated mission for the autonomous vehicle. The user inputs can include model inputs to generate simulation models associated with a simulated virtual environment, dynamic objects (e.g., static object and traffic entities), environmental factors (e.g., simulated weather conditions), and sensors associated with the autonomous vehicle. For example, the simulation controller can include a memory configured to store the model data, as well as behavioral data associated with dynamic objects, the autonomous vehicle, and physical interactions of the components of the simulation. The user inputs can also include control commands that can provide simple operational commands (e.g., air traffic control commands in the example of an autonomous aircraft) to an autonomous vehicle control system of the autonomous vehicle (e.g., takeoff, land, target a specific object, etc.). As an example, the control commands can be provided as voice inputs via a voice control interface of the user interface that can be configured to convert voice inputs into the control commands for interpretation by the autonomous vehicle control system and to convert operational feedback signals as voice acknowledgements to be interpreted by the user.
In addition, the user interface can be configured to facilitate event inputs associated with spontaneous simulated events. As an example, the spontaneous simulated events can be spontaneous perturbations of the simulated virtual environment, such as to force reactive behavior of the autonomous vehicle control system in the simulated interaction of the autonomous vehicle in the simulated virtual environment. For example, the spontaneous simulated events can correspond to behavioral changes associated with dynamic objects, such as simulated vehicles in the simulated virtual environment, and/or changes to environmental conditions (e.g., simulated weather conditions) in the simulated virtual environment. The simulation controller can include a simulation driver configured to generate event entities in response to the event inputs and to integrate the event inputs into the simulated virtual environment to elicit a simulated improvised behavioral response of the autonomous vehicle in response to the spontaneous simulated event. The simulation controller can receive feedback signals from the autonomous vehicle control system to monitor the simulated interaction of the autonomous vehicle in the simulated virtual environment, such that the simulated interaction can be monitored via the user interface and/or recorded in an event log corresponding to a simulated mission for the autonomous vehicle. Accordingly, the autonomous vehicle can be tested based on monitoring reactive behavioral in a simulated mission in response to the user provided spontaneous simulated events.
The autonomous vehicle simulation system 10 includes an autonomous vehicle control system 12 that is configured as an operational controller for the associated autonomous vehicle, and is therefore the component of the autonomous vehicle that is to be tested for autonomous operation of the associated autonomous vehicle in a simulated manner, as described herein. As an example, the autonomous vehicle control system 12 can be configured as one or more processors that are configured to receive inputs from sensors associated with the autonomous vehicle and provide outputs to operational components of the autonomous vehicle. The autonomous vehicle control system 12 can thus be tested for autonomous operation of the autonomous vehicle in a simulated mission based on inputs provided to and feedback provided from the autonomous vehicle control system 12. As described herein, the terms “simulated mission” and “simulation of the autonomous vehicle” describe a simulation operation of the autonomous vehicle control system in a simulated virtual environment in which a simulated version of the autonomous vehicle interacts with the simulated virtual environment based on the inputs provided to and the feedback provided from the autonomous vehicle control system 12. Therefore, during a simulated mission, the autonomous vehicle control system 12 may be disconnected from the autonomous vehicle itself, such that the input signals to and feedback signals from the autonomous vehicle control system 12 may be isolated from the respective sensors and operational components of the associated autonomous vehicle.
The autonomous vehicle simulation system 10 also includes a user interface 14 that is configured to facilitate control inputs to provide control commands to the associated autonomous vehicle and to facilitate simulation inputs associated with simulating the operation of the autonomous vehicle. As described herein, the term “control command” describes simple operating commands of the autonomous vehicle that can be provided during the simulated mission, such as for driving, takeoff, landing, turning, targeting, etc., such as in the same manner that an air traffic controller interacts with a piloted vehicle, and does not refer to continuous piloting of the autonomous vehicle. The user interface 14 is also configured to monitor the simulation of the autonomous vehicle, such that a user of the user interface 14 can determine success or failure of a given simulated mission, can provide inputs to the autonomous vehicle control system 12 during an associated simulated mission, and can store the results of a given simulated mission in an event log associated with the simulated mission. In the example of
The simulation controller 16 includes a memory system 18 and a simulation driver 20. The memory system 18 can be arranged as one or more memory structures or devices configured to store data associated with the simulation of the autonomous vehicle. Additionally, the memory system 18 can be configured to store a variety of other data and data files, such as stored logs of simulated missions of the autonomous vehicle. In the example of
The simulation driver 20 is configured to integrate the simulation inputs SIM provided via the user interface 14 with the model data 22 and the simulation behavioral data 24 to provide the simulation commands SIM_CMD to the autonomous vehicle control system 12. Additionally, the simulation driver 20 is configured to receive the feedback signals SIM_CMD from the autonomous vehicle control system 12 to update the conditions and status of the simulated mission, and to provide the feedback signals SIM to the user interface 14 to allow the user to monitor the simulated mission via the user interface 14. As an example, the simulation driver 20 can be configured as one or more processors configured to compile the simulated mission. Therefore, the simulation driver 20 is configured to facilitate the simulation of the autonomous vehicle in a manner that is safer and less expensive than live real-world testing of the autonomous vehicle.
The memory system 50 can be arranged as one or more memory structures or devices configured to store data associated with the simulation of the autonomous vehicle. Additionally, the memory system 50 can be configured to store a variety of other data and data files, such as stored logs of simulated missions of the autonomous vehicle. In the example of
As an example, to provide traceability between simulation and real world testing, the target demonstration environment of the scene of interest (e.g., Hawthorne Municipal Airport in the example of
Referring back to the example of
The second view 154 of the dynamic object corresponds to a dynamic object model demonstrating an approximate outline of the physical boundaries of the dynamic object. As an example, the dynamic object model can be generated by a user via the user interface 14 and can be stored in the memory system 50 as one of the dynamic models 56. The dynamic object model demonstrated by the second view 154 thus corresponds to physical features and characteristics of the dynamic object as functionally interpreted by the simulation controller 16, such as via the simulation driver 20. Thus, the simulation controller 16 can implement the physical boundaries and characteristics of the dynamic object model as a means of interaction of the autonomous vehicle and/or the operational and functional aspects of the autonomous vehicle with the dynamic object. For example, the dynamic object model can be used to define collisions of the dynamic object with the simulated version of the autonomous vehicle, simulated ordnance of the autonomous vehicle, and/or other dynamic objects in the simulated virtual environment. As an example, the simulation driver 20 can be configured to generate an event entity associated with the dynamic object, such that the dynamic object can be controlled within the simulated virtual environment based on the dynamic object model information stored in the dynamic models 56, as well as simulation behavioral data and timing data, as described in greater detail herein.
The second view 204 corresponds to the modeling of the simulated virtual environment, and thus a combination of the scene models 54 and the dynamic models 56. Thus, the second view 204 demonstrates an approximate outline of the physical boundaries of the static features of the geographic scene of interest and the dynamic objects therein, such as interpreted by the simulation driver 20. It is to be understood that the second view 204 is demonstrated as a conceptual diagram with respect to the model data 52, and is not necessarily a “view” that is provided to users. As an example, the simulated virtual environment can be generated by the simulation driver 20 based on the model inputs MOD_IN provided by a user via the user interface 14 and stored in the memory system 50 as the respective scene models 54 and the dynamic models 56. In the example of
As the autonomous vehicle moves relative to the static features of the geographic scene of interest, the first view 202 and the second view 204 can each be updated in real-time. As an example, the first view 202 can be updated in real-time for display to user(s) via the user interface 14, such as to simulate the view of the geographic scene of interest that is provided via one or more sensors (e.g., video, radar, lidar, or a combination thereof) to assist in providing control commands to the autonomous vehicle during the simulated mission and/or to monitor progress of the simulated mission in real-time. Similarly, the second view 204 can be updated by the simulation driver 20 to provide constant updates of the relative position of the simulated version of the autonomous vehicle with the static features and the dynamic objects of the simulated virtual environment, as well as the dynamic objects with respect to the static features and with respect to each other, as dictated by the scene models 54 and the dynamic models 56 and the associated simulation behavioral data described herein. Accordingly, the simulation driver 20 can be configured to update the location of the simulated version of the autonomous vehicle and the dynamic objects within the simulated virtual environment in approximate real-time.
Referring back to the example of
The model data 52 further includes sensor models 60. The sensor models 60 can include data associated with simulated aspects of the sensors of the autonomous vehicle. For example, the sensor models 60 can be implemented to simulate sensor responses of actual hardware sensors associated with the autonomous vehicle to transform conditions of the simulated virtual environment into actual sensor data. As an example, each sensor device associated with the autonomous vehicle can include a variety of detailed specifications, such as frame rate, resolution, field of view, dynamic range, mounting positions, and data formats. Therefore, each of the detailed specifications can be modeled and stored in the sensor models 60 to simulate the responses of the sensors of the autonomous vehicle, and thus can provide associated simulated responses for the simulated version of the autonomous vehicle. For example, the sensor models 60 can include models associated with a navigation sensor (e.g., modeled as a global navigation satellite system (GNSS) and/or inertial navigation sensor(s) (INS)), a radar system, a lidar system, a video system, electro-optical sensors, and/or a variety of other types of sensors. As described in greater detail herein, the simulation driver 20 can introduce event contingencies based on the sensor models 60 corresponding to the interaction of the autonomous vehicle in the simulated virtual environment during a simulated mission, such as defined in a test script. Therefore, by simulating raw sensor data in a simulated mission, the perception system of the actual autonomous vehicle, including all processing and data reduction components, can be tested for performance and accuracy.
In the example of
The simulation behavioral data 62 can also include autonomous vehicle behavior (“AV BEHAVIOR”) data 66. The autonomous vehicle behavior data 66 can include data associated with the autonomous vehicle, such as including physical characteristics of the autonomous vehicle, including physical boundaries of the autonomous vehicle with respect to the static features and dynamic objects of the simulated virtual environment. The autonomous vehicle behavior data 66 can also include data associated with the interaction of the actuators of the autonomous vehicle in the simulated virtual environment. Thus, features of the autonomous vehicle, such as guidance, navigation, control capabilities, actuators, and physical dynamics of the autonomous vehicle, can be defined in the autonomous vehicle behavior data 66 to govern the movement and interaction of the autonomous vehicle through the simulated virtual environment.
Furthermore, the simulation behavioral data 62 includes physics data 68. The physics data 68 can be configured to define the physical interaction of the models 54, 56, 58, and 60 with respect to each other and to the behavior defined in the simulation behavioral data 62. The physics data 68 can thus define physical interactions of substantially all components of the simulated virtual environment. As an example, the physics data can be generated via a physics engine, such as in the simulation controller 16, which can be implemented via one or more processors associated with the simulation controller 16. Thus the physics data 68 can be generated and provided to the simulation driver 20 via the memory system 50 as needed. Additionally or alternatively, the physics data 68 can be defined by a user via the user interface 14 and stored in the memory system 50 to be implemented by the simulation driver 20 during the simulated mission. Accordingly, the physics data 68 can approximate physical interactions between substantially all portions of the simulated virtual environment to provide for an accurate simulation of the autonomous vehicle to approximate real-world operation of the autonomous vehicle.
The user interface 250 includes a model control interface 252 that is configured to facilitate model inputs MOD_IN to the simulation controller 16. The model inputs MOD_IN can be provided to define the model data 52 and/or the simulation behavioral data 62 in the memory system 50. As an example, the model control interface 252 can be a program or application operating on the user interface 250.
The user interface 250 also includes a voice control interface 254. The voice control interface 254 is configured to receive voice audio inputs provided from a user, such as via a microphone, and to convert the voice audio inputs into control commands VC_CMD that are provided to the autonomous vehicle control system 12 (e.g., via the simulation driver 20). As an example, the control commands VC_CMD can be basic operational inputs that are provided for control of the autonomous vehicle, such that the autonomous vehicle control system 12 can respond via output signals provided to respective actuator components for motion control of the autonomous vehicle in a programmed manner. For example, the control commands VC_CMD can include commands for takeoff, landing, targeting, altitude control, speed control, directional control, or a variety of other simple commands to which the autonomous vehicle control system 12 can respond via outputs to control the autonomous vehicle based on the control programming therein. Therefore, the user of the user interface 250 can implement the simulated mission of the autonomous vehicle via the voice inputs provided to the voice control interface 254. As another example, the voice inputs can be provided to the voice control interface 254 as pre-recorded audio transmissions to allow for scripted voice scenarios of the simulated mission. Additionally, the voice control interface 254 can receive feedback signals VC_ACK from the autonomous vehicle control system 12 and convert the feedback signals to pre-recorded audio signals for interpretation by the associated user. The feedback signals VC_ACK can be status signals and/or acknowledgement signals to provide the user with sufficient information for control and/or mission parameters associated with the simulated mission. Accordingly, based on the voice control interface 254, a simulated mission of the autonomous vehicle can be initiated and completed based on implementing voice commands and audio feedback.
The user interface 250 also includes an event control interface 256 configured to facilitate event inputs SIM_EVT that can be provided to generate predetermined perturbations to the simulated virtual environment to test the reactive behavior of the autonomous vehicle control system 12 during a simulated mission. As an example, the event inputs SIM_EVT can be provided as Extensible Markup Language (XML) scripts. The event control interface 256 can be implemented to provide the event inputs SIM_EVT before a simulated mission or during a simulated mission, such as to control the conditions of the simulated virtual environment, such as with respect to the dynamic objects and/or the environment conditions (e.g., simulated weather conditions). As an example, the event inputs SIM_EVT can correspond to scripted events (e.g., time-based), can correspond to spontaneous events provided by the user, or can initiate random events (e.g., generated randomly via the simulation driver 20). Thus, the autonomous vehicle control system 12, in controlling the simulated version of the autonomous vehicle, can be tested for improvised reactive behavior to the events that are defined via the event inputs SIM_EVT based on the programming therein.
The user interface 250 further includes a simulation feedback interface 258. The simulation feedback interface 258 is configured to receive feedback signals SIM_FBK that can be provided, for example, from the simulation driver 20 to enable user(s) to monitor the simulated operation of the autonomous vehicle, such as in real-time. As an example, the simulation feedback interface 258 can include a monitor or a set of monitors that can display the simulated virtual environment in real-time during the simulated mission, such as to simulate video camera or other imaging sensor feed(s) to monitor the simulated interaction of the autonomous vehicle in the simulated virtual environment. For example, the monitor of the simulation feedback interface 258 can display simulated video images, radar images, lidar images, or a combination thereof. The user(s) can thus view the simulated virtual environment in a variety of different ways, such as overhead (e.g., as demonstrated by the diagram 100 in the example of
The simulation driver 300 includes an event generator 302 that is configured to generate event entities 304 corresponding to dynamic events in the simulated virtual environment during the simulated mission, and stores the event entities 304 in a memory 306. As an example, the memory 306 can correspond to the memory system 50 in the example of
In the example of
The simulation driver 300 also includes a simulation integrator 314 that is configured to integrate the event entities 304 into the simulated virtual environment. The simulation integrator 314 receives the clock signal CLK and the model data MOD_DT from the memory system 50, such as the scene models 54. Thus, at an appropriate time dictated by the a comparison of real-time (via the clock signal CLK) with the time stamp associated with the event entity 304, or in substantial real-time, the simulation integrator 314 can access the appropriate event entity 304 and provide the necessary integration of the associated event in the simulated virtual environment. The simulation integrator 314 can integrate the event entity 304 into the simulated virtual environment by compiling the model data 308 and behavioral data 310 with the scene models 54 to provide the associated dynamic activity relative to the static features of the simulated virtual environment at the appropriate time. Additionally, the simulation integrator 314 can access the sensor models 60 to translate the event entity 304 into sensor data, such as to simulate raw sensor data of sensors on-board the actual autonomous vehicle, that can be interpreted by the autonomous vehicle control system 12.
In the example of
The simulation integrator 314 can thus provide the simulation feedback signals SIM_FBK to simulate the results of the outputs provided from the autonomous vehicle control system 12, such as based on the autonomous vehicle behavior data 66 and the physics data 68 that can be provided via the simulation behavioral data BHV_DT that can be provided from the memory system 50. As described previously, the simulation feedback signals SIM_FBK can be provided to the user interface 250 (e.g., the simulation feedback interface 258), such that user(s) can monitor the movement, behavior, and/or reactions of the autonomous vehicle, and thus the simulated operation of the autonomous vehicle. Accordingly, based on the operation of the simulation driver 300, user(s) can monitor the simulated interaction of the autonomous vehicle in the simulated virtual environment, including the reactive behavior of the autonomous vehicle to the perturbations of the simulated virtual environment provided by the event entities 304 to provide for accurate testing of the programmed control of the autonomous vehicle via the autonomous vehicle control system 12.
In view of the foregoing structural and functional features described above, a methodology in accordance with various aspects of the present invention will be better appreciated with reference to
At 360, the spontaneous simulated event (e.g., an event entity 304) is integrated into the simulated virtual environment based on the model and behavioral data associated with each of a simulated virtual environment and the autonomous vehicle and model data (e.g., the model data 308) and behavioral data (e.g., the behavioral data 310) associated with the spontaneous simulated event. At 362, simulated sensor data (e.g., the signals SIM_CMD) is provided to the autonomous vehicle control system based on the model data and the behavioral data associated with each of the simulated virtual environment and the spontaneous simulated event. At 364, simulation feedback data (e.g., the signals SIM_CMD from the simulated autonomous control system 12 and the simulation feedback signals SIM_FBK) is received from the autonomous vehicle control system comprising the simulated interaction of the autonomous vehicle within the simulated virtual environment and reactive behavior of the autonomous vehicle control system in response to the spontaneous simulated event.
What have been described above are examples of the invention. It is, of course, not possible to describe every conceivable combination of components or methodologies for purposes of describing the invention, but one of ordinary skill in the art will recognize that many further combinations and permutations of the invention are possible. Accordingly, the invention is intended to embrace all such alterations, modifications, and variations that fall within the scope of this application, including the appended claims.