This disclosure relates to autonomous vehicles. More specifically, this disclosure relates to controllers, controller systems, and controller methods for autonomous vehicles.
In a typical autonomous vehicle (AV), a controller is developed and then subsequently tuned. Upon use, the controller encounters a problem, and a new controller must be developed and tuned. This process is repeated indefinitely and is costly, time consuming, and requires extensive vehicle track time. Accordingly, it would be desirable to have a self-developing controller, a self-tuning controller, or a combination thereof.
Disclosed herein are implementations of a controller that is configured to self-develop, self-tune, or both, based on a design of experiments (DOE) test matrix. The methods and systems disclosed herein may be used online, offline, or a combination thereof. For the purposes of this disclosure, the term “online” is used with respect to a scenario where a host vehicle is on a response surface (i.e., a test track, a roadway, or the like) under activity conditions. For the purposes of this disclosure, the term “offline” is used with respect to a scenario where a host vehicle is not on a response surface (i.e., a test track, a roadway, or the like), and therefore not under activity conditions.
A method may be used for controlling a vehicle. The vehicle may be an autonomous vehicle (AV) and referred to as a host vehicle. The method may include constructing a plant model. The plant model may be based on a DOE test matrix. The method may include performing a controller simulation based on the constructed plant model. The controller simulation may generate performance data based on the controller simulation.
The method may use one or more optimization algorithms based on machine learning, artificial intelligence, or a combination thereof. The terms “learning method” and “learning algorithm” are used for simplicity, and it is understood that any optimization algorithm may be used, whether it is a learning algorithm or a non-learning algorithm. For example, the method may include performing a first learning method to identify one or more regimes. In some implementations, the method may include performing a second learning method based on the one or more regimes. The method may include generating one or more parameter tunings based on the first learning method, the second learning method, or a combination thereof. The method may then use the one or more parameter tunings to update the AV controller.
In some implementations, the first learning method may be, for example, an unsupervised learning method. In some implementations, the second learning method may be, for example, a reinforcement learning method. The second learning method may be performed to optimize one or more parameters of each of the one or more regimes. In some implementations, the first learning method, the second learning method, or both, may be performed when the host vehicle is offline. In some implementations, the one or more parameter tunings may be generated when the host vehicle is offline.
In some implementations, the DOE test matrix may include one or more testing parameters. The one or more testing parameters may include, for example, an entry radius, a curve radius, an exit radius, an entry length, a curve length, an exit length, an entry speed, a curve speed, an exit speed, and direction. In some implementations, the method may replicate the DOE test matrix to refine the plant model.
In some implementations, the AV controller may be a pure pursuit controller, a kinematic front-wheel based feedback controller, a linear model predictive controller, or a non-linear model predictive controller. In some implementations, the AV controller may be updated in real-time.
A vehicle control system may be used for controlling a vehicle. The vehicle may be an AV and referred to as a host vehicle. The vehicle control system may include a controller, a processor, and a control interface. The control interface may be coupled to or in communication with the controller and the processor.
The processor may be configured to construct a plant model. The plant model may be based on a DOE test matrix. The processor may be configured to perform a controller simulation based on the constructed plant model. The processor may be configured to generate performance data. The performance data may be based on the controller simulation.
The processor may be configured to use one or more optimization algorithms based on machine learning, artificial intelligence, or a combination thereof. For example, the may be configured to perform a first learning method. The first learning method may be used to identify one or more regimes. In some implementations, the processor may be configured to perform a second learning method based on the one or more regimes. The processor may be configured to generate one or more parameter tunings based on the first learning method, the second learning method, or a combination thereof. The processor may be configured to transmit the one or more parameter tunings to the controller via the control interface to update the controller.
In some implementations, the first learning method, the second learning method, or both, performed by the processor may be an unsupervised learning method. In some implementations, the first learning method, the second learning method, or both, performed by the processor may be a reinforcement learning method. The second learning method may be performed by the processor to optimize one or more parameters of each of the one or more regimes. In some implementations, the first learning method and the second learning method may be performed by the processor on a condition that the vehicle is offline. In some implementations, the one or more parameter tunings may be generated by the processor on a condition that the vehicle is offline.
In some implementations, the DOE test matrix may include one or more test parameters. The test parameters may include, for example, an entry radius, a curve radius, an exit radius, an entry length, a curve length, an exit length, an entry speed, a curve speed, an exit speed, and direction. In some implementations, the processor may be further configured to replicate the DOE test matrix to refine the plant model.
In some implementations, the controller may be a pure pursuit controller, a kinematic front-wheel based feedback controller, a linear model predictive controller, or a non-linear model predictive controller. In some implementations, the controller may be updated in real-time.
The disclosure is best understood from the following detailed description when read in conjunction with the accompanying drawings. It is emphasized that, according to common practice, the various features of the drawings are not to-scale. On the contrary, the dimensions of the various features are arbitrarily expanded or reduced for clarity.
Reference will now be made in greater detail to a preferred embodiment of the invention, an example of which is illustrated in the accompanying drawings. Wherever possible, the same reference numerals will be used throughout the drawings and the description to refer to the same or like parts.
As used herein, the terminology “computer” or “computing device” includes any unit, or combination of units, capable of performing any method, or any portion or portions thereof, disclosed herein.
As used herein, the terminology “processor” indicates one or more processors, such as one or more special purpose processors, one or more digital signal processors, one or more microprocessors, one or more controllers, one or more microcontrollers, one or more application processors, one or more central processing units (CPU)s, one or more graphics processing units (GPU)s, one or more digital signal processors (DSP)s, one or more application specific integrated circuits (ASIC)s, one or more application specific standard products, one or more field programmable gate arrays, any other type or combination of integrated circuits, one or more state machines, or any combination thereof.
As used herein, the terminology “memory” indicates any computer-usable or computer-readable medium or device that can tangibly contain, store, communicate, or transport any signal or information that may be used by or in connection with any processor. For example, a memory may be one or more read only memories (ROM), one or more random access memories (RAM), one or more registers, low power double data rate (LPDDR) memories, one or more cache memories, one or more semiconductor memory devices, one or more magnetic media, one or more optical media, one or more magneto-optical media, or any combination thereof.
As used herein, the terminology “instructions” may include directions or expressions for performing any method, or any portion or portions thereof, disclosed herein, and may be realized in hardware, software, or any combination thereof. For example, instructions may be implemented as information, such as a computer program, stored in memory that may be executed by a processor to perform any of the respective methods, algorithms, aspects, or combinations thereof, as described herein. Instructions, or a portion thereof, may be implemented as a special purpose processor, or circuitry, that may include specialized hardware for carrying out any of the methods, algorithms, aspects, or combinations thereof, as described herein. In some implementations, portions of the instructions may be distributed across multiple processors on a single device, on multiple devices, which may communicate directly or across a network such as a local area network, a wide area network, the Internet, or a combination thereof.
As used herein, the terminology “determine” and “identify,” or any variations thereof, includes selecting, ascertaining, computing, looking up, receiving, determining, establishing, obtaining, or otherwise identifying or determining in any manner whatsoever using one or more of the devices and methods shown and described herein.
As used herein, the terminology “example,” “embodiment,” “implementation,” “aspect,” “feature,” or “element” indicates serving as an example, instance, or illustration. Unless expressly indicated, any example, embodiment, implementation, aspect, feature, or element is independent of each other example, embodiment, implementation, aspect, feature, or element and may be used in combination with any other example, embodiment, implementation, aspect, feature, or element.
As used herein, the terminology “or” is intended to mean an inclusive “or” rather than an exclusive “or.” That is, unless specified otherwise, or clear from context, “X includes A or B” is intended to indicate any of the natural inclusive permutations. That is, if X includes A; X includes B; or X includes both A and B, then “X includes A or B” is satisfied under any of the foregoing instances. In addition, the articles “a” and “an” as used in this application and the appended claims should generally be construed to mean “one or more” unless specified otherwise or clear from the context to be directed to a singular form.
Further, for simplicity of explanation, although the figures and descriptions herein may include sequences or series of steps or stages, elements of the methods disclosed herein may occur in various orders or concurrently. Additionally, elements of the methods disclosed herein may occur with other elements not explicitly presented and described herein. Furthermore, not all elements of the methods described herein may be required to implement a method in accordance with this disclosure. Although aspects, features, and elements are described herein in particular combinations, each aspect, feature, or element may be used independently or in various combinations with or without other aspects, features, and elements.
An AV includes an object detection system, a navigation system, and a controller system. The object detection system is configured to determine whether other vehicles or objects like pedestrians and cyclists will intersect the travel path of the host vehicle. The navigation system is configured to determine a travel path for the host vehicle. The controller system is configured to communicate with the object detection system and the navigation system to operate a steering/acceleration profile for the host vehicle that avoids the potential collisions with other vehicles or objects.
One example controller is a pure pursuit controller that does not consider path curvature. Another example controller may be a kinematic front-wheel based feedback controller that considers forward driving. Stability of a linear model predictive controller may depend on horizon length. A nonlinear model predictive controller may consume a large communication and computing power. Choosing the right controller may be difficult given the numerous choices and applications. In addition, it is time consuming to develop and tune a lateral and longitudinal control system. Each time the controller is modified, it requires retuning and excessive testing. Each controller requires unique tuning methods.
The steering system 1030 may include a steering actuator 1040 that is an electric power-assisted steering actuator. The brake system may include one or more brakes 1050 coupled to respective wheels 1060 of the vehicle 1000. Additionally, the processor 1020 may be programmed to command the brake system to apply a net asymmetric braking force by each brakes 1050 applying a different braking force than the other brakes 1050.
The processor 1020 may be further programmed to command the brake system to apply a braking force, for example a net asymmetric braking force, in response to a failure of the steering system 1030. Additionally or alternatively, the processor 1020 may be programmed to provide a warning to an occupant in response to the failure of the steering system 1030. The steering system 1030 may be a power-steering control module. The control system 1010 may include the steering system 1030. Additionally, the control system 1010 may include the brake system.
The steering system 1030 may include a steering actuator 1040 that is an electric power-assisted steering actuator. The brake system may include two brakes 1050 coupled to respective wheels 1060 on opposite sides of the vehicle 1000. Additionally, the method may include commanding the brake system to apply a net asymmetric braking force by each brakes 1050 applying a different braking force.
The control system 1010 allows one of the steering system 1030 and the brake system to take over for the other of the steering system 1030 and the brake system if the other fails while the vehicle 1000 is executing a turn. Whichever of the steering system 1030 and the braking system remains operable is then able to apply sufficient yaw torque to the vehicle 1000 to continue the turn. The vehicle 1000 is therefore less likely to impact an object such as another vehicle or a roadway barrier, and any occupants of the vehicle 1000 are less likely to be injured.
The vehicle 1000 may operate in one or more of the levels of autonomous vehicle operation. For purposes of this disclosure, an autonomous mode is defined as one in which each of propulsion (e.g., via a powertrain including an electric motor and/or an internal combustion engine), braking, and steering of the vehicle 1000 are controlled by the processor 1020; in a semi-autonomous mode the processor 1020 controls one or two of the propulsion, braking, and steering of the vehicle 1000. Thus, in one example, non-autonomous modes of operation may refer to SAE levels 0-1, partially autonomous or semi-autonomous modes of operation may refer to SAE levels 2-3, and fully autonomous modes of operation may refer to SAE levels 4-5.
With reference to
The control system 1010 may transmit signals through the communications network, which may be a controller area network (CAN) bus, Ethernet, Local Interconnect Network (LIN), Bluetooth, and/or by any other wired or wireless communications network. The processor 1020 may be in communication with a propulsion system 2010, the steering system 1030, the brake system 2020, sensors 2030, and/or a user interface 2040, among other components.
With continued reference to
With reference to
With reference to
The steering column 1080 transfers rotation of the steering wheel 1070 to movement of the steering rack 1090. The steering column 1080 may be, e.g., a shaft connecting the steering wheel 1070 to the steering rack 1090. The steering column 1080 may house a torsion sensor and a clutch (not shown).
The steering wheel 1070 allows an operator to steer the vehicle 1000 by transmitting rotation of the steering wheel 1070 to movement of the steering rack 1090. The steering wheel 1070 may be, e.g., a rigid ring fixedly attached to the steering column 1080 such as is known.
With continued reference to
The steering actuator 1040 may provide power assist to the steering system 1030. In other words, the steering actuator 1040 may provide torque in a direction in which the steering wheel 1070 is being rotated by a human driver, allowing the driver to turn the steering wheel 1070 with less effort. The steering actuator 1040 may be an electric power-assisted steering actuator.
With reference to
With reference to
The user interface 2040 presents information to and receives information from an occupant of the vehicle 1000. The user interface 2040 may be located, e.g., on an instrument panel in a passenger cabin (not shown) of the vehicle 1000, or wherever may be readily seen by the occupant. The user interface 2040 may include dials, digital readouts, screens, speakers, and so on for output, i.e., providing information to the occupant, e.g., a human-machine interface (HMI) including elements such as are known. The user interface 2040 may include buttons, knobs, keypads, touchscreens, microphones, and so on for receiving input, i.e., information, instructions, etc., from the occupant.
Wireless transceiver 3072 may include one or more devices configured to exchange transmissions over an air interface to one or more networks (e.g., cellular, the Internet, etc.) by use of a radio frequency, infrared frequency, magnetic field, or an electric field. Wireless transceiver 3072 may use any known standard to transmit and/or receive data (e.g., Wi-Fi, Bluetooth®, Bluetooth Smart, 802.15.4, ZigBee, etc.). Such transmissions may include communications from the host vehicle to one or more remotely located servers. Such transmissions may also include communications (one-way or two-way) between the host vehicle and one or more target vehicles in an environment of the host vehicle (e.g., to facilitate coordination of navigation of the host vehicle in view of or together with target vehicles in the environment of the host vehicle), or even a broadcast transmission to unspecified recipients in a vicinity of the transmitting vehicle.
Both applications processor 3080 and image processor 3090 may include various types of hardware-based processing devices. For example, either or both of applications processor 3080 and image processor 3090 may include a microprocessor, preprocessors (such as an image preprocessor), graphics processors, a central processing unit (CPU), support circuits, digital signal processors, integrated circuits, memory, or any other types of devices suitable for running applications and for image processing and analysis. In some embodiments, applications processor 180 and/or image processor 190 may include any type of single or multi-core processor, mobile device microcontroller, central processing unit, or the like.
In some embodiments, applications processor 3080 and/or image processor 3090 may include multiple processing units with local memory and instruction sets. Such processors may include video inputs for receiving image data from multiple image sensors and may also include video out capabilities. In one example, the processor may use 90 nm-micron technology operating at 332 Mhz.
Any of the processing devices disclosed herein may be configured to perform certain functions. Configuring a processing device, such as any of the described processors, other controllers or microprocessors, to perform certain functions may include programming of computer executable instructions and making those instructions available to the processing device for execution during operation of the processing device. In some embodiments, configuring a processing device may include programming the processing device directly with architectural instructions. In other embodiments, configuring a processing device may include storing executable instructions on a memory that is accessible to the processing device during operation. For example, the processing device may access the memory to obtain and execute the stored instructions during operation. In either case, the processing device configured to perform the sensing, image analysis, and/or navigational functions disclosed herein represents a specialized hardware-based system in control of multiple hardware based components of a host vehicle.
While
Processing unit 3010 may comprise various types of devices. For example, processing unit 3010 may include various devices, such as a controller, an image preprocessor, a central processing unit (CPU), support circuits, digital signal processors, integrated circuits, memory, or any other types of devices for image processing and analysis. The image preprocessor may include a video processor for capturing, digitizing and processing the imagery from the image sensors. The CPU may comprise any number of microcontrollers or microprocessors. The support circuits may be any number of circuits generally well known in the art, including cache, power supply, clock and input-output circuits. The memory may store software that, when executed by the processor, controls the operation of the system. The memory may include databases and image processing software. The memory may comprise any number of random access memories, read only memories, flash memories, disk drives, optical storage, tape storage, removable storage and other types of storage. In one instance, the memory may be separate from the processing unit 3010. In another instance, the memory may be integrated into the processing unit 3010.
Each memory 3040, 3050 may include software instructions that when executed by a processor (e.g., applications processor 3080 and/or image processor 3090), may control operation of various aspects of vehicle control system 3000. These memory units may include various databases and image processing software, as well as a trained system, such as a neural network, or a deep neural network, for example. The memory units may include random access memory, read only memory, flash memory, disk drives, optical storage, tape storage, removable storage and/or any other types of storage. In some embodiments, memory units 3040, 3050 may be separate from the applications processor 3080 and/or image processor 3090. In other embodiments, these memory units may be integrated into applications processor 3080 and/or image processor 3090.
Position sensor 3030 may include any type of device suitable for determining a location associated with at least one component of vehicle control system 3000. In some embodiments, position sensor 3030 may include a GPS receiver. Such receivers can determine a user position and velocity by processing signals broadcasted by global positioning system satellites. Position information from position sensor 3030 may be made available to applications processor 3080 and/or image processor 3090.
In some embodiments, vehicle control system 3000 may include components such as a speed sensor (e.g., a speedometer) for measuring a speed of vehicle 1000. Vehicle control system 3000 may also include one or more accelerometers (either single axis or multi-axis) for measuring accelerations of vehicle 1000 along one or more axes.
The memory units 3040, 3050 may include a database, or data organized in any other form, that indication a location of known landmarks. Sensory information (such as images, radar signal, depth information from lidar or stereo processing of two or more images) of the environment may be processed together with position information, such as a GPS coordinate, vehicle's ego motion, etc. to determine a current location of the vehicle relative to the known landmarks, and refine the vehicle location.
User interface 3070 may include any device suitable for providing information to or for receiving inputs from one or more users of vehicle control system 3000. In some embodiments, user interface 3070 may include user input devices, including, for example, a touchscreen, microphone, keyboard, pointer devices, track wheels, cameras, knobs, buttons, or the like. With such input devices, a user may be able to provide information inputs or commands to vehicle control system 3000 by typing instructions or information, providing voice commands, selecting menu options on a screen using buttons, pointers, or eye-tracking capabilities, or through any other suitable techniques for communicating information to vehicle control system 3000.
User interface 3070 may be equipped with one or more processing devices configured to provide and receive information to or from a user and process that information for use by, for example, applications processor 3080. In some embodiments, such processing devices may execute instructions for recognizing and tracking eye movements, receiving and interpreting voice commands, recognizing and interpreting touches and/or gestures made on a touchscreen, responding to keyboard entries or menu selections, etc. In some embodiments, user interface 3070 may include a display, speaker, tactile device, and/or any other devices for providing output information to a user.
Map database 3060 may include any type of database for storing map data useful to vehicle control system 3000. In some embodiments, map database 3060 may include data relating to the position, in a reference coordinate system, of various items, including roads, water features, geographic features, businesses, points of interest, restaurants, gas stations, etc. Map database 3060 may store not only the locations of such items, but also descriptors relating to those items, including, for example, names associated with any of the stored features. In some embodiments, map database 3060 may be physically located with other components of vehicle control system 3000. Alternatively or additionally, map database 3060 or a portion thereof may be located remotely with respect to other components of vehicle control system 3000 (e.g., processing unit 3010). In such embodiments, information from map database 3060 may be downloaded over a wired or wireless data connection to a network (e.g., over a cellular network and/or the Internet, etc.). In some cases, map database 3060 may store a sparse data model including polynomial representations of certain road features (e.g., lane markings) or target trajectories for the host vehicle. Map database 3060 may also include stored representations of various recognized landmarks that may be used to determine or update a known position of the host vehicle with respect to a target trajectory. The landmark representations may include data fields such as landmark type, landmark location, among other potential identifiers.
Image capture devices 3022, 3024, and 3026 may each include any type of device suitable for capturing at least one image from an environment. Moreover, any number of image capture devices may be used to acquire images for input to the image processor. Some embodiments may include only a single image capture device, while other embodiments may include two, three, or even four or more image capture devices. Image capture devices 3022, 3024, and 3026 will be further described with reference to
One or more cameras (e.g., image capture devices 3022, 3024, and 3026) may be part of a sensing block included on a vehicle. Various other sensors may be included in the sensing block, and any or all of the sensors may be relied upon to develop a sensed navigational state of the vehicle. In addition to cameras (forward, sideward, rearward, etc), other sensors such as RADAR, LIDAR, and acoustic sensors may be included in the sensing block. Additionally, the sensing block may include one or more components configured to communicate and transmit/receive information relating to the environment of the vehicle. For example, such components may include wireless transceivers (RF, etc.) that may receive from a source remotely located with respect to the host vehicle sensor based information or any other type of information relating to the environment of the host vehicle. Such information may include sensor output information, or related information, received from vehicle systems other than the host vehicle. In some embodiments, such information may include information received from a remote computing device, a centralized server, etc. Furthermore, the cameras may take on many different configurations: single camera units, multiple cameras, camera clusters, long FOV, short FOV, wide angle, fisheye, or the like.
The image capture devices included on vehicle 1000 as part of the image acquisition unit 3020 may be positioned at any suitable location. In some embodiments, image capture device 3022 may be located in the vicinity of the rearview mirror. This position may provide a line of sight similar to that of the driver of vehicle 1000, which may aid in determining what is and is not visible to the driver. Image capture device 3022 may be positioned at any location near the rearview mirror, but placing image capture device 3022 on the driver side of the mirror may further aid in obtaining images representative of the driver's field of view and/or line of sight.
Other locations for the image capture devices of image acquisition unit 3020 may also be used. For example, image capture device 3024 may be located on or in a bumper of vehicle 1000. Such a location may be especially suitable for image capture devices having a wide field of view. The line of sight of bumper-located image capture devices can be different from that of the driver and, therefore, the bumper image capture device and driver may not always see the same objects. The image capture devices (e.g., image capture devices 3022, 3024, and 3026) may also be located in other locations. For example, the image capture devices may be located on or in one or both of the side mirrors of vehicle 1000, on the roof of vehicle 1000, on the hood of vehicle 1000, on the trunk of vehicle 1000, on the sides of vehicle 1000, mounted on, positioned behind, or positioned in front of any of the windows of vehicle 1000, and mounted in or near light fixtures on the front and/or back of vehicle 1000.
In addition to image capture devices, vehicle 1000 may include various other components of vehicle control system 3000. For example, processing unit 3010 may be included on vehicle 1000 either integrated with or separate from an engine control unit (ECU) of the vehicle. Vehicle 1000 may also be equipped with a position sensor 3030, such as a GPS receiver and may also include a map database 3060 and memory units 3040 and 3050.
As discussed earlier, wireless transceiver 3072 may and/or receive data over one or more networks (e.g., cellular networks, the Internet, etc.). For example, wireless transceiver 3072 may upload data collected by vehicle control system 3000 to one or more servers, and download data from the one or more servers. Via wireless transceiver 3072, vehicle control system 3000 may receive, for example, periodic or on demand updates to data stored in map database 3060, memory 3040, and/or memory 3050. Similarly, wireless transceiver 3072 may upload any data (e.g., images captured by image acquisition unit 3020, data received by position sensor 3030 or other sensors, vehicle control systems, etc.) from vehicle control system 3000 and/or any data processed by processing unit 3010 to the one or more servers.
Vehicle control system 3000 may upload data to a server (e.g., to the cloud) based on a privacy level setting. For example, vehicle control system 3000 may implement privacy level settings to regulate or limit the types of data (including metadata) sent to the server that may uniquely identify a vehicle and or driver/owner of a vehicle. Such settings may be set by user via, for example, wireless transceiver 3072, be initialized by factory default settings, or by data received by wireless transceiver 3072.
A design of experiments (DOE) method may be implemented in combination with entry radius, curve radius, exit radius, entry length, curve length, exit length, entry speed, curve speed, exit speed, and direction. Multiple replications of the test matrix may be conducted to collect data that is used to construct a plant model. Simulations are then conducted with the plant model to generate performance scores for various controller parameters. The performance data is input into an unsupervised learning algorithm to identify appropriate parameter regimes. An optimization algorithm, for example a reinforcement learning algorithm, is applied to optimize the parameters within each regime.
Table 1 below is an example subset of experimental conditions that may be used for a DOE.
Referring to
In order to determine where the host vehicle 5010 is located on the digital map 5120, the navigation device 5090 may include a location device 5140, such as a GPS receiver. Alternatively, or in combination with the location device 5140, the navigation device 5090 may include an image device 5150. The image device 5150 may include a camera, a radar unit, a LIDAR unit, or any combination thereof, used to detect relatively permanent objects proximate to the host vehicle 5010 that are indicated on the digital map 5120, for example, traffic signals, buildings, etc., and determine a relative location relative to those objects in order to determine where the host vehicle 5010 is located on the digital map 5120. This process may be referred to as map localization. The functions of the navigation device 5090, the information provided by the navigation device 5090, or both, may be all or in part by way of V2I communications, V2V communications, vehicle-to-pedestrian (V2P) communications, or a combination thereof, which may generically be labeled as V2X communications 5160.
The function of the image device 5150 may be provided by, but not limited to, a camera 5170, a radar unit 5180, a LIDAR unit 5190, or any combination thereof, which may also be shared with an object detector 5200. In some implementations, the object detector 5200 may include a sonar unit 5210. The object detector 5200 may be used to detect the relative location of the other entity 5070, and determine an intersection point where the other entity 5070 will intersect the travel path of the host vehicle 5010. In order to determine the intersection point and the relative timing of when the host vehicle 5010 and the other entity 4070 will arrive at the intersection point, the object detector 5200 may be used by the vehicle control system 5000 to determine, for example, a relative speed, a separation distance of the other entity 5070 from the host vehicle 5010, or both. The functions of the object detector 5200, the information provided by the object detector 5200, or both, may be all or in part by way of V2I communications, V2V communications, V2P communications, or a combination thereof, which may generically be labeled as V2X communications 5160. Accordingly, the vehicle control system 5000 may include a transceiver to enable such communications.
The vehicle control system 5200 includes the controller 5130 that is in communication with a control interface 5170, the object detector 5200, and the navigation device 5090. The communication may be by way of, but not limited to, wires, wireless communication, or optical fiber. The controller 5130 may include a processor such as a microprocessor or other control circuitry such as analog circuitry, digital circuitry, or both, including an application specific integrated circuit (ASIC) for processing data. The controller 5130 may include a memory, including non-volatile memory, such as electrically erasable programmable read-only memory (EEPROM) for storing one or more routines, thresholds, captured data, or a combination thereof. The one or more routines may be associated with a DOE test matrix and may be executed by the processor to perform steps to determine vehicle operations, vehicle conditions, or both, under activity conditions.
The controller 5130 may operate in a self-developing mode 5220 such that the controller 5130 is automatically developed based on the DOE test matrix 5110 and the capture of data related to the vehicle operations under activity conditions. The controller 5130 may self-develop offline where the host vehicle 5010 is not under activity conditions, using a controller simulator 5230. The controller simulator 5230 performs simulations based on the DOE test matrix 5110. The controller simulator collects data resulting from the performed simulations and feeds the collected data to a learning module, for example deep learning module 5240. The deep learning module 5240 may perform supervised learning, unsupervised learning, reinforcement learning, or a combination thereof. The deep learning module 5240 may identify regimes and generate controller parameters for the controller 5130. The deep learning module 5240 may provide feedback to the DOE test matrix 5110, the controller simulator 5230, or both. The feedback may be used to update the DOE test matrix 5110, the controller simulator 5230, or both. The controller parameters are transmitted via a controller interface 5250 to the controller 5130. In some embodiments, the controller 5130 may self-develop online where the host vehicle 5010 is under activity conditions. The DOE test matrix 5110, the control simulator 5230, and the deep learning module 5240 may be included on a single processor, individual processors, the controller 5130, or any combination thereof.
The controller 5130 may operate in a self-tuning mode 5260 such that the controller 5130 is automatically tuned based on the DOE test matrix 5110 and the capture of data related to the vehicle operations under activity conditions. In this example, the controller 5130 may self-tune online where the host vehicle 5010 is under activity conditions. The controller simulator 5230 performs simulations based on the DOE test matrix 5110. The controller simulator collects data resulting from the performed simulations and feeds the collected data to a learning module, for example deep learning module 5240. The deep learning module 5240 may perform supervised learning, unsupervised learning, reinforcement learning, or a combination thereof. The deep learning module 5240 may identify regimes and generate tuning parameters to tune the controller 5130. The deep learning module 5240 may provide feedback to the DOE test matrix 5110, the controller simulator 5230, or both. The feedback from may include data from any actuator from the host vehicle 5010, for example a steering actuator, a brake actuator, a throttle actuator, or a combination thereof. The feedback may be used to update the DOE test matrix 5110, the controller simulator 5230, or both. The tuning parameters are transmitted via a controller interface 5250 to the controller 5130. The DOE test matrix 5110, the control simulator 5230, and the deep learning module 5240 may be included on a single processor, individual processors, the controller 5130, or any combination thereof.
The vehicle control system 6000 may be implemented as part of a host vehicle 6010, and may use a DOE test matrix to capture vehicle operations under activity conditions. A subset of the DOE test matrix used may be similar to the subset shown it Table 1 above. The host vehicle 6010 may operate in automated mode 6020 where a human operator is not needed to operate the vehicle 6010. Alternatively, the host vehicle may operate in manual mode 6030 where the degree or level of automation may be little more than providing steering advice to a human operator who is generally in control of the steering 6040, accelerator 6050, and brakes 6060 of the host vehicle 6010. For example, in manual mode 6030, the vehicle control system 6000 may assist the human operator as needed to arrive at a selected destination, avoid interference or collision with another entity 6070, or both. The another entity 6070 may be another vehicle, a pedestrian, a building, a tree, an animal, or any other object that the vehicle 6010 may encounter.
Referring to
In order to determine where the host vehicle 6010 is located on the digital map 6120, the navigation device 6090 may include a location device 6140, such as a GPS receiver. Alternatively, or in combination with the location device 6140, the navigation device 6090 may include an image device 6150. The image device 6150 may include a camera, a radar unit, a LIDAR unit, or any combination thereof, used to detect relatively permanent objects proximate to the host vehicle 6010 that are indicated on the digital map 6120, for example, traffic signals, buildings, etc., and determine a relative location relative to those objects in order to determine where the host vehicle 6010 is located on the digital map 6120. This process may be referred to as map localization. The functions of the navigation device 6090, the information provided by the navigation device 6090, or both, may be all or in part by way of V2I communications, V2V communications, vehicle-to-pedestrian (V2P) communications, or a combination thereof, which may generically be labeled as V2X communications 6160.
The function of the image device 6150 may be provided by, but not limited to, a camera 6170, a radar unit 6180, a LIDAR unit 6190, or any combination thereof, which may also be shared with an object detector 6200. In some implementations, the object detector 6200 may include a sonar unit 6210. The object detector 6200 may be used to detect the relative location of the other entity 6070, and determine an intersection point where the other entity 6070 will intersect the travel path of the host vehicle 6010. In order to determine the intersection point and the relative timing of when the host vehicle 6010 and the other entity 6070 will arrive at the intersection point, the object detector 6200 may be used by the vehicle control system 6000 to determine, for example, a relative speed, a separation distance of the other entity 6070 from the host vehicle 6010, or both. The functions of the object detector 6200, the information provided by the object detector 6200, or both, may be all or in part by way of V2I communications, V2V communications, V2P communications, or a combination thereof, which may generically be labeled as V2X communications 6160. Accordingly, the vehicle control system 6000 may include a transceiver to enable such communications.
The vehicle control system 6200 includes the controller 6130 that is in communication with a control interface 6170, the object detector 6200, and the navigation device 6090. The communication may be by way of, but not limited to, wires, wireless communication, or optical fiber. The controller 6130 may include a processor such as a microprocessor or other control circuitry such as analog circuitry, digital circuitry, or both, including an application ASIC for processing data. The controller 6130 may include a memory, including non-volatile memory, such as an EEPROM for storing one or more routines, thresholds, captured data, or a combination thereof. The one or more routines may be associated with a DOE test matrix and may be executed by the processor to perform steps to determine vehicle operations, vehicle conditions, or both, under activity conditions.
The controller 6130 may operate in a self-developing mode 6220 such that the controller 6130 is automatically developed based on the DOE test matrix 6110 and the capture of data related to the vehicle operations under activity conditions. The controller 6130 may self-develop offline where the host vehicle 6010 is not under activity conditions, using a controller simulator 6230. The controller simulator 6230 performs simulations based on the DOE test matrix 6110. The controller simulator 6230 collects data resulting from the performed simulations and feeds the collected data to a learning module, for example a generator network such as a deep generator network 6240. The deep generator network 6240 may perform supervised learning, unsupervised learning, reinforcement learning, or a combination thereof. The deep generator network 6240 may identify regimes and generate controller parameters for the controller 6130. The deep generator network 6240 may provide feedback to the DOE test matrix 6110, the controller simulator 6230, or both. The feedback may be used to update the DOE test matrix 6110, the controller simulator 6230, or both. The controller parameters are transmitted via a deep discriminator network 6250 to the controller 6130. In some embodiments, the deep discriminator network 6250 may be a part of the controller 6130. In some embodiments, the deep generator network 6240 may operate offline, and the deep discriminator network 6250 may operate either online or offline to evaluate the candidates that are received from the deep generator network 6240. In some embodiments, the controller 6130 may self-develop online where the host vehicle 6010 is under activity conditions. The DOE test matrix 6110, the control simulator 6230, and the deep generator network 6240 may be included on a single processor, individual processors, the controller 6130, or any combination thereof.
The controller 6130 may operate in a self-tuning mode 6260 such that the controller 6130 is automatically tuned based on the DOE test matrix 6110 and the capture of data related to the vehicle operations under activity conditions. In this example, the controller 6130 may self-tune online where the host vehicle 6010 is under activity conditions. The controller simulator 6230 performs simulations based on the DOE test matrix 6110. The controller simulator collects data resulting from the performed simulations and feeds the collected data to a learning module, for example deep generator network 6240. The deep generator network 6240 may perform supervised learning, unsupervised learning, reinforcement learning, or a combination thereof. The deep generator network 6240 may identify regimes and generate tuning parameters to tune the controller 6130. The deep generator network 6240 may provide feedback to the DOE test matrix 6110, the controller simulator 6230, or both. The feedback from may include data from any actuator from the host vehicle 6010, for example a steering actuator, a brake actuator, a throttle actuator, or a combination thereof. The feedback may be used to update the DOE test matrix 6110, the controller simulator 6230, or both. The tuning parameters are transmitted via a deep discriminator network 6250 to the controller 6130. The DOE test matrix 6110, the control simulator 6230, and the deep generator network 6240 may be included on a single processor, individual processors, the controller 6130, or any combination thereof.
Although some embodiments herein refer to methods, it will be appreciated by one skilled in the art that they may also be embodied as a system or computer program product. Accordingly, aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “processor,” “device,” or “system.” Furthermore, aspects of the present invention may take the form of a computer program product embodied in one or more computer readable mediums having computer readable program code embodied thereon. Any combination of one or more computer readable mediums may be utilized. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to CDs, DVDs, wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the “C” programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
Aspects of the present invention are described below with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions.
These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures.
While the disclosure has been described in connection with certain embodiments, it is to be understood that the disclosure is not to be limited to the disclosed embodiments but, on the contrary, is intended to cover various modifications, combinations, and equivalent arrangements included within the scope of the appended claims, which scope is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures as is permitted under the law.