The subject matter described herein relates in general to the physical design of vehicles and, more specifically, to systems and methods for estimating vehicle physical-design parameters from image data.
An important aspect of vehicle manufacturing is a vehicle's physical design (shape, structure, etc.). Modern vehicle design includes the use of sophisticated computer-aided-design (CAD) tools, and physics-based simulators permit vehicle designers to predict a vehicle's physical-design parameters such as drag coefficient from a computerized model before the vehicle is manufactured.
More recently, vehicle designers have begun to use artificial-intelligence (AI) tools to estimate vehicle physical-design parameters. For example, the drag coefficient of a vehicle can be estimated using a machine-learning-based model such as a neural network. The current technology for performing drag estimation via neural networks involves training the end-to-end network. This training process takes, as input, a computerized representation of an object such as an automobile or airplane and produces, as output, an estimated drag coefficient. The training process requires large amounts of data to train the neural network in a supervised manner, making the process both time-consuming and costly.
Embodiments of a system for estimating vehicle physical-design parameters from image data are presented herein. In one embodiment, the system comprises a processor and a memory storing machine-readable instructions that, when executed by the processor, cause the processor to receive one or more images representing a physical design of a vehicle. The memory also stores machine-readable instructions that, when executed by the processor, cause the processor to process the one or more images using a machine-learning-based model that includes a pre-trained feature extractor whose output layer has been replaced with a regression layer. The regression layer, after the regression layer has replaced the output layer, is trained to output an estimate of a physical-design parameter of the vehicle whose physical design is represented by the one or more images. The physical design of the vehicle is modified based, at least in part, on the estimate of the physical-design parameter.
Another embodiment is a non-transitory computer-readable medium for estimating vehicle physical-design parameters from image data and storing instructions that, when executed by a processor, cause the processor to receive one or more images representing a physical design of a vehicle. The instructions also cause the processor to process the one or more images using a machine-learning-based model that includes a pre-trained feature extractor whose output layer has been replaced with a regression layer. The regression layer, after the regression layer has replaced the output layer, is trained to output an estimate of a physical-design parameter of the vehicle whose physical design is represented by the one or more images. The physical design of the vehicle is modified based, at least in part, on the estimate of the physical-design parameter.
Another embodiment is a method of estimating vehicle physical-design parameters from image data, the method comprising receiving one or more images representing a physical design of a vehicle. The method also includes processing the one or more images using a machine-learning-based model that includes a pre-trained feature extractor whose output layer has been replaced with a regression layer. The regression layer, after the regression layer has replaced the output layer, is trained to output an estimate of a physical-design parameter of the vehicle whose physical design is represented by the one or more images. The physical design of the vehicle is modified based, at least in part, on the estimate of the physical-design parameter.
The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate various systems, methods, and other embodiments of the disclosure. It will be appreciated that the illustrated element boundaries (e.g., boxes, groups of boxes, or other shapes) in the figures represent one embodiment of the boundaries. In some embodiments, one element may be designed as multiple elements or multiple elements may be designed as one element. In some embodiments, an element shown as an internal component of another element may be implemented as an external component and vice versa. Furthermore, elements may not be drawn to scale.
To facilitate understanding, identical reference numerals have been used, wherever possible, to designate identical elements that are common to the figures. Additionally, elements of one or more embodiments may be advantageously adapted for utilization in other embodiments described herein.
Various embodiments of systems and methods for estimating vehicle physical-design parameters from image data described herein overcome the disadvantages of current machine-learning-based approaches. In the various embodiments, a vehicle physical-design parameter estimation system (hereinafter often referred to as simply a “parameter estimation system”) leverages a pre-trained (already trained) feature extractor to estimate vehicle physical-design parameters (hereinafter sometimes referred to as simply “parameters”) rather than training an end-to-end neural network. In some embodiments, the pre-trained feature extractor is a pre-trained object-classification neural network. The pre-trained feature extractor extracts, from one or more input images representing the physical design of a vehicle, a large number of intrinsic features. In the various embodiments, the final (output) layer of the pre-trained feature extractor is deleted and replaced with a regression layer that is trained to estimate a vehicle physical-design parameter such as drag coefficient based on the features extracted by the preceding layers of the pre-trained feature extractor. The regression layer can be trained using a relatively small amount of data due to the small number of trainable parameters (weights, etc.) involved.
Another significant advantage of the various embodiments described herein is that the parameter estimation system is robust against distributional shift. That is, the parameter estimation system generalizes well to out-of-distribution input image data because the pre-trained feature extractor has been trained with such a large volume and wide variety of image data.
The techniques disclosed herein can be applied to the estimation of a variety of different kinds of vehicle physical-design parameters. Examples of such parameters include, without limitation, drag coefficient (aerodynamic drag coefficient), structural strength, properties pertaining to an impact or collision with the vehicle, manufacturability (the ease with which the vehicle can be manufactured), assemblability (the ease with which the vehicle can be assembled from its constituent parts), and materials-efficiency (a measure of how much material is wasted during manufacturing).
The physical design of the vehicle can be modified based, at least in part, on the estimate of the physical-design parameter. In some embodiments, the same system that estimates the physical-design parameter modifies the physical design. In other embodiments, a component of a vehicle physical-design process modifies the physical design. In some embodiments, the parameter estimation system is part of a real-time interactive vehicle-design workflow in which a human or artificial-intelligence (AI) designer modifies the physical design of a vehicle and receives immediate feedback from the machine-learning-based model regarding how a physical-design parameter of interest changes as a result of the modifications of the physical design. This supports rapid, iterative refinement of the physical design. Conventional physics-based simulation systems require significantly more time to provide such feedback.
The format of the physical design 110 differs, depending on the embodiment. In some embodiments, the physical design 110 is a three-dimensional (3D) model produced by a computer-aided design system or a generative AI system. In other embodiments, the physical design 110 can be, for example, computerized blueprints, two-dimensional (2D) perspective views, or 2D orthographic views.
As shown in
As discussed above, parameter estimation system 130 includes a machine-learning-based model that estimates a vehicle physical-design parameter by processing the image data 120. More specifically, parameter estimation system 130 includes a pre-trained feature extractor whose last (output) layer has been removed and replaced with a new regression layer. The regression layer is trained to estimate the vehicle physical-design parameter of interest based on the features extracted by the preceding layers of the pre-trained feature extractor. As mentioned above, parameter estimation system 130 leverages the ability of the pre-trained feature extractor to extract a large number of features from the image data 120 due to its having been trained on a large volume and variety of image data. By comparison, the regression layer requires a relatively small amount of training data to learn how to estimate the vehicle physical-design parameter of interest. As those skilled in the art are aware, a pre-trained feature extractor can require millions of dollars and years of compute time to produce. Some open-source pre-trained feature extractors are available to the public. As discussed above, in some embodiments, the pre-trained feature extractor is a pre-trained object-classification neural network.
As indicated in
As also shown in
In the drag-coefficient embodiment of
As discussed above, the techniques disclosed herein can be applied to the estimation of a variety of other vehicle physical-design parameters besides drag coefficient, such as, without limitation, structural strength, properties pertaining to an impact or collision with the vehicle, manufacturability, assemblability, and materials-efficiency.
As shown in
As depicted in
Input module 315 generally includes instructions that, when executed by the one or more processors 305, cause the one or more processors 305 to receive one or more images 120 representing a physical design 110 of a vehicle. As explained above, in some embodiments, the one or more images 120 are received from a vehicle physical-design process 335. The format of the one or more images 120 can differ, depending on the embodiment (i.e., depending on the nature of the vehicle physical-design parameter to be estimated). Regardless of the specific format, the one or more images 120 represent the physical design 110 of a vehicle.
Estimation module 320 generally includes instructions that, when executed by the one or more processors 305, cause the one or more processors 305 to process the one or more images 120 using a machine-learning-based model that includes a pre-trained feature extractor 230 whose output layer has been replaced with a regression layer 240. As discussed above, after the output layer of pre-trained feature extractor 230 has been replaced with regression layer 240, the regression layer 240 is trained to output an estimate 140 of a physical-design parameter of the vehicle whose physical design 110 is represented by the one or more images 120. In the embodiment of
As discussed above, in some embodiments, the physical design 110 of the vehicle is modified based, at least in part, on the estimate 140 of the physical-design parameter output by the parameter estimation system 130. In some embodiments, update module 323 includes instructions that, when executed by the one or more processors 305, cause the one or more processors 305 to modify the physical design 110 based, at least in part, on the estimate 140 of the physical-design parameter. For example, update module 323 might modify the physical design 110 in a way that improves the physical-design parameter estimated by parameter estimation system 130. This can be confirmed through an updated estimate 140 of the relevant parameter based on the modified physical design 110. In other embodiments, a component of vehicle physical-design process 335 modifies physical design 110 based, at least in part, on the estimate 140 of the physical-design parameter. In some embodiments, parameter estimation system 130 is part of a real-time interactive vehicle physical-design process 335. In such an embodiment, a human or AI designer might modify the physical design 110 of the vehicle and receive immediate feedback from the machine-learning-based model regarding how a physical-design parameter of interest (e.g., drag coefficient) changes as a result of the modifications of the physical design 110. This facilitates an iterative process of refining the physical design 110.
At block 410, input module 315 receives one or more images 120 representing a physical design 110 of a vehicle. As explained above, in some embodiments, the one or more images 120 are received from a vehicle physical-design process 335. The format of the one or more images 120 can differ, depending on the embodiment (i.e., depending on the nature of the vehicle physical-design parameter to be estimated). Regardless of the specific format, the one or more images 120 represent the physical design 110 of a vehicle.
At block 420, estimation module 320 processes the one or more images 120 using a machine-learning-based model that includes a pre-trained feature extractor 230 whose output layer has been replaced with a regression layer 240. As discussed above, after the output layer of pre-trained feature extractor 230 has been replaced with regression layer 240, the regression layer 240 is trained to output an estimate 140 of a physical-design parameter of the vehicle whose physical design 110 is represented by the one or more images 120. As discussed above, in some embodiments, the pre-trained feature extractor 230 is a pre-trained object-classification neural network. As discussed in connection with the drag-coefficient-estimation embodiment of
At block 430, the physical design 110 of the vehicle is modified based, at least in part, on the estimate 140 of the physical-design parameter. As discussed above, in some embodiments, update module 323 in parameter estimation system 130 modifies the physical design 110. In other embodiments, a component of vehicle physical-design process 335 modifies the physical design 110. In some embodiments, parameter estimation system 130 is part of a real-time interactive vehicle physical-design process 335 in which a human or AI designer modifies the physical design 110 of the vehicle and receives immediate feedback from the machine-learning-based model regarding how a physical-design parameter of interest (e.g., drag coefficient) changes as a result of the modifications of the physical design 110. This supports rapid, iterative refinement of the physical design 110.
As discussed above, the techniques disclosed herein can be applied to the estimation of a variety of other vehicle physical-design parameters besides drag coefficient, such as, without limitation, structural strength, properties pertaining to an impact or collision with the vehicle, manufacturability, assemblability, and materials-efficiency.
Detailed embodiments are disclosed herein. However, it is to be understood that the disclosed embodiments are intended only as examples. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as a basis for the claims and as a representative basis for teaching one skilled in the art to variously employ the aspects herein in virtually any appropriately detailed structure. Further, the terms and phrases used herein are not intended to be limiting but rather to provide an understandable description of possible implementations. Various embodiments are shown in
The components described above can be realized in hardware or a combination of hardware and software and can be realized in a centralized fashion in one processing system or in a distributed fashion where different elements are spread across several interconnected processing systems. A typical combination of hardware and software can be a processing system with computer-usable program code that, when being loaded and executed, controls the processing system such that it carries out the methods described herein. The systems, components and/or processes also can be embedded in a computer-readable storage, such as a computer program product or other data programs storage device, readable by a machine, tangibly embodying a program of instructions executable by the machine to perform methods and processes described herein. These elements also can be embedded in an application product which comprises all the features enabling the implementation of the methods described herein and, which when loaded in a processing system, is able to carry out these methods.
Furthermore, arrangements described herein may take the form of a computer program product embodied in one or more computer-readable media having computer-readable program code embodied, e.g., stored, thereon. Any combination of one or more computer-readable media may be utilized. The computer-readable medium may be a computer-readable signal medium or a computer-readable storage medium. The phrase “computer-readable storage medium” means a non-transitory storage medium. A computer-readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium would include the following: a portable computer diskette, a hard disk drive (HDD), a solid-state drive (SSD), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a portable compact disc read-only memory (CD-ROM), a digital versatile disc (DVD), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer-readable storage medium may be any tangible medium that can contain or store a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer-readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber, cable, RF, etc., or any suitable combination of the foregoing. Computer program code for carrying out operations for aspects of the present arrangements may be written in any combination of one or more programming languages, including an object-oriented programming language such as Java™, Smalltalk, C++ or the like and conventional procedural programming languages, such as the “C” programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
Generally, “module,” as used herein, includes routines, programs, objects, components, data structures, and so on that perform particular tasks or implement particular data types. In further aspects, a memory generally stores the noted modules. The memory associated with a module may be a buffer or cache embedded within a processor, a RAM, a ROM, a flash memory, or another suitable electronic storage medium. In still further aspects, a module as envisioned by the present disclosure is implemented as an application-specific integrated circuit (ASIC), a hardware component of a system on a chip (SoC), as a programmable logic array (PLA), or as another suitable hardware component that is embedded with a defined configuration set (e.g., instructions) for performing the disclosed functions.
The terms “a” and “an,” as used herein, are defined as one or more than one. The term “plurality,” as used herein, is defined as two or more than two. The term “another,” as used herein, is defined as at least a second or more. The terms “including” and/or “having,” as used herein, are defined as comprising (i.e. open language). The phrase “at least one of . . . and . . . ” As used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. As an example, the phrase “at least one of A, B, and C” includes A only, B only, C only, or any combination thereof (e.g. AB, AC, BC or ABC).
As used herein, “cause” or “causing” means to make, command, instruct, and/or enable an event or action to occur or at least be in a state where such event or action may occur, either in a direct or indirect manner.
Aspects herein can be embodied in other forms without departing from the spirit or essential attributes thereof. Accordingly, reference should be made to the following claims rather than to the foregoing specification, as indicating the scope hereof.
This application claims the benefit of U.S. Provisional Patent Application No. 63/465,613, “Surrogate Modeling of Car Drag Coefficient with Depth and Normal Renderings,” filed on May 11, 2023, and claims the benefit of U.S. Provisional Patent Application No. 63/471,389, “Systems and Methods for Drag Estimation,” filed on Jun. 6, 2023, both of which are incorporated by reference herein in their entirety.
Number | Date | Country | |
---|---|---|---|
63465613 | May 2023 | US | |
63471389 | Jun 2023 | US |