Modern computer systems can be used to create, use, and manage data for products and other items. Computer-aided technology (CAx) systems, for instance, may be used to aid in the design, analysis, simulation, or manufacture of products. Examples of CAx systems include computer-aided design (CAD) systems, computer-aided engineering (CAE) systems, visualization and computer-aided manufacturing (CAM) systems, product data management (PDM) systems, product lifecycle management (PLM) systems, and more. These CAx systems may include components (e.g., CAx applications) that facilitate the design and simulated testing of product structures and product manufacturing processes.
Certain examples are described in the following detailed description and in reference to the drawings.
Modern technological advances have given rise to the design, simulation, and manufacture of increasingly complex products of various types across many different industries. CAx systems may support the design and tracking of product series that can include CAD parts numbering in the hundreds of thousands, millions, tens of millions and possibly more. As used herein, a CAD part may refer to any discrete object that can be represented digitally (e.g., as CAD data), and CAD parts may be combined to form products or components thereof. Along with parts, modern CAx systems can support the design and management of CAD assemblies, which may refer to any representation of how CAD parts combine to form a particular product or product component. As such, CAD assemblies may represent different products (e.g., a given CAD assembly for a hard drive comprised of multiple components represented as CAD parts such as an enclosure, bolts, optical storage components, etc.). CAD assemblies may also represent different sub-systems or components of a given product (e.g., electrical and cooling subsystems of a server rack), different variants of a given product (e.g., a base vehicle model, mid-range vehicle model, top end vehicle model), and the like.
Assembly creation in modern CAx contexts may involve the design and positioning of hundreds of individual CAD parts in relatively small assemblies to hundreds of thousands of components (or more) in relatively large assemblies. Moreover, the various parts that form CAD assemblies may be interconnected in various ways, with a common example being nuts, bolts, shafts, or screws that fit into other CAD parts at holes of specific diameters and geometry. Such CAD parts may be interrelated according to a particular physical position and certain orientation at which CAD parts interconnect in a given CAD assembly. In particular, CAD assemblies may support constraints that limit CAD parts that form the CAD assemblies. As used herein, a constraint may refer to any specified limitation on a degree of movement for a CAD part of a CAD assembly. CAD parts in a CAD model may have multiple degrees of freedom in terms of movement, for example along ‘x’, ‘y’, or ‘z’ dimensional axes of a 3D system the CAD part is designed in and/or rotations along, about, or around the ‘x’, ‘y’, or ‘z’ axes.
Constraints may be specified for a given CAD part or between multiple CAD parts to limit (e.g., restrict or prevent) movement along at least one the degrees of movement for or between CAD parts. As such, constraints in CAD assemblies may limit how a CAD part can or cannot move relative to another CAD part. Such constraints in CAD assemblies may reflect physical restrictions of movement upon construction of a physical product represented by such CAD assemblies. As an illustrative example, constraints set for plate and bolt parts of a CAD assembly may restrict the bolt part from drifting sideways through the plate (e.g., along an ‘x’ or ‘y’ direction) or prevent the bolt from falling through the hole in the plate (e.g., along a ‘z’ direction) at which the bolt is affixed to the plate in the CAD assembly. While such an example may seem simple, setting appropriate constraints for CAD assemblies comprised of hundreds of thousands of CAD parts or more can be a challenge.
Determining and specifying constraints in modern CAD contexts can be time-consuming, resource-consuming, and tedious. For example, mating of CAD parts in CAD assemblies may be performed through manual labelling of complementary features on each CAD part, but such processes require manual design work and can be error-prone. User-specified constraints may be another way to manually specify movement restrictions between interconnected CAD parts. However, a given CAD assembly may be constrained in a near-limitless number of ways, many of times achieving the same degree of movement limitations, but with differing constraint setting based on user preference. Such manual processes may be time-consuming, error-prone, inconsistent, and non-optimal.
The disclosure herein may provide systems, methods, devices, and logic for ML-based generation of constraints for CAD assemblies. As described in greater detail herein, various features are presented to support generation of constraints for CAD assemblies through machine-learning. The ML-based constraint generation features of the present disclosure may support the automatic detection, application, and validation of constraints for CAD parts of a CAD assembly, and may support constraint generation without actual placement of CAD parts into a CAD assembly. Moreover, training data for such ML-based solutions need not include explicitly-constrained CAD models, as the ML-based constraint generation features presented herein may support inference of constraints from existing designs. Through the present disclosure, constraint and CAD parts determination, specification, and positioning in CAD assemblies can be made simpler and faster through proposal of likely CAD part locations in a CAD assembly, which can reduce the burden on CAD users to navigate the 3D space of a CAD assembly. The presented features may also enable detection of complementary CAD parts in a CAD assembly if the final arrangement of the CAD assembly is not known, thus improving efficiency of CAD processes.
These and other ML-based constraint generation features and technical benefits are described in greater detail herein.
As an example implementation to support any combination of the ML-based constraint generation features described herein, the computing system 100 shown in
In the example shown in
In operation, the constraint learning engine 110 may access a CAD assembly 130 that includes multiple CAD parts and generate a representation graph of the CAD assembly 130. Nodes in the generated representation graph represent geometric faces of the multiple CAD parts and edges in the representation graph may represent geometric edges of the multiple CAD parts. The constraint learning engine 110 may further determine constraints in the CAD assembly 130, and the constraints may limit a degree of movement between geometric faces of different CAD parts in the CAD assembly 130 and insert constraint edges into the representation graph 210 that represent the determined constraints. Then, the constraint learning engine 110 may provide the representation graph as training data to train the ML model 120. In operation, the constraint generation engine 112 may generate constraints for a different CAD assembly by applying the ML model 120 for the different CAD assembly.
These and other ML-based constraint generation features are described in greater detail herein. Training of the ML model 120 to support ML-based constraint generation is described next via the constraint learning engine 110 with reference to
The constraint learning engine 110 may access the CAD assembly 130 in various ways. For instance, a user may input the CAD assembly 130, specifying the CAD assembly 130 as a completed CAD model to train the ML model 120. The constraint learning engine 110 may load the CAD assembly 130 from a CAD model repository or CAD database that stores completed product designs. The CAD assembly 130 may be represented in any 3D form, such as a boundary representation (BREP), and the multiple CAD parts that form the CAD assembly 130 may also be represented as BREPs comprised of geometric faces, edges, or other 3D geometric elements and primitives. Each CAD part may be separate design instance or object, e.g., as designed or implemented through a CAD application. In the example shown in
To facilitate machine-learning, the constraint learning engine 110 may generate a representation graph from CAD assemblies to provide as training data for the ML model 120. In some sense, the constraint learning engine 110 may transform the CAD assembly 130 from a 3D geometric representation into a data format interpretable by the ML model 120 to learn or predict constraints in CAD assemblies. In particular, representation graphs generated from CAD assemblies may provide a graph-based representation of 3D structures, and the constraint learning engine 110 may decompose a 3D geometry into nodes and edges in generating a representation graph. Nodes in a generated representation graph may represent geometric faces of a CAD assembly and edges in the representation graph may represent the geometric edges (also referred to as topological edges) at which geometric faces of the CAD assembly meet or intersect.
In
The constraint learning engine 110 may iterate through each CAD part of the CAD assembly 130 until the CAD assembly 130 is entirely represented in the representation graph 210. Different graph portions of the representation graph 210 may represent different CAD parts of the CAD assembly 130. A graph portion may refer to a collection of nodes and edges in the representation graph 210, and an example graph portion is illustrated in the
Note that a representation graph generated by the constraint learning engine 110 may be disjointed in that some graph portions may be disconnected from other graph portions of the representation graph (e.g., no path of nodes and edges exists in the representation graph to connect the disconnected graph portions). This may be the case as a given CAD part or set of CAD parts of the CAD assembly 130 do not share any geometric edges with the rest of the CAD assembly 130. In some implementations, the constraint learning engine 110 may generate a representation graph to represent multiple different CAD assemblies. In such implementations, the generated representation graph may be disjointed since different CAD assemblies may represent separate, unrelated structures that do not share any common geometric edges.
In the example shown in
The constraint learning engine 110 may represent any number of parameters of the CAD assembly 130 in a generated representation graph, including geometric and topological characteristics of the CAD assembly 130 or CAD parts of the CAD assembly 130. In some implementations, the constraint learning engine 110 may track characteristics of geometric faces of a CAD assembly as values encoded in, assigned to, associated with, or appended to nodes of a representation graph. As one way of doing so, the constraint learning engine 110 may insert feature vectors into the nodes in the representation graph 210, and a given feature vector may represent topological and geometric features of a corresponding geometric face of the CAD assembly 130.
A feature vector may refer to any data form by which features (e.g., parameters, or attributes) of a geometric face of a CAD assembly are captured, extracted or represented. Example features that the constraint learning engine 110 may extract and represent in a feature vector of a given geometric face include a primitive type of the given geometric face, a CAD part identifier that the given geometric face is a part of, a number of other geometric faces adjacent to the given geometric face, an area of the given geometric face, a perimeter of the given geometric face, a primitive radius for the given geometric face, a normal vector of the given geometric face, a rotation or scale invariant descriptor of the given geometric face, and more, or any combinations thereof. While some example features are provided herein, any descriptive data of the CAD assembly 130 can be extracted and encoded into the representation graph 210 via feature vectors.
In some examples, the constraint learning engine 110 may extract a feature vector for a given geometric face as an n-dimensional array of parameter values, as a concatenation of parameter values, or in various other structures or formats. Insertion of a feature vector into the representation graph 210 may include any form by which the constraint learning engine 110 associates, links, otherwise correlates a feature vector extracted for a particular geometric face to a corresponding node in the representation graph 210 that represents the particular geometric face. In the example of
In any of the ways described herein, the constraint learning engine 110 may generate a representation graph that represents one or multiple CAD assemblies. The representation graph may facilitate machine-learning in that it can encode geometric and topological features of CAD assemblies on a per-face basis. The constraint learning engine 110 may further generate the representation graph 210 to include, track, or represent constraints in the CAD assembly 130 through constraint edges, as described in greater detail next with reference to
In support of constraint edge insertions into the representation graph 210, the constraint learning engine 110 may determine constraints in the CAD assembly 130 in various ways. As one example, the constraint learning engine 110 may determine constraints in the CAD assembly 130 by identifying user-specified constraints in the CAD assembly 130. As the CAD assembly 130 may be a previously generated or completed CAD model, various constraints may be expressly set in the CAD assembly 130 by engineers or designers during the design process. Put another way, the CAD assembly 130 may be a constrained CAD model. As such, the constraint learning engine 110 may identify any user-specified constraints expressly set in the CAD assembly 130 and insert constraint edges into the representation graph 210 accordingly.
In some instances, the CAD assembly 130 may encode user-specified constraints between specific geometric faces of different CAD parts. In such instances, the constraint learning engine 110 may identify the corresponding nodes of the specific geometric faces, and insert a constraint edge between the corresponding nodes to represent the user-specified constraint. In other examples, the CAD assembly 130 may encode user-specified constraints between different CAD parts of the CAD assembly 130 (e.g., between a bolt part and a plate part that the bolt part is inserted into). In this case, the constraint learning engine 110 may determine a selected geometric face for each of the different CAD parts and insert a constraint edge between the nodes in the representation graph 210 that correspond to the selected geometric faces. Selection of geometric faces by the constraint learning engine 110 may be performed through random selection, least distance between the selected geometric faces, greatest area, normal vector comparisons, default faces specified for given CAD parts, or according to any number of additional or alternative selection processes or criteria.
As another example of constraint determinations, the constraint learning engine 110 may infer any number of constraints from a geometry of the CAD assembly 130. In that regard, the constraint learning engine 110 may be capable of processing and determining constraints from the CAD assembly 130 even when the CAD assembly 130 is not expressly constrained or manually labeled with user-specified constraints. That is, the constraint learning engine 110 may support constraint determinations from the CAD assembly 130 even when no specific constraints are set or encoded in the CAD assembly 130, so long as the CAD parts that form the CAD assembly are positioned relative to one another. The constraint learning engine 110 may thus support automatic detection and application of constraints for CAD assemblies, which may increase the efficiency in training and application of ML-based techniques for constraint generations.
To infer constraints in the CAD assembly 130, the constraint learning engine 110 may determine formulaic representations of constraints, including different formulaic representations for constraints of different constraint types. As used herein, a formulaic representation may include any combination of functions, expressions, or formulas by which the constraint learning engine 110 may detect a constraint between geometric faces of a CAD assembly. The constraint learning engine 110 may then compare geometric faces of the CAD assembly 130 by applying the formulaic representations to determine constraints of the different constraint types between the geometric faces.
As illustrative examples, the constraint learning engine 110 may infer constraints of a “touch” constraint type and an “align” constraint type from the CAD assembly 130. For a touch constraint type, the constraint learning engine 110 may determine the formulaic representation as ni·nj=−1, and ni·rij=nj·rij=0, in which ni is the unit normal of geometric face i, nj is the unit normal of geometric face j, and rij=cj−ci is the displacement vector between centroids ci and cj of geometric faces i and j respectively. For the align constraint type, the constraint learning engine 110 may determine the formulaic representation as ni·nj=1 and ni·rij=nj·rij=1, in which ni is the unit normal of geometric face i, nj is the unit normal of geometric face j, and rij=cj−ci is the displacement vector between centroids ci and cj of geometric faces i and j respectively. In some implementations, the constraint learning engine 110 may include a further criterion in the formulaic representation of the align constraint type that the bounding boxes of geometric faces i and j intersect.
When the formulaic representations are satisfied (e.g., ni·nj=−1 instead of another value, and ni·rij=nj·rij=0 instead of a non-zero value), then the constraint learning engine 110 may determine that the corresponding constraint exists between the geometric faces being compared via the formulaic representation. Determination of formulaic representations may be customized, user-configured, or pre-loaded as specific formulas, expressions, or logic to evaluate in order to determine, detect, or infer constraints between geometric faces.
The constraint learning engine 110 may infer constraints in the CAD assembly 130 by applying any number of formulaic representations to geometric faces of the CAD assembly 130. While a brute force comparison for every pair of geometric faces in the CAD assembly 130 using formulaic representations is possible, such an approach would be inefficient and require increased computational resources and time. In some implementations, the constraint learning engine 110 may selectively compare geometric faces of the CAD assembly 130 by selecting particular geometric faces to apply the formulaic representations to. As one example of selective comparison, the constraint learning engine 110 may determine not to compare geometric faces of the same CAD part. This may be the case as geometric faces of the same CAD part may be designed as components of the same rigid object instance. In that regard, geometric faces of the CAD part may be designed to have no degrees of freedom with respect to one another (and thus comparisons between geometric faces of the same CAD part would not make sense). The geometric faces of a bolt part, for example, are part of the same physical structure and are limited as such. As no degrees of freedom exist between geometric faces of the same CAD part, then modern CAD systems may be incapable of even setting constraints between geometric faces of the same CAD part, and the constraint learning engine 110 need not compare geometric faces of the same CAD part.
As such, the constraint learning engine 110 may apply formulaic representations (of various constraint types) to geometric faces of different CAD parts, but not to geometric faces of the same CAD part. The constraint learning engine 110 may also selectively compare geometric faces based on distance. Distance-based selection of geometric faces for comparison and application of formulaic representations may leverage an aspect of constraints as typically being applying between proximate CAD parts. As such, the constraint learning engine 110 may determine not to compare geometric faces of a CAD part to another CAD part that is relatively far in the CAD assembly 130. To illustrate, for a CAD assembly representing an aircraft, it would not make sense to apply constraint inference computations between CAD parts in the aircraft wing and the engine, since such parts are disparate, physically distant, and unrelated. Distance may thus be used as a relationship measure for CAD assemblies.
To apply distance-based selective comparisons, the constraint learning engine 110 may iterate through geometric faces of a given CAD part and compare the geometric faces of the given CAD part to any geometric faces of other CAD parts that are within a threshold distance (e.g., Euclidean distance in a coordinate system in which the geometric faces are positioned). Threshold distances applied by the constraint learning engine 110 may be predetermined or user-configured. Thus, for a formulaic representation of a given constraint type (e.g., of the touch constraint type) the constraint learning engine 110 may compare selected geometric faces of the CAD assembly 130 by identifying a geometric face of a given CAD part of the CAD assembly 130, determine geometric faces of other CAD parts that are within a threshold distance from the identified geometric face of the given CAD part, and apply the formulaic representation of the given constraint type between the identified geometric face of the given CAD part and each of the geometric faces of the other CAD parts to determine the constraints of the given constraint type.
In such a manner, the constraint learning engine 110 may iterate through each geometric face of the given CAD part, and then eventually iterate through each geometric face in the CAD assembly 130. Application of the formulaic representations of any number of constraint types may indicate whether the compared geometric faces are constrained according to any of the constraint types. If so (e.g., ni·nj=−1, and ni·rij=nj·rij=0 for two selected geometric faces i and j), the constraint learning engine 110 may infer a constraint for the selected geometric faces and insert an edge constraint into the representation graph 210 between the corresponding nodes of the selected geometric faces. If not, (e.g., ni·nj!=−1 and/or ni·rij=nj·rij!=0 for two selected geometric faces i and j), the constraint learning engine 110 may determine not to insert a constraint edge between the corresponding nodes.
In any combination of the ways described herein, the constraint learning engine 110 may determine constraints in the CAD assembly 130 and insert constraint edges into the representation graph 210 that represent the determined constraints.
Note that the constraint learning engine 110 may differentiate between different types of edges in representation graph 210. In
Through generation of the representation graph 210 (with inserted constraint edges), the constraint learning engine 110 may generate training data for the ML model 120 to learn and predict constraints in CAD assemblies. Accordingly, the constraint learning engine 110 may provide the representation graph 120 (or portions thereof) as training data to train the ML model 120. The constraint learning engine 110 may train the ML model 120 with any number of representation graphs that represent any number of different CAD assemblies.
The ML model 120 may be trained using data encoded in the representation graph 210 of the CAD assembly 130, as well as any number of additional or alternative representation graphs or any number of other CAD assemblies. In doing so, the ML model 120 may process the data encoded in the representation graph(s) to learn constraint rules to apply for constraint determinations for CAD parts of different CAD assemblies.
In some implementations, the ML model 120 implements a random forest classifier which can learn constraint rules and predict whether a pair of geometric faces are “constrained” or “not constrained”, doing so based on feature vectors extracted for the geometric faces. In effect, the ML model 120 may implement a classifier function ƒ(xi,xj)→y, in which xi and xj may represent feature vectors of geometric faces i and j and y may be a binary variable indicating whether geometric faces i and j of an input CAD assembly or of two input CAD parts are constrained or not. To ensure symmetry of the results, the constraint learning engine 110 may configure the ML model 120 such that an absolute value of the difference is used, e.g., ƒ(|xi−xj|)→y.
To implement such a classifier function, the ML model 120 may train using the assembly and constraint data encoded in representation graphs. In particular, the ML model 120 (or the constraint learning engine 110) may parse the representation graph(s) based on node pairs to extract training data in the form of pairs or tuples comprising |xi−xj| (which may be extracted as a computation between the feature vectors encoded for the corresponding nodes for geometric faces i and j) and y (which may be extracted from the representation graph as whether a constraint edge exists between the nodes corresponding to geometric faces i and j). The ML model 120 may then learn a suitable transformation which may reduce or minimize the error in predicting y. In some instances, the ML model 120 may generate different classification functions for different constraint types, or the implemented classification function may generate separate outputs y for different constraint types. As such, the ML model 120 may process training data and generate constraints for input CAD assemblies and CAD parts thereof.
The ML model 120 may be capable of learning constraint rules even with relatively little training data. Even when trained on a single, relatively simple CAD assembly, the ML model 120 may process the generated representation graph for the single CAD assembly and be able to generate constraints for different CAD assemblies of increased complexity and greater variety in complexity. As one experimental example, the constraint learning engine 110 may train the ML model 120 using a representation graph generated for a simple CAD assembly of a plate with eight (8) identical holes and eight (8) identical screws that fit into the plate holes. Training on such a representation graph, the ML model 120 may be capable of generating constraints for input CAD assemblies of substantially greater variety and geometry, constraint types, and complexity. As such, the ML model 120 may learn broadly applicable constraint rules even from relatively little training data, which may increase the efficiency and applicability of the ML-based constraint generation features of the present application.
In
In some implementations, the constraint generation engine 112 may generate constraints for the CAD assembly 410 through identification or selection of two (or more) CAD parts of the CAD assembly 410. Such CAD part selection may be determined from user inputs or selections, whether as express selection of CAD parts in the CAD assembly 410 to predict constraints for or via selection of the CAD parts through an application UI to include in the CAD assembly 410. For unconstrained models or other forms of the CAD assembly 410, the constraint generation engine 112 may determine the multiple CAD parts that comprise the CAD assembly 410 for constraint generations.
To apply the ML model 120, the constraint generation engine 112 may construct a representation graph for the CAD assembly 410 or the selected CAD parts thereof. The constraint generation engine 112 may do so in any of the ways described herein, though the constraint generation engine 112 need not determine or infer constraints from the CAD assembly 410. Moreover, constraint inference may not even be possible, as the CAD assembly 410 or CAD parts thereof may not be properly positioned in the CAD assembly 410 (and for which the ML model 120 can predict constraints, positioning, and alignments for). In generating the representation graph for the CAD assembly 410, the constraint generation engine 112 may extract feature vectors for the geometric faces of the CAD assembly 410 and encode the extracted feature vectors as part of the generated representation graph for the CAD assembly 410. In the example of
The ML model 120 may process the representation graph 420 to determine pairs of geometric faces of the CAD assembly 410 (which can be extracted and interpreted as pairs of nodes in the representation graph 420 for different CAD parts). Then, the ML model 120 may generate constraints 430 for the CAD assembly 410. In some examples, the ML model 120 may generate the constraints 430 in a form of constraint predictions for pairs of geometric face, for example according to a learned classification function ƒ(xi,xj)→y. Note that the output y provided by the ML model 120 may be a normalized probability value (e.g., between 0 and 1), indicative of a degree to which the ML model 120 predicts the corresponding pair of geometric faces of the CAD assembly 410 should be constrained.
In some implementations, the constraint generation engine 112 may generate a ranked list from the constraints 430, which a CAD application may provide for user feedback. Such user feedback may be used to set express constraints in the CAD assembly 410, position CAD parts based on the generated constraints 430 or ranked list, or otherwise modify, process, or design the CAD assembly 410 based on the generated constraints 430. The constraints 430 generated by the constraint generation engine 112 (via the ML model 120) may further be delineated based on constraint type, and constraints of different constraint types may be applied to the CAD assembly 410 accordingly. The constraint generation engine 112 may generate and apply the constraints 430 to the CAD assembly 410 in sequence for two CAD parts, in batches (e.g., for multiple CAD parts or for entire sections of the CAD assembly 410), or all at once for the entire CAD assembly 410. Also, multiple different types of constraints may be applied between CAD parts, which may limit multiple degrees of freedom between the CAD parts.
In some implementations, the constraint generation engine 112 may determine to a apply a selected combination of the constraints 430 that together represent a viable positioning of the CAD parts of the CAD assembly 410. Such a selected combination may be determined based any number of selection criteria. In some implementations, the constraint generation engine 112 may present different viable combinations of the constraints 430 for positioning the CAD parts. The constraint generation engine 112 may do without applying any of the constraints 430 (or subsets thereof) to the CAD assembly 410, e.g., to support user selection and subsequent application of a selected combination of the constraints 430.
In any of the ways described herein, ML-based constraint generations may be implemented. While many ML-based constraint generation features have been described herein through illustrative examples presented through various figures, the constraint learning engine 110 and the constraint generation engine 112 may implement any combination of the ML-based constraint generation features described herein.
In implementing the logic 500, the constraint learning engine 110 may access a CAD assembly comprising multiple CAD parts (502) and generate a representation graph of the CAD assembly (504), doing so in any of the ways described herein. Nodes in the representation graph may represent geometric faces of the multiple CAD parts and edges in the representation graph may represent geometric edges of the multiple CAD parts. In implementing the logic 500, the constraint learning engine 110 may also determine constraints in the CAD assembly (506).
For example, the constraint learning engine 110 may determine constraints in the CAD assembly by identifying user-specified constraints in the CAD assembly. Additionally or alternatively, the constraint learning engine 110 may determine constraints in the CAD assembly by inferring the constraints from geometry of the CAD assembly, including by determining a formulaic representation of a given constraint type and comparing selected geometric faces of the CAD assembly using the formulaic representation to determine constraints of the given constraint type between the selected geometric faces. Comparing the selected geometric faces of the CAD assembly by the constraint learning engine 110 may include identifying a geometric face of a given CAD part of the CAD assembly, determining geometric faces of other CAD parts that are within a threshold distance from the identified geometric face of the given CAD part, and applying the formulaic representation of the given constraint type between the identified geometric face of the given CAD part and each of the geometric faces of the other CAD parts to determine the constraints of the given constraint type.
In implementing the logic 500, the constraint learning engine 110 may further insert constraint edges into the representation graph that represent the determined constraints (508), and provide the representation graph as training data to train a machine-learning model (510). In implementing the logic 500, the constraint generation engine 112 generate constraints for a different CAD assembly by applying the machine-learning model for the different CAD assembly (512).
The logic 500 shown in
The computing system 600 may execute instructions stored on the machine-readable medium 620 through the processor 610. Executing the instructions (e.g., the constraint learning instructions 622 and/or the constraint generation instructions 624) may cause the computing system 600 to perform any of the ML-based constraint generation features described herein, including according to any of the features of the constraint learning engine 110, the constraint generation engine 112, the ML model 120, or any combinations thereof.
For example, execution of the constraint learning instructions 622 by the processor 610 may cause the computing system 600 to access a CAD assembly comprising multiple CAD parts and generate a representation graph of the CAD assembly. Nodes in the representation graph may represent geometric faces of the multiple CAD parts and edges in the representation graph may represent geometric edges of the multiple CAD parts. Execution of the constraint learning instructions 622 by the processor 610 may also cause the computing system 600 to determine constraints in the CAD assembly, wherein the constraints may limit a degree of movement between geometric faces of different CAD parts in the CAD assembly. Execution of the constraint learning instructions 622 by the processor 610 may further cause the computing system 600 to insert constraint edges into the representation graph that represent the determined constraints and provide the representation graph as training data to train a machine-learning model. Execution of the constraint generation instructions 624 by the processor 610 may cause the computing system 600 to generate constraints for a different CAD assembly by applying the machine-learning model for the different CAD assembly.
Any additional or alternative ML-based constraint generation features as described herein may be implemented via the constraint learning instructions 622, constraint generation instructions 624, or a combination of both.
The systems, methods, devices, and logic described above, including the constraint learning engine 110 and the constraint generation engine 112, may be implemented in many different ways in many different combinations of hardware, logic, circuitry, and executable instructions stored on a machine-readable medium. For example, the constraint learning engine 110, the constraint generation engine 112, or combinations thereof, may include circuitry in a controller, a microprocessor, or an application specific integrated circuit (ASIC), or may be implemented with discrete logic or components, or a combination of other types of analog or digital circuitry, combined on a single integrated circuit or distributed among multiple integrated circuits. A product, such as a computer program product, may include a storage medium and machine-readable instructions stored on the medium, which when executed in an endpoint, computer system, or other device, cause the device to perform operations according to any of the description above, including according to any features of the constraint learning engine 110, the constraint generation engine 112, or combinations thereof.
The processing capability of the systems, devices, and engines described herein, including the constraint learning engine 110 and the constraint generation engine 112, may be distributed among multiple system components, such as among multiple processors and memories, optionally including multiple distributed processing systems or cloud/network elements. Parameters, databases, and other data structures may be separately stored and managed, may be incorporated into a single memory or database, may be logically and physically organized in many different ways, and may be implemented in many ways, including data structures such as linked lists, hash tables, or implicit storage mechanisms. Programs may be parts (e.g., subroutines) of a single program, separate programs, distributed across several memories and processors, or implemented in many different ways, such as in a library (e.g., a shared library).
While various examples have been described above, many more implementations are possible.
Number | Date | Country | Kind |
---|---|---|---|
202041036517 | Aug 2020 | IN | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2021/042737 | 7/22/2021 | WO |