The present application claims priority to Chinese Patent Application No. 202310337386.4, filed on Mar. 31, 2023, the content of which is incorporated herein by reference in its entirety.
The present disclosure relates to the field of difference analysis of 3D CAD models, in particular to a difference analysis method for 3D CAD models based on key-point matching.
CAD technology is widely applied in digital design and intelligent manufacturing. In the iterative update of a product from the initial version to the final version, it is necessary to constantly modify and optimize its 3D CAD models. Thus it is very important to identify the changes of CAD models accurately and quickly between different versions to ensure production efficiency and product quality. Additionally, the difference analysis method for 3D CAD models is also widely used in the retrieval and reuse of models, management of product information and exchange of product data. However, the existing difference analysis methods for CAD models are non-universal because they require the models to be created in the same way or cannot identify the differences between two models in different coordinate systems.
In view of the problem that existing difference analysis methods for 3D CAD models are not universal or robust, the present disclosure provides a difference analysis method for 3D CAD models based on key-point matching. According to the method, the 3D CAD models are firstly transformed into a key-point representation, then a graph representation and a graph matching method are introduced to pre-match the vertices of two 3D CAD models, and parameters of rigid transformation between the two 3D CAD models are obtained through a mismatch elimination method; and subsequently, a matching relationship of key-points is established by checking their distance after rigid transformation to obtain a key-point matching sequence for hierarchical comparison, thereby the difference between the 3D CAD models can be identified quickly and accurately by analyzing the differences between their vertices, edges, faces and solid models.
The objective of the present disclosure is achieved through the following technical solution: a difference analysis method for 3D CAD models based on key-point matching, including the following steps:
Extracting intersections of edges from the 3D CAD models as the vertices, and recording the 3D coordinates of the vertices.
Transforming the edges of the 3D CAD models into a form of Non-Uniform Rational B-Spline (NURBS) curves, and taking NURBS control points except the vertices extracted from each edge as edge control points of the 3D CAD models, and recording 3D coordinates and weights of the edge control points.
Transforming the faces of the 3D CAD models into a NURBS form, and taking the NURBS control points except the vertices extracted from each face as the face control points of the 3D CAD models, and recording the 3D coordinates and weights of the face control points.
where {lni} is a set of sampling points of an ith node Gi, n=1, . . . , N1×N1, N1 is a sampling number, Si(u,v) is a sampling point on the model face Si corresponding to the node Gi obtained by the NURBS representation.
Uniformly sampling on a model edge corresponding to a link by a NURBS representation:
where {lmij} is a set of sampling points of a link Gij between the ith node and a jth node, m=1, . . . , N2, N2 is a sampling number, Cij(u) is a sampling point obtained from the NURBS representation of a model edge Cij corresponding to the link Gij.
where K is a number of points in the point sets, R* and T* are an optimal rotation matrix and an optimal translation matrix respectively, pk and qk are the 3D coordinates of the kth point in the two point sets {pk} and {qk}, respectively, and a solution is:
where,
and
U and V are singular value decompositions of a covariance matrix Sc, Sc=XYT, X and Y comprise column vectors xk=pk−
After obtaining the optimal rotation matrix and the optimal translation matrix, calculating the average matching distance error as follows:
The similarity between two nodes Gi and G′i, in two graphs G and G′ is:
where {lni} is a set of sampling points of the node Gi in a graph G, and {l′ni′} is a set of sampling points of a node G′i, in a graph G′.
The similarity between two links Gij and G′i′j′ in two graphs G and G′ is:
where {lmij} is a set of sampling points of the link Gij in the graph G, and {l′mi′j′} is a set of sampling points of a link G′i′j′ in the graph G′.
Solving M* by an integer projected fixed point method to obtain a node matching result of the two graphs, the faces on the 3D CAD models corresponding to the two nodes matched in the graphs are pre-matched faces; when all the faces adjacent to two edges are correspondingly pre-matched, the two edges are pre-matched edges, and when all the edges adjacent to two vertices are correspondingly pre-matched, the two vertices are pre-matched vertices.
where Ek({pk}, {qk}) is the matching distance error of the kth pair of pre-matched vertices in the point sets {pk} and {qk}, pk and qk are the 3D coordinates of the kth pair of pre-matched vertices in the point sets {pk} and {qk}.
After eliminating a vertex pair with the maximum matching distance error from the sets of pre-matched vertices, recalculating the average matching distance error and comparing it with the maximum allowable modeling error ε until the rigid transformation parameters RM and TM between the 3D CAD models are solved.
where piv and qi′v are the 3D coordinates of the ith vertex and the ith vertex in vertex sets {piv} and {qi′v} of the two 3D CAD models; and when the obtained eii′v is smaller than the maximum allowable modeling error ε, the two corresponding vertices are matched vertices.
Substituting the rigid transformation parameters RM and TM into every pair of the edge control points, and calculating the matching distance error:
where pie and qi′e are 3D coordinates of the ith edge control point and the ith edge control point in edge control point sets {pie} and {qi′e} of the two 3D CAD models, w(pie) and w(qi′e) are weights of the corresponding edge control points; if the obtained eii′e, is smaller than the maximum allowable modeling error ε, the two corresponding edge control points are matched edge control points.
Substituting the rigid transformation parameters RM and TM into each pair of face control points, and calculating the matching distance error:
where pif and qi′f are 3D coordinates of the ith face control point and the i′th face control point in face control point sets {pif} and {qi′f} of the two 3D CAD models, w(pif) and w(qi′f) are weights of the corresponding face control points; and when the obtained eii′f is smaller than the maximum allowable modeling error ε, the two corresponding face control points are matched face control points.
Further, in the S2, a method for acquiring the edge control points specifically includes:
Transforming the edges of the 3D CAD models into a NURBS form, and a curve {right arrow over (C)}(u) underlying the edge can be represented as:
where u is a normalized parameter, n is a number of control points of a NURBS curve, k is an order of the NURBS curve, Ni,k(u) is an ith k-order B-spline basis function, {right arrow over (P)}i is a 3D coordinate of an ith control point, and wi is a weight corresponding to the ith control point; then taking all the NURBS control points except the vertices extracted from the each edge as the edge control points of the 3D CAD models, and recording the 3D coordinates and the weights of the edge control points.
Further, in the S2, a method for acquiring the face control points specifically includes:
Transforming the faces of the 3D CAD models into a NURBS form, and a surface {right arrow over (S)}(u,v) underlying the faces can be represented as:
where u,v are normalized parameters, n,m are numbers of the control points in two dimensions of a NURBS surface, k,l are orders of the two dimensions of the NURBS surface, Ni,k(u) is the ith k-order B-spline basis function, Nj,l(v) is an jth l-order B-spline basis function, {right arrow over (P)}i,j is a 3D coordinate of the control point at a position (i, j), wi,j is a weight corresponding to the control point at the position (i, j); then taking the NURBS control points except the vertices extracted from each face as the face control points of the 3D CAD models, and recording the 3D coordinates and the weights of the face control points.
Further, after obtaining a difference analysis result of the 3D CAD models, a visualization interface for the difference analysis of the 3D CAD models is developed based on an Open Cascade platform to display the two 3D CAD models in comparison and highlight all different faces of the 3D CAD models.
The method has the beneficial effects that the difference between 3D CAD models in different poses can be quickly and accurately identified with the consideration of modeling errors, and an intuitive difference analysis result can be provided.
The object and effect of the present disclosure will become more apparent by describing the present disclosure in detail according to the attached drawings and preferred embodiments. It should be understood that the specific embodiments described here are only for explaining the present disclosure and are not used to limit the present disclosure.
In the difference analysis method for 3D CAD models based on key-point matching provided by the present disclosure, the 3D CAD model is firstly transformed into a key-point representation, then a graph representation and a graph matching method are introduced to establish a pre-matching of vertices between the two 3D CAD models, and rigid transformation parameters between the two 3D CAD models are obtained through a mismatch elimination method; subsequently, a matching relationship of key-points is established by checking their distance after rigid transformation to obtain a key-point matching sequence for hierarchical comparison, so as to identify the differences between vertices, edges, faces and solid models. The flow chart of the method is shown in
Intersections of edges are extracted from the 3D CAD models as the vertices, and the 3D coordinates of the vertices are recorded.
The edges of the 3D CAD models are transformed into the NURBS form, and the curve {right arrow over (C)}(u) underlying the edge can be represented as:
where U is a normalized parameter, ri is a number of control points of the NURBS curve, k is an order of the NURBS curve, Ni,k(u) is an ith k-order B-spline basis function, {right arrow over (P)}i is the 3D coordinates of the ith control point, and wi is a weight corresponding to the ith control point; the NURBS control points except the vertices extracted from each edge are taken as the edge control points of the 3D CAD models, and the 3D coordinates and the weights of the edge control points are recorded.
The faces of the 3D CAD models are transformed into the NURBS form, and the surface {right arrow over (S)}(u,v) underlying the faces can be represented as:
where u,v are normalized parameters, n,m are numbers of the control points in two dimensions of the NURBS surface, k,l are orders of the two dimensions of the NURBS surface, Ni,k(u) is the ith k-order B-spline basis function, Nj,l(v) is an jth l-order B-spline basis function, {right arrow over (P)}i,j is the 3D coordinate of the control point at a position (i, j), and wi,j is a weight corresponding to the control point at the position (i, j); and the NURBS control points except the vertices extracted from each face are taken as the face control points of the 3D CAD models, and the 3D coordinates and the weights of the face control points are recorded.
where {lni} is a set of sampling points of an ith node Gi, n=1, . . . , N1×N1, N1 is a sampling number, Si(u,v) is a sampling point on the model face Si corresponding to the node Gi obtained by the NURBS
Uniform sampling is performed on a model edge corresponding to a link by a NURBS representation:
where {lmij} is a set of sampling points of a link Gij between the ith node and a jth node, m=1, . . . , N2, N2 is a sampling number, Cij(u) is a sampling point obtained from the NURBS representation of a model edge Cij corresponding to the link Gij.
where K is a number of points in the point set, R* and T* are an optimal rotation matrix and an optimal translation matrix respectively, pk and qk are the 3D coordinates of the kth point in the two point sets {pk} and {qk} respectively, and a solution is:
where
U and V are singular value decompositions of a covariance matrix Sc, Sc=XYT, X and Y comprise column vectors xk=pk−
After obtaining the optimal rotation matrix and the optimal translation matrix, the average matching distance error is calculated as follows:
The similarity between two nodes Gi and G′i′ in two graphs G and G′ is:
where {lni} is a set of sampling points of the node Gi in a graph G, and {l′ni′} is a set of sampling points of a node G′i′ in a graph G′.
The similarity between two links Gij and G′i′j′ in the two graphs G and G′ is:
where {lmij} is a set of sampling points of the link Gij in the graph G, and {l′mi′j′} is a set of sampling points of a link G′i′j′ in the graph G′.
M* is solved by an integer projected fixed point method, and a node matching result of the two graphs is obtained; the faces on the 3D CAD models corresponding to the two nodes matched in the graphs are pre-matched faces; when all the faces adjacent to two edges are correspondingly pre-matched, the two edges are pre-matched edges; and when all the edges adjacent to two vertices are correspondingly pre-matched, the two vertices are pre-matched vertices.
where Ek({pk},{qk}) is the matching distance error of the kth pair of the pre-matched vertices in the point sets {pk} and {qk}, pk and qk are the 3D coordinates of the kth pair of the pre-matched vertices in the point sets {pk} and {qk}.
After eliminating a vertex pair with the maximum matching distance error from the point sets of the pre-matched vertices, the average matching distance error is recalculated and compared with the maximum allowable modeling error ε until the rigid transformation parameters RM and TM between the 3D CAD models are obtained.
where piv and qi′v are the 3D coordinates of the ith vertex and the i′th vertex in vertex sets {piv} and {qi′v} of the two 3D CAD models; when the matching distance error eii′v is smaller than the maximum allowable modeling error ε, the two corresponding vertices are matched vertices.
The rigid transformation parameters RM and TM are substituted into every pair of the edge control points, and the matching distance error is calculated:
where pie and qi′e are the 3D coordinates of the ith edge control point and the i′th edge control point in edge control point sets {pie} and {qi′e} of the two 3D CAD models, w(pie) and w(qi′e) are the weights of the corresponding edge control points; and when the matching error eii′e is smaller than the maximum allowable modeling error ε, the two corresponding edge control points are matched edge control points.
The rigid transformation parameters RM and TM are substituted into each pair of the face control points, and the matching error is calculated:
where pif and qi′f are the 3D coordinates of the ith face control point and the i′th face control point in the face control point sets {pif} and {qi′f} of the two 3D CAD models, w(pif) and w(qi′f) are the weights of the corresponding face control points; and when the matching error eii′f is smaller than the maximum allowable modeling error ε, the two corresponding face control points are matched face control points.
The method of the present disclosure will be described with reference to specific embodiments.
Two specific 3D CAD models are selected for difference analysis. As shown in
It can be understood by those skilled in the art that the above is only a preferred example of the present disclosure, and it is not used to limit the present disclosure. Although the present disclosure has been described in detail with reference to the above examples, it is still possible for those skilled in the art to modify the technical solution described in the above examples or replace some technical features equally. Any modification and equivalent substitution within the spirit and principle of the present disclosure should be included in the scope of protection of the present disclosure.
Number | Date | Country | Kind |
---|---|---|---|
202310337386.4 | Mar 2023 | CN | national |