This application claims priority under 35 U.S.C. § 119 to Korean Patent Application No. 10-2023-0142734, filed on Oct. 24, 2023 in the Korean Intellectual Property Office KIPO, the contents of which are herein incorporated by reference in their entireties.
Embodiments of the present disclosure relate to the field of computer graphics, and more particularly, to a method for visualizing a cutting path in a simulation apparatus and a simulation apparatus for performing the same.
With the development of technologies such as computer graphics and digital twins, a technical barrier to simulation of real-world objects in a virtual environment is gradually lowering. In particular, since a process of cutting an object in real world has to be performed safely and efficiently, learning is often performed through virtual simulation. There is an increasing demand for realistic cutting simulation for environments that require a high level of safety and precision, such as a surgical procedure, nuclear power plant decommissioning, and a disaster environment. In addition, due to the nature of virtual application, it is important to show a cutting process in real time in response to the user's interaction. Therefore, many researches for effectively and efficiently visualizing a cutting process in virtual cutting simulation are being conducted.
Embodiments of the present disclosure provide a method for visualizing a cutting path in a simulation apparatus, capable of realistically visualizing the cutting path with a small amount of calculation.
Embodiments of the present disclosure provide a simulation apparatus for performing a method for visualizing a cutting path efficiently and effectively.
In an embodiment of a method for visualizing a cutting path in a simulation apparatus, which is for cutting a target object by using a cutting tool in a virtual environment according to the present disclosure, the method includes setting a first point representing a start portion of the cutting tool and a second point representing an end portion of the cutting tool, obtaining a third point representing an intersection of the target object and the cutting tool based on the first point and the second point, calculating an incident depth and an incident angle of the cutting tool based on a normal vector that is orthogonal to a first plane of the target object including the third point, the first point, the second point, and the third point, generating a normal map based on the incident angle, the incident depth, a first depth representing a maximum depth at which the target object is able to be cut by the cutting tool, and the third point and visualizing the cutting path representing lines obtained by connecting a plurality of third points to each other based on normal maps generated as a result of repeatedly performing the obtaining of the third point, the calculating of the incident depth and the incident angle, and the generating of the normal map.
In an embodiment of a simulation apparatus for cutting a target object by using a cutting tool in a virtual environment according to the present disclosure, the simulation apparatus includes an input module configured to receive an input of a user and transmit an input signal to a controller, the controller configured to transmit a cutting image signal to a display module based on the input signal and the display module configured to provide image information to the user based on the cutting image signal. The controller includes a first module configured to periodically set a first point representing a start portion of the cutting tool and a second point representing an end portion of the cutting tool based on the input signal, and obtain a third point representing an intersection of the target object and the cutting tool based on the first point and the second point, a second module configured to receive the first point, the second point, and the third point from the first module, and calculate an incident depth and an incident angle of the cutting tool based on a normal vector that is orthogonal to a first plane of the target object including the third point, the first point, the second point, and the third point, a third module configured to receive the third point from the first module, receive the incident depth and the incident angle from the second module, and generate a normal map based on the incident angle, the incident depth, a first depth representing a maximum depth at which the target object is able to be cut by the cutting tool, and the third point and a fourth module configured to receive normal maps, which are generated as a result of repeatedly performing an operation of obtaining the third point by the first module, an operation of calculating the incident depth and the incident angle by the second module, and an operation of generating the normal map by the third module, from the third module, and generate the cutting image signal for visualizing a cutting path representing lines obtained by connecting a plurality of third points to each other based on the normal maps.
According to the method for visualizing the cutting path in the simulation apparatus of embodiments of the present disclosure described above, the cutting path may be visualized by using a normal map without modifying a mesh of a target object, so that the cutting path can be expressed realistically with a small amount of calculation.
The above and other features and advantages of the present disclosure will become more apparent by describing in detailed embodiments thereof with reference to the accompanying drawings, in which:
The present disclosure now will be described more fully hereinafter with reference to the accompanying drawings, in which embodiments of the present disclosure are shown. The present disclosure may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein.
Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the present disclosure to those skilled in the art. Like reference numerals refer to like elements throughout.
It will be understood that, although the terms first, second, third, etc. may be used herein to describe various elements, components, regions, layers and/or sections, these elements, components, regions, layers and/or sections should not be limited by these terms. These terms are only used to distinguish one element, component, region, layer or section from another region, layer or section. Thus, a first element, component, region, layer or section discussed below could be termed a second element, component, region, layer or section without departing from the teachings of the present disclosure.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the present disclosure. As used herein, the singular forms “a,” “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
All methods described herein can be performed in a suitable order unless otherwise indicated herein or otherwise clearly contradicted by context. The use of any and all examples, or exemplary language (e.g., “such as”), is intended merely to better illustrate the invention and does not pose a limitation on the scope of the invention unless otherwise claimed. No language in the specification should be construed as indicating any non-claimed element as essential to the practice of the inventive concept as used herein.
Hereinafter, the present disclosure will be explained in detail with reference to the accompanying drawings.
Referring to
According to the embodiments of the present disclosure, a first point representing a start portion of the cutting tool and a second point representing an end portion of the cutting tool may be set (step S100). The cutting tool may be replaced with two points to obtain an intersection of the target object and the cutting tool with a small amount of calculation. For example, the first point and the second point may be used to generate a ray in graphics pipeline-based ray casting. Specific examples of the first point and the second point will be described below with reference to
A third point representing an intersection of the target object and the cutting tool may be obtained based on the first point and the second point (step S200). For example, the third point may be obtained by using the graphics pipeline-based ray casting. For example, a ray that connects the first point to the second point may be generated, and coordinates of an intersection of the ray and a mesh triangle of the target object that has collided with the ray may be obtained. A specific example of the third point will be described below with reference to
An incident depth and an incident angle of the cutting tool may be calculated based on a normal vector that is orthogonal to a first plane of the target object including the third point, the first point, the second point, and the third point (step S300). For example, as will be described below with reference to
A normal map in the form of an RGB image may be generated based on the incident angle, the incident depth, a first depth, and the third point (step S400). For example, the first depth may represent a maximum depth at which the target object may be cut by the cutting tool, and may be preset at an initial stage of a simulation operation. The calculation of the first depth will be described below with reference to
The cutting path may be visualized based on normal maps (step S500). The normal maps may be generated as a result of repeatedly performing the steps S200, S300, and S400. For example, each of the steps S200, S300, and S400 may be performed periodically. For example, the steps S200, S300, and S400 may be performed in an environment of 30 frame per second (fps). In this case, 30 normal maps may be generated per second. For example, the cutting path may represent lines obtained by connecting a plurality of third points to each other. For example, when the steps S200, S300, and S400 are performed in the environment of 30 fps, 30 third points may be obtained, and 29 lines obtained by connecting two temporally neighboring points among the 30 third points may be obtained. In this case, the 29 lines may be substantially identical to the cutting path. According to one embodiment, the normal maps may be aligned to match directions of the lines.
According to the method for visualizing the cutting path in the simulation apparatus of the embodiments of the present disclosure described above, the cutting path may be visualized by using a normal map without modifying a mesh of a target object, so that the cutting path may be expressed realistically with a small amount of calculation.
Referring to
Referring to
Referring to
Referring to
For example, the incident depth ID may be calculated as a distance between the second point P2 and the third point P3. For example, the incident depth ID may represent a depth at which the cutting tool has passed through the first plane S1. As will be described below with reference to
For example, the incident angle IA may be calculated as an angle formed between a vector V21 in which a start point is the second point P2 and an end point is the first point P1 and the normal vector NV. For example, the incident angle IA may be calculated based on an inner product of the vector V21 in which the start point is the second point P2 and the end point is the first point P1 and the normal vector NV. As will be described below with reference to
Referring to
A fifth point may be obtained based on the first point and the second point (step S210). For example, the fifth point may be included in the third points, and may represent an intersection of the target object and the cutting tool.
After a first time interval from a time at which the fifth point is obtained, a sixth point may be obtained based on the first point and the second point (step S220). For example, the sixth point may represent an intersection obtained after the fifth point among the third points repeatedly obtained at the first time interval. For example, the first time interval may be 1/30 second in an environment of 30 fps.
The fifth point and the sixth point may be connected in the form of a line (step S230). The line obtained by connecting the fifth point to the sixth point may be one of the lines obtained by connecting the third points to each other, which are described above with reference to
Referring to
Referring to
Referring to
A cutting speed of the cutting tool may be calculated based on the first time interval, a displacement of the fifth point, and a displacement of the sixth point (S240). For example, the first time interval may represent a period at which the step S200 of
Referring to
The cutting speed and the critical speed may be compared with each other (step S401). For example, the critical speed may represent a maximum speed of the cutting tool to engrave the surface of the target object by the thickness of the cutting tool. For example, when the cutting tool passes rapidly over the surface of the target object, before the surface of the target object is engraved by the thickness of the cutting tool, the cutting tool may move away from the surface of the target object being engraved. When the cutting tool passes rapidly over the surface of the target object, the surface of the target object may be engraved by a thickness that is less than the thickness of the cutting tool.
When the cutting speed is less than or equal to the critical speed (step S401: Yes), the normal map may be generated based on the incident angle, the incident depth, the first depth, the third point, and a preset thickness of the cutting tool (step S402). For example, a case in which the cutting speed is less than or equal to the critical speed may represent a case in which the surface of the target object is engraved by the thickness of the cutting tool. For example, the thickness of the cutting tool may be used in a polynomial function-based interpolation operation. A specific example of the visualization based on the thickness of the cutting tool will be described below with reference to
When the cutting speed is greater than the critical speed (step S401: No), the normal map may be generated based on the incident angle, the incident depth, the first depth, the third point, and a first thickness that is less than the thickness of the cutting tool (step S403). For example, a case in which the cutting speed is greater than the critical speed may represent a case in which the surface of the target object is not engraved by the thickness of the cutting tool. For example, the first thickness may be set to have a value that is gradually decreased as the cutting speed increases. For example, the first thickness may be set to be inversely proportional to a difference between the speed of the cutting tool and the critical speed. For example, the first thickness may be set by a user. For example, the first thickness may be used in the polynomial function-based interpolation operation. A specific example of the visualization based on the first thickness will be described below with reference to
Referring to
Referring to
Referring to
A distance between the second point and the third point may be obtained as the incident depth (S310). For example, the incident depth may represent a depth at which the cutting tool enters the surface of the target object in an opposite direction of the normal vector. An angle formed between a vector generated based on the first point and the second point and the normal vector may be obtained as the incident angle (S320). For example, the incident angle may be obtained through the inner product between the vectors as described above with reference to
Referring to
The first depth may be obtained based on a material of the target object and a type of cutting tool (step S311). The first depth may represent a maximum depth at which the target object may be cut by the cutting tool. For example, when the material of the target object is stainless steel, and the cutting tool is a water jet cutter, the first depth may be set to 0.5 cm. In this case, the incident depth representing the depth at which the cutting tool enters the surface of the target object in the opposite direction of the normal vector may be adjusted to have a value that is less than the first depth.
The incident depth and the first depth may be compared with each other (step S312). When the incident depth is greater than or equal to the first depth (step S312: Yes), the incident depth may be adjusted to the first depth. For example, when the first depth is 0.5 cm, and the incident depth is 1 cm, the incident depth may be adjusted to 0.5 cm. When the incident depth is less than the first depth (step S312: No), the step S300 may end. For example, when the first depth is 0.5 cm, and the incident depth is 0.3 cm, the incident depth may not be adjusted.
Referring to
Coordinates at which the normal map is to be generated may be set based on the third point (step S410). For example, the normal map may be generated by using coordinates of the third point as a center point. For example, when the simulation apparatus uses a three-dimensional Cartesian coordinate system, and the coordinates of the third point are (0,0,0), a normal map in the form of an image having a size of 1024×1024 and using (0,0,0) as a center point may be generated. For example, the third point may correspond to (512,512) pixel coordinates in the normal map. For example, a direction of the normal map in a three-dimensional coordinate system may match a direction of the first plane (S1 of
A fourth point representing a point at which the target object is cut most deeply may be calculated based on the incident angle (step S420). For example, the fourth point may represent a portion in which the surface of the target object is indented most deeply. An RGB value of the normal map is determined based on the incident depth and the fourth point (step S430). For example, a B value of a line segment including the fourth point on the normal map may be set to be the largest. For example, the B value of the line segment including the fourth point may be set to 255, and a B value at a periphery of the line segment may be set by using polynomial function-based interpolation. A specific operation of calculating the fourth point will be described below with reference to
Referring to
Referring to
In [Equation 1], S4 may represent a degree by which a fourth point is spaced apart from a third point, and S4 may be calculated in a unit of a pixel. IA may represent an incident angle, and NMS may represent a width or length size of a normal map.
For example, when a size of a normal map NM_x is 1024×1024, and the incident angle IA_x is 30°, the fourth point P4_x may be spaced apart from the third point P3_x by sin (30°)×512 pixels. In this case, a line segment LA_x including the fourth point on the normal map NM_x may represent a set of (256, n) pixels (where n is a natural number up to 1024). For example, a B value of the line segment LA_x including the fourth point may be set to the largest value within the normal map NM_x, and a B value at a periphery of the line segment LA_x including the fourth point may be calculated through the polynomial function-based interpolation.
When comparing
Referring to
In [Equation 2], B may represent a B value of a normal map, ID may represent an incident depth, and D1 may represent a first depth. D1 may be preset based on a material of the target object and a type of cutting tool as described above with reference to
Referring to
When comparing
Referring to
Referring to
Referring to
The simulation apparatus 1000 may be an apparatus for performing cutting simulation of cutting a target object by using a cutting tool in a virtual environment.
The input module 100 may include a three-dimensional input apparatus such as a haptic apparatus and a mouse. For example, the input module 100 may detect an input (e.g., a movement) of a user to convert the input into an input signal IS, and transmit the input signal IS to the controller 200.
The controller 200 may operate based on the input signal IS, and may transmit a cutting image signal VS for visualizing a cutting path based on a normal map to the display module 300.
The display module 300 may receive the cutting image signal VS from the controller 200, and provide image information VI to the user based on the cutting image signal VS.
The first module 210 may perform the steps S100 and S200 of
The second module 220 may perform the step S300 of
The third module 230 may perform the step S400 of
The fourth module 240 may perform the step S500 of
Referring to
The second module 220a may perform the step S240 of
The third module 230a may perform the steps S401, S402, and S403 of
The foregoing is illustrative of the present inventive concept and is not to be construed as limiting thereof. Although a few embodiments of the present inventive concept have been described, those skilled in the art will readily appreciate that many modifications are possible in the embodiments without materially departing from the novel teachings and advantages of the present inventive concept. Accordingly, all such modifications are intended to be included within the scope of the present inventive concept as defined in the claims. In the claims, means-plus-function clauses are intended to cover the structures described herein as performing the recited function and not only structural equivalents but also equivalent structures. Therefore, it is to be understood that the foregoing is illustrative of the present inventive concept and is not to be construed as limited to the specific embodiments disclosed, and that modifications to the disclosed embodiments, as well as other embodiments, are intended to be included within the scope of the appended claims. The present inventive concept is defined by the following claims, with equivalents of the claims to be included therein.
Number | Date | Country | Kind |
---|---|---|---|
10-2023-0142734 | Oct 2023 | KR | national |