This patent application is based on and claims priority pursuant to 35 U.S.C. § 119(a) to Japanese Patent Application No. 2018-002980, filed on Jan. 11, 2018, in the Japan Patent Office, the entire disclosure of which is hereby incorporated by reference herein.
Aspects of the present disclosure relate to an information processing apparatus, an information processing method, and a computer program product.
Apparatuses are known that perform three-dimensional fabrication by sequential lamination to produce a three-dimensional object (3D object) constituting a three-dimensional model of the 3D object. A lamination fabrication method employed by these apparatuses slices three-dimensional data of the 3D object into a plurality of thin parallel planes, superimposes thin planes (thin plates) to generate original data used in the fabrication of the 3D object, and laminates material such as powder, resin, steel plate, paper, or the like to produce the 3D object. As such a lamination fabrication method, an inkjet method, a powder method, a stereo-lithography method, a sheet lamination method, an extrusion method, and the like are known.
Moreover, a lamination fabrication method is known in which vaporized solvent is made to penetrate a surface of a 3D object to flatten the 3D object and soften modeling material at the surface of the 3D object to smooth the surface of the 3D object.
In an aspect of this disclosure, a novel information processing apparatus for generating a toolpath of a three-dimensional fabrication apparatus is provided. The information processing apparatus includes circuitry to slice three-dimensional data of the three-dimensional object into a plurality of parallel planes to generate a first image slice and a second image slice, each of the first image slice and the second image slice being one of the plurality of parallel planes, extract a first contour in the first image slice and a second contour in the second image slice, the first contour dividing the first image slice into an area inside the three-dimensional object and an area outside the three-dimensional object, the second contour dividing the second image slice into the area inside the three-dimensional object and the area outside the three-dimensional object, correct a third contour of a third image slice disposed between the first image slice and the second image slice based on the first contour of the first image slice and the second contour of the second image slice, and create a toolpath from the first contour, the second contour, and the third contour.
In another aspect of this disclosure, a novel information processing method for generating a toolpath of a three-dimensional fabrication apparatus is provided. The information processing method includes slicing three-dimensional data of a three-dimensional object into a plurality of parallel planes to generate a first image slice and a second image slice, each of the first image slice and the second image slice being a cross-section of the plurality of parallel planes, extracting a first contour in the first image slice and a second contour in the second image slice, the first contour and the second contour respectively dividing the first image slice and the second image slice into an area inside the three-dimensional object and an area outside the three-dimensional object, correcting a third contour of a third image slice disposed between the first image slice and the second image slice based on the first contour of the first image slice and the second contour of the second image slice, and creating a toolpath from the first contour, the second contour, and the third contour.
In still another aspect of this disclosure, a novel computer program product including a non-transitory computer-readable medium containing an information processing program is provided. The computer program product causing a device to perform: slicing three-dimensional data of a three-dimensional object into a plurality of parallel planes to generate a first image slice and a second image slice, each of the first image slice and the second image slice being one of the plurality of parallel planes, extracting a first contour in the first image slice and a second contour in the second image slice, the first contour dividing the first image slice into an area inside the three-dimensional object and an area outside the three-dimensional object, the second contour dividing the second image slice into the area inside the three-dimensional object and the area outside the three-dimensional object, correcting a third contour of a third image slice disposed between the first image slice and the second image slice based on the first contour of the first image slice and the second contour of the second image slice, and creating a toolpath from the first contour, the second contour, and the third contour.
The aforementioned and other aspects, features, and advantages of the present disclosure will be better understood by reference to the following detailed description when considered in connection with the accompanying drawings, wherein:
The accompanying drawings are intended to depict embodiments of the present disclosure and should not be interpreted to limit the scope thereof. The accompanying drawings are not to be considered as drawn to scale unless explicitly noted.
In describing embodiments illustrated in the drawings, specific terminology is employed for the sake of clarity. However, the disclosure of this patent specification is not intended to be limited to the specific terminology so selected and it is to be understood that each specific element includes all technical equivalents that have the same function, operate in an analogous manner, and achieve similar results.
Although the embodiments are described with technical limitations with reference to the attached drawings, such description is not intended to limit the scope of the disclosure and all the components or elements described in the embodiments of this disclosure are not necessarily indispensable. As used herein, the singular forms “a”, “an”, and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise.
Embodiments of the present disclosure are described below with reference to the attached drawings.
The information processing apparatus 10 is a general-purpose computer such as a personal computer (PC) or the like and is a device that implements certain functions by executing one or more programs (described below). As another example, the information processing apparatus 10 may be a tablet terminal, a smartphone, a PDA (Personal Digital Assistant), a mobile phone, a wearable PC, a game machine, a car navigation terminal, an electronic white board, or a projector, for example.
The 3D printer 20 is, for example, an apparatus for fabricating a three-dimensional object (3D object) by extrusion (fused deposition modeling (FDM) method), for example. The 3D printer 20 receives data transmitted from the information processing apparatus 10 and laminates fabrication layers based on the data to form the 3D object. The information processing apparatus 10 and the 3D printer 20 may be connected via a local area network (LAN), a wide area network (WAN), the Internet, or the like. Further, the information processing apparatus 10 and the 3D printer 20 may be connected via a universal serial bus (USB) cable or the like. The connection may be established by wire or part or all may be constructed wirelessly.
The information processing apparatus 10 analyzes three-dimensional data (3D data) to construct a three-dimensional model (3D model) and slices the 3D model at equal intervals with a predetermined lamination thickness (lamination pitch) to generate slice data. The information processing apparatus 10 generates a toolpath based on the slice data and transmits the toolpath to the 3D printer 20. The toolpath may be provided to the 3D printer 20 in a state stored in a recording medium such as a USB memory or a secure digital (SD) memory card. The 3D printer 20 may read the print data from the recording medium attached to a recording medium interface (recording medium I/F).
The lamination fabrication method slicing the three-dimensional data of the 3D object into a plurality of parallel planes (plates) to generate a plurality of image slice, superimposes these thin planes (plates) to generate original data used in fabrication of the 3D object, and laminates material such as powders, resin, steel plate, paper, or the like to produce the 3D object.
Note that the information processing apparatus 10 and the 3D printer 20 may be configured as a single unit. That is, the 3D printer 20 may have a function of the information processing apparatus 10 and may perform a process such as generating the toolpath from the 3D data.
The 3D printer 20 fabricates the 3D object based on the toolpath. A material extrusion deposition method (FDM) method), a material jetting method, a binder jetting method, a selective laser sintering (SLS) method, a stereolithography (SLA) method, and the like, are available as fabrication method used in the 3D printer 20. The material extrusion deposition method (FDM method)) extrudes thermally melted resin through a nozzle and laminates layers of thermally melted resin to fabricate the 3D object. In addition to resin, a fluid material such as metal may use as a material of the 3D printer 20. The material jetting method discharges resin from an inkjet head and cures the resin with ultraviolet rays and laminates layers of cured resin. The binder jetting method discharges a liquid of binder from an inkjet head to cure gypsum or resin powder one by one. The SLS method irradiates a powdery material with a laser to sinter the powdery material. The SLA method cures liquid photocurable resin with an ultraviolet laser and laminates layers of cured resin one by one to fabricate the 3D object.
In the present embodiment, for convenience of explanation, the 3D printer 20 of the material extrusion deposition method (FDM method) is described below as an example. A method of changing the toolpath of the present embodiment may be applied to change the toolpath for each fabrication methods.
The CPU 101 controls the entire operation of the information processing apparatus 10. The CPU 101 serving as circuitry to extract the corner pixels and create the toolpath. The ROM 102 stores programs and data for operating the CPU 101. The RAM 103 is used as a work area for the CPU 101. The HD 104 stores programs, an operating system (OS), and various data. The HDD 105 controls reading or writing of various types of data from and to the HD 104 according to the control of the CPU 101. The network I/F 109 is an interface for performing data communication using a network.
The keyboard 111 is a device including a plurality of keys for the user to input characters, numerical values, various instructions, and the like. The mouse 112 is a device for the user to select various instructions, execute selected instructions, select a process target, move a cursor, and the like. The media drive 107 controls reading or writing (storing) of data from or to the recording medium 106 such as a flash memory. The optical drive 114 controls reading or writing of various data with from or to the optical disk 113 such as a compact disc read-only memory (CD-ROM)), a digital versatile disc or DVD (trademark), and a Blu-ray Disc (trademark), etc. as an example of a detachable recording medium. The display 108 displays various information such as a cursor, a menu, a window, characters, images, and the like. The display 108 may be a projector or the like. The USB I/F 115 is an interface for connecting a USB cable, a USB memory, or the like.
The communication unit 1001 is implemented by the network I/F 109 or the USB I/F 115 illustrated in
The slice contour extraction unit 1003 extracts the contour of the 3D object to be fabricated in each image slice generated by the slice generation unit 1002. A contour is defined by a line connecting a plurality of corner pixels. A detailed description of contour correction is given below.
The contour correction unit 1004 corrects the contour by comparing the contour extracted by the slice contour extraction unit 1003 with the image slice of the upper layer or the lower layer of the extracted contour. A detailed description of contour correction is given below.
The 3D printer 20 includes a discharge nozzle from which material such as resin is discharged. The toolpath generation unit 1005 generates a toolpath including a trajectory of the discharge nozzle, a moving speed of the discharge nozzle, an amount of material discharged from the discharge nozzle, and the like of the 3D printer 20 based on the contour corrected by the contour correction unit 1004. One operation of the discharge nozzle is defined by three parameters of the trajectory of the discharge nozzle, the moving speed of the discharge nozzle, and an amount of material supplied to the discharge nozzle (amount of material discharged from the discharge nozzle). The trajectory of the discharge nozzle, the moving speed of the discharge nozzle, and the amount of material supplied to the discharge nozzle may be collectively referred to as the “tool-pass”.
The storage unit 1006 includes a program 1007 for implementing each functional unit, a 3D data storage unit 1008 for holding 3D data of the 3D object to be fabricated, and a toolpath storage unit 1009 for holding a toolpath to be output to the 3D printer 20. The storage unit 1006 supplies necessary data to each functional unit according to the control of the CPU 101.
In step S01, the slice generation unit 1002 acquires 3D data from the 3D data storage unit 1008 and slices 3D data of the 3D object to be fabricated into a plurality of parallel planes for predetermined lamination thickness to generate an image slice. Each of the image slices is a cross-section of the plurality of parallel planes. The image slices are generated by the number of layers to be laminated and are output to the slice contour extraction unit 1003.
Next, in step S02, the slice contour extraction unit 1003 extracts a contour of each image slice. In step S03, the contour to be corrected and the amount of correction are calculated based on the contour extracted for each image slice in step S02. In step S04, the toolpath generation unit 1005 creating a toolpath from the contours corrected in each image slice based on the correction target and the amount of correction calculated in step S03.
In
Thus, the pixels of the contour can be obtained by a known method of extracting the contour of the image. Each of the corner pixels is adjacent to the pixels outside the 3D object among the 8-connected pixels in a vertical, a horizontal, and a diagonal direction among the eight directions. A direction of the toolpath of the successive contour changes at the corner pixels. If there are two pixels (first and second pixels) of the contour adjacent to each other in the diagonal direction, and if there is a third pixel adjacent to both the two pixels (first and second pixels) of the contour potion, the third pixel may be excluded from the contour. A detail of an exclusion process is described in
Then, as illustrated in
In this way, the present embodiment can extract the corner pixels and create the toolpath.
When generating the image slices, it is determined whether the pixel belongs to the pixels inside the 3D object (white pixel) or the pixels outside the 3D object (black (hatched) pixel) for each pixel in the image slices. The method of determination depends on the method used for generating the image slices.
It is determined whether the pixel belongs to the pixel inside the 3D object or the pixel outside the object according to an actual occupancy rate of an area inside the 3D object and an area outside the 3D object in each pixel. Thus, the image slice of a lower layer and the image slice of an upper layer may coincide at a portion in which the surface of the 3D object changes continuously that is the portion in which a change of a shape of the contour of the actual 3D object in an XY plane between the lamination layers is smaller than the resolution of the image slices. Here, the resolution of the image slice is a size of an actual area corresponding to one pixel of the image slice. In
When the toolpath is created from the image slices obtained as described above by the method of creating the toolpath as described in
Returning to
Each of the image slices SL1 to SL3 illustrated in
In step S11, the contour correction unit 1004 selects the image slice from which the contour output from the slice contour extraction unit 1003 is extracted and selects one corner pixel in the selected image slice, and the contour correction unit 1004 sets the Count to zero.
In the following step S12, the contour correction unit 1004 sets the image slice one layer below the image slice selected in step S11 as a search target.
In the following step S13, the contour correction unit 1004 determines whether there are the corner pixels of the image slice set as the search targeted in step S12 in the same coordinate with the corner pixel selected in the step S11. That is, the contour correction unit 1004 determines whether the plurality of corner pixels of a third contour in a third image slice (the image slice set as the search target in step S12) includes a corner pixel of same coordinates as the corner pixel of a first contour in a first image slice (the corner pixel of the image slice selected in the step S11). If the determination is YES, Count is incremented by one in step S16, and the process returns to step S12. If the determination is NO, the process proceeds to step S14.
In the following step S14, the value of Count is determined. When Count is smaller than one, the contour correction unit 1004 determines that no corner pixel exists in the same coordinate of the image slice set as the search targeted in step S12. Then, in step S17, the contour correction unit 1004 changes the corner pixel selected in step S11 or selects the image slice one layer below the image slice set in step S12 (change image slice) when all the corner pixels in the image slice selected in step S11 have been selected. Then, the process proceeds to step S11. However, the corner pixel, the coordinate of which is corrected in step S15, is not selected in step S11 when selecting the image slice of one layer below the image slice set in step S12. If Count is one or more, the process proceeds to step S15.
In step S15, the contour correction unit 1004 corrects the coordinate of the corner pixel of the image slice in which the corner pixel exists in the same coordinate with the corner pixel selected in step S11 based on a position of the corner pixel selected in step S11 and the corner pixel positioned at the 8-connected pixels of the corner pixel in the image slice of the search target. Thus, the contour correction unit 1004 corrects the coordinate of the corner pixel of the contour to be corrected (third contour) to approach a contour of original data of the three-dimensional object.
The image slice in the portion includes the corner pixel in the coordinate of the 8-connected pixels of the original corner pixel after the image slices, in which the corner pixel is extracted in the same coordinate, are successively created for plurality of layers. It is because that a pixel in the contour of the 3D object changes only by one in a portion in which a change in the contour of the 3D object between the lamination layers on the XY plane is smaller than the resolution of the image slice, that is, the size of the area of the actual 3D object represented by one pixel of the image slice.
Here, “L” is a distance between a center of CSL3 and a center of CSL0 on the XY plane. A difference of number of layers of the image slices CSL3 and CSL0 is three. Then, the contour correction unit 1004 corrects (shifts) a position of the corner pixel of the image slice SL2 positioned in the same coordinate with the CSL3 to a position in a direction of CSL0 (in a lower right direction) by L/3 from the original coordinate. Further, the contour correction unit 1004 corrects (shifts) a position of the corner pixel of the image slice SL1 positioned in the same coordinate with the CSL3 to a position in a direction of CSL0 (in a lower right direction) by 2×L/3 from the original coordinate.
As described above, “L” is the distance between a first corner pixel and a second corner pixel positioned in the 8-connected pixels of the first corner pixel on XY plane. The layer of the image slice of the corner pixel disposed N-layer below the layer of the image slice of the first corner pixel. Then, an amount of correction of the corner pixel in the same coordinate with the first corner pixel is calculated as n×L/N. The layer of the image slice of the corner pixel is disposed N-layers below the layer of the image slice of the first corner pixel. “n” is an order (number) of the layer of the image slice, the contour of which is to be corrected, from the first image slice. “n” is an integer taking values from 1 to N−1. The direction is corrected in a direction of the corner pixel positioned in the vicinity of 8-connected pixels of the image slice existing N-layers below the layer of the image slice of the first corner pixel.
Return to
Noted that the processes described with reference to
Return to
As described above, according to the present embodiment, the information processing apparatus 10 extracts the contour of the image slice from the 3D data and corrects the contour to approach the contour of the actual 3D object when the image slices each including the corner pixels in the same coordinate continuously exists.
Thus, the present embodiment can form a 3D object including a portion in which the surface of the 3D object changes continuously and smoothly.
In the embodiment of the present disclosure, the 3D printer 20 is an example of a three-dimensional fabricating apparatus. The corner pixels are examples of pixels in which a direction of the contour changes. The slice contour extraction unit 1003 is an example of a contour extraction unit.
Although the image generation device and the image projection apparatus according to some embodiments have been described above, embodiments of the present disclosure are not limited to the above-described embodiments, and various modifications and improvements are possible within the scope of the present disclosure.
The above-described embodiments are illustrative and do not limit the present invention. Thus, numerous additional modifications and variations are possible in light of the above teachings. For example, elements and/or features of different illustrative embodiments may be combined with each other and/or substituted for each other within the scope of the present invention.
Each of the functions of the described embodiments may be implemented by one or more processing circuits or circuitry. Processing circuitry includes a programmed processor, as a processor includes circuitry. A processing circuit also includes devices such as an application specific integrated circuit (ASIC), digital signal processor (DSP), field programmable gate array (FPGA), and conventional circuit components arranged to perform the recited functions.
Number | Date | Country | Kind |
---|---|---|---|
2018-002980 | Jan 2018 | JP | national |