The present application claims priority under 35 U.S.C. §119 to Japanese Patent Application No. 2015-181201, filed Sep. 14, 2015. The contents of which are incorporated herein by reference in their entirety.
1. Field of the Invention
The present invention relates to a three-dimensional shaping apparatus, a three-dimensional shaping method, and a computer program product.
2. Description of the Related Art
In late years, a technology called three-dimensional shaping is used in the field of rapid prototyping, etc. Three-dimensional objects obtained by the three-dimensional shaping are used, in many cases, as prototypes used to evaluate appearance and performance of a final product in a product development stage, or as exhibits and so on.
As one of three-dimensional shaping techniques, the laminating method of shaping and laminating shapes obtained by slicing an objective three-dimensional object to form the three-dimensional object is known. One of three-dimensional shaping apparatuses using the laminating method is a powder laminating shaping printer that feeds a molding material such as powder to a position corresponding to a molding part and supplies a liquid for binding the molding material thereto afterward to form a layer.
In the powder laminating shaping printer, a three-dimensional object to be shaped is formed in a poor visibility state such that the three-dimensional object is buried in uncured powder material.
According to one aspect of the present invention, a three-dimensional shaping apparatus is configured to laminate layers of a molding material based on input information to shape a three-dimensional object. The three-the dimensional shaping apparatus includes a powder material feeder, a layer information acquiring unit, a binding agent discharging unit, and an image projecting unit. The powder material feeder is configured to feed a powder material flat so as to be vertically deposited. The layer information acquiring unit is configured to acquire layer information generated in such a manner that information indicating a shape of the three-dimensional object is divided so as to correspond to the layers of the molding material. The binding agent discharging unit is configured to discharge a binding agent for binding the powder material selectively to the flat fed powder material at a position determined based on the layer information, to bind the powder material to form the layers of the molding material. The image projecting unit is configured to project an image onto a flat surface of the powder material based on projection information generated according to the layer information.
The accompanying drawings are intended to depict exemplary embodiments of the present invention and should not be interpreted to limit the scope thereof. Identical or similar reference numerals designate identical or similar components throughout the various drawings.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the present invention.
As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise.
In describing preferred embodiments illustrated in the drawings, specific terminology may be employed for the sake of clarity. However, the disclosure of this patent specification is not intended to be limited to the specific terminology so selected, and it is to be understood that each specific element includes all technical equivalents that have the same function, operate in a similar manner, and achieve a similar result.
An embodiment has an object to perform localization of a three-dimensional object in a shaping device for laminating layers in which powder material is selectively bound to form a three-dimensional object.
Exemplary embodiments of the present invention will be explained below with reference to the accompanying drawings. The present embodiments will explain a system, as an example, including a 3D printer that receives 3D data indicating a shape of a three-dimensional object such as computer aided design (CAD) data and deposits layers of a molding material to form the three-dimensional object based on the data, and including a personal computer (PC) that transmits the 3D data to the 3D printer.
As illustrated in
The CPU 10 is a computing unit, and controls the overall operation of the PC 1. The RAM 20 is a volatile storage medium capable of high speed reading and writing of information and is used as a work area when the CPU 10 processes the information. The ROM 30 is a read-only nonvolatile storage medium, and stores programs such as firmware. The HDD 40 is a nonvolatile storage medium capable of reading and writing information, and stores an operating system (OS), various types of control programs, and application programs, and the like.
The I/F 50 connects the bus 80 and various hardware and networks, etc. for control. The LCD 60 is a visual user interface through which a user checks the status of the PC 1. The operation part 70 is a user interface, such as a keyboard and a mouse, with which the user inputs information to the PC 1.
In the hardware configuration, the CPU 10 performs computation according to the program stored in the ROM 30 and the program loaded into the RAM 20 from the storage medium such as the HDD 40 or an optical disk (not illustrated), to thereby configure a software control unit. A functional block for implementing the functions of the PC 1 according to the present embodiments is implemented by a combination of the software control unit configured in this manner and the hardware.
The configuration of the 3D printer 2 according to the present embodiments will be explained next with reference to
As explained above, the 3D printer 2 discharges the binder liquid P from the IJ head 201 according to a slice image generated by horizontally dividing the three-dimensional shaped object, of which shape is expressed by the input 3D data, into round slices. The discharged binder liquid P binds the powder material fed to the shaping stage 211, molding for one layer is thereby perform, and such layers are laminated to carry out three-dimensional shaping. Moreover, the 3D printer 2 according to the present embodiments includes the projector 203, and projects a slice image onto the shaping stage 211. A molding operation for one layer according to the present embodiments will be explained below with reference to
As illustrated in
When the powder material is fed to the shaping stage 211 as illustrated in
When the molding for one layer is complete as illustrated in
The 3D printer 2 also includes an information processing function equivalent to the configuration explained in
The configuration for the control of the 3D printer 2 according to the present embodiments will be explained next with reference to
The controller 220 includes a main control unit 221, a network control unit 222, a powder feeder driver 223, an IJ head driver 224, and a projector driver 225. The main control unit 221 is a control unit that controls the whole in the controller 220 and is implemented by the CPU 10 performing operations according to the OS and the application programs. The network control unit 222 is an interface through which the 3D printer 2 exchanges information with other devices such as the PC 1, and Ethernet (registered trademark) or a Universal Serial Bus (USB) interface is used. Therefore, the network control unit 222 and the main control unit 221 function as a layer information acquiring unit that acquires slice data from the PC 1.
The powder feeder driver 223 and the IJ head driver 224 are pieces of driver software for controlling the drive of the powder feeder 210 and the IJ head 201 respectively, and control the drive of the powder feeder 210 and the IJ head 201 respectively according to the control of the main control unit 221. The projector driver 225 is driver software for projecting the image data transmitted from the PC 1 to the 3D printer 2 from the projector 203. The operations explained in
The functional configuration of the PC 1 according to the present embodiments will be explained next with reference to
The controller 100 is implemented by a combination of the software and the hardware, and functions as a control unit for controlling the entire PC 1. As illustrated in
The 3D data app 110 is a software application such as CAD software for processing data used to express a three-dimensional shape of a shaped object.
The 3D data conversion processor 120 is a 3D information processor for acquiring the input 3D data and performing conversion processing. That is, the program for implementing the 3D data conversion processor 120 is used as a 3D information processing program. The input of the 3D data to the 3D data conversion processor 120 includes, for example, a case where the 3D data conversion processor 120 acquires the data input to the PC 1 through the network and a case where the 3D data conversion processor 120 acquires the data of a file path specified by a user operation for the operation part 70.
The 3D data conversion processor 120 generates layer information for each layer obtained by slicing a three-dimensional object formed by the 3D data (hereinafter, “slice data”) based on the 3D data acquired in that manner. The 3D data conversion processor 120 according to the present embodiments generates projection data, as the processing according to the gist of the present embodiments, which is information to be projected onto the shaping stage 211 based on the slice data. The processing will be explained in detail later.
The 3D printer driver 130 is a software module for operating the 3D printer 2 through the PC 1, and generates a job for operating the 3D printer 2 based on the slice data and the projection data generated by the 3D data conversion processor 120 and transmits the job to the 3D printer 2. Therefore, the slice data corresponds to shaping information for shaping a divided three-dimensional object.
The functions included in the 3D data conversion processor 120 according to the present embodiments will be explained next with reference to
The 3D data acquiring unit 121 acquires the 3D data input to the 3D data conversion processor 120. As explained above, the 3D data is target object three-dimensional shape information indicating a three-dimensional shape of a target object to be shaped. The slice processor 122 generates slice data based on the 3D data acquired by the 3D data acquiring unit 121. At this time, each of the slice data is generated in such a manner that the 3D data is divided to a thickness corresponding to one feed portion of the powder material.
The projection distance calculating unit 123 calculates, as illustrated in
The projection information generating unit 124 generates projection data based on the slice data generated by the slice processor 122 and the projection distance calculated by the projection distance calculating unit 123. How to generate the projection data will be explained herein with reference to
In the present embodiments, position information (x1, y1, z1) of the projector 203 is previously stored in the PC 1. The projection distance calculating unit 123 refers to the position information (x1, y1, z1), the change of the position in a Z direction of the shaping stage 211 in association with the feed of the powder material, and a thickness of lamination of the powder material, to calculate a height h from the shaping surface on the shaping stage 211 to the optical lens of the projector 203 as illustrated in
Moreover, the projection distance calculating unit 123 refers to the position information (x1, y1, z1) to calculate a distance a between a center O and a point (x1, y1) on the shaping stage 211 as illustrated in
How to generate the projection data will be explained nest with reference to
If the diagonal of the projection area size is D, a resolution of an image (hereinafter, “stage resolution S”) to be projected onto the shaping stage 211 can be obtained from the length D and the projection resolution of the projector 203. The stage resolution S is calculated by Stage resolution S=(Projection resolution of projector 203)/(Length D/√2) using the property of an isosceles right triangle. The projection resolution of the projector 203 in this case corresponds to the resolution of the display device. Moreover, if a resolution of the slice data which is information of pixels to which the binder liquid P is discharged at the time of shaping is “slice resolution R”, a ratio N between the slice resolution R and the stage resolution S can be obtained as N=S/R. When the slice data is geometrically transformed to increase the slice data by N times in the vertical and horizontal directions using the ratio N obtained in this manner and the obtained slice data is projected onto the shaping stage 211, an image of a size corresponding to one layer of the three-dimensional object shaped by the slice data can be projected on the shaping stage. Therefore, the projection information generating unit 124 geometrically transforms the slice data to be increased by N times in the vertical and horizontal directions to generate projection data.
The conversion data output unit 125 outputs the slice data generated by the slice processor 122 and the projection data generated by the projection information generating unit 124 to the 3D printer driver 130. Thereby, the 3D printer driver 130 generates a job for operating the 3D printer 2 based on the slice data and the projection data and transmits the job to the 3D printer 2.
As illustrated in
The operation of the 3D printer 2 having received the job will be explained next with reference to
After the IJ head 201 is moved, the main control unit 221 refers to the slice data and the projection data. The main control unit 221 transmits the referred projection data to the projector driver 225 so as to project the projection data on the powder material fed to the shaping stage 211. Moreover, in the slice data, when the position of the IJ head 201 is part of the three-dimensional object to be shaped, the main control unit 221 performs the control to discharge the binder liquid P (S1004). At this time, when the position of the IJ head 201 is not part of the three-dimensional object to be shaped, the main control unit 221 performs the control not to discharge the binder liquid P. The main control unit 221 repeats the processing at S1004 until the processing for one layer is complete.
When the processing for one layer is complete, the main control unit 221 repeats the processing from the feed of the powder material for a new layer until the processing for all the layers is complete (No at S1005), and ends the processing when the processing for all the layers is complete (Yes at S1005). With the processing, the operation of the 3D printer 2 having received the job is complete.
As explained above, the 3D printer 2 according to the present embodiments projects the projection data onto the powder material at the area where the shaping is performed, and can thereby confirm the position of the three-dimensional object on the shaping stage 211. Thus, it is possible to visually confirm the position of the three-dimensional object in the laminated powder material and to reduce the damage that may occur when the shaped three-dimensional object is taken out therefrom.
A case in which a plurality of three-dimensional objects are concurrently shaped can be considered depending on the size of the three-dimensional object. In this case, the 3D printer 2 projects slice data of the three-dimensional objects generated by a function implemented in the slice processor 122 onto the shaping stage 211.
Various functions included in the slice processor 122 will be explained herein with reference to
When the three-dimensional objects are to be concurrently shaped, the data synthesizing unit 126 synthesizes slice data generated from the 3D data input to the 3D data conversion processor 120. The data selecting unit 127 receives information of an operation performed by the user from the PC 1 and performs a selection of the projection data corresponding to the information of the operation. The data storage unit 128 stores the projected projection data in the RAM 20 and the HDD 40, etc. The progress rate calculating unit 129 compares the slice data and the 3D data, and adds the information of the progress rate in the shaping process to each of the slice data. The details of the processing executable by the functions included in the slice processor 122 will be explained below.
The 3D data conversion processor 120 generates projection data based on the slice data synthesized by the data synthesizing unit 126, and transmits the projection data to the 3D printer driver 130 (S1404). The 3D printer driver 130 generates a job for operating the 3D printer 2 based on the synthesized slice data and the projection data and transmits the job to the 3D printer 2. With the processing of the 3D data in the PC 1, it is possible to simultaneously project a plurality of projection data onto the powder material on the shaping stage 211 as illustrated in
An operation of the PC 1 performed by the user is received in the 3D data app 110, so that arrangement of the 3D data of the three-dimensional objects is determined. Therefore, when a cylinder and a triangular pyramid are concurrently shaped as illustrated in
In the present embodiments, slice data arbitrarily specified by the user can be projected on the shaping stage 211. For example, when the shaping is carried out while changing color of the powder material, because the projection data for a shaping layer arbitrarily specified by the user is projected on the shaping stage 211, it is possible to confirm the details of the position of the shaping layer specified by the user in the shaping process.
Subsequently, the slice processor 122 determines whether the shaping processing based on the slice data is completely performed and shaping of the three-dimensional object is complete (S1803). When the shaping of the three-dimensional object is not complete (No at S1803), the slice processor 122 performs slice processing on any 3D data not shaped, and performs the processing again from S1801. When the shaping of the three-dimensional object is complete (Yes at S1803), the slice processor 122 refers to the data storage unit 128 to perform the processing of projecting the largest slice data (S1804). In the processing, the largest slice data is projected onto the powder material on the shaping stage 211. Therefore, the projection distance calculating unit 123 calculates a projection distance at the time of shaping completion. The calculated projection distance is transmitted to the projection information generating unit 124, and becomes data used at the time of geometric transformation from the slice data to the projection data. The projection data generated through the geometric transformation is projected from the projector 203 onto the shaping stage 211 by the 3D printer driver 130.
By projecting the largest slice data onto the shaping stage 211 in this manner, it is possible to visually recognize the size of the three-dimensional object even if the three-dimensional object is buried in the powder material. Therefore, it is possible to reduce any damage that may occur when the three-dimensional object is taken out after the completion of the shaping. In the present embodiments, sizes of pixel areas representing the positions of shaped objects are compared with each other to determine the sizes of the projection data.
The slice processor 122 sequentially allocates a number to the slice data generated at the time of the slice processing performed on the input 3D data (S2001). The allocation of the number at this time is used as information for forming a layer of the molding material in the shaping process.
The progress rate calculating unit 129 calculates a progress rate in each of the slice data based on the number allocated to the slice data and the maximum value of the number, and adds the calculation result to the slice data (S2002). The slice data added with the progress rate in this manner is transmitted to the projection distance calculating unit 123 (S2003), and is used as the slice data and the projection data in the processing at S1004. Information indicating the progress rate based on the number allocated to the slice data may be added.
The form of the slice data after the progress rate is added may be a form in which the progress rate is displayed as character information in the projection data or the progress rate is displayed on the 3D printer 2 based on the slice data.
As explained above, in the processing performed when the slice processing of the 3D data is executed, the slice processor 122 according to the present invention projects the detailed position of the shaped object or generates the projection data that reflects the progress rate. When shaping of a plurality of 3D data is concurrently performed, respective slice data are generated and the generated slice data are synthesized, and the synthesized 3D data is projected onto the shaping stage 211. These processings implemented by the functions included in the slice processor 122 may be independently performed, respectively, and a combination of some of the processings may be executed. By causing the slice processor 122 to execute the processings, it is possible to perform localization of a three-dimensional object on the shaping stage 211 not only at the time of shaping each of the shaping layer of the three-dimensional object but also before the shaping or after the completion of the shaping.
When a three-dimensional object having a complicated structure is to be shaped, it is desirable to perform shaping after checking positions where the shaping is performed, on the shaping stage 211. In this case, after the projection data is projected onto the shaping stage 211, it is possible to receive a user input to the PC 1 and to determine whether to execute the shaping. The operation of determining whether the shaping is possible after the projection will be explained below with reference to
When the powder material is fed to the shaping stage 211, the main control unit 221 refers to the projection data and the slice data to transmit the referred projection data to the projector driver 225. The projector 203 projects the projection data onto the powder material fed to the shaping stage 211 (S2101). When the projection is performed by the projector 203, the main control unit 221 transmits a request to determine whether the shaping based on the slice data is to be performed to the PC 1 through the network control unit 222. The user operates the PC 1 to input information as to whether to perform the shaping of an area corresponding to the slice data projected on the shaping stage 211.
When accepting the operation for the PC 1 by the user to receive a signal indicating that execution of the shaping is possible (Yes at S2102), the 3D printer driver 130 transmits a job for causing the 3D printer 2 to execute the shaping based on the slice data corresponding to the projection data to the 3D printer 2. The 3D printer 2 performs the shaping of the area corresponding to the projection data based on the job (S2103). The 3D printer 2 repeatedly executes processings at S1001 to S2103 until all the slice data corresponding to the 3D data are shaped (No at S2104).
When accepting the operation for the PC 1 by the user to receive a signal indicating that execution of the shaping is not possible (No at S2102), the 3D printer driver 130 stops the shaping, and transmits a job for terminating all the processings to the 3D printer 2. The processing for determining whether execution of the shaping is possible after the projection processing as illustrated in
The above-described embodiments are illustrative and do not limit the present invention. Thus, numerous additional modifications and variations are possible in light of the above teachings. For example, at least one element of different illustrative and exemplary embodiments herein may be combined with each other or substituted for each other within the scope of this disclosure and appended claims. Further, features of components of the embodiments, such as the number, the position, and the shape are not limited the embodiments and thus may be preferably set. It is therefore to be understood that within the scope of the appended claims, the disclosure of the present invention may be practiced otherwise than as specifically described herein.
The method steps, processes, or operations described herein are not to be construed as necessarily requiring their performance in the particular order discussed or illustrated, unless specifically identified as an order of performance or clearly identified through the context. It is also to be understood that additional or alternative steps may be employed.
Further, any of the above-described apparatus, devices or units can be implemented as a hardware apparatus, such as a special-purpose circuit or device, or as a hardware/software combination, such as a processor executing a software program.
Further, as described above, any one of the above-described and other methods of the present invention may be embodied in the form of a computer program stored in any kind of storage medium. Examples of storage mediums include, but are not limited to, flexible disk, hard disk, optical discs, magneto-optical discs, magnetic tapes, nonvolatile memory, semiconductor memory, read-only-memory (ROM), etc.
Alternatively, any one of the above-described and other methods of the present invention may be implemented by an application specific integrated circuit (ASIC), a digital signal processor (DSP) or a field programmable gate array (FPGA), prepared by interconnecting an appropriate network of conventional component circuits or by a combination thereof with one or more conventional general purpose microprocessors or signal processors programmed accordingly.
Each of the functions of the described embodiments may be implemented by one or more processing circuits or circuitry. Processing circuitry includes a programmed processor, as a processor includes circuitry. A processing circuit also includes devices such as an application specific integrated circuit (ASIC), digital signal processor (DSP), field programmable gate array (FPGA) and conventional circuit components arranged to perform the recited functions.
Number | Date | Country | Kind |
---|---|---|---|
2015-181201 | Sep 2015 | JP | national |