1. Field of the Invention
The present invention relates to a three-dimensional shaping apparatus.
2. Description of the Related Art
Three-dimensional shaping apparatuses which form three-dimensional shaped objects by laminating many layers are attracting attention. This type of lamination shaping technique is referred to as “additive manufacturing (AM)”, “three-dimensional printing”, “rapid prototyping” and the like. For the lamination shaping technique, various shaping methods have been proposed. In Japanese Patent Application Laid-Open No. H10-224581 and Japanese Patent Application Laid-Open No. 2003-053846, for example, a shaping method applying an electrophotographic process is disclosed, and in US Patent Application Publication No. 2009/0060386 (Description), a laser sintering method is disclosed.
In a three-dimensional shaping apparatus, the shape accuracy of a cross-sectional image of each layer (image forming accuracy) and the positional accuracy when each layer is laminated (lamination accuracy) have a great influence on the quality of the final shaped object. These issues are especially critical in the case of a lamination method of independently forming an image of each layer and sequentially laminating these images, such as the case of the apparatus disclosed in Japanese Patent Application Laid-Open No. H10-224581 or Japanese Patent Application Laid-Open No. 2003-053846. However, in the apparatuses disclosed in Japanese Patent Application Laid-Open No. H10-224581 and Japanese Patent Application Laid-Open No. 2003-053846, distortion of the images and dispersion of image positions are not considered, therefore the image forming accuracy and the lamination accuracy cannot be guaranteed.
US Patent Application Publication No. 2009/0060386 (Description) discloses a positional calibration method for a laser sintering type apparatus, where the center reference of an image is determined by scanning a calibration plate before starting the shaping. This method, however, simply aligns the drawing position of the image to the center of the stage, and does not correct distortion of the image itself. Furthermore, this method cannot be applied to a lamination method of independently forming an image of each layer, and sequentially laminating these images, such as the lamination methods disclosed in Japanese Patent Application Laid-Open No. H10-224581 and Japanese Patent Application Laid-Open No. 2003-053846.
With the foregoing in view, it is an object of the present invention to provide a technique for improving the quality and accuracy of a shaped object in a three-dimensional shaping apparatus, which independently forms an image of each layer and sequentially laminates these images to acquire a three-dimensional shaped object.
The present invention in its first aspect provides a three-dimensional shaping apparatus, comprising: an image forming unit configured to form a material image formed from a shaping material based on input image data; a transfer member to which the material image formed by the image forming unit is transferred and which is configured to convey the material image; and a stage on which the material image conveyed by the transfer member is laminated, the three-dimensional shaping apparatus comprising: a marker generation unit configured to generate image data of a calibration marker; a control unit configured to input the generated image data of the calibration marker to the image forming unit; a first detection unit configured to detect a position of the calibration marker, which is formed by the image forming unit based on the image data of the calibration marker and laminated on the stage; and an image distortion measurement unit configured to measure image distortion of the calibration marker laminated on the stage, based on the detection result from the first detection unit.
The present invention in its second aspect provides a three-dimensional shaping apparatus, comprising: an image forming unit configured to form a material image formed from a shaping material based on input image data; a transfer member to which the material image formed by the image forming unit is transferred and which is configured to convey the material image; and a stage on which the material image conveyed by the transfer member is laminated, the three-dimensional shaping apparatus comprising: a control unit configured to generate slice image data including a slice image of a shaping target object and a registration marker, and to input the slice image data to the image forming unit; a second detection unit configured to detect the registration marker, which are formed by the image forming unit based on the slice image data and are included in the material image transferred to the transfer member; a position measurement unit configured to measure a positional shift of the material image on the transfer member based on the detection result from the second detection unit; and an adjustment unit configured to adjust a position of the stage based on the positional shift measured by the position measurement unit.
The present invention in its third aspect provides a lamination shaping method for shaping a three-dimensional object by forming a material image formed from a shaping material based on image data and laminating the material image on a stage, the lamination shaping method comprising the steps of: forming a calibration marker formed from the shaping material; laminating the calibration marker on the stage; detecting a position of the calibration marker laminated on the stage; and acquiring image distortion information on image distortion generated in the material image based on the position of the calibration marker.
The present invention in its fourth aspect provides a lamination shaping method for forming a three-dimensional object by forming a material image formed from a shaping material and laminating the material image on a stage, the lamination shaping method comprising the steps of: generating slice image data including a slice image of a shaping target object and a registration marker; forming the material image based on the slice image data; and detecting a position of the registration marker included in the material image and adjusting a relative position between the material image and the stage.
According to the present invention, the quality and accuracy of a shaped object can be improved in a three-dimensional shaping apparatus, which independently forms an image of each layer and sequentially laminates these images to acquire a three-dimensional shaped object.
Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
Embodiments of the present invention will now be described with reference to the drawings. Dimensions, materials, shapes and relative positions of components, the procedures of various controls, control parameters, target values and the like, described in the following embodiments, are not intended to limit the scope of the present invention unless otherwise specified.
A configuration of a three-dimensional shaping apparatus according to Embodiment 1 of the present invention will be described with reference to
The three-dimensional shaping apparatus is an apparatus that forms a three-dimensional shaped object by laminating a shaping material according to the cross-sectional information of a shaping target object. This apparatus is also called an “additive manufacturing (AM) system”, “3-D printing” or a “rapid prototyping (RP) system”.
The three-dimensional shaping apparatus of this embodiment has an image forming unit 100, a shaping unit 200, and a control unit 60. The image forming unit 100 is a component to form one layer of an image formed from a shaping material, based on the slice image data of each layer. The image forming unit 100 includes an image generation controller 10, a laser scanner (exposure device) 20, a process cartridge 30 and a transfer roller 41. The shaping unit 200 is a component to form a three-dimensional shaped object having a three-dimensional structure by sequentially laminating and fixing a plurality of layers of images formed by the image forming unit 100. The shaping unit 200 includes a shaping controller 70, a transfer member 42, a counter member (heater roller) 43, a stage 52, a stage guide 53, a plurality of motors 111 to 114, and a plurality of sensors 44, 45, 54 and 55. The control unit 60 is a component to perform processing to generate a plurality of layers of slice image data (cross-sectional data) from the three-dimensional shape data of the shaping target object, and controls each component of the three-dimensional shaping apparatus and the like.
The control unit 60 has a function of generating the slice image data for lamination shaping from the three-dimensional shape data of the shaping target object, a function of outputting the slice image data of each layer to the image generation controller 10, and a function of managing the lamination shaping step among other functions. The control unit 60 can be constructed by installing programs that have these functions on a personal computer or on an embedded computer, for example. For the three-dimensional shape data, data can be created by a three-dimensional CAD, a three-dimensional modeler, a three-dimensional scanner or the like. The format of the three-dimensional shape data is not especially limited, but polygonal data, such as stereolithography (STL), for example, can be used. For the format of the slice image data, multi-value image data (each value represents a type of material) or multi-plane image data (each plane corresponds to a type of material) can be used, for example.
The image generation controller 10 has a function of controlling the image forming process in the image forming unit 100, based on the slice image data input from the control unit 60, and control signals input from the shaping controller 70 and the like. In concrete terms, the image generation controller 10 performs resolution conversion and decoding processing of the slice image data, and controls the image writing start position and the timing by the laser scanner 20. The image generation controller 10 may have a function similar to a printer controller, which is embedded in a standard laser printer (2-D printer).
The image forming unit 100 is a unit that generates one layer of an image formed from a shaping material, using an electrophotographic process. The electrophotographic process is a method of forming a desired image through a series of processes of charging a photoreceptor, forming a latent image by exposure, and forming an image by adhering developer particles to a latent image. The three-dimensional shaping apparatus uses particles formed from a shaping material instead of toner, as the developer, but the basic principle of the electrophotographic process is essentially the same as that of a 2-D printer.
The photosensitive drum 34 is an image bearing member having such a photoreceptor layer as an organic photoreceptor and an amorphous silicon photoreceptor. A primary charging roller 33 is a charging device to uniformly charge a photoreceptor layer of the photosensitive drum 34. The laser scanner 20 is an exposure device that scans the photosensitive drum 34 by laser light according to the image signals provided from the image generation controller 10, and draws a latent image. A shaping material supply unit 31 is a device that stores and supplies a shaping material that is used as the developer. A development roller 32 is a developing assembly that supplies the shaping material to the electrostatic latent image on the photosensitive drum 34. A transfer roller 41 is a transfer device that transfers the image of the shaping material formed on the photosensitive drum 34 onto a transfer member (transfer belt) 42. A cleaning device (not illustrated) for cleaning the surface of the photosensitive drum 34 may be disposed downstream of a transfer nip between the photosensitive drum 34 and the transfer roller 41. In this embodiment, the photosensitive drum 34, the primary charging roller 33, the shaping material supply unit 31 and the development roller 32 are integrated as a process cartridge 30, so that replacement is easier.
For the shaping material, various materials can be selected according to the intended use, function and purpose of the shaped object to be created. In this description, a material constituting the shaped object (structure) is called a “structure material”, and a material constituting a support member (e.g. a post that supports an overhanging portion from the bottom) for supporting a shaped object during the lamination process, is called a “support material”. If it is unnecessary to make this distinction, then the simple term “shaping material” is used. For the structure material, a thermoplastic resin, such as polyethylene (PE), polypropylene (PP), ABS and polystyrene (PS) can be used. For the support material, a material having thermoplasticity and water solubility is preferable, since it is easy to remove from a structure. For the support material, glucide, polylactic acid (PLA), polyvinyl alcohol (PVA) or polyethylene glycol (PEG), for example, can be used.
The shaping controller 70 has a function of performing mechatronic control of the three-dimensional shaping apparatus. A driving system includes a transfer roller motor 111 that rotates the transfer roller 41, and a stage driving X motor 112, a stage driving Y motor 113 and a stage driving Z motor 114 which move the stage 52 in three axis directions. A sensing system includes a material end detection sensor 44 which is used for online registration, a material end detection sensor 45 which is used for offline calibration, and a material left end sensor 54 and a material right end sensor 55. The roles of these sensors, online registration and offline calibration will be described in detail later.
The transfer member 42 is a conveyance member that bears an image of the shaping material formed by the image forming unit 100, and conveys (transports or carries) the image to the stage 52 (lamination nip). The transfer member 42 is constituted by an endless belt made of resin, polyimide or the like. The counter member 43 is a heating lamination device which includes a heater, that melts the shaping material image on the transfer member 42, and laminates the image on the shaped object on the stage 52. Here a roller (heater roller 43), for conveying the transfer member 42, is used as the counter member 43, but the present invention is not limited to this configuration. It is sufficient if the counter member 43 has a function to press the melted shaping material image against the shaped object on the stage 52, and the heating unit that melts the shaping material image may be disposed separately from the counter member 43. The stage 52 is a member that holds the shaped object during lamination, and can be moved in three axis directions (X, Y and Z) by the stage guide 53.
The basic operation of the three-dimensional shaping apparatus to form a shaped object will now be described.
The control unit 60 generates the slice image data of each layer. For example, the control unit 60 generates a slice image of each layer by slicing the shaping target object at a predetermined pitch (e.g. several μm to slightly more than a ten μm thickness) based on the three-dimensional shape data of the shaping target object. Then the control unit 60 attaches a registration marker (described in detail later) to the slice image of each layer, whereby the slice image data of each layer is generated. The slice image of each layer need not always be generated by the control unit 60, for the slice image data may be generated by acquiring the slice image generated outside the control unit 60 as well, and attaching the registration marker thereon. The slice image data is sequentially input to the image generation controller 10 from the slice image data of the lowest layer. The image generation controller 10 controls the laser emission and scanning of the laser scanner 20 according to the input slice image data.
In the image forming unit 100, the surface of the photosensitive drum 34 is uniformly charged by the primary charging roller 33. When the surface of the photosensitive drum 34 is exposed to laser light from the laser scanner 20, the exposed portion is destaticized. The shaping material charged with developing bias is supplied to the destaticized portion by the development roller 32, and one layer of an image formed from the shaping material (hereafter called “material image”) is formed on the surface of the photosensitive drum 34. This material image is transferred onto the transfer member 42 by the transfer roller 41.
The transfer member 42 rotates while bearing the material image, and conveys the material image to the lamination position. On the other hand, the shaping controller 70 controls the stage 52 so that the stage 52 (or a semi-shaped product on the stage 52) enters the lamination position at the same timing and same speed with the material image. Then the stage 52 and the transfer member 42 are heated by the heater roller 43 while moving in sync, whereby the material image is heat-welded onto the stage 52 (or the upper surface of the semi-shaped product on the stage 52). Each time the material image is layered, the shaping controller 70 lowers the stage 52 in the Z direction by a thickness of one layer, and waits for the lamination of the next layer.
This operation of image formation and lamination is repeated for the number of images of the slice image data, whereby the target three-dimensional shaped object is formed on the stage 52.
In this description, an object to be formed by the three-dimensional shaping apparatus (that is, an object represented by the three-dimensional shape data output to the three-dimensional shaping apparatus) is called a “shaping target object”, and an object formed (output) by the three-dimensional shaping apparatus is called a “shaped object”. In the case when the shaped object includes a support member, the portion of the shaped object, excluding the support member, is called a “structure”. Digital data that includes one slice of data, which is acquired by slicing the three-dimensional shape data of the shaping target object and the data of a registration marker, is called “slice image data”. One layer of an image formed from a shaping material, which is formed by the image forming unit based on the slice image data, is called a “material image”.
In the case of a three-dimensional shaping apparatus which forms a shaped object by laminating many images (lamination type), as in this embodiment, the shape accuracy of the material image and the positional accuracy upon lamination determines the quality of the final shaped object. For example, distortion may be generated in the material image because of problems in the scanning accuracy of exposure and the dimensional accuracy of the photosensitive drum and transfer roller. If such image distortion accumulates, the dimensions and the shape of the final shaped object are affected in a significant way. Further, if the position of the material image of each layer upon being laminated on the shaped object on the stage 52 disperses, the side face of the final shaped object becomes uneven, and a smooth surface cannot be created. This is a problem unique to the lamination type three-dimensional shaping apparatus, which forms one final shaped object by laminating several hundred to several tens of thousands of images.
Therefore according to this embodiment, in order to guarantee the shape accuracy of the material image of each layer, the image distortion generated in the image forming unit 100 is measured before forming the shaped object (this is called “offline calibration”), and the image distortion correction is performed for the slice image data of each layer when the image is formed. Further, in order to guarantee the positional accuracy upon lamination, the position of the material image of each layer on the transfer member is measured, and the positions of the material image and the shaped object on the stage 52 are aligned upon lamination (this is called “online registration”). The offline calibration, the image distortion correction, and the online registration will now be described in detail.
Now the offline calibration performed before generating the shaped object will be described. In the offline calibration, the calibration markers are formed on the stage 52 in the same procedure as the image formation and lamination described above, and the image distortion is measured based on the positional shift of the markers. The offline calibration may be performed not only before generating the shaped object, but also during the intervals of the lamination of the material images.
In the following description, the image data for a calibration marker input to the image generation controller 10 is called “calibration marker data”. The calibration marker data is stored in the memory of the control unit 60, and is read when the offline calibration is performed. An image formed from a shaping material, which is formed based on the calibration marker data, is called a “calibration marker” or “marker”. Further, a marker transferred from the photosensitive drum 34 to the transfer member 42 (that is, a marker on the transfer member 42) is called a “calibration transfer marker”, and a marker transferred onto the stage 52 is called a “calibration lamination marker”. The name of a marker is different depending on the location of the marker, since an image distortion of a marker can change in the process when the marker is in-transfer, and a marker is detected using a different sensor depending on the marker location, therefore it is more convenient to identify a marker by its location. In a context where the location of the marker need not be identified, a term “calibration marker” or “marker” is used.
In an upper part of the stage 52, a material left end sensor 54 is disposed at a Y position corresponding to the origin O1, and a material right end sensor 55 is disposed at a Y position corresponding to the origin O2. The material left end sensor 54 is a sensor for detecting the positions of the front left end calibration marker AFL and the rear left end calibration marker ARL. The material right end sensor 55 is a sensor for detecting the positions of the front right end calibration marker AFR and the rear right end calibration marker ARR. The vectors VFL, VFR, VRL and VRR shown in
Normally the directions of the deformation vectors of the markers at the four corners are different. This is because the direction and degree of displacement differ depending on the position in the shaping area due to the influence of distortion of the transfer member 42, the alignment deviation of each roller shaft or the like. Therefore the calibration markers are required to acquire deformation vectors at a plurality of points in the shaping area. For example, at least two markers are disposed at distant positions in the shaping area on the stage, and a deformation vector at each position is detected (measured), and it is even better if the markers are disposed at the four corners of the rectangular shaping area, as in this embodiment. The calibration marker is not limited to the plurality of markers, but a frame shaped material image connecting AFL, AFR, ARL and ARR may be formed as a calibration marker, and deformation vectors may be measured at the corners of the frame. Thereby displacement generated at each marker position in the shaped area can be detected. If a hard belt material is used as the transfer member 42, the displacement generated at each position in the shaping area becomes relatively linear, hence the deformation vectors at positions other than the markers at the four corners can be determined by the linear interpolation of the deformation vectors acquired by the markers at the four corners. If the transfer member 42 has a flexibility that is partially cyclic or discontinuous, the number of calibration markers may be increased. For example, it is preferable that a plurality of markers are arrayed along the four sides of the shaping area.
The offline calibration will be described in detail with reference to
As shown in
The flow of the offline calibration by the shaping controller 70 will be described with reference to the flow chart in
In step 301, the shaping controller 70 monitors the output of the material left end sensor 54 while changing the XY positions of the stage 52 by controlling a stage driving X motor 112 and a stage driving Y motor 113. When the origin O1 is detected, the shaping controller 70 stores the XY positions of the stage 52 at this point as X=0 and Y=0. In the same manner, in step 302, the shaping controller 70 monitors the output of the material right end sensor 55 while changing the XY positions of the stage 52. In step 303, the shaping controller 70 stores the difference between the XY positions of the stage 52 when the origin O2 was detected and the XY positions of the stage 52 when the origin O1 was detected, as X=dx and Y=dy. This (dx, dy) is an error offset value representing the mounting errors of the material left end sensor 54 and the material right end sensor 55. If the mounting errors of the two sensors 54 and 55 are negligible (in other words, if it is regarded that dx=dy=0), then the processing in steps 302 and 303 may be omitted.
In step 304, the calibration marker generation unit 65 of the control unit 60 outputs the calibration marker data to the image generation controller 10, whereby the image forming unit 100 and the shaping unit 200 generate the calibration lamination markers. In concrete terms, the image forming unit 100 forms the calibration markers formed from the shaping material on the photosensitive drum 34 based on the calibration marker data, using the same process as forming the material image of the shaped object. The markers are transferred from the photosensitive drum 34 onto the transfer member 42, and are then conveyed to the shaping unit 200 as calibration transfer markers. When the material end detection sensor 45 detects the front end of the calibration transfer markers, the shaping controller 70 controls the stage 52 so that the stage 52 advances to the lamination position at the same timing as the calibration transfer markers. Then the calibration transfer markers are transferred onto the stage 52 by the heater roller 43, whereby the calibration lamination markers are acquired. The calibration lamination markers AFL, AFR, ARL and ARR include information on the image distortion which is generated in a series of processes, such as exposure, development, transfer and lamination, as shown in
In step 305, the shaping controller 70 monitors the output of the material left end sensor 54 and the material right end sensor 55, while changing the XY positions of the stage 52 by controlling the stage driving X motor 112 and the stage driving Y motor 113. The XY positions of the marker AFL detected by the material left end sensor 54 and the XY positions of the marker AFR detected by the material right end sensor 55 are stored in a calibration lamination marker position detection unit 201. In step 306, the shaping controller 70 controls the stage driving X motor 112, and moves the stage 52 to the positions of the calibration lamination markers ARL and ARR at the rear end. In step 307, the shaping controller 70 monitors the output of the material left end sensor 54 and the material right end sensor 55 while changing the XY positions of the stage 52 by controlling the stage driving X motor 112 and the stage driving Y motor 113. The XY positions of the marker ARL detected by the material left end sensor 54 and the XY positions of the marker ARR detected by the material right end sensor 55 are stored in the calibration lamination marker position detection unit 201.
In step 308, the image distortion measurement unit 202 calculates the XY positions of each marker AFL, AFR, ARL and ARR in the state without the image distortion (these XY positions are called “normal positions”), based on the XY positions of the origin O1. Then, based on the difference between the normal position of each marker and the detected position of each marker detected in steps 305 and 307, the image distortion measurement unit 202 calculates the deformation vectors VFL, VFR, VRL and VRR representing the displacement amount and direction of the displacement of each marker. If there is an error offset amount (dx, dy) between the two sensors 54 and 55, the error offset amount (dx, dy) is considered when calculating the deformation vectors VFR and VRR.
In step 309, the shaping controller 70 sends the deformation vector of each marker to the control unit 60 as image distortion information on the image distortion generated in the material image in a period from the generation of the material image in the image forming unit to lamination on the stage.
Image distortion correction, which is executed upon forming the material image, based on the image distortion information acquired in advance by the offline calibration, will now be described with reference to
As shown in
In step 311, image distortion information is acquired from the shaping controller 70. In step 312, the image distortion correction unit 63 calculates inverse vectors from the deformation vectors of the markers at the four corners, which are acquired as image distortion information, and linearly interpolates these inverse vectors, whereby a correction parameter for each pixel is calculated. The correction parameter is, for example, information to indicate the correspondence between the pixel coordinates in the image before correction and the pixel coordinates in the image after correction.
In step 313, the three-dimensional shape data of the shaping target object is read. In step 314, the 3-D data slicer 61 slices the three-dimensional shape of this shaping target object at a predetermined pitch (e.g. several μm to a slightly more than a ten μm thickness) based on the three-dimensional shape data, so as to generate a slice image of each layer. In step 315, the registration marker attaching unit 62 attaches the registration marker to the slice image of each layer, so as to generate slice image data (a registration marker will be described in detail later). Instead of steps 314 and 315, the slice image data of each layer may be generated by attaching the laminate of the registration marker to the three-dimensional shape data, which is read by the registration marker attaching unit 62, and then slicing the three-dimensional shape using the 3-D data slicer 61.
In step 316, the image distortion correction unit 63 corrects the distortion of the slice image data using the correction parameters determined in step 312. The distortion correction here is processing to provide distortion in the reverse direction to the slice image, so that the image distortion generated in the process from the image formation to lamination is reduced or eliminated. The distortion of the slice image data may also be corrected by correcting the distortion of the three-dimensional shape data before slicing, instead of correcting distortion of the data after slicing using the 3-D data slicer 61. In step 317, the printer driver 64 sends the corrected slice image data to the image generation controller 10.
If the slice image data is corrected based on the image distortion information acquired by the offline calibration, as described above, a material image with very little or no image distortion when laminated on the stage 52 can be formed, and the dimensional accuracy of the shaped object can be improved.
A concept of the image distortion correction will be described with reference to
The broken line in
Online registration, which is performed when the material image is laminated, will now be described. In the online registration, a registration marker is inserted into the material image, and the material image is aligned at lamination based on the detected positions of these markers.
In the following description, the image data for a registration marker input to the image generation controller 10 is called “registration marker data”. An image formed from a shaping material, which is formed based on the registration marker data is called a “registration marker” or a “marker”. Further, a marker transferred from the photosensitive drum 34 to the transfer member 42 (that is, a marker on the transfer member 42) is called a “registration transfer marker”, and a marker transferred onto the stage 52 is called a “registration lamination marker”. The name of a marker is different depending on the location of the marker, since an image distortion of a marker can change in the process when the marker is in-transfer, and a marker is detected using a different sensor depending on the marker location, therefore it is more convenient to identify a marker by its location. In a context where the location of the marker need not be identified, a term “registration marker” or “marker” is used.
As shown in
As described in step 315 in
The registration transfer marker position detection unit 211 detects the registration transfer markers AF on the transfer member 42 using the material end detection sensor 44. Then the position measurement unit 212 acquires the position in the X direction (front end position) and the positional shift amount in the Y direction of the material image from the detection result of the registration transfer markers AF. Here the X direction is the advancing direction of the transfer member 42, and the Y direction is the width direction (direction orthogonal to the advancing direction) of the transfer member 42. Based on the position of the material image in the X direction, the lamination position adjustment unit 213 controls the driving start timing of the stage driving X motor 112, and aligns the front ends of the shaped object on the stage 52 and the material image on the transfer member 42. The lamination position adjustment unit 213 also controls the stage driving Y motor 113, and aligns the left ends of the shaped object on the stage 52 and the material image on the transfer member 42 based on the positional shift amount of the material image in the Y direction. Thereby lamination dispersion of the semi-shaped object and the material image on the XY plane is cleared online, and high quality shaping becomes possible.
The change amount of the hypotenuse of the right-angled triangle is given by
Y=1−aX
Here a is a slope of the hypotenuse, and the length of one side of the triangle is 1.
When a side at the left of the registration transfer marker AF is Y=0 and the normal position is Y=0.5, then the shift amount ΔY can be given by
ΔY=Y−0.5=(1−aX)−0.5=0.5−aX
If the angle of the hypotenuse is 45°, then a=1, and ΔY=0 when X=0.5, ΔY=0.5 when X=0, and ΔY=−0.5 when X=1. The range of X is 0<X<1.
The material end detection sensor 44 detects a first edge and a second edge of the registration transfer marker AF. L1 indicates a detection line of the material end detection sensor 44 when the registration transfer marker AF on the transfer member 42 passes the normal position. In other words, the state when the material end detection sensor 44 passes the line L1 becomes the reference (shift amount ΔY=0). S1 indicates the output signal of the material end detection sensor 44 when the material end detection sensor 44 passes the line L1. If the first edge of the registration transfer marker AF is detected, the signal changes from low level to high level. If the second edge L2 is detected, the signal changes from high level to low level.
If the transfer member 42 shifts from the reference position to the left by ΔY here, the material end detection sensor 44 passes the line L3. S3 indicates the output signal of the material end detection sensor 44 when the material end sensor 44 passes the line L3. When the first edge of the registration transfer marker AF is detected, the signal changes from low level to high level. When the second edge L4 is detected, the signal changes from high level to low level.
Therefore the position in the X direction (front end position) is recognized by the rise timing of the output signal S3 of the material end detection sensor 44. Further, from the high level period (difference between the first edge detection timing and the second edge detection timing) of the output signal S3 and expression Y=1−aX, the positional shift amount ΔY of the registration transfer marker in the Y direction with respect to the normal position can be determined. In this way, the positions in two directions (X and Y directions) can be detected by one registration transfer marker and one material end detection sensor 44. This is advantageous in terms of cost because the configuration and processing are simplified, and is advantageous in terms of increasing speed of the position alignment in the two directions. The configuration of this embodiment is effective for lamination, since the transfer member 42 and the stage 52 can move at high-speed. However, if the positional shift in the Y direction is negligibly minimal, it is sufficient if only the position in the X direction can be detected, and a square marker having two side parallel in the Y direction and two sides parallel in the X direction can be used, for example.
According to the configuration of the three-dimensional shaping apparatus of this embodiment described above, the distortion of an image in the XY plane, generated in the process from the image formation to lamination, can be minimized by performing offline calibration and image distortion correction. Furthermore, the positional shift during lamination can be minimized by performed online registration. As a result, high quality shaped objects can be formed with high shape accuracy and dimensional accuracy.
An image distortion during image formation is generated by, for example, the distortion of the photosensitive drum 34, the distortion of the development roller 32, the alignment shift of each roller shift, the abrasion of each member and the like. Therefore, it is preferable that the execution timing of the offline calibration is at the beginning of each shaping job. If the image distortion is negligible, the offline calibration can be omitted.
In the case of the three-dimensional shaping apparatus of this embodiment as well, shaping with high shape accuracy can be implemented by performing offline calibration and image distortion correction for each image forming unit. If image distortion generated in the image forming units 100A and 100B is small, in terms of scan accuracy at exposure, dimensional accuracy of the photosensitive drum and the like, then the image distortion information acquired by the offline calibration using either one of the image forming units 100A and 100B may be used for the image distortion correction upon forming a material image in both image forming units.
In the case of this configuration, image distortion in the transfer member 42A and in the transfer member 42B are different, and the lamination timings thereof are also different, hence independent image distortion correction and registration are required. Therefore, the offline calibration is performed and image distortion is corrected for the image forming unit 100A and for the image forming unit 100B, respectively. When lamination is performed, the markers on the transfer member 42A and the markers on the transfer member 42B are read respectively using different sensors 44A and 44B, whereby the respective material images can be aligned independently.
The three-dimensional shaping apparatus includes photosensitive drums 34A, 34B, 34C and 34D, transfer rollers 41A, 41B, 41C and 41D, primary transfer member 47 and transfer roller pair 46 and 48. A material image formed by each image forming unit 100A to 100D is sequentially transferred from the photosensitive drums 34A to 34D onto the primary transfer member 47. The material images on the primary transfer member 47 are transferred to the secondary transfer member 42 by the transition roller pair 46 and 48. The material images on the secondary transfer member 42 are conveyed to the shaping unit, and are laminated on the stage 52 or on the semi-shaped object on the stage 52. In this configuration, it is preferable to execute the offline calibration for each image forming unit 100A to 100D respectively, so that the image distortion in each image forming unit 100A to 100D is corrected.
As in Embodiment 1, a deformation vector of each marker is calculated, whereby image distortion information of the image formed by each of the image forming units 100A to 100D can be acquired. This allows independent correction of the image distortion generated in an image formed by each of the image forming units 100A to 100D, hence a shaped object having excellent dimensional accuracy and shape accuracy can be acquired.
As an operation method of Embodiment 3, it is assumed that the image forming unit 100B stands by as a spare for the image forming unit 100A, and the image forming unit 100A switches to the image forming unit 100B when the material in the image forming unit 100A empties. In this case, it is preferable that the correction parameters of the image forming units 100A and 100B are created and stored in advance respectively. Then when the image forming unit to be used is switched from 100A to 100B, the correction parameters for the image forming unit 100A can be switched immediately to the correction parameters for the image forming unit 100B. This makes any procedure, such as exchanging cartridges and calibration, unnecessary, and lamination shaping can be automatically continued at high precision.
In this embodiment as well, as in Embodiment 2, if image distortion generated in the image forming units 100A to 100D is small, in terms of scanning accuracy in exposure and dimensional accuracy of the photosensitive drum and the like, then the image distortion information acquired by the offline calibration using one of the image forming units 100A to 100D may be used for the image distortion correction upon forming a material image in the image forming units 100A to 100D.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Application No. 2015-104681, filed on May 22, 2015, which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | Kind |
---|---|---|---|
2015-104681 | May 2015 | JP | national |