The present disclosure relates to a printed image data generation method, a printing method, and a printed image data generation apparatus.
In recent years, there has been an increasing demand for printing two-dimensional images on a three-dimensional medium. In order to meet such a demand, a three-dimensional inkjet printer as described in PTL 1 has been proposed.
The three-dimensional inkjet printer generates two-dimensional pseudo three-dimensional image data having coordinate information of a three-dimensional image by associating three-dimensional image data of a medium having a shape of a sphere, a column, or a truncated cone with two-dimensional coordinates by performing coordinate conversion using a mathematical expression corresponding to each shape.
Then, the three-dimensional inkjet printer performs halftone processing on the pseudo three-dimensional image data, and generates two-dimensional halftone image data indicating gradation corresponding to CMYK, which is colors of ink.
Thereafter, the three-dimensional inkjet printer performs a process of three-dimensionalizing the two-dimensional halftoned image data to generate three-dimensional halftoned image data corresponding to the shape of the three-dimensional medium, and prints an image on the medium based on the three-dimensional halftoned image data.
A printed image data generation method according to one aspect of the present disclosure includes: a data acquisition step of acquiring data indicating a three-dimensional shape model imitating a printing target object and data indicating a two-dimensional image for decorating a surface of the printing target object; and a printed image data generation step of calculating a position of a surface of the three-dimensional shape model on which ink ejected from a nozzle is to land, determining a pixel of the two-dimensional image corresponding to the position of the surface, and generating data of a printed image to be printed on the printing target object based on information of the pixel.
A method for printing a two-dimensional image on a three-dimensional object by an inkjet head according to an aspect of the present disclosure includes: preparing three-dimensional model data indicating a three-dimensional model generated from the three-dimensional object; preparing binarized image data indicative of a binarized image generated from the two-dimensional image, the binarized image including a plurality of binarized image pixels, each of the plurality of binarized image pixels having a pixel value indicative of presence or absence of a color; preparing nozzle trajectory data indicating a plurality of trajectories in which a plurality of nozzles included in the inkjet head move; generating printed image data based on the binarized image data and the nozzle trajectory data, the printed image data including a plurality of printed image pixels corresponding one-to-one to the plurality of binarized image pixels, each of the plurality of printed image pixels having a pixel value indicative of a nozzle assignment. Here, generation of the printed image data includes: determining that a first nozzle of the plurality of nozzles passes over a first binarized image pixel having a pixel value indicative of the presence of a color among the plurality of binarized image pixels; assigning the first nozzle to a first printed image pixel corresponding to the first binarized image pixel among the plurality of printed image pixels; determining that a second nozzle different from the first nozzle among the plurality of nozzles passes over the first binarized image pixel; determining a second binarized image pixel having a pixel value indicating that the binarized image pixel has a color located within a predetermined distance from the first binarized image pixel; and assigning the second nozzle in addition to the first nozzle to the first printed image pixel in response to determining the second binarized image pixel.
A printed image data generation apparatus according to one aspect of the present disclosure includes: a data acquisition unit that acquires data indicating a three-dimensional shape model imitating a printing target object and data indicating a two-dimensional image for decorating a surface of the printing target object; and a printed image data generator that calculates a position of a surface of the three-dimensional shape model on which ink ejected from a nozzle is to land, determines a pixel of the two-dimensional image corresponding to the position of the surface, and generates data of a printed image to be printed on the printing target object based on information of the pixel.
The technique disclosed in PTL 1 uses mathematical expressions derived from the shape of a sphere, a column, or a truncated cone to perform association with the two-dimensional coordinates, and thus is applicable to printing on a medium having a simple shape from which such mathematical expressions are derived. However, it is difficult to apply the technique disclosed in PTL 1 to printing on a medium having the free curved surface, such as a sporting good having the free curved surface like a helmet of a road bike or a racket of tennis, a wearable terminal having a housing formed to match a curved surface of a human body, or a digital signage including a plurality of curved surfaces.
An object of the present disclosure is to provide a printed image data generation method, a printing method, and a printed image data generation apparatus capable of easily printing an image even when a surface of a printing target object is the free curved surface.
Hereinafter, the exemplary embodiments of the printed image data generation method, the printing method, and the printed image data generation apparatus will be described with reference to the drawings. Note that, the following disclosure is merely an example and does not limit the scope of the claims. The technique described in the claims includes various variations and changes of the specific examples exemplified below.
As shown in
On the other hand, in a case of the printing target object in which the surface is the free curved surface as in
However, it is difficult to appropriately perform such processing on the printing target object having a complicated surface shape. Furthermore, as described above, it is also difficult to apply the technique of PTL 1 to such a printing target object whose surface is the free curved surface.
In the example of
Furthermore,
In a case of performing printing by moving such inkjet head 101 along the arc, since there are nozzles having different rotation radii as in the trajectories B to E, for example, in order to land the ink ejected from the nozzles belonging to second nozzle row 101b on the same region as the ink ejected from the nozzles belonging to first nozzle row 101a, complicated correction according to a difference in rotation radius needs to be performed on the pixels of the printed image.
Note that, in
In view of the circumstances above, the first exemplary embodiment discloses a printed image data generation method, a printing method, and a printed image data generation apparatus that enable easy printing of an image even when the surface of the printing target object is the free curved surface.
Next, an example of the printing apparatus according to the first exemplary embodiment will be described.
The printing apparatus includes printed image data generation apparatus 10A and print processing apparatus 10B. Printed image data generation apparatus 10A generates printed image data to be printed on workpiece 102 that is the printing target object. Print processing apparatus 10B performs printing on workpiece 102 using the printed image data generated by printed image data generation apparatus 10A.
Hereinafter, a printing apparatus in which printed image data generation apparatus 10A and print processing apparatus 10B are integrated will be described, but printed image data generation apparatus 10A and print processing apparatus 10B may be separate apparatuses.
In this case, a transfer of the printed image data from printed image data generation apparatus 10A to print processing apparatus 10B may be performed by communication processing via a network, or may be performed by storing the printed image data generated by printed image data generation apparatus 10A in a storage medium such as a universal serial bus (USB) flash memory and causing print processing apparatus 10B to read the printed image data.
Printed image data generation apparatus 10A includes model generator 103, shape model storage 104, trajectory data generator 105, trajectory data storage 106, surface decoration unit 107, surface decoration data storage 108, RIP processing unit 109, two-dimensional image data storage 110, nozzle position data storage 111, data acquisition unit 112, printed image data generator 113, and printed image data storage 114.
Model generator 103 is a processing unit that generates three-dimensional shape model 104a imitating workpiece 102, and is realized by three-dimensional computer aided design (CAD) software or the like. Shape model storage 104 stores three-dimensional shape model 104a generated by model generator 103.
Trajectory data generator 105 is a processing unit that generates trajectory data 106a including information on the trajectory of inkjet head 101 in a case where inkjet head 101 scans the surface of workpiece 102, and is realized by computer aided manufacturing (CAM) software or the like. Trajectory data storage 106 stores trajectory data 106a generated by trajectory data generator 105.
Note that, a format of trajectory data 106a varies depending on the CAM software to be used, but may be any format as long as it includes information corresponding to the position and attitude angle (roll, pitch, yow) of inkjet head 101 that change from moment to moment.
Surface decoration unit 107 performs surface decoration on three-dimensional shape model 104a of workpiece 102 generated by model generator 103 to generate surface decoration shape model 108a. Surface decoration shape model 108a is, for example, a three-dimensional shape model in which the surface decoration is performed using a general texture mapping method in a field of computer graphics.
Surface decoration unit 107 attaches surface decoration image 108b corresponding to a texture image in the texture mapping to three-dimensional shape model 104a to generate surface decoration shape model 108a.
Furthermore, when generating surface decoration shape model 108a, surface decoration unit 107 also generates data indicating a correspondence relationship between the position of a point in three-dimensional shape model 104a and the position of a point in surface decoration image 108b. The UV map used in the texture mapping is an example of data indicating such a correspondence relationship.
Surface decoration data storage 108 stores surface decoration shape model 108a generated by surface decoration unit 107. Surface decoration shape model 108a also includes data indicating the correspondence relationship between the positions of points in three-dimensional shape model 104a and the positions of points in surface decoration image 108b.
RIP processing unit 109 is a processing unit that converts surface decoration image 108b into a raster image necessary for actual printing, and is realized by raster image processing unit (RIP) software or the like.
As shown in enlarged view 108c, surface decoration image 108b can express gradation in a stepless manner, whereas the number of gradations is limited in the binary images as shown in enlarged view 110e. RIP processing unit 109 generates binary images 110a to 110d so as to have the number of gradations that can be printed by the printing apparatus.
Here, in binary images 110a to 110d, color information is expressed by binary values of 0 and 1, where 0 corresponds to not ejecting ink, and 1 corresponds to ejecting ink. Note that, depending on the color to be expressed, the color may be further separated into several colors in addition to each color of CMYK.
Two-dimensional image data storage 110 stores the data of the four binary images 110a to 110d obtained as a result of the conversion processing by RIP processing unit 109 as raster image data which is data of a two-dimensional image.
Nozzle position data storage 111 stores nozzle position data indicating the nozzle position of the inkjet head used for actual printing.
As shown in
Next, a case where inkjet head 101 has the plurality of nozzle rows will be described.
As shown in
As shown in
Note that, the direction in which the nozzles are arranged in inkjet head 101 may be inclined in a range of greater than 0 degrees and smaller than 90 degrees with respect to the scanning direction of inkjet head 101. As a result, this enables high-resolution printing.
Data acquisition unit 112 acquires trajectory data 106a stored in trajectory data storage 106, the data of surface decoration shape model 108a stored in surface decoration data storage 108, the data of binary images 110a to 110d stored in two-dimensional image data storage 110, the data of the nozzle position stored in nozzle position data storage 111, and the like, and outputs the data to printed image data generator 113.
Printed image data generator 113 calculates the position of the surface of three-dimensional shape model 104a on which the ink ejected from the nozzle lands, determines the pixels of binary images 110a to 110d, which are two-dimensional images corresponding to the position of the surface, and generates the data of the printed image to be printed on the printing target object based on the information of the pixels.
For example, printed image data generator 113 calculates the attitude angle and the nozzle position of inkjet head 101 according to the printing resolution based on trajectory data 106a, in a case of causing inkjet head 101 having a nozzle arrangement shown in the nozzle position data to scan the surface of surface decoration shape model 108a.
Subsequently, for each combination of the attitude angle and the nozzle position of inkjet head 101, printed image data generator 113 determines whether or not a vertical line perpendicular to the nozzle surface of inkjet head 101 provided with a plurality of nozzle holes intersects the surface of surface decoration shape model 108a.
Then, in a case where the vertical line intersects with the surface of surface decoration shape model 108a, printed image data generator 113 reads the data of the pixels of binary images 110a to 110d of CMYK corresponding to an intersection portion from the raster image data, thereby generating data indicating whether or not to eject each ink of CMYK for each pixel as the printed image data.
Hereinafter, generation processing of the printed image data performed by printed image data generator 113 will be described in more detail.
Surface decoration shape model 108a treats the curved surface of complicated workpiece 102 as a set of triangular surfaces called polygons.
Furthermore, in surface decoration image 108b shown in an example in
Furthermore, since binary images 110a to 110d are images generated by converting surface decoration image 108b into the raster image, the coordinates of each pixel do not change. Therefore, if the coordinates of the pixel on surface decoration image 108b corresponding to the vertex of each polygon of surface decoration shape model 108a are uniquely obtained, the coordinates of the pixels on binary images 110a to 110d corresponding to the vertex of each polygon of surface decoration shape model 108a are also uniquely obtained.
Here, the UV map will be described.
The UV map shown in
With this UV map, the coordinates of the vertices constituting the polygon surface and the coordinates of surface decoration image 108b can be made to correspond to each other on a one-to-one basis. For example, in the examples of
Here, it is assumed that point O in
Printed image data generator 113 determines whether or not a straight line passing through point O and extending in the same direction as vector r intersects with a polygon constituted by points P1, P2, and P3, and calculates coordinates of intersection point P in a case where the straight line intersects with the polygon.
Next, printed image data generator 113 calculates the positions of points UV on surface decoration image 108b corresponding to intersection point P, and generates the printed image data at the position of intersection point P.
For example, printed image data generator 113 detects three points UV1, UV2, and UV3 of surface decoration image 108b corresponding to three points P1, P2, and P3 constituting the polygon with reference to the UV map.
Then, printed image data generator 113 calculates coordinates indicating the position of point UV so that a ratio (S1′:S2′:S3′) of the areas of three triangles formed by the three points UV1, UV2, and UV3 and point UV is the same as the ratio (S1:S2:S3) of the areas of the three triangles formed by three points P1, P2, and P3 and intersection point P constituting the polygon of surface decoration shape model 108a.
In this manner, printed image data generator 113 obtains the three points UV1, UV2, and UV3 of surface decoration image 108b corresponding to three points P1, P2, and P3 of the polygon of three-dimensional shape model 104a, and determines point UV of surface decoration image 108b corresponding to intersection point P based on a positional relationship between three points P1, P2, and P3 and intersection point P.
Thereafter, printed image data generator 113 acquires binary (0 or 1) information in each of CMYK binary images 110a to 110d of the pixel corresponding to the position of point UV.
Printed image data generator 113 executes the above processing on each polygon, and generates data indicating whether or not each ink of CMYK is to be ejected for each pixel as printed image data using the binary information acquired for each polygon.
Note that, here, printed image data generator 113 obtains the position of point UV using the ratio of the area of the triangle, but the method of obtaining point UV is not limited thereto, and the position of point UV may be obtained based on the positional relationship between three points P1, P2, and P3 defined as follows and intersection point P.
For example, in
Furthermore, in
Printed image data storage 114 stores the printed image data generated by printed image data generator 113.
Print processing apparatus 10B includes inkjet head 101, multi-axis controller 115, multi-axis robot 116, and print controller 117.
Inkjet head 101 has a plurality of nozzles, and ejects ink from each nozzle to perform printing on the printing target object.
Multi-axis controller 115 reads trajectory data 106a stored in trajectory data storage 106, converts the trajectory data into an operation amount given to each axis of multi-axis robot 116, and outputs the operation amount information to multi-axis robot 116.
Multi-axis robot 116 operates each axis using the information of the operation amount acquired from multi-axis controller 115, and moves or rotates workpiece 102 fixed to the arm.
Using the printed image data stored in printed image data storage 114 and the information of the operation amount acquired from multi-axis controller 115, print controller 117 controls inkjet head 101 in accordance with movement of multi-axis robot 116 to eject ink, thereby printing an image on the surface of workpiece 102.
Here, print controller 117 may execute a process of controlling inkjet head 101 so as not to print the same pixel redundantly in a case where inkjet head 101 performs printing following a certain trajectory and then performs printing following an adjacent trajectory. As a result, an occurrence of moire can be suppressed. Hereinafter, a method for preventing redundant printing of pixels will be described.
As shown in
Then, print controller 117 performs printing on these three columns according to the binary information, and thereafter, sets the values of the pixels of these three columns in printed image data 114a to “0” to set the pixels not to eject ink. The pixels whose pixel values are set to “0” are represented by white squares in
As shown in
Here, for the columns of pixels B1 to B5 and the columns of pixels C1 to C5, the value “0” is set as the pixels that do not eject ink by the first-stage processing. Therefore, print controller 117 does not perform printing for the columns of pixels B1 to B5 and the columns of pixels C1 to C5, and performs printing only for the columns of pixels D1 to D5 according to the binary information of those pixels.
As described above, for the pixel for which the binary information has been once acquired, print controller 117 sets the value of the pixel to “0” and sets the pixel to a pixel that does not eject ink, thereby making it possible to prevent redundant printing even in a case where there is a portion where the trajectory of inkjet head 101 during printing overlaps.
Note that, it is desirable that the printing resolution during printing on workpiece 102 actually having a three-dimensional shape and the resolution of surface decoration image 108b, which is a two-dimensional image, coincide with each other, but it is difficult to completely match these.
In a case where the printing resolution is higher than the resolution of surface decoration image 108b, white spots where a part of the image is not printed occur. Therefore, print controller 117 may set the printing resolution to be the same as the resolution of surface decoration image 108b or lower than the resolution of surface decoration image 108b to prevent white spots.
Furthermore, model generator 103, trajectory data generator 105, surface decoration unit 107, RIP processing unit 109, data acquisition unit 112, printed image data generator 113, multi-axis controller 115, and print controller 117 shown in
Furthermore, shape model storage 104, trajectory data storage 106, surface decoration data storage 108, two-dimensional image data storage 110, nozzle position data storage 111, and printed image data storage 114 are realized by a storage device such as a memory and a hard disk drive. These may be realized by one storage device or may be realized by a plurality of storage devices.
As described above, according to the printed image data generation method of the first exemplary embodiment, in the data acquisition step, the data indicating three-dimensional shape model 104a imitating the printing target object and the data indicating surface decoration image 108b that is the two-dimensional image for decorating the surface of the printing target object are acquired, and in the printed image data generation step, the position of the surface of three-dimensional shape model 104a on which the ink ejected from the nozzle is to land is calculated, the pixel of surface decoration image 108b corresponding to the position of the surface is determined, and the printed image data to be printed on the printing target object is generated based on the information of the pixel.
In this printed image data generation method, by causing print processing apparatus 10B to perform printing using the printed image data generated in this manner, even when the surface of the printing target object is the free curved surface, the designer can easily print an image without performing work such as manually expanding and contracting the printed image.
Furthermore, according to the printed image data generation method of the first exemplary embodiment, the data acquisition step further includes acquiring data indicating the nozzle position of inkjet head 101 including the plurality of nozzles, and the printed image data generation step includes calculating the position of the surface described above corresponding to one of nozzles, determining the pixel of surface decoration image 108b corresponding to the position, and generating the printed image data to be printed on the printing target object based on the information of the pixel of surface decoration image 108b.
In this printed image data generation method, by causing print processing apparatus 10B to perform printing using the printed image data generated in this manner, even in a case where inkjet head 101 moves along the curve, the influence of the position of the nozzle is considered in advance, so that the image can be easily printed without performing complicated correction on the pixels of the printed image.
Furthermore, according to the printed image data generation method of the first exemplary embodiment, in the printed image data generation step, points on surface decoration image 108b corresponding to vertices of the polygon of three-dimensional shape model 104a is obtained, the positional relationship between the vertices of the polygon and the position of the surface described above is obtained, and the pixel of surface decoration image 108b corresponding to the position of the surface is determined based on the points on obtained surface decoration image 108b and the obtained positional relationship.
In this printed image data generation method, by determining the pixels in this manner, it is possible to appropriately determine the pixels of surface decoration image 108b corresponding to the position of the surface.
Furthermore, according to the printed image data generation method of the first exemplary embodiment, the positional relationship is specified by an area ratio of a plurality of triangles. Each of the plurality of triangles has three vertices that include two vertices of the polygon and the position of the surface.
In this printed image data generation method, by using such a positional relationship, it is possible to effectively determine the pixel of surface decoration image 108b corresponding to the position of the surface.
Furthermore, the printing method of the first exemplary embodiment includes the printed image data generation method.
In this printing method, by performing printing using the printed image data generated by the printed image data generation method, even when the surface of the printing target object is the free curved surface, the designer can easily print an image without performing work such as manually expanding and contracting the printed image.
Furthermore, according to the printing method of the first exemplary embodiment, in the print control step, the pixel from which the information indicating whether or not to eject the ink is read from the printed image data is set as the pixel that does not eject the ink.
In this printing method, by performing the setting in this way, even when there is a portion where the trajectory of inkjet head 101 during printing overlaps, the redundant printing can be prevented.
Furthermore, according to the printing method of the first exemplary embodiment, in the print control step, the printing resolution during printing on the printing target object is set to be the same as the resolution of surface decoration image 108b or lower than the resolution of surface decoration image 108b.
In this printing method, by performing the setting in this manner, it is possible to prevent the occurrence of white spots where a part of the image is not printed.
Furthermore, according to printed image data generation apparatus 10A of the first exemplary embodiment, data acquisition unit 112 acquires data indicating three-dimensional shape model 104a imitating the printing target object and data indicating surface decoration image 108b, which is a two-dimensional image for decorating the surface of the printing target object. Printed image data generator 113 calculates the position of the surface of three-dimensional shape model 104a on which the ink ejected from the nozzle lands, determines the pixel of surface decoration image 108b corresponding to the position of the surface, and generates the data of the printed image to be printed on the printing target object based on the information of the pixel.
Printed image data generation apparatus 10A causes print processing apparatus 10B to perform printing using the printed image data generated in this manner, so that even when the surface of the printing target object is the free curved surface, the designer can easily print an image without performing work such as manually expanding and contracting the printed image.
In an actual printing apparatus, the landing position of the ink ejected from inkjet head 101 may deviate depending on the distance between inkjet head 101 and workpiece 102. In the second exemplary embodiment, a method of correcting this deviation will be described in detail.
As shown in an example in
When performing printing on workpiece 102, ink 205 is ejected while moving inkjet head 101 relative to workpiece 102. Therefore, when distances h1, h2, . . . , hn are different for each nozzle, the landing position of ink 205 may be deviated for each nozzle, and in this case, a desired printing result cannot be obtained.
In view of the circumstances above, the present second exemplary embodiment discloses a printed image data generation apparatus, a printing method, and a printed image data generation method capable of easily correcting the deviation of the landing position of ink 205.
The printing apparatus includes printed image data generation apparatus 10A′ and print processing apparatus 10B′. Printed image data generation apparatus 10A′ generates printed image data to be printed on workpiece 102 as the printing target object. Print processing apparatus 10B′ performs printing on workpiece 102 using the printed image data generated by printed image data generation apparatus 10A.
Hereinafter, a printing apparatus in which printed image data generation apparatus 10A′ and print processing apparatus 10B′ are integrated will be described, but it is similar to the first exemplary embodiment that printed image data generation apparatus 10A′ and print processing apparatus 10B′ may be separate apparatuses.
Printed image data generation apparatus 10A′ further includes distance data storage 121, landing deviation data storage 122, landing deviation correction unit 123, and corrected image data storage 124 in addition to the components of printed image data generation apparatus 10A shown in
Furthermore, in print processing apparatus 10B′, print controller 117 is replaced with print controller 125.
Other configurations in
Printed image data generator 120 calculates the position of the surface of three-dimensional shape model 104a on which the ink ejected from the nozzle lands, determines the pixels of binary images 110a to 110d which are two-dimensional images corresponding to the position of the surface, and generates the data of the printed image to be printed on the printing target object based on the information of the pixels by the same method as printed image data generator 113 in the first exemplary embodiment.
Furthermore, printed image data generator 120 according to the second exemplary embodiment calculates the distance between point O and intersection point P as shown in
Distance data storage 121 stores the distance data generated by printed image data generator 120.
Landing deviation data storage 122 stores the landing deviation data. The landing deviation data is data indicating how much the landing position of the ink ejected from the nozzle hole deviates from a target position due to the separation between the nozzle hole and the surface of workpiece 102 to be printed.
A circle in
The landing deviation data is data including information indicating the relationship between the deviation amount of the landing position and the distance as a table or an approximate expression for each nozzle. As described below, the deviation of the landing position is corrected using the landing deviation data.
Based on the distance data stored in distance data storage 121 and the landing deviation data stored in landing deviation data storage 122, landing deviation correction unit 123 generates a deviation correction image in which the positions of the pixels included in the printed image data stored in printed image data storage 114 are corrected.
For example, landing deviation correction unit 123 acquires, from the distance data, information of the distance between the nozzle hole and the surface of surface decoration shape model 108a during printing each pixel included in the printed image data.
Subsequently, landing deviation correction unit 123 acquires information of the deviation amount of the landing position corresponding to the distance from the landing deviation data, corrects the printed image data, and generates the corrected image data in which the positions of the pixels are deviated in advance so as to land at the target position in actual printing. By using such corrected image data during printing, the printing apparatus can correct the deviation of the landing position of the ink due to the distance between inkjet head 101 and workpiece 102.
Here, a difference between general deviation correction and deviation correction according to the second exemplary embodiment will be described.
The deviation amounts “1”, “2”, and “0” shown in
When inkjet head 101 prints a printed image having pixels “A0”, “A2”, “B1”, “C0”, and “C2” to which ink is to be ejected, landing deviation correction unit 123 generates a deviation correction image in which the positions of the pixels are corrected by “−1” for the nozzle that ejects ink in column A and “−2” for the nozzle that ejects ink in column B.
By using such a deviation correction image, the deviation due to the machining error of the nozzle is corrected, but in this method, it is difficult to correct the deviation of the landing position of the ink due to the distance between inkjet head 101 and workpiece 102 as described in
In this example, (a) of
Furthermore, (b) of
(c) of
Therefore, when inkjet head 101 prints a printed image having the pixels “A0”, “A2”, “B1”, “C0”, and “C2” to which ink is to be ejected, landing deviation correction unit 123 generates a deviation correction image in which the positions of the pixels are corrected by “−1” for a pixel “A0”, “−2” for a pixel “A2”, and “−1” for a pixel “B1”. Since the deviation amount of the pixels “C0” and “C2” is “0”, the correction is not performed.
By using such a deviation correction image, the printing apparatus can correct not only the deviation due to the machining error of the nozzle but also the deviation of the landing position of the ink due to the distance between inkjet head 101 and workpiece 102.
Returning to the description of
Using the corrected image data stored in corrected image data storage 124 and the information of the operation amount given to each axis of multi-axis robot 116 acquired from multi-axis controller 115, print controller 125 controls inkjet head 101 in accordance with the movement of multi-axis robot 116 to eject ink, thereby printing an image on the surface of workpiece 102.
Note that, in
Hereinafter, advantages of using different landing deviation data in the forward path and the backward path will be described.
As shown in
On the other hand, in a case of performing the reciprocating printing, the deviation of the landing position due to the ejection angle becomes a constant value ΔXa regardless of a printing direction. However, when ejection speed Vi of ink 205 is different for each nozzle, as shown in
For example, deviation amount ΔXv increases in proportion to distance H between inkjet head 101 and workpiece 102 and in inverse proportion to ejection speed Vi.
Furthermore,
Furthermore, the deviation of the landing position is also affected by an air flow, and a magnitude of the influence of the air flow on the deviation of the landing position may be different between the forward path and the backward path.
For this reason, landing deviation correction unit 123 may use different landing deviation data in the forward path and the backward path for one nozzle and generate the corrected image data for each nozzle.
Furthermore, model generator 103, trajectory data generator 105, surface decoration unit 107, RIP processing unit 109, data acquisition unit 112, multi-axis controller 115, printed image data generator 120, landing deviation correction unit 123, and print controller 125 shown in
Furthermore, shape model storage 104, trajectory data storage 106, surface decoration data storage 108, two-dimensional image data storage 110, nozzle position data storage 111, printed image data storage 114, distance data storage 121, landing deviation data storage 122, and corrected image data storage 124 are realized by a storage device such as a memory and a hard disk drive. These may be realized by one storage device or may be realized by the plurality of storage devices.
As described above, in the printed image data generation method of the second exemplary embodiment, in the landing deviation correction step, the printed image data is corrected based on the landing deviation data indicating the relationship between the position of the surface of three-dimensional shape model 104a on which the ink ejected from the nozzle lands, the distance between the nozzle, and the deviation of the landing position of the ink, and the distance data indicating the distance between the nozzle and the position of the surface of the three-dimensional shape model.
In this printed image data generation method, by performing such processing, the deviation of the landing position of the ink can be easily corrected.
When inkjet head 101 moves on the free curved surface, pixels over which two or more nozzles pass may be generated. For example, in
However, the simple measure described above may cause a problem that the two-dimensional image on the three-dimensional object becomes thin. This is because when inkjet head 101 moves on the free curved surface, a pixel that does not pass over any nozzle may occur. For example, in
The present exemplary embodiment provides a printed image data generation apparatus that copes with such a problem.
Similarly to the first exemplary embodiment, the printed image data generation apparatus prepares three-dimensional model data indicating a three-dimensional model generated from the three-dimensional object.
As in the first exemplary embodiment, the printed image data generation apparatus prepares binarized image data representing binarized image 130 generated from the two-dimensional image. Binarized image 130 is, for example, the same as binary image 110d of the first exemplary embodiment. Binarized image 130 includes the plurality of binarized image pixels (54 pixels in
The printed image data generation apparatus prepares the nozzle trajectory data indicating the plurality of trajectories in which the plurality of nozzles (#1 to #8 in
The printed image data generation apparatus generates printed image data 140 based on the binarized image data and the nozzle trajectory data. As shown in
Hereinafter, the algorithm for generating printed image data 140 will be described with reference to
The printed image data generation apparatus determines that nozzle #1 passes over binarized image pixel “C3” based on the trajectory of nozzle #1 and binarized image 130a. As shown in printed image data 140a, the printed image data generation apparatus assigns nozzle #1 to printed image pixel “C3” corresponding to binarized image pixel “C3”. As shown in binarized image 130b, the printed image data generation apparatus changes the pixel value of binarized image pixel “C3” to “−1”. A pixel value “−1” indicates that it has a color and that any nozzle has already been assigned.
Similarly, the printed image data generation apparatus determines that nozzle #2 passes over binarized image pixel “D4” based on the trajectory of nozzle #2 and binarized image 130b. As shown in printed image 140b, the printed image data generation apparatus assigns nozzle #2 to printed image pixel “D4” corresponding to binarized image pixel “D4”. As shown in binarized image 130c, the printed image data generation apparatus changes the pixel value of binarized image pixel “D4” to “−1”.
Similarly, as shown in the printed images 140c and 140d, the printed image data generation apparatus assigns the nozzles #3 and #4 to the printed image pixels “E5” and “F6”, respectively.
The printed image data generation apparatus determines that nozzle #5 passes over binarized image pixel “D4” based on the trajectory of nozzle #5 and binarized image 130e. Binarized image pixel “D4” has the pixel value “−1”. The printed image data generation apparatus searches for the binarized image pixel having the pixel value “1” located within the predetermined distance from the binarized image pixel as the nozzle passes over the binarized image pixel having the pixel value “−1”. In the example of
In
Similarly, the printed image data generation apparatus determines that nozzle #6 passes over binarized image pixel “E5” based on the trajectory of nozzle #6 and binarized image 130f. Binarized image pixel “E5” has the pixel value “−1”. The printed image data generation apparatus searches for a binarized image pixel having the pixel value “1” located within the predetermined distance from binarized image pixel “E5”, and determines binarized image pixel “E7”. As shown in printed image 140f, in response to the determination of binarized image pixel “E7”, the printed image data generation apparatus assigns nozzle #6 in addition to nozzle #3 to printed image pixel “E5” corresponding to binarized image pixel “E5”.
Similarly, as shown in printed images 140g and 140h, the printed image data generation apparatus assigns nozzles #7 and #8 to printed image pixels “F6” and “E7”, respectively.
In the third exemplary embodiment, the printed image data generation apparatus can prevent the two-dimensional image on the three-dimensional object from becoming thin on average. Specifically, in the example of
As described above, in the present exemplary embodiment, the printed image data generation apparatus determines that the first nozzle (for example, nozzle #2) passes over the first binarized image pixel (for example, D4) having the pixel value “1” indicating that the first nozzle has a color among the plurality of binarized image pixels. The printed image data generation apparatus assigns the first nozzle (for example, nozzle #2) to the first printed image pixel (for example, D4) corresponding to the first binarized image pixel (for example, D4). The printed image data generation apparatus determines that the second nozzle (for example, #5) different from the first nozzle (for example, nozzle #2) passes over the first binarized image pixel (for example, D4). The printed image data generation apparatus determines the second binarized image pixel (for example, B2) having the pixel value “1” indicating that it has a color located within the predetermined distance from the first binarized image pixel (for example, D4). In response to determining the second binarized image pixel (for example, B2), the printed image data generation apparatus assigns the second nozzle (for example, #5) in addition to the first nozzle (for example, nozzle #2) to the first printed image pixel (for example, D4). As described above, according to the printed image data generation method, the printing method, and the printed image data generation apparatus of the present disclosure, it is possible to improve print quality when surface decoration is performed on the printing target object having a complicated shape.
For example, since satisfactory printing quality cannot be obtained by the conventional method, it is possible to perform the surface decoration even on a printing target object having a complicated surface shape for which there is no choice but to cope with by seal sticking or the like.
Note that, in the first and second exemplary embodiments, RIP processing unit 109 converts surface decoration image 108b into binary images 110a to 110d corresponding to the four colors of CMYK, respectively, and printed image data generators 113, 120 generate printed image data 113a including the binary data indicating whether or not to eject each ink of CMYK for each pixel based on binary images 110a to 110d.
However, the printed image data generation method, the printing method, and the printed image data generation apparatus of the present disclosure are not limited thereto.
For example, RIP processing unit 109 may convert surface decoration image 108b into gradation image data in which gradation corresponding to ejection amount of ink is set for each pixel corresponding to each of four colors of CMYK.
Then, printed image data generators 113, 120 may calculate the position of the surface of three-dimensional shape model 104a on which the ink ejected from the nozzle lands, determine the pixel of the gradation image, which is the two-dimensional image corresponding to the position of the surface, and generate the printed image data including the data of the ejection amount of each ink of CMYK based on the information of the pixel.
For example, 4-bit (“0” to “15”) TIFF data may be used as the gradation image data and the printed image data. In this case, “0” represents the ejection amount of ink of 0, and “1” to “15” represent the ejection amount of ink that increases according to an increase in these numerical values.
According to the printed image data generation method, the printing method, and the printed image data generation apparatus of the present disclosure, an image can be easily printed even when the surface of the printing target object is the free curved surface.
The printed image data generation method, the printing method, and the printed image data generation apparatus of the present disclosure can be applied to a printing apparatus that directly prints an image on various printing target objects having the three-dimensional shape.
Number | Date | Country | Kind |
---|---|---|---|
2022-162393 | Oct 2022 | JP | national |
2023-145201 | Sep 2023 | JP | national |