The present application claims priority under 35 U.S.C. § 119 to Japanese Patent Application No. 2020-165830, filed on Sep. 30, 2020 and Japanese Patent Application No. 2021-094784, filed on Jun. 4, 2021. The contents of which are incorporated herein by reference in their entirety.
The present invention relates to an image forming device, an information processing device, a computer-readable medium, and an image forming method.
Conventionally, an inspection device is known in which an inspection object is illuminated and is then inspected using an imaging camera. For example, in Japanese Translation of PCT International Application Publication No. JP-T-2002-501265, a technique is disclosed in which images taken with an imaging camera are used, and the targets captured in the images are identified.
In recent years, a service is known that enables a user to upload an image via the Internet and print the uploaded image on a specific print target. Then, an inspection device treats the specific print target, on which the uploaded image is printed, as the inspection object and inspects it.
According to the conventional service, although the user becomes able to upload a favorite image (design), if that image is violating predetermined print guidelines that represent the required conditions for image printing, then printing failure becomes likely to occur.
The print guidelines represent the conditions that should be satisfied or that cannot be violated by the source data for printing, and are defined separately for each print target or each service. For example, if the print object is circular in shape and if a circular design having a similar size to the print object is present in the image for printing, then even a slight misalignment in the print position leads to printing failure. In that case, it is possible to think of setting a print guideline indicating that a circular design having a similar size to the print object cannot be included. Moreover, considering the performance of a printing device, it is possible to think of setting print guidelines regarding fine patterns that are difficult to express or designs having specific spatial frequencies.
Moreover, when printing is to be performed on some part of the print object, sometimes it is necessary to have a lager image (design) than the print range so as to ensure that there is no printing failure due to misalignment in the print position. In that case, it is possible to think of setting a print guideline indicating that the image (design) needs to be larger than the print area.
An inspection device compares the uploaded image with the image printed on the print target. If the uploaded image is different than the printed image or if an edge of the image is off the print range due to misalignment in the print position, then the inspection device treats that inspection object as a defective article.
According to an aspect of the present invention, an image forming device includes a receiving unit and a determining unit. The receiving unit is configured to receive an image to be printed. The determining unit is configured to determine whether the image received by the receiving unit complies with a condition predetermined to be satisfied for printing.
The accompanying drawings are intended to depict exemplary embodiments of the present invention and should not be interpreted to limit the scope thereof. Identical or similar reference numerals designate identical or similar components throughout the various drawings.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the present invention.
As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise.
In describing preferred embodiments illustrated in the drawings, specific terminology may be employed for the sake of clarity. However, the disclosure of this patent specification is not intended to be limited to the specific terminology so selected, and it is to be understood that each specific element includes all technical equivalents that have the same function, operate in a similar manner, and achieve a similar result.
An embodiment of the present invention will be described in detail below with reference to the drawings.
An embodiment has an object to reduce the occurrence of printing failure by reducing the instances of printing images that violate the print guidelines representing the required conditions for image printing.
An exemplary embodiment of an image forming device, an information processing device, a computer program product, and an image forming method is described below in detail with reference to the accompanying drawings.
As illustrated in
In the inspection device 100, an inspection object 300 representing the print quality inspection target is illuminated and is then inspected using an imaging device 102.
On the surface of the inspection object 300, various designs matching the user's preference are printed using a printing device such as an inkjet printer. The inspection device 100 inspects the print quality of the design images printed on the inspection object 300.
The inspection object 300 is not limited to have a disc shape, and can alternatively be rectangular in shape. As far as the inspection object 300 is concerned; a tin badge, a smartphone cover, and an anti-drop grip attached to a smartphone are possible candidates.
The main body frame 101 includes the following components in the given order from top to bottom in the vertical direction: the imaging device 102; a first ring illumination device 103 representing a first illumination device; a second ring illumination device 104 representing a second illumination device; and a backlight illumination device 105 representing a third illumination device.
In addition, the main body frame 101 includes a slide stage 106 in between the second ring illumination device 104 and the backlight illumination device 105.
The imaging device 102 is a camera in which, for example, a complementary metal oxide semiconductor (CMOS) is used; and which outputs signals according to the projected light. Alternatively, the imaging device 102 can be a camera in which a charge coupled device (CCD) is used.
The first ring illumination device 103 includes a ring-shaped main body 103a representing a first main body. Because of the ring-shaped structure of the main body 103a, the imaging device 102 becomes able to take images of the inspection object 300. In the first ring illumination device 103, on the undersurface of the ring-shaped main body 103a, a number of LEDs 103b representing first-type light sources is installed. The LEDs 103b emit light from above (in the vertical direction) onto the inspection object 300 that is positioned at the imaging position for the imaging device 102.
As illustrated in
Meanwhile, although the first ring illumination device 103 includes the ring-shaped main body 103a, that is not the only possible case. Alternatively, for example, the first ring illumination device 103 can have a plurality of fragmented illumination lamps arranged in a ring, or can have a plurality of (for example, four) compact illumination lamps arranged on the left, right, top, and bottom of a circle in the planar view.
The second ring illumination device 104 includes a ring-shaped main body 104a representing a second main body. Because of the ring-shaped structure of the main body 104a, the imaging device 102 becomes able to take images of the inspection object 300. In the second ring illumination device 104, on the inner wall of the ring-shaped main body 104a, a number of LEDs 104b representing light sources are installed. The inner wall of the ring-shaped main body 104a has a tapered shape, with the opening thereof becoming wider from top toward bottom. The LEDs 104b emit light onto the inspection object 300, which is positioned at the imaging position for the imaging device 102, from an oblique direction (for example, at an angle of 30°) with respect to the vertical direction.
As illustrated in
Meanwhile, although the second ring illumination device 104 includes the ring-shaped main body 104a, that is not the only possible case. Alternatively, for example, the second ring illumination device 104 can have a plurality of fragmented illumination lamps arranged in a ring, or can have a plurality of (for example, four) compact illumination lamps arranged on the left, right, top, and bottom of a circle in the planar view.
On the flat surface 301 of the inspection object 300, the light emitted from the backlight illumination device 105 turns into diffused reflection light with respect to the imaging device 102. Moreover, on the edge region surface 302 of the inspection object 300, the light emitted from the backlight illumination device 105 turns into specular reflection light and diffused reflection light with respect to the imaging device 102.
In the present embodiment, the first ring illumination device 103 is made to function as an illumination device for enabling imaging of the edge region surface 302 of the inspection object 300. The second ring illumination device 104 is made to function as an illumination device for enabling imaging of the flat surface 301 of the inspection object 300 (i.e., imaging of the inside of the edge region surface 302). The backlight illumination device 105 is made to function as an illumination device for enabling recognition of the area covered by the inspection object 300.
In the present embodiment, a taken image that is taken by the imaging device 102 when the illumination is provided by the first ring illumination device 103 is synthesized with a taken image that is taken by the imaging device 102 when the illumination is provided by the second ring illumination device 104, and a whole-area image of the inspection object 300 is obtained.
Given below is the explanation about the slide stage 106. The slide stage 106 is a guiding member for guiding the inspection object 300 to the imaging position for the imaging device 102.
As illustrated in
As illustrated in
The movements of the slide stage 106 are mitigated by absorbers 106d that are disposed on right and left.
In the present embodiment, the slide stage 106 of the sliding type is used as the guiding member for guiding the inspection object 300 to the imaging position for the imaging device 102. However, that is not the only possible case. Alternatively, for example, a revolver-type guiding member can be used that revolves in order to guide the inspection object 300 to the imaging position for the imaging device 102.
Given below is the explanation about the electrical connections established in the inspection system 1.
The first ring illumination device 103, the second ring illumination device 104, and the backlight illumination device 105 are connected to the information processing device 200 via the I/O power box 107 and the light source power box 108. The first ring illumination device 103, the second ring illumination device 104, and the backlight illumination device 105 are subjected to LED illumination lighting control and LED illumination lighting power control by the information processing device 200.
The imaging device 102 is directly connected to the information processing device 200, and is controlled by the information processing device 200.
The sensors 106b and 106c in the slide stage 106 are connected to the information processing device 200 via the I/O power box 107. The signal detection regarding the sensors 106b and 106c of the slide stage 106 is performed by the information processing device 200.
Given below is the explanation of a hardware configuration of the information processing device 200.
As illustrated in
Of those constituent elements, the CPU 501 controls the operations of the entire information processing device 200. The ROM 502 is used to store computer programs, such as the IPL, used in the driving of the CPU 501. The RAM 503 is used as the work area of the CPU 501. The HD 504 is used to store a variety of data of computer programs. The HDD controller 505 controls reading of a variety of data from and writing of a variety of data into the HD 504 under the control of the CPU 501. The display 506 displays a variety of information such as a cursor, menus, windows, characters, and images. The external device connection I/F 508 is an interface for establishing connection with various external devices. Examples of an external device includes a universal serial bus (USB) memory or a printer. The network I/F 509 is an interface for performing data communication using a communication network 100. The bus line 510 is an address bus or a data bus meant for electrically connecting the constituent elements, such as the CPU 501, illustrated in
The keyboard 511 is a type of an input unit that includes a plurality of keys for inputting characters, numerical values, and various instructions. The pointing device 512 is a type of an input unit for selecting or executing various instructions, selecting the processing target, and moving the cursor. The DVD-RW drive 514 controls the reading of a variety of data from and writing of a variety of data into a DVD-RW 513 representing an example of a detachably-attachable recording medium. Meanwhile, instead of using a DVD-RW, a DVD-R can also be used. The media I/F 516 controls the reading of a variety of data from and writing (storing) of a variety of data into a recording media 515 such as a flash memory.
The computer programs executed in the information processing device 200 according to the present embodiment are recorded as installable files or executable files in a computer-readable recording medium such as a compact disk read only memory (CD-ROM), a flexible disk (FD), a compact disk recordable (CD-R), or a digital versatile disk (DVD).
Alternatively, the computer programs executed in the information processing device 200 according to the present embodiment can be stored in a downloadable manner in a computer connected to a network such as the Internet. Still alternatively, the computer programs executed in the information processing device 200 according to the present embodiment can be distributed via a network such as the Internet.
From among a variety of arithmetic processing performed when the CPU 501 of the information processing device 200 executes computer programs, the following explanation is given about the characteristic operations according to the embodiment.
The first image obtaining unit 201 obtains, from a taken image that is taken by the imaging device 102 when the light is emitted by the first ring illumination device 103 (i.e., from a diffused-reflection illumination image), an image of the edge region surface 302 of the inspection object 300 as a first target image for inspection of the inspection object.
The second image obtaining unit 202 obtains, from a taken image that is taken by the imaging device 102 when the light is emitted by the second ring illumination device 104 (i.e., from a specular-reflection illumination image), an image of the flat surface 301 excluding the edge region surface 302 of the inspection object 300 as a second target image for inspection of the inspection object.
The third image obtaining unit 203 obtains, from a taken image that is taken by the imaging device 102 when the light is emitted by the backlight illumination device 105 (i.e., from a silhouette image), an image indicating the area covered by the inspection object 300.
The image synthesizing unit 204 synthesizes the following: the image of the edge region surface 302 of the inspection object 300, the image of the flat surface 301 of the inspection object 300, and the image of the area covered by the inspection object 300; and treats that synthetic image as the target image for inspection. Alternatively, the image synthesizing unit 204 can synthesize the image of the edge region surface 302 of the inspection object 300 and the image of the flat surface 301 of the inspection object 300; and treat that synthetic image as the target image for inspection.
The luminance setting unit 205 performs luminance setting of the first ring illumination device 103 and the second ring illumination device 104.
The identifying unit 206 analyzes a master image (a reference image) regarding the inspection object 300, and identifies the presence or absence of a predetermined pattern. The predetermined pattern is, for example, a pattern similar to the shape of the inspection object 300. Alternatively, the predetermined pattern is, for example, a pattern similar to the point symmetry or the line symmetry.
According to the identification result by the identifying unit 206 regarding the identification of the master image, the threshold value switching unit 207 changes the threshold value to be applied when the inspecting unit 208 compares the master image (the reference image) regarding the inspection object 300 with the target image for inspection. The threshold value represents an example of the broader concept such as the determination criterion or the determination condition.
The inspecting unit 208 performs inspection by comparing the master image (the reference image) regarding the inspection object 300 with the target image for inspection. More specifically, in the case of inspecting the print quality of the target image for inspection, the inspecting unit 208 uses a threshold value for determining the off-center of the target image for inspection. If that predetermined threshold value is not satisfied, then the inspecting unit 208 determines that the target image for inspection is defective on account of off-center.
Given below is the explanation of the flow of the characteristic operations from among the operations performed by the information processing device 200 according to the present embodiment.
Firstly, the explanation is given about a luminance setting operation performed by the CPU 501 of the information processing device 200 regarding two illumination devices, namely, the first ring illumination device 103 and the second ring illumination device 104.
In the luminance setting operation, the inspection object 300 in which the flat surface 301 and the edge region surface 302 have the same color (for example, white) is treated as the target.
As illustrated in
Subsequently, the luminance setting unit 205 controls the second ring illumination device 104 and the imaging device 102 (Step S3), and obtains the signal level (level B) of a taken image of the flat surface 301 that is taken when the inspection object 300 is captured under the illumination by the second ring illumination device 104 (Step S4).
Then, the luminance setting unit 205 sets the luminance of the first ring illumination device 103 and the second ring illumination device 104 in such a way that the level A obtained at Step S2 becomes equal to the level B obtained at Step S4 (Step S5). For example, the luminance setting unit 205 calculates a correction coefficient (level A/level B) of the taken-image signals in such a way that the level A obtained at Step S2 becomes equal to the level B obtained at Step S4. At the time of the actual inspection, the correction coefficient is used in correcting the taken image obtained under the illumination of the second ring illumination device 104.
In the present embodiment, the explanation is given about the luminance setting operation regrading two illumination devices, namely, the first ring illumination device 103 and the second ring illumination device 104. However, that is not the only possible case. That is, it is also possible to adjust the focus of the imaging device 102.
Given below is the explanation of an image synthesis operation performed by the CPU 501 of the information processing device 200.
As illustrated in
More specifically, at Step S12, the third image obtaining unit 203 measures the setting position (x, y) of the inspection object 300; calculates the circular size of the inspection object 300; finalizes the background image region of the inspection object 300; and determines the placement defect of the inspection object 300.
Subsequently, the first image obtaining unit 201 controls the first ring illumination device 103 and the imaging device 102 (Step S13) and, from the taken images of the inspection object 300 that are taken under the illumination of the first ring illumination device 103 (i.e., from the diffused-reflection illumination images), obtains the image of the edge region surface 302 of the inspection object 300 (Step S14).
More specifically, at Step S13, the first image obtaining unit 201 obtains four taken images (diffused-reflection illumination images) of the inspection object 300. Then, at Step S14, the first image obtaining unit 201 performs image averaging based on the four taken images (diffused-reflection illumination images), and then records the image of the edge region surface 302 of the inspection object 300.
Then, the second image obtaining unit 202 controls the second ring illumination device 104 and the imaging device 102 (Step S15) and, from the taken images of the inspection object 300 that are taken under the illumination of the second ring illumination device 104 (i.e., from the specular-reflection illumination images), obtains the image of the flat surface 301 of the inspection object 300 (Step S16).
More specifically, at Step S15, the second image obtaining unit 202 obtains four taken images (specular-reflection illumination images) of the inspection object 300. Then, at Step S16, the second image obtaining unit 202 performs image averaging based on the four taken images (specular-reflection illumination images) and records the image of the flat surface 301 of the inspection object 300.
Meanwhile, the operations from Step S13 to Step S16 are not limited to be performed in the order explained above in the flowchart. Alternatively, for example, the second image obtaining unit 202 can control the second ring illumination device 104 and the imaging device 102 (Step S15) and obtain the image of the flat surface of the inspection object 300 (Step S16); and then the first image obtaining unit 201 can control the first ring illumination device 103 and the imaging device 102 (Step S13) and obtain the image of the edge region surface 302 of the inspection object 300 (Step S14).
Lastly, the image synthesizing unit 204 synthesizes the following: the area covered by the inspection object 300 as recognized at Step S12, the image of the edge region surface 302 of the inspection object 300 as obtained at Step S14, and the image of the flat surface 301 of the inspection object 300 as obtained at Step S16 (Step S17).
In the image synthesis operation performed at Step S17, the image synthesizing unit 204 performs the following operations: performs pattern matching of the axis centers of the x and y axes; obtains the rotation angle; performs affine conversion of the x-axis center and the y-axis center, and performs resampling. Moreover, in the image synthesis operation performed at Step S17, the image synthesizing unit 204 also performs a gradation correction operation for adding gradation correction to the synthetic image.
The synthetic image that is generated in the manner explained above is treated as the target image for inspection of the inspection object 300 and is compared with the master image for the purpose of, for example, print quality inspection.
Given below is the explanation of an image type identification operation performed by the CPU 501 of the information processing device 200.
As illustrated in
Then, the identifying unit 206 performs the averaging operation as the preprocessing for printer emulation (Step S23).
Then, the identifying unit 206 performs the image type identification operation for identifying the image type of the image of the inspection object 300 (Step S24), and generates a master image (Step S25). The reason for which the image type identification operation at Step S24 is required is explained below.
When the print quality of the target image for inspection regarding the inspection object 300 is visually determined by an examiner, the determination threshold value regarding the off-center of the image differs according to the outer periphery/inner periphery of the pictorial pattern. The off-center of an image indicates that the image has shifted from the center. Particularly, regarding a pictorial pattern having circles of the same shape as the disc-shaped inspection object 300, the examiner happens to strictly judge the off-center of the image. For example,
The type Type 0 represents the “normal” type in which a picture is printed over the entire face.
The types from Type 1 to Type 3 differ in the way that, according to the state of the outer periphery/inner periphery of the pictorial pattern, a different determination threshold value is used for inspecting the print quality of the target image for inspection.
The type Type 1 is of the “circle” type in which a circle is present close to the outer periphery and a solid area is present. As illustrated in
The type Type 2 is of the “circle and picture” type in which a circle is present close to the outer periphery but a solid area is absent. As illustrated in
The type Type 3 is of the “circle in picture” type in which a circle is present in a picture printed over the entire face. As illustrated in
As the print quality of the target image for inspection, in the case of visually determining the off-center, the examiner compares the width of the outer periphery color of the target image for inspection according to the positions of the line/point symmetry, and determines whether the widths of the outer shape of the line/point symmetry positions are identical (the horizontal positions, the vertical positions, or the diagonal positions).
In that regard, in the present embodiment, the pictorial pattern of the inspection object 300 is identified, and the off-center inspection is performed according to the method suitable to the pattern type (normal/circle/circle and picture/circle in picture).
Given below is the explanation about the inspection operation based on image comparison as performed by the CPU 501 of the information processing device 200.
As illustrated in
Then, the inspecting unit 208 compares the master image with the target image for inspection, and performs inspection for detecting off-center (Step S33). Herein, according to the identification result by the identifying unit 206 regarding the image type of the master image, the threshold value switching unit 207 changes the threshold value to be applied at the time of comparing the master image regarding the inspection object 300 and the target image for inspection.
More specifically, according to the equation given below, the threshold value switching unit 207 obtains a threshold value (Th) for each image type from a threshold value (ThNormal) for misalignment allowance as specified in advance by the user.
Th=C
Type
×Th
Normal
The threshold value switching unit 207 calculates a coefficient CType according to the image type of the master image as identified by the identifying unit 206.
Subsequently, the inspecting unit 208 draws the inspection result indicating the detected off-center (Step S34).
In the present embodiment, the outer shape (edge) of the inspection object 300 representing the information detected from the master image is displayed in an overlapping manner with the outer periphery (edge) of the target image for inspection, which is shifted by the off-center, representing the information obtained from the result of comparison between the master image and the target image for inspection. However, that is not the only possible case. Alternatively, the center of the outer shape of the inspection object 300 representing the information detected from the master image can be displayed in an overlapping manner with the center of the outer periphery of the target image for inspection, which is shifted by the off-center, representing the information obtained from the result of comparison between the master image and the target image for inspection.
As a result of displaying both pieces of information in an overlapping manner, the result of off-center can be illustrated in an easy-to-understand manner, thereby making it possible to display the misalignment during printing in an easy-to-understand manner.
Lastly, based on the inspection result indicating the detected off-center, the inspecting unit 208 displays, in the display 506, whether or not the misalignment of the target image for inspection is allowed (Step S35). That marks the end of the operations.
In this way, according to the present embodiment, as a result of analyzing the master image (reference image) to be printed, the presence or absence of a predetermined pattern (a pattern (for example, a circular pattern) similar to the shape of the inspection object 300 (for example, a circular medium)) is identified and, according to the identification result, a change is made in the threshold value to be used by the inspecting unit 208 for comparing the master image (reference image) and the target image for inspection. As a result, the sensitivity of the misalignment inspection at the time of printing is appropriately changed by taking into account the impact of the print pattern of the inspection object 300. Hence, even when there are a number of design variations of the inspection object, the inspection can be performed in an efficient manner while reducing the burden on the examiner.
According to the present embodiment, the explanation is given about an example of printing a pattern similar to the circular inspection object 300 (for example, printing a pictorial pattern having a circular pattern). However, that is not the only possible case. Alternatively, it is possible to print a pattern similar to a polygonal inspection object 300 (for example, a pictorial pattern having a polygonal pattern). Moreover, the pictorial pattern need not be similar to the shape of the inspection object 300. For example, it is also possible to have a case in which a pictorial pattern having a polygonal pattern is printed on a circular inspection object 300. Furthermore, even in the case in which the pictorial pattern is bilaterally symmetrical/vertically symmetrical, the same operations can be performed.
In addition, the inspecting unit 208 analyzes whether or not the submitted design (RIP) complies with the print guidelines that represent the required conditions for image printing set in advance, and make notification of the analysis result. The print guidelines represent the conditions that should be satisfied or that cannot be violated by the source data for printing, and are defined separately for each print target or each service. For example, as the print guidelines, it is possible to think of the following guidelines: the design should be larger than the outer periphery of the print area, and the color and the design cannot be changed around the outer periphery of the print area.
Thus, regarding the submitted design (RIP) that is submitted by the user, the inspecting unit 208 determines whether or not the print guidelines are followed. If the print guidelines are not followed, then the inspecting unit 208 either notifies the user about the same or proposes a corrected design that follows the print guidelines. As a result, it becomes possible to reduce the occurrence of printing failure by reducing the instances of printing images that violate the print guidelines.
Meanwhile, when the submitted design (RIP) is either same as or smaller than the range of the print area, the off-center becomes easy to occur. Hence, it becomes necessary to notify the user about creating a pictorial pattern that is larger than the range of the print area.
The image receiving unit 2081 functions as a receiving unit. The image receiving unit 2081 receives an image representing the submitted design (RIP) that is submitted by the user, and receives the selection result of metainformation (explained later). Then, the image receiving unit 2081 outputs the received image to the compliance determining unit 2082, the image correcting unit 2083, and the notifying unit 2084.
The compliance determining unit 2082 functions as a determining unit. The compliance determining unit 2082 receives input of the image that is output from the image receiving unit 2081. Then, the compliance determining unit 2082 determines whether the image received by the image receiving unit 2081 complies with the print guidelines. Subsequently, the compliance determining unit 2082 outputs the determination result to the image correcting unit 2083 and the notifying unit 2084.
For example, the compliance determining unit 2082 compares the size of the submitted design (image) with the size specified in the print guidelines, and determines whether or not the submitted design (image) complies with the print guidelines (a first compliance determination method).
For example, depending on whether or not the color is constant at the edge of the print range of the inspection object 300, the compliance determining unit 2082 determines whether or not the submitted design (image) complies with the print guidelines (a second compliance determination method).
Meanwhile, when the outer shape of the print target (the inspection target) is similar to the design of the printed image, as illustrated in the types from Type 1 to Type 3 in
Moreover, for example, the compliance determining unit 2082 performs determination using machine learning (a convolutional neural network (CNN)) that is trained to distinguish between an image complying with the print guidelines and an image not complying with the print guidelines (a third compliance determination method).
The image correcting unit 2083 functions as a correcting unit. The image correcting unit 2083 receives input of the image output from the image receiving unit 2081 and receives input of the determination result output from the compliance determining unit 2082. If the compliance determining unit 2082 determines noncompliance, then the image correcting unit 2083 corrects the image received by the image receiving unit 2081. Then, the image correcting unit 2083 outputs the correction result to the notifying unit 2084.
For example, the image correcting unit 2083 performs variable magnification of the size of the submitted design (image), which is submitted by the user, so that the design (image) complies with the print guidelines (a first image correction method).
Alternatively, for example, the image correcting unit 2083 corrects the color of some part of the submitted design (image), which is input from the user, so that the design (image) complies with the print guidelines (a second image correction method).
Still alternatively, for example, the image correcting unit 2083 performs correction using machine learning (a convolutional neural network (CNN)) trained to convert an image not complying with the print guidelines to comply with the print guidelines, or performs correction using a generative adversarial network (a third image correction method).
Still alternatively, for example, the image correcting unit 2083 translates, partially or entirely, the submitted design (image), which is submitted by the user, to comply with the print guidelines (a fourth image correction method).
For example, depending on whether or not the design (image) is in the middle of the image, sometimes there is a change in the criteria for detecting printing mismatch. In that case, it becomes difficult to maintain a uniform inspection quality, or the image inspection operation becomes complex. In that regard, for example, placing the design (image) at the center of the image can be considered as a condition in the print guidelines. In that case, the image correcting unit 2083 performs correction by translating, partially or entirely, the image and placing the design in the middle of the image, to comply with the print guidelines.
The notifying unit 2084 functions as a notifying unit. The notifying unit 2084 receives input of the following: the image output from the image receiving unit 2081, the determination result output from the compliance determining unit 2082, and the correction result output from the image correcting unit 2083. Then, the notifying unit 2084 notifies the user about the determination result by the compliance determining unit 2082 or about the correction result by the image correcting unit 2083. Moreover, the notifying unit 2084 notifies the user about the metainformation that enables the user to make selection.
Given below is an example of the notification made by the notifying unit 2084.
Moreover, when an image does not comply with the print guidelines, the notifying unit 2084 displays “not complying with guidelines” in the display 506.
Alternatively, when an image does not comply with the print guidelines, the notifying unit 2084 can output the fact that the submitted design (image), which is submitted by the user, is “not complying with the guidelines” in a result file (a log file).
The print guidelines (the center of the print area) indicate the center of the print area. Thus, when the center of the pictorial pattern of the inspection object 300 as detected from the master image is not in alignment with the print guidelines (the center of the print area), the examiner understands that the pictorial pattern of the inspection object 300 is misaligned and not appropriate. Moreover, even if a design is originally not in alignment with the center, when the center of the print area is indicated, the examiner becomes able to correctly determine, without getting affected by the misalignment of the design, that either the pictorial pattern of the inspection object 300 is misaligned and not appropriate or the pictorial pattern of the inspection object 300 is not appropriate.
According to the present embodiment, as a result of illustrating the print guidelines in an overlapping manner on the master image (reference data), the problems in the master image (reference data) (for example, the print area being small and not appropriate, or the center of the print area being misaligned) can be illustrated in an easy-to-understand manner. Thus, the operator can look at the display to confirm the design, and determine whether or not reprinting is necessary.
Meanwhile, according to the present embodiment, the inspecting unit 208 of the information processing device 200 is equipped with the function of determining the compliance of the print guidelines. However, that is not the only possible case. Alternatively, for example, the function of determining the compliance of the print guidelines can be implemented in a web application/order software that is a computer program used in an information processing terminal such as a personal computer or a handheld terminal of the user (designer). In that case, the information processing terminal such as a personal computer or a handheld terminal of the user (designer) functions as the image inspection device.
When the function of determining the compliance of the print guidelines is implemented in a web application/order software used in an information processing terminal of the user; as soon as the user uploads a design (image), the compliance determining unit 2082 determines whether or not the uploaded design (image) complies with the print guidelines. If it is determined that the design does not comply with the print guidelines, then the notifying unit 2084 notifies the user about that fact. Moreover, the image correcting unit 2083 corrects the uploaded design (image), and presents a correction proposal via the notifying unit 2084.
That is, in order to deal with the case in which printing needs to be performed even if the print guidelines are violated, the images violating the print guidelines can also be printed if specified by the user or the examiner. The radio button B1 illustrated in
Meanwhile, as a result of providing a slide bar, the number of print copies of the received image and the corrected image can be specified in the image receiving unit 2081.
Moreover, the image receiving unit 2081 includes a cancel button B2 for receiving print cancellation. Thus, when the cancel button B2 is pressed, the printing can be cancelled.
In this way, according to the present embodiment, after a design (image) is uploaded, it is determined whether or not that design (image) complies with the print guidelines, and the determination result is notified to the user or the examiner. Moreover, a correction image obtained by correcting the uploaded design (image) to comply with the print guidelines is presented to the user while prompting the user to make changes. Then, the inspection system either receives the design (image) that has been changed by the user to comply with the print guidelines, or receives an instruction to print the corrected image that was presented. Thus, the designs (images) not complying with the print guidelines can be detected/corrected, and the occurrence of printing failure can be reduced by reducing the instances of printing images that violate the print guidelines representing the required conditions for image printing.
The information processing device 200 that functions as the image inspection device can be installed in an image forming device that includes an image forming unit. That is, it is possible to think of an image forming device that uses partially or entirely the functional configuration of the inspecting unit 208 that is related to the compliance determination of the print guidelines as illustrated in
Firstly, at Step S3001, the image receiving unit 2081 receives an image. In a web application/order software, the uploading of an image designed by the user is received. In an image forming device, an image for printing is received.
Then, at Step S3002, the compliance determining unit 2082 performs the compliance determination operation for determining whether the image received by the image receiving unit 2081 complies with the print guidelines.
Subsequently, at Step S3003, the compliance determining unit 2082 diverges according to the determination result.
If it is determined that the image complies with the print guidelines (Yes at Step S3003), then the system control proceeds to Step S3004 at which the inspecting unit 208 makes the image forming unit of the image forming device print the image determined to comply with the print guidelines. On the other hand, if it is determined that the image does not comply with the print guidelines (No at Step S3003), then the system control proceeds to Step S3005 at which the inspecting unit 208 cancels the printing to be performed by the image forming unit of the image forming device.
Firstly, at Step S3101, the image receiving unit 2081 receives an image. In a web application/order software, the uploading of an image designed by the user is received. In an image forming device, an image for printing is received.
Then, at Step S3102, the compliance determining unit 2082 performs the compliance determination operation for determining whether the image received by the image receiving unit 2081 complies with the print guidelines.
Subsequently, at Step S3103, the compliance determining unit 2082 diverges according to the determination result.
If it is determined that the image complies with the print guidelines (Yes at Step S3103), then the system control proceeds to Step S3105 at which the inspecting unit 208 makes the image forming unit of the image forming device print the image determined to comply with the print guidelines.
On the other hand, if it is determined that the image does not comply with the print guidelines (No at Step S3103), then the system control proceeds to Step S3104 at which the correcting unit 2083 performs the correction operation for correcting the image received at the image receiving unit 2081. Then, the system control proceeds to Step S3105 at which the inspecting unit 208 makes the image forming unit of the image forming device print the image corrected by the correcting unit 2083.
Firstly, at Step S3201, the image receiving unit 2081 receives an image. In a web application/order software, the uploading of an image designed by the user is received. In an image forming device, an image for printing is received.
Then, at Step S3202, the compliance determining unit 2082 performs the compliance determination operation for determining whether the image received by the image receiving unit 2081 complies with the print guidelines.
Subsequently, at Step S3203, the notifying unit 2084 notifies the user about the determination result by the compliance determining unit 2082. Meanwhile, if the compliance determining unit 2082 determines that the image received by the image receiving unit 2081 does not comply with the print guideline, then operation at Step S3204 can be omitted.
If it is determined that the image complies with the print guidelines, then the system control proceeds to Step S3204 at which the inspecting unit 208 makes the image forming unit of the image forming device print the image determined to comply with the print guidelines.
Firstly, at Step S3301, the image receiving unit 2081 receives an image. In a web application/order software, the uploading of an image designed by the user is received. In an image forming device, an image for printing is received.
Then, at Step S3302, the compliance determining unit 2082 performs the compliance determination operation for determining whether the image received by the image receiving unit 2081 complies with the print guidelines.
Subsequently, at Step S3303, the compliance determining unit 2082 diverges according to the determination result.
If it is determined that the image complies with the print guidelines (Yes at Step S3303), then the system control proceeds to Step S3304 at which the inspecting unit 208 makes the image forming unit of the image forming device print the image determined to comply with the print guidelines.
On the other hand, if it is determined that the image does not comply with the print guidelines (No at Step S3303), then the system control proceeds to Step S3305 at which the correcting unit 2083 performs the correction operation for correcting the image received at the image receiving unit 2081.
Then, at Step S3306, the notifying unit 2084 presents the determination result by the compliance determining unit 2082 or presents the corrected image obtained by the correcting unit 2083.
Subsequently, the system control proceeds to Step S3304 at which the inspecting unit 208 makes the image forming unit of the image forming device print the image corrected by the correcting unit 2083.
Firstly, at Step S3401, the image receiving unit 2081 receives an image. In a web application/order software, the uploading of an image designed by the user is received. In an image forming device, an image for printing is received.
Then, at Step S3402, the compliance determining unit 2082 performs the compliance determination operation for determining whether the image received by the image receiving unit 2081 complies with the print guidelines.
Subsequently, at Step S3403, the compliance determining unit 2082 diverges according to the determination result.
If it is determined that the image complies with the print guidelines (Yes at Step S3403), then the inspecting unit 208 receives print setting/a print instruction at Step S3404. Herein, the number of print copies can be specified to the image receiving unit 2081.
Then, at Step S3405, the inspecting unit 208 inspects the printed target. At Step S3406, the inspecting unit 208 presents the inspection result.
Meanwhile, the operations performed at Steps S3405 and S3406 are performed as the operations of the inspecting unit 208. Hence, if the inspecting unit 208 functions only as an image forming device, those operations need not be performed.
Meanwhile, if it is determined that the image does not comply with the print guidelines (No at Step S3403), then the correcting unit 2083 performs the correction operation at Step S3407 for correcting the image received by the image receiving unit 2081.
Then, at Step S3408, the notifying unit 2084 presents the metainformation for enabling the user to either select the determination result by the compliance determining unit 2082 or select the corrected image obtained by the correcting unit 2083.
Subsequently, at Step S3409, the image receiving unit 2081 receives whether or not the printing is to be continued for printing the corrected image. Then, at Step S3410, the image receiving unit 2081 diverges based on the received result.
When the printing is to be continued with the corrected image (Yes at Step S3410), then the system control returns to Step S3404. In that case, it is possible to specify the number of print copies of the received image and the number of print copies of the corrected image.
On the other hand, when the printing is not to be continued with the corrected image (No at Step S3410), the system control proceeds to Step S3411 and it is determined whether manual editing is to be performed. If it is determined that manual editing is to be performed (Yes at Step S3411), then the system control proceeds to Step S3412 at which the image receiving unit 2081 receives manual editing. Then, the system control returns to Step S3402.
On the other hand, if it is determined that manual editing is not to be performed (No at Step S3411), then the system control proceeds to Step S3413 at which it is determined whether or not a different image is to be uploaded. If it is determined that a different image is to be uploaded (Yes at Step S3413), then the system control returns to Step S3401.
On the other hand, if it is determined that a different image is not to be uploaded (No at Step S3413), then the system control proceeds to Step S3414 at which it is determined whether or not to print the image that violates the print guidelines. If it is determined to print the image that violates the print guidelines (Yes at Step S3414), then the system control returns to Step S3404.
On the other hand, if it is determined not to print the image that violates the print guidelines (No at Step S3414), then it marks the end of the operations.
Meanwhile, the functions explained above according to embodiment can be implemented using one or more processing circuits. Herein, the term “processing circuit” either implies a processor that is implemented using an electronic circuit programmed to execute the functions using software; or implies a device, such as an application specific integrated circuit (ASIC), a digital signal processor (DSP), a field programmable gate array (FPGA), or a conventional circuit module, that is designed to execute the functions explained above.
The group of devices mentioned in the embodiment represents nothing more than one of a plurality of computing environments for implementing the embodiment disclosed in the present written description. In a particular embodiment, the information processing device 200 includes a plurality of computing devices called a server cluster. The computing devices are configured to communicate with each other via a communication link of an arbitrary type such as a network or a shared memory, and perform the operations disclosed in the present written description. In an identical manner, the information processing device 200 can be configured to include a plurality of computing devices configured to communicate with each other.
An embodiment provides the advantageous effect that it is possible to reduce the occurrence of printing failure by reducing the instances of printing images that violate the print guidelines representing the required conditions for image printing.
The above-described embodiments are illustrative and do not limit the present invention. Thus, numerous additional modifications and variations are possible in light of the above teachings. For example, at least one element of different illustrative and exemplary embodiments herein may be combined with each other or substituted for each other within the scope of this disclosure and appended claims. Further, features of components of the embodiments, such as the number, the position, and the shape are not limited the embodiments and thus may be preferably set. It is therefore to be understood that within the scope of the appended claims, the disclosure of the present invention may be practiced otherwise than as specifically described herein.
The method steps, processes, or operations described herein are not to be construed as necessarily requiring their performance in the particular order discussed or illustrated, unless specifically identified as an order of performance or clearly identified through the context. It is also to be understood that additional or alternative steps may be employed.
Further, any of the above-described apparatus, devices or units can be implemented as a hardware apparatus, such as a special-purpose circuit or device, or as a hardware/software combination, such as a processor executing a software program.
Further, as described above, any one of the above-described and other methods of the present invention may be embodied in the form of a computer program stored in any kind of storage medium. Examples of storage mediums include, but are not limited to, flexible disk, hard disk, optical discs, magneto-optical discs, magnetic tapes, nonvolatile memory, semiconductor memory, read-only-memory (ROM), etc.
Alternatively, any one of the above-described and other methods of the present invention may be implemented by an application specific integrated circuit (ASIC), a digital signal processor (DSP) or a field programmable gate array (FPGA), prepared by interconnecting an appropriate network of conventional component circuits or by a combination thereof with one or more conventional general purpose microprocessors or signal processors programmed accordingly.
Each of the functions of the described embodiments may be implemented by one or more processing circuits or circuitry. Processing circuitry includes a programmed processor, as a processor includes circuitry. A processing circuit also includes devices such as an application specific integrated circuit (ASIC), digital signal processor (DSP), field programmable gate array (FPGA) and conventional circuit components arranged to perform the recited functions.
Number | Date | Country | Kind |
---|---|---|---|
2020-165830 | Sep 2020 | JP | national |
2021-094784 | Jun 2021 | JP | national |