The present invention relates to an image reading apparatus, and an image forming system including the image reading apparatus.
There is known an image reading apparatus that optically reads an image on a document by emitting light to the document and receiving reflected light from the document using an image sensor or the like. In a case in which color is measured using an image sensor, red (R), green (G), and blue (B) luminance values (hereinafter “RGB value”) output by the image sensor are converted into a color value in a CIELab space (hereinafter “Lab value”). A look-up table (LUT) that is a color conversion table is used to convert an RGB value into a Lab value. In order to suppress the size of the color conversion table, the color conversion table is prepared so as to indicate the relationship between an RGB value and a Lab value for only some RGB values, rather than for all RGB values. Japanese Patent Laid-Open No. 2002-64719 discloses a technique in which an RGB value that is not included in a color conversion table is obtained by performing an interpolation calculation based on relationships between RGB values and Lab values included in the color conversion table.
In a case in which RGB values output by an image sensor are converted into Lab values by performing an interpolation calculation based on information indicated by a color conversion table, the color measurement accuracy is dependent on the size of the color conversion table, i.e., the number of correspondence relationships between RGB values and Lab values indicated by the color conversion table. However, for colors in the low-brightness color area, the image sensor reads different colors as having the same RGB value, and thus color measurement accuracy cannot be increased even if the size of the color conversion table is increased.
According to an aspect of the present disclosure, an image reading apparatus that reads a document includes: a reading unit comprising a light source configured to emit light to the document; a sensor configured to receive the light reflected by the document and output an analog signal based on a light reception result of the reflected light; and a converter configured to convert the analog signal output by the sensor into a digital signal, and output the digital signal obtained as a result of the conversion by the converter, and a controller. The controller is configured to: control the reading unit to perform a plurality of readings of the document; and change a reading condition of the reading unit so that a value of a first digital signal to be acquired in a first reading among the plurality of readings and a value of a second digital signal to be acquired in a second reading among the plurality of readings differ from one another.
Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
Hereinafter, embodiments will be described in detail with reference to the attached drawings. Note, the following embodiments are not intended to limit the scope of the claimed invention. Multiple features are described in the embodiments, but limitation is not made to an invention that requires all such features, and multiple such features may be combined as appropriate. Furthermore, in the attached drawings, the same reference numerals are given to the same or similar configurations, and redundant description thereof is omitted.
The image signal output by the image sensor 109 is collectively refers to a first image signal that is an analog signal and indicates a luminance value (pixel value) of the color red, a second image signal that is an analog signal and indicates a luminance value (pixel value) of the color green, and a third image signal that is an analog signal and indicates a luminance value (pixel value) of the color blue. The first image signal is output from a light-receiving element that receives light from the document 100 through a red filter provided in the image sensor 109. The second image signal is output from a light-receiving element that receives light from the document 100 through a green filter provided in the image sensor 109. The third image signal is output from a light-receiving element that receives light from the document 100 through a blue filter provided in the image sensor 109. In such a manner, the image sensor 109 according to the present embodiment is a three-channel sensor. The white LEDs 103, the mirrors 104, 105, and 106, the IR-cut filter 107 and the lens 108, the image sensor 109, and an A/D converter 202 (see
A motor 102 moves, in the left-right direction in the drawing, a first mirror unit 137 including the white LEDs 103 and the mirror 104, and a second mirror unit 138 including the mirrors 105 and 106. The left-right direction in the drawing is also referred to as a sub scanning direction. By repeatedly performing scanning of the document 100 by the reading unit while moving the first mirror unit 137 and the second mirror unit 138 in the sub scanning direction, the image sensor 109 outputs, to the image processing unit 110, an image signal that is a result of reading the entire document 100. The image signal that is the result of reading the entire document 100 indicates, for each pixel of the document 100, a red pixel value (luminance value), a green pixel value (luminance value), and a blue pixel value (luminance value). The image processing unit 110 processes the image signal from the image sensor 109 to generate a read image signal, and outputs the generated read image signal to an image processing unit 111 of the image forming apparatus 2000. The read image signal also indicates a red pixel value (luminance value), a green pixel value (luminance value), and a blue pixel value (luminance value). The processing performed by the image processing unit 110 includes analog-to-digital conversion processing for converting the red, green, and blue analog pixel values indicated by the image signal into digital pixel values (digital values). A white reference plate 139 is a reference member that is used for calibration of the white LEDs 103 and the image sensor 109.
Next, a configuration of the image forming apparatus 2000 will be described. During image forming, photoreceptors 124 to 127 are each charged to a predetermined potential by an unillustrated charger and driven to rotate in the counterclockwise direction in the drawing. Semiconductor lasers 112 to 115 each emit laser light based on an output signal from the image processing unit 111. Polygon mirrors 116 to 119 scan the photoreceptors 124 to 127 with laser light emitted from the semiconductor lasers 112 to 115. Thus, electrostatic latent images are formed on the photoreceptors 124 to 127. Developer units 120 to 123 respectively include black (K) toner, cyan (C) toner, magenta (M) toner, and yellow (Y) toner. The developer units 120 to 123 form toner images on the photoreceptors 124 to 127 by developing the electrostatic latent images on the photoreceptors 124 to 127 using the toner. Primary transfer rollers 151 to 154 transfer the toner images on the photoreceptors 124 to 127 onto an intermediate transfer belt 128. By transferring the toner images on the photoreceptors 124 to 127 onto the intermediate transfer belt 128 so as to be overlaid on one another, colors other than black, cyan, magenta, and yellow can be reproduced. During image forming, the intermediate transfer belt 128 is driven to rotate in the clockwise direction in the drawing. Thus, the toner images on the intermediate transfer belt 128 are conveyed to a position facing a secondary transfer roller 155.
Meanwhile, a sheet fed onto a conveyance path from a cassette 132, a cassette 133, a cassette 134, or a manual feed tray 131 is conveyed to the position facing the secondary transfer roller 155 by rollers 130. The secondary transfer roller 155 transfers the toner images on the intermediate transfer belt 128 onto the sheet. A fixing device 136 fixes the toner images onto the sheet by applying heat and pressure to the sheet. After the toner images have been fixed, the sheet is discharged onto a discharge tray 135. A control unit 170 controls the entire image forming system.
Levels (signal values), such as voltage values, of analog image signals output by the image sensor 109 correspond to the red, green, and blue luminance values of each pixel of the document 100, and change depending on a set value of the light-emission luminance (light-emission intensity) of the white LEDs 103, and set values of the accumulation time and sensitivity of the light-receiving elements of the image sensor 109. The CPU 401 determines a reading condition, i.e., the set value of the light-emission luminance of the white LEDs 103 and the set values of the accumulation time and the sensitivity of the light-receiving elements of the image sensor 109, based on levels of analog image signals output by the image sensor 109 when the white reference plate 139 is read.
Analog image signals output by the image sensor 109 are converted into digital image signals by the A/D converter 202. In other words, the A/D converter 202 converts the levels of the analog image signals into digital values. For each pixel of the document 100, the analog image signals and the digital image signals indicate red, green, and blue luminance values. Note that luminance values are indicated by the levels of the analog image signals. That is, the luminance values indicated by the analog image signals are analog values. On the other hand, the luminance values indicated by the digital image signals are digital values. In the following description, a combination of the three luminance values of the colors red, green, and blue is also referred to as a color value. Furthermore, the analog image signals and the digital image signals are collectively referred to as image signals.
Returning to
In the present embodiment, one document 100 is read repeatedly multiple times under different reading conditions. The image processing unit 110 generates one read image signal based on a plurality of image signals obtained as a result of multiple iterations of reading. An image synthesis processing unit 407 performs synthesis processing on the plurality of image signals obtained as a result of multiple iterations of reading the document 100. A RAM 408 is used to temporarily store image signals in the synthesis of image signals. The synthesis processing will be described in detail later. The number of times reading is performed can be set to twice or more times. In the following, reading processing in a case in which the number of times reading is performed is twice will be described.
In step S13, the CPU 401 sets reading condition #2 for the second iteration. Reading condition #2 is a condition that increases the levels of analog image signals output by the image sensor 109 so as to be higher than the levels thereof under reading condition #1, and is one of the following.
Note that reading condition #2 may be a predetermined condition.
Furthermore, the reading conditions may be configured such that a conversion characteristic that the A/D converter 202 uses to convert analog signals into digital signals is switched to a different conversion characteristic. For example, a conversion characteristic in which the sensitivity (resolution) for low luminance is higher than the sensitivity (resolution) for high luminance is used in reading condition #1 for the first iteration, and a conversion characteristic in which the sensitivity (resolution) for high luminance is higher than the sensitivity (resolution) for low luminance is used in reading condition #2 for the second iteration.
In step S14, the CPU 401 reads the document 100 under reading condition #2 and acquires digital image signals #2. Reading condition #2 is a condition that increases the levels of analog image signals output by the image sensor 109 so as to be higher than the levels thereof under reading condition #1. Thus, under reading condition #2, the level of an analog image signal indicating a luminance value of a color in an area of the document 100 where brightness is not low may be saturated. However, the quantization error in the A/D converter 202, etc., decrease for a color in an area where brightness is low. Thus, the signal-to-noise ratio is improved and color separation performance improves for a color in an area where brightness is low. Note that, in the second iteration of reading, the same shading coefficient as that in the first iteration of reading is used. In step S15, the image synthesis processing unit 407 performs synthesis processing for generating a read image signal based on image signal #1 and image signal #2. In step S16, the image synthesis processing unit 407 outputs the read image signal to the image forming apparatus 2000 as a result of reading the document 100.
Next, the synthesis processing in step S15 in
For example, suppose that, as reading condition #2, a condition in which the accumulation time of the light-receiving elements is increased to twice that in reading condition #1, and the sensitivity of the light-receiving elements is increased to four times that in the reading condition #1 is set. That is, reading condition #2 is set so that levels that are eight times those of analog image signal #1 under reading condition #1 are output.
In the second iteration of reading, the levels of analog image signals #2 from the image sensor 109 are eight times those in the first iteration. Thus, the level of analog image signal #2 indicating a luminance value of a color of the pixel for which the digital value in the first iteration of reading was higher than or equal to 128 is saturated. On the other hand, the level of analog image signal #2 indicating a luminance value of a color of the pixel for which the digital value in the first iteration of reading was equal to or lower than 127 is not saturated, and the reading accuracy of the luminance value improves as a result of the increase in level. The image synthesis processing unit 407 generates a read image signal by using the digital value in the first iteration for a color for which the digital value in the result of the first iteration of reading was higher than or equal to 128 and using the digital value in the second iteration for a color for which the digital value in the result of the first iteration of reading was equal to or lower than 127. Note that digital values corresponding to the first iteration are multiplied by eight to balance the levels thereof with those of digital values corresponding to the second iteration. Thus, the digital values of the read image signal are (R, G, B)=(6480, 266, 2952) as illustrated in
Note that, in a case in which the number of times reading is performed is twice or more times, a reading condition to be used in the first iteration of reading and reading conditions to be respectively used in the second and subsequent iterations of reading are set. The reading conditions to be used in the second and subsequent iterations of reading differ from one another. The image processing unit 110 determines a luminance value higher than or equal to a first luminance value in the read image signal based on a digital value obtained from the level of analog image signal #1 obtained in the first iteration of reading. On the other hand, the image processing unit 110 determines a luminance value lower than the first luminance value in the read image signal based on a digital value obtained from the levels of analog image signals #2 obtained in the second and subsequent iterations of reading. The first luminance value can be determined based on the reading conditions.
For example, suppose that the number of times reading is performed is three times, and reading conditions for the second and third iterations are respectively set so that the levels of analog image signals are 4 times and 16 times those with the first reading condition. Furthermore, in the following description, “luminance values” of the read image signal are indicated based on luminance values obtained in the first iteration of reading. In this case, a color of the pixel for which the luminance value was higher than or equal to 256 in the first iteration of reading cannot be read in the second iteration of reading because the level of the analog image signal is saturated. Furthermore, a color of the pixel for which the luminance value was higher than or equal to 64 in the first iteration of reading cannot be read in the third iteration of reading because the level of the analog image signal is saturated. That is, a color of the pixel for which the luminance value is higher than or equal to 256 in the first iteration of reading can only be read in the first iteration of reading. Thus, in this case, the first luminance value is set to 256. That is, the image processing unit 110 determines a luminance value higher than or equal to 256 in the read image signal based on a digital value obtained from the level of an analog image signal obtained in the first iteration of reading. On the other hand, the image processing unit 110 determines a luminance value lower than 256 in the read image signal based on a digital value obtained from the levels of analog image signals obtained in the second and subsequent iterations of reading.
Note that the range of luminance values that can be read in the second iteration of reading is 0-255 based on the luminance values in the first iteration of reading. Furthermore, the range of luminance values that can be read in the third iteration of reading is 0-63 based on the luminance values in the first iteration of reading. That is, a color of the pixel for which the luminance value in the first iteration of reading is within the range of 64-255 cannot be read in the third iteration of reading because the level of the analog image signal is saturated. Thus, the image processing unit 110 determines a luminance value within the range of 64-255 in the read image signal based on a digital value obtained from the level of the analog image signal obtained in the second iteration of reading. On the other hand, in regard to a color of the pixel for which the luminance value in the first iteration of reading is within the range of 0-63, the analog image signal level is not saturated neither in the second iteration nor the third iteration. In this case, for a color of the pixel for which the luminance value in the first iteration of reading is within the range of 0-63, the image processing unit 110 performs the determination based on the maximum value among the levels of the analog image signals in the second and third iterations, or specifically, the level of the analog image signal in the third iteration.
Note that, in the present embodiment, reading condition #1 in the first iteration of reading was set so that levels of analog image signals are not saturated when the document 100 is read. However, the iteration among the multiple iterations of reading to which reading condition #1 is to be applied can be determined as appropriate, and the present embodiment is not limited to the case in which reading condition #1 is applied to the first iteration of reading.
More generally, suppose that the image reading apparatus 1000 performs N iterations (where N is an integer or 2 or more) of reading. Reading conditions #1 to #N for the 1st to Nth iterations differ from one another. That is, analog image signals that the image sensor 109 outputs upon reading the document 100 under reading conditions #1 to #N have different levels. Note that one of the N reading conditions is set so that the levels of analog image signals that the image sensor 109 outputs upon reading the document 100 are not saturated. In the following, it is supposed that reading condition #1 is the condition under which none of the levels output from the image sensor 109 are saturated when the document 100 is read.
Based on reading conditions #1 to #N, the image synthesis processing unit 407 groups luminance values of digital image signals #1 acquired under reading condition #1 into ranges #1 to #N. Here, luminance values within range #n (where n is an integer of 1 to N) are continuous and associated with reading condition #n. Note that range #n associated with reading condition #n includes only luminance values with which image signals are not saturated even if the document 100 is read under reading condition #n. For example, in the above-described example in which reading is performed three times, range #1, range #2, and range #3 associated with reading condition #1, reading condition #2, and reading condition #3 for the first iteration, the second iteration, and the third iteration are a range of luminance values of 256 to 1023, a range of luminance values of 64 to 256, and a range of luminance values of 0-63, respectively. If the luminance value of one color of one pixel indicated by digital image signal #1 acquired under reading condition #1 is included in range #n, the image synthesis processing unit 407 uses, as the luminance value of the color of the pixel indicated by the read image signal, the luminance value of the color of the pixel indicated by digital image signal #n acquired under reading condition #n. The image synthesis processing unit 407 generates the read image signal in such a manner. Note that, upon generating the read image signal, the image synthesis processing unit 407 equalizes the resolutions (the number of bits indicating digital values) of digital image signals #1 to digital image signals #N.
According to the synthesis processing described in the present embodiment, color information of the document 100 can be read accurately because the color separation performance of the image reading apparatus 1000 improves. Because color information of the document 100 can be accurately obtained from a read image signal output by the image reading apparatus 1000, the resemblance of the color tone of printed object output by the image forming apparatus 2000 with that of the document 100 can be increased.
The more accurately the image reading apparatus 1000 can read color information of the document 100, the higher the resemblance of the color tone of printed object output by the image forming apparatus 2000 with that of the document 100. For example, in a case in which there is image data (document data) of a color sample (reference printed object), the resemblance of the color tone of printed object output by the image forming apparatus 2000 with that of the reference printed object can be increased with higher accuracy by using a system technique called color sample matching. Specifically, first, the image forming apparatus 2000 is caused to form an image on a sheet based on the document data, and test printed object is thereby output. Both this test printed object and the reference printed object, which is a color sample, are read using the image reading apparatus 1000, and print data to be used to produce printed object corresponding to the reference printed object using the image forming apparatus 2000 is created based on the difference between two read image signals, and printed object is output based on the print data. According to this configuration, the resemblance of the color tone of the printed object with that of the reference printed object can be increased.
In such a manner, the image forming system according to the present embodiment can be used as a color sample matching system. Note that the image processing of creating print data to be used in printing by the image forming apparatus 2000 based on the difference between two read image signals may be performed by the control unit 170 or the image processing unit 111 of the image forming apparatus 2000, for example. Furthermore, the image processing may be performed by the image processing unit 110 of the image reading apparatus 1000. Furthermore, the image processing may be performed by an image processing apparatus that is communicably connected to the image forming apparatus 2000 and the image reading apparatus 1000. In this case, the image forming system may include the image forming apparatus 2000, the image reading apparatus 1000, and the image processing apparatus.
Furthermore, the image forming system including the image reading apparatus 1000 according to the present embodiment is applicable to color adjustment technology. The color tone of printed object printed by the image forming apparatus 2000 is prone to change due to individual differences, and temporal and environmental changes. Color adjustment technology is technology for suppressing such change in color tone. Before using the image forming apparatus 2000 to output printed object, the image forming apparatus 2000 is caused to output a sheet on which a color adjustment chart is formed by causing the image forming apparatus 2000 to form an image based on image data of the color adjustment chart. By reading this sheet on which the color adjustment chart has been formed using the image reading apparatus 1000, the image reading apparatus 1000 outputs a read image signal (read data) that is a result of reading the color adjustment chart. Based on the read data, a printer profile of the image forming apparatus 2000 until image data is output as printed object can be ascertained. Then, by adjusting image forming conditions of the image forming apparatus 2000 based on the ascertained printer profile, or by converting, based on the ascertained printer profile, image data that the image reading apparatus 1000 outputs to the image forming apparatus 2000, the resemblance of the color tone of printed object with that of a document 100 can be increased.
Note that the adjustment of image forming conditions of the image forming apparatus 2000 based on read data can be performed by the image forming apparatus 2000. Furthermore, the conversion, based on read data, of image data to be output to the image forming apparatus 2000 can be performed by the image reading apparatus 1000.
Next, a second embodiment will be described focusing on differences from the first embodiment. In the present embodiment, the necessity of the second and subsequent iterations of reading is determined based on the result of the first iteration of reading, and the second and subsequent iterations of reading and the synthesis processing are performed if it is determined that the second and subsequent iterations of reading are necessary. On the other hand, the result of the first iteration of reading is output as a read image signal if it is determined that the second and subsequent iterations of reading are unnecessary.
Furthermore, a configuration may be adopted in which the maximum value among luminance values in the result of the first iteration of reading is determined, the second and third iterations of reading are performed with reading conditions set based on the maximum value, and the synthesis processing is performed based on the results of the second and third iterations of reading. Note that the reading conditions for the second and third iterations differ from one another, and one of the reading conditions is a condition that increases the maximum luminance value as much as possible within a range in which analog image signals are not saturated. By setting reading conditions based on the maximum value among luminance values in the result of the first iteration of reading, a read image signal that has higher color separation performance can be generated. Furthermore, a further increase in color separation performance is possible by determining a reading condition separately for each of multiple areas of the document 100 based on the result of the first iteration of reading. Specifically, a maximum luminance value and a minimum luminance value are obtained for each of the multiple areas based on the result of the first iteration of reading, and the optimal reading condition for the area is determined in accordance with the maximum and minimum values. Then, reading of an area is performed using the reading condition determined with respect to the area. Furthermore, synthesis processing is performed for each area, and a read image signal is generated based on the synthesis results of the areas. Furthermore, a configuration may be adopted such that, if there is image data of the document 100, the setting of a reading condition under which analog image signals are not saturated and the setting of areas are performed using the image data of the document 100 instead of using the result of the first iteration of reading.
Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Application No. 2023-086430, filed May 25, 2023, which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | Kind |
---|---|---|---|
2023-086430 | May 2023 | JP | national |