The present invention relates to an output information generating method of an image reading device that acquires a scanned image containing pixels with a pixel value greater than a white reference value, and an image reading device.
This application is based upon Japanese Patent Application 2018-088446 filed on May 1, 2018, the entire contents of which are incorporated by reference herein.
JP-A-2008-271386 describes an image reading device of the related art. More specifically, JP-A-2008-271386 describes an image reading device that, when a document is scanned and a fluorescent color is detected, reduces the amount of light per pixel and then scans again in order to improve the results of scanning documents containing fluorescent colors.
One problem with the related art as described above is that reading a document containing fluorescent colors is time-consuming.
An output information generating method of an image reading device according to the invention includes an acquisition step of reading a document and acquiring a scanned image; an identification step of identifying a first area containing specific pixels, which are pixels with a pixel value greater than a white reference value, in the scanned image; and a generating step of generating output information including an output image based on the scanned image, and information identifying the first area.
An image reading device according to another aspect of the invention includes an acquirer configured to read a document and acquire a scanned image; an identifier configured to identify a first area containing specific pixels, which are pixels with a pixel value greater than a white reference value, in the scanned image; and a generator configured to generate output information including an output image based on the scanned image, and information identifying the first area.
Other objects and attainments together with a fuller understanding of the invention will become apparent and appreciated by referring to the following description and claims taken in conjunction with the accompanying drawings.
An output information generating method of an image reading device, and an image reading device, according to a preferred embodiment of the invention are described below. This embodiment describes a sheet feed scanner configured to read while conveying a document as an example of an image reading device according to the invention.
A paper feed roller 11, a pair of upstream conveyance rollers 12, a pair of downstream conveyance rollers 13, a paper detector 20, and an image scanner 30, are disposed along the conveyance path 5. A document presser 71 is also disposed to the 5 opposite the image scanner 30.
The paper feed roller 11 is disposed near the document feed port 7. The paper feed roller 11 feeds a document S introduced from the document feed port 7 to the conveyance path 5.
The upstream conveyance rollers 12 are located upstream from the image scanner 30 in the conveyance direction of the document S, that is, on the −Y side of the image scanner 30 as seen in
The paper detector 20 includes a document sensor 21 and a paper feed sensor 22. The paper feed sensor 22 is disposed downstream from the document sensor 21 in the conveyance direction of the document S, that is, on the +Y side.
In the configuration in this example the document sensor 21 is disposed upstream from the upstream conveyance rollers 12, but may be disposed on the downstream side of the upstream conveyance rollers 12. Likewise, the paper feed sensor 22 is disposed on the downstream side of the upstream conveyance rollers 12, but may be disposed upstream from the upstream conveyance rollers 12.
The document sensor 21 is a sensor for detecting that a document S was supplied to the conveyance path 5. The paper feed sensor 22 is a sensor for detecting the document S was conveyed to an indexing position. The detection position of the paper feed sensor 22 is therefore the indexing position.
The image scanner 30 includes a white light source 31, a lens array 32, a color sensor 33, and a cover glass 35. A color sensor module comprising these components and an A/D converter 34 is used as the image scanner 30 in this embodiment. Note that the color sensor 33 is an example of an image reading sensor.
Light emitted from the white light source 31 passes through the optically transparent cover glass 35, and is incident to the document presser 71 or the document S.
The document presser 71 prevents the document S from floating, and prevents uneven illumination of the document. The document presser 71 also functions as a white reference plate for setting the white reference value.
Light reflected from the document presser 71 or the document S passes the lens array 32 and is incident to the color sensor 33. The color sensor 33 is configured with three rows of color sensors for detecting red, green, and blue (RGB) colors. The color sensor rows comprise multiple sensor chips arrayed in a direction intersecting the conveyance direction of the document S, that is, along the X-axis.
The conveyor 10 includes a paper feed motor 14, an upstream conveyance motor 15, and a downstream conveyance motor 16.
The paper feed motor 14 drives the paper feed roller 11. The upstream conveyance motor 15 and downstream conveyance motor 16 respectively drive the drive rollers of the upstream conveyance rollers 12 and downstream conveyance rollers 13. Note that the upstream conveyance rollers 12 and downstream conveyance rollers 13 may also be driven by a common conveyance motor.
The paper detector 20 includes the document sensor 21 and paper feed sensor 22. The document sensor 21 and paper feed sensor 22 are optical sensors comprising an emitter and a photodetector, for example, illuminate the conveyance path 5 with light from the emitter, and detect the presence of a document S by the photodetector detecting the reflection of the emitted light. Based on the detection result from the paper detector 20, the controller 60 determines the position of the document S on the conveyance path 5. For example, when the detection result from the document sensor 21 changes from No Document to Document Detected, the controller 60 determines that a document S was introduced to the conveyance path 5. When the detection result from the paper feed sensor 22 changes from No Document to Document Detected, the controller 60 determines that a document S was conveyed to the indexing position.
The image scanner 30 includes a white light source 31, lens array 32, color sensor 33, and A/D converter 34. The image scanner 30 reads the document S according to the color/monochrome color setting, which is one of the scanner settings, and acquires a scanned image in color or a scanned image in gray scale.
The white light source 31 is, for example, a white LED, and illuminates the area to be scanned from the −Y side and the +Y side.
The lens array 32 in this example is a SELFOC lens array,and forms a continuous image by overlaying same-size erect images from multiple lenses arrayed in a direction intersecting the conveyance direction of the document S, that is, along the X-axis.
The color sensor 33 is, for example, a CCD (Charge Coupled Device) sensor or a CMOS (Complementary Metal Oxide Semiconductor) sensor forming RGB sensor lines. A color filter for limiting the wavelengths of the incident reflected light to red, green, and blue is disposed to the respective sensor lines. Each sensor line comprises multiple sensor chips arrayed along the X-axis, and each sensor chip includes a photoelectric conversion element. The amount of light picked up by the photoelectric conversion element is accumulated as an electric charge, and the charge is output as an analog signal. The A/D converter 34 converts the analog signal output from the image sensor 33 to a digital signal.
The image processor 40 applies various image processes to the scanning data acquired by the image scanner 30, and generates an output image. The image processor 40 in this example is a dedicated ASIC (Application Specific Integrated Circuit) for image processing. While described in detail below, the image processor 40 applies different image processes based on whether or not specific pixels, which are pixels with a pixel value exceeding the white reference value, are contained in the scanned image. The specific pixels are pixels with a color having reflectance greater than white, such as fluorescent color pixels. To improve the reproducibility of the color tones of fluorescent colors, the image processor 40 in this embodiment applies different normalization processes to fluorescent color areas E1 containing specific pixels (see
The output section 50 outputs the output image after image processing by the image processor 40 to an external device 100. The external device 100 may be a storage device or a printer. The storage device may be a network storage device. A printer may be any printer connected by a cable or network. The external device 100 may also output the output image to a PC (Personal Computer) or display device.
The output image may also being output to an external device 100, and the output image may be output to an internal storage device of the image reading device 1. If the image reading device 1 is configured with an integrated printer, the output image may also be output to and printed by the integrated printer.
The controller 60 is connected to and controls operation of the devices described above. The controller 60 in this example comprises a processor (such as a CPU (Central Processing Unit), ASIC, or combination thereof), ROM (Read Only Memory), and RAM (Random Access Memory).
By the configuration described above, the controller 60 determines whether or not specific pixels are in the scanned image acquired by reading the document S, and if specific pixels are included, identifies the fluorescent color areas E1 containing the specific pixels. The image processor 40 applies a first normalization process, which is a normalization process using a value exceeding the white reference value as the maximum value, to the fluorescent color areas E1; and applies a second normalization process, which is a normalization process using the white reference value as the maximum value, to normal color areas outside the fluorescent color areas E1. This process can be expected to improve the reproducibility of the color tones of fluorescent colors inside the fluorescent color areas E1.
Problems with conventional methods of applying a normalization process using the white reference value as the maximum value to fluorescent color areas E1 are described next.
Phosphors such as used in fluorescent markers are characterized by changing a portion of the incident light to a different color. For example, yellow fluorescent colors are particularly bright because when blue light is incident, part of the blue light is absorbed, that is, green and red light are reflected, and a particularly vivid yellow is produced.
When yellow phosphors are read with an image scanner 30 configured with color sensors as described above, the green and red output values are higher than the white reference value, exceeding the dynamic range of the color sensor 33, and part of the blue light is absorbed as described above, and resulting in reading a hue that differs from the hue in the document S. Note that the upper limit of the dynamic range of the color sensor 33 is referred to below as the maximum output value of the color sensor 33.
Before reading a document S, the image reading device 1 according to this embodiment of the invention therefore sets the scanning settings to a first setting in which the white reference value is a value lower than 255, the maximum output value of the color sensor 33.
The image reading device 1 according to this embodiment also applies to the fluorescent color areas E1 in the scanned image a first normalization process using a value greater than the white reference value as the maximum value. This suppresses fluorescent colors being read as a different hue.
The image reading device 1 also applies a second normalization process using the white reference value as the maximum value to normal color areas. This suppresses normal colors from becoming too dark.
These processes are described in detail below.
If the image reading device 1 determines the document S was not conveyed to the indexing position, the image reading device 1 determines based on the detection result from the document sensor 21 whether or not a document S was introduced to the conveyance path 5.
The image reading device 1 reports an error if it determines a document S was not introduced to the conveyance path 5, and conveys the document S to the indexing position if it determines a document S was introduced to the conveyance path 5.
The image reading device 1 also sets the white reference value as part of the initialization process. The image reading device 1 reads the document presser 71, which is configured to function as a white reference plate, and sets the exposure time of the color sensor 33 and the light output level and gain of the white light source 31 so that the output value goes to 160 in this example. When the above initialization process ends, the image reading device 1 conveys and reads the document S, and acquires a scanned image (S12).
The image reading device 1 then determines whether or not there are specific pixels in the acquired scanned image (S13). In this embodiment, if there is a specific number of consecutive pixels with a pixel value greater than 160, which is the white reference value, detected along the X-axis or the Y-axis, and a group of these pixels occupies a specific area or greater, those pixels are determined to be specific pixels.
The specific pixels being consecutive means that there is a continuous train of specific pixels with no spaces equal to or greater than a specific number of pixels between the pixels. For example, if the specific pixel count is two, and every other pixel is a specific pixel, the specific pixels are considered to be consecutive. This prevents identifying as specific pixels random pixels having a pixel value greater than 160, the white reference value, due to noise. If the scanned image is determined to contain specific pixels (S13: Yes), the image reading device 1 then identifies the fluorescent color areas E1 containing the specific pixels (S14).
A method of identifying the fluorescent color areas E1 is described next.
Note also that if the set of pixels inside the area of consecutive specific pixels occupies an area of a specific size or greater, those pixels may be treated as pixels in a normal color area.
When a fluorescent color area E1 is identified, the image reading device 1 sets a first reference value, which is the reference value for a first normalization process applied to the identified fluorescent color area E1 (S15). More specifically, the image reading device 1 sets the highest pixel value of the pixels in the identified fluorescent color area E1 as the first reference value.
A reference value as used herein denotes the value that is the maximum output value of the color sensor 33 after the normalization process. When there are multiple fluorescent color areas E1 in the scanned image, the image reading device 1 sets a first reference value for each fluorescent color area E1.
The image reading device 1 then applies image processes including the first normalization process to the fluorescent color area E1 (S16). The image reading device 1 then applies image processes including the second normalization process using the white reference value set in the initialization process as the reference value to the normal color areas (S17). The image reading device 1 then generates an output image resulting from the image processes applied in S16 and S17 to the scanned image (S18), and outputs the generated output image to the external device 100 (S19).
However, if the image reading device 1 determines there are no specific pixels in the scanned image (S13: No), S14-S16 are skipped.
Specific examples of the processes executed in S16 and S17 are described next. This example describes operation when scanning the document S with the color mode set to color and generating a color output image.
The pixel set indicated by the reference numeral 82 shows the pixel values after applying the first normalization process and color conversion to pixel set 81. When generating a color output image, the image reading device 1 therefore applies the same linear conversion to RGB pixels.
Because this process calculates the pixel values after monochrome conversion based only on the green pixel values, linear conversion is applied only to the green pixels. Note that this process is not limited to using only one of the RGB pixel values, and the pixel values after monochrome conversion may be calculated using other known techniques, including a weighted average method that averages specifically weighted RGB pixel values.
Comparing
As described above, the image reading device 1 according to this embodiment applies different normalization processes to fluorescent color areas E1 containing specific pixels in the scanned image, and normal color areas other than fluorescent color areas E1. More specifically, the image reading device 1 applies a first normalization process using a value greater than the white reference value as the maximum value to the fluorescent color areas E1, and applies a second normalization process using the white reference value as the maximum value to normal color areas. As a result, the pixel values are corrected so that the pixel values of pixels in the scanned image are higher in normal color areas than in fluorescent color areas E1. The image reading device 1 according to the third embodiment of the invention also reads the document S with the white reference value of the scanning settings a lower value than the maximum output value of the color sensor 33.
By setting the white reference value to a lower value than the maximum output value of the color sensor 33, this configuration can prevent saturation of the output values of the color sensor 33. Furthermore, by setting a lower white reference value, the scanned results of normal colors are darker than in the actual document S. However, because the second normalization process applied to the normal color areas corrects the pixel values so that the pixel values of pixels in the scanned image are higher than in the fluorescent color areas E1, normal colors do not become too dark and can be reproduced with natural brightness.
Furthermore, because the pixel values in the fluorescent color areas E1 are corrected by a first normalization process using the highest pixel value in the fluorescent color area E1 as the reference value, the contrast between the brightness of fluorescent colors in the fluorescent color area E1 and normal colors can be more accurately reproduced.
In addition, because the image reading device 1 according to this embodiment does not require a pre-scanning step to detect the presence of specific pixels, documents S containing fluorescent colors can be read in the same amount of time as documents S not containing fluorescent colors.
A second embodiment of the invention is described next. The first embodiment described above applies different processes to the fluorescent color area E1 and normal color areas, but this embodiment enables good reproduction of documents S containing fluorescent colors on the external device 100 side by adding information identifying fluorescent color areas E1 to the output image based on the scanned image.
This embodiment is described focusing on differences with the first embodiment. Like parts in this embodiment and the first embodiment described above are identified by like reference numerals, and further description thereof is omitted. Variations applicable to like elements in this embodiment and the first embodiment are also applicable to this embodiment.
The image reading device 1 then determines if specific pixels are contained in the acquired scanned image (S23), and if there are no specific pixels (S23: No), executes normal image processing steps such as shading correction, gamma correction, line correction, and skew correction (S24). The image reading device 1 then generates an output image by applying these normal image processes to the acquired scanned image (S25), and outputs the generated output image to the external device 100 (S26).
If the scanned image is determined to contain specific pixels (S23: Yes), the image reading device 1 identifies the fluorescent color areas E1 containing the specific pixels (S27). Step S27 is an example of an identification step in the accompanying claims.
The image reading device 1 may also apply a shape correction process to correct the shape of the area so that the fluorescent color area E1 is rectangular.
The image reading device 1 then applies image processes including a color conversion process to the identified fluorescent color area E1 (S28). This color conversion process is described next.
The image reading device 1 converts the color of specific pixels in the fluorescent color area E1 to a color based on surrounding pixels outside the fluorescent color area E1. When specific pixels have a fluorescent color, this color conversion process can delete the fluorescent color. Image processes including a color conversion process include normal image processes such as shading correction, gamma correction, line correction, and skew correction.
Returning to
Note that “pixels corresponding to a fluorescent color area E1 in the scanned image” means “pixels contained in an area in the output image that correspond to a fluorescent color area E1 in the scanned image.” An area in the output image that corresponds to a fluorescent color area E1 in the scanned image is referred to below as a ‘fluorescent color correspondence area of the output image.’
Steps S28 to S30 are an example of a generating step in the accompanying claims. In addition, area information is an example of information identifying a first area.
The color information is a channel providing RGB color information. The color information is expressed as an 8-bit pixel value for each color.
The α channel is a channel indicating area information. The area information is a 1-bit value indicating whether or not the pixel is in a fluorescent color correspondence area of the output image, that is, whether or not the pixel is in an fluorescent color area E1 in the scanned image. If the pixel is in a fluorescent color correspondence area of the output image, this bit is set to 1; if the pixel is not in a fluorescent color correspondence area of the output image, the bit is set to 0. In other words, an area information value of 1 indicating the pixel is in a fluorescent color correspondence area is added to all pixels in a fluorescent color correspondence area of the output image.
Returning to
Information identifying a fluorescent color area E1 does not necessarily need to be added to individual pixels as in the area information above, and may be added as coordinate data identifying the location of the fluorescent color area E1, for example, to the file header or a separate file. When added as a separate file, the file set of the image file of the output image and the file identifying the locations of fluorescent color areas E1 is equivalent to the output information.
The information identifying the fluorescent color areas E1 may be expressed by three or more coordinate data, and the area defined by the coordinates based on the coordinate data may be identified as a fluorescent color area E2. When the document S is a text document, the page count, number of lines, number of columns, number of characters from a specific character, and other information may also be included as information identifying a fluorescent color area E1.
Information identifying a fluorescent color area E1 is also not limited to the location of the fluorescent color, and may include information identifying what color the fluorescent color is. For example, a pixel that is not a fluorescent color may be identified by a 0 bit in the a channel, and fluorescent colors may be identified by a specific value identifying a specific fluorescent color.
RGB pixel values are used as an example of color information above, but if the image processor 40 is configured to convert colors for printing, CMYK color information may be generated. Color information based on YCbCr, Lab, and other color spaces may also be generated.
As described above, the image reading device 1 according to the second embodiment of the invention adds area information identifying pixels in a fluorescent color area E1 to the pixels in the fluorescent color correspondence area of the output image based on the scanned image, and generates output information that is output to an external device 100. Based on the area information, an external device 100 in this configuration can appropriately process fluorescent color correspondence areas in the output image.
Because fluorescent colors are generally used in places a user wants to highlight, by outputting the output information to an external device 100, fluorescent color correspondence areas can be desirably converted and highlighted, and the fluorescent color correspondence areas can be emphasized, on the external device 100 side.
If the external device 100 is a printer, or if the image reading device 1 is part of a multifunction device including a printer unit, a printed image resulting from the desired color conversion and highlighting process can be output by the printer or the printer unit. In addition, if the printer or the printer unit can print using fluorescent materials, a printout accurately reproducing the document S can be produced.
When the document S is scanned with the color setting set to the monochrome mode, the output image will be produced with the fluorescent color correspondence area reproduced as a solid gray or black. However, because the printer or printer unit in this configuration can highlight the fluorescent color areas, the fluorescent color area E1 can be highlighted regardless of the color setting.
Yet further, by the image reading device 1 adding area information to pixels in the fluorescent color correspondence area, the external device 100 that acquires the output information can determine the fluorescent color correspondence area by pixel unit.
Furthermore, because there is no need to add coordinate data as information identifying a fluorescent color area E1, problems such as the amount of data in the output information increasing because the shape of the fluorescent color correspondence area is complex can be avoided.
In addition, by adding area information to the a channel, the external device 100 that acquires the output information can easily determine whether or not a particular pixel is in a fluorescent color correspondence area.
By applying a color conversion process that converts the color of specific pixels in fluorescent color areas E1 to a color based on the pixels surrounding the fluorescent color area E1, the image reading device 1 can also execute a wide range of processes on the printer or other device that acquires the output information. In other words, when the image reading device 1 does not execute a color conversion process, the output image will be output with fluorescent colors remaining in the fluorescent color correspondence area, a highlighting process will be applied to fluorescent colors on the external device 100 side, and the appearance of the image may be impaired. As a result, the processes that can be executed on the external device 100 side are limited when color conversion processing is not applied on the image reading device 1, but because the image reading device 1 according to this embodiment runs the color conversion process, this problem is prevented.
A third embodiment of the invention is described next. The second embodiment described above generates output information including the output image and information identifying fluorescent color areas E1. This embodiment generates an output image by merging an image identifying the fluorescent color areas E1 with the scanned image.
This embodiment is described focusing on differences with the second embodiment. Like parts in this embodiment and the first and second embodiments described above are identified by like reference numerals, and further description thereof is omitted. Variations applicable to like elements in this embodiment and the first and second embodiments are also applicable to this embodiment.
When the document S reading process starts, the image reading device 1 first executes the initialization process (S41), and then acquires a scanned image (S42). The image reading device 1 then determines if specific pixels are contained in the acquired scanned image (S43), and if it determines there are no specific pixels (S43: No), executes normal image processing steps such as shading correction, gamma correction, line correction, and skew correction (S44).
If the scanned image is determined to contain specific pixels (S43: Yes), the image reading device 1 identifies the fluorescent color areas E1 containing the specific pixels (S45). The image reading device 1 also generates an area identification image I (see
The image reading device 1 then executes an image process including a color conversion process and merging the area identification image I with the fluorescent color areas E1 in the scanned image (S47). The color conversion process is the same as in the second embodiment.
Generating the area identification image I and the merging process are described below.
Returning to
Note that the area identification image I is not limited to a contour line surrounding the fluorescent color area E1 as shown in
If a masking image is merged with the scanned image, the area outside the fluorescent color area E1 is masked, and the fluorescent color area E1 can be emphasized as a result. The overlaid image and the masking image may also be images with transparency, or images generated on a different layer than the scanned image.
The image reading device 1 may also change the type of the area identification image I and the color according to the color information of the specific pixels in the fluorescent color area E1 and the color setting. The image reading device 1 may also provide multiple area identification images I of different types and colors as candidates for the area identification image I, and merge the area identification image I selected by the user.
As described above, the image reading device 1 according to the third embodiment of the invention generates an output image by applying a merging process that merges an area identification image I identifying a fluorescent color area E1 with the scanned image. This configuration enables highlighting the fluorescent color area E1.
If the external device 100 is a printer, or if the image reading device 1 and a printer unit are an integrated multifunction device, the printer or the printer unit may not be able to satisfactorily reproduce the fluorescent colors, but this embodiment of the invention can reliably highlight the fluorescent color areas E1 in any situation.
More particularly, if the document S is scanned with the color mode set to monochrome, the output image will be generated with the fluorescent color area E1 reproduced as a solid gray or black area, but this embodiment of the invention can highlight fluorescent color areas E1 regardless of the color setting.
Three preferred embodiments of the invention are described above, but the invention is not so limited and the foregoing embodiments may be combined as desired with the variations described below.
In the foregoing embodiments the controller 60 sets the white reference value so the output value of the color sensor 33 is 160 when the maximum output value of the color sensor 33 is 255 and white is read (see
Note that 1023 is simply one example of the maximum output value of the color sensor 33, and a 9 bit, 12 bit, or other value may be used. In this configuration, the second normalization process is not limited to a linear conversion, and a simple bit masking process taking the lowest eight bits, for example, may be used.
In the foregoing embodiments the controller 60 defines the inside of an area where consecutive specific pixels are present as part of the fluorescent color area E1, or more specifically when a text character is inside the fluorescent color area E1 defines the lines of the character as part of the fluorescent color area E1, but there are various ways of determining the fluorescent color area E1.
For example, the fluorescent color area may be defined to exclude the lines of characters.
The image processor 40 in the first embodiment applies a first normalization process to fluorescent color areas E1 in the scanned image, and applies a second normalization process to normal color areas, but may apply the first normalization process to the entire scanned image, and if there are no specific pixels in the scanned image, apply the second normalization process to the entire scanned image.
In other words, when there are specific pixels somewhere in the scanned image, the entire scanned image may be treated as a fluorescent color area E1. When there are no specific pixels in the scanned image, the pixel values may be corrected so that the pixel values of all pixels in the scanned image are greater than when there are specific pixels in the scanned image. In addition, when there are no specific pixels in the scanned image, the white reference value may be set as the maximum output value of the color sensor 33 instead of setting the scanning settings to a first setting.
Another example is a configuration that does not determine if there are specific pixels in the scanned image. In this configuration, the white reference value is set to 75% to 95% of the maximum output value of the color sensor 33, and the output values of the color sensor 33 are used as is without normalization, for example, to generate the output image. This configuration produces scanning results in which normal color areas are slightly dark, but can improve the reproducibility of fluorescent colors without special image processing. In an operating mode that assumes specific pixels are present in the scanned image, this configuration accelerates operation equally to the time not consumed to determine if there are specific pixels.
In the first embodiment described above, the controller 60 uses the white reference value as the reference value in the second normalization process, but instead of using the white reference value, may use a value based on the white reference value, such as the result of a specific value added to, subtracted from, or multiplied by the white reference value, as the reference value.
In the first normalization process the highest pixel value in the fluorescent color area E1 is set as the first reference value, but the first reference value is not so limited, and may be set to any value greater than the white reference value. For example, a value greater than the highest pixel value in the fluorescent color area E1 may be set as the first reference value.
In yet another example, the first reference value may be set in the first normalization process so that the highest pixel value in the fluorescent color area E1 is 2n−1. This configuration enables using a simple bit shifting process as the first normalization process.
In the first embodiment described above the image processor 40 executes the normalization process using a linear conversion, but the normalization process may use methods other than a linear conversion. More specifically, the normalization process may correct pixel values based on a specific rule using the first reference value or white reference value as the maximum value.
The controller 60 in the first embodiment described above sets a first reference value for each fluorescent color area E1 when there are multiple fluorescent color areas E1 in the scanned image, but may set a common first reference value for all fluorescent color areas E1 contained in the scanned image. In addition, when the document S is a cut-sheet document, a common first reference value may be set for all fluorescent color areas E1 contained in the scanned image of a single page.
In the second and third embodiments described above, the controller 60 executes the same process regardless of the color setting, but may be configured to add information identifying the fluorescent color areas E1 to the output image, or merge an image identifying the fluorescent color area E1 with the scanned image, when the color setting is set to monochrome, or configured to not execute these processes when the color setting is set to color.
In the embodiments described above the image scanner 30 has color filters disposed to the color sensor 33, and uses a color sensor method that scans images with a white light source 31, but other scanning methods may be used, such as a light source color switching method that changes the color of the light emitted from the light source while scanning.
In the embodiments described above the image reading device 1 has a document presser 71 that also functions as a white reference plate, but may be configured with a white reference plate that is separate from the document presser, or configured without a white reference plate. If configured without a white reference plate, a white reference plate may be temporarily attached in the factory during production, the white reference value measured and stored in memory, and the white reference plate then removed prior to shipping. In this case, during actual use, the white reference value to be used for scanning is calculated by correcting the white reference value stored before shipping, and the calculated white reference value may be used for scanning.
The foregoing embodiments describe an image reading device 1 having only a scanner function for outputting images, but the image reading device according to the invention may also be used in a multifunction device having one or more of a printer function for producing printouts, a facsimile function for sending and receiving faxes, an external memory function for outputting files to an external memory device, and a display for displaying output.
The scanner may also be a sheet-feed scanner that scans while conveying documents S, or a flatbed scanner that scans documents S placed on a platen.
The image reading device of the invention can also be applied to imaging devices other than scanners, and to handheld devices having a built-in camera.
Methods of executing processes of the image reading device 1 described in the foregoing embodiments and variations, a program for executing the processes of the image reading device 1, and a computer-readable storage medium storing such a program, are also included in the scope of the invention. Configurations combining the foregoing embodiments and variations in various ways are also conceivable without departing from the scope of the invention as described in the accompanying claims.
Number | Date | Country | Kind |
---|---|---|---|
2018-088446 | May 2018 | JP | national |