Scanning is the process of generating a digital representation of a hardcopy image, such as an image printed on media like paper. Scanning involves outputting light onto the hardcopy image, and detecting the light as reflected by the image. This process is performed for each of a number of locations on the image. Full color scanning generally involves sensing values for all color components on the locations of a hardcopy image. For example, a red value, a green value, and a blue value may together describe the color at a given location of an image.
A difficulty with full color scanning, however, is that not all the color components are equally detectable. For instance, some optical sensors are less responsive to blue light than they are to red and green light. To overcome this problem, scanning speed may be decreased so that blue light is adequately detected, but this decreases overall scanning performance. The size of the blue sensor may be increased, but this adds cost to the scanning device. Other proposed solutions have similar drawbacks.
The media sheet 102, inclusive of the image 104, is logically divisible into a number of locations 106A, 106B, . . . , 106N, collectively referred to as the locations 106. The number of locations 106 is generally dependent on the resolution of the scanning device used to scan the media sheet 102. For example, a scanning device having a resolution of 300 lines per inch (LPI) may be able to detect locations on the media sheet 102 that are as small as 1/300 of an inch along a given dimension. Thus, the locations 106 are not inherent to the media sheet 102 or the image 104, but rather are a function of the scanning process employed to generate a digital representation thereof.
It is noted that the locations 106 as a whole are accurately on the media sheet, such that just a portion of the locations 106 represent the image 104. However, the terminology “locations on the image” is nevertheless used for descriptive convenience. Such locations can include all the locations on the media sheet 102 encompassing the image 104, and not just the locations where the image 104 is located, as can be appreciated by those of ordinary skill within the art.
Each of the locations 106 when scanned has a number of color components, where the color components together are completely descriptive of the color of the location in question. For example, typically a full color digital representation of an image includes a value for a red color component, a value for a green color component, and a value for a blue color component of each location on the image. These red, green, and blue values for each location fully describe the color of that location. The red, green, and blue values for all the locations together are the full color digital representation of the image.
Within the prior art, typically each of these color components is individually detected using an optical sensor, as has been alluded to in the background. Thus, for each location of an image, a red value is optically detected, a green value is optically detected, and a blue value is optically detected. However, as has been noted in the background, some colors of light, such as blue light, are more difficult to detect than other colors of light with certain types of optical sensors, such as charge-coupled devices (CCD's). Embodiments of the invention overcome such problems, as is now described.
Optical sensors 204R, 204G, and 204W, collectively referred to as the optical sensors 204, differently sense or detect the white light 208′ reflected by the location 106A. The optical sensors 204 may be charge-coupled devices (CCD's), or other types of optical sensors. The optical sensor 204W detects a grayscale response of the white light 208′, which may be referred to as the gray component, or gray or grayscale value, of the location 106A. For instance, if just the optical sensor 204W were used to scan all the locations 106, the resulting digital representation would be a grayscale representation of the image, as opposed to a full color representation of the image.
The grayscale response may be non-restrictively defined as follows. First, it may be defined as a series of achromatic tones having varying proportions of white and black, to give a full range of grays between white and black. Second, it may be considered as a series of shades from white to black.
Optical filters 206R and 206G particularly filter the white light 208′ before the white light 208′ is detected or sensed by the optical sensors 204R and 204G. The optical filter 206R substantially permits just red light to reach the optical sensor 204R, whereas the optical filter 206G substantially permits just green light to reach the optical sensor 204G. Stated another way, the optical filter 206R substantially permits just red frequencies of the white light 208′ to pass, whereas the optical filter 206G substantially permits just green frequencies of the white light 208′ to pass.
Therefore, the optical sensor 204R detects a red response of the white light 208′, which may be referred to as the red component, or a red value, of the location 106A. Likewise, the optical sensor 204G detects a green response of the white light 208′, which may be referred to as the green component, or a green value, of the location 106A. Together with the grayscale response detected by the optical sensor 204W, then, three different components are scanned for the location 106A: a red component, a green component, and a gray component.
By themselves, the red, green, and gray components are insufficient to provide a full color digital representation at the location 106A. In particular, a blue component is missing. Rather than directly optically sensing the blue component at the location 106A, as in the prior art, one embodiment of the invention instead generates, or calculates, the blue component from the red, green, and gray components that have been directly optically scanned.
For instance, in one embodiment, the following equation may be used to generate the blue component at the location 106A without actually directly optically sensing the blue component using an optical sensor:
BLUE=c*GRAY−RED−GREEN
In this equation, GRAY is the value of the gray component that has been optically scanned by the sensor 204W, RED is the value of the red component that has been optically scanned by the sensor 204R, and GREEN is the value of the green component that has been optically scanned by the sensor 204G. c is a constant, which in one embodiment can be three, for instance. Therefore, the value of the blue component, BLUE, is calculated without having to actually be optically scanned. In other embodiments, c may be empirically determined to produce the most accurate blue values.
Other types of equations and transformations may be employed that are more sophisticated than the equation that has been presented in the previous paragraph, as can be appreciated by those of ordinary skill within the art. The above equation represents the ideal scenario in which there is no noise or crosstalk among the sensors 204, ambient light effects, and so on. Where noise, crosstalk, ambient noise, and so on, are problematic, empirically tested transformations may be employed to generate the blue component from the gray, red, and green color components to reduce these effects.
Furthermore, the embodiment of
The exposure times of the optical sensors 204 may also vary. In the example of
It is noted that the embodiment of
Such optical scanning may be achieved in one embodiment as follows. White light is output onto the location in question (308). Thereafter, for each color component, colored light corresponding to the color component, as reflected at the location, is detected (308). For example, an optical sensor with a green optical filter thereover may be turned on for a predetermined length of time to generate the green value for the location, and an optical sensor with a red optical filter thereover may be turned on for the same or different length of time to generate the red value for the location. Thus, the colored light in each case results from the white light being reflected at the location, and then passing through a correspondingly colored filter before reaching a given optical sensor.
For the gray component at the location, the white light as reflected by the location is optically detected (310), to yield the gray value for the location. An optical sensor with no optical filter thereover may be turned on for the same or different length of time to generate this gray value for the location. The end result is that there are color component values and a gray component value for the location in question. However, these values are insufficient to fully describe the color at the location of the image.
Therefore, an additional color component is generated for the location from the optically scanned color components and from the optically scanned gray component (312). For instance, where red, green, and gray components have been optically scanned, a blue component may be generated as has been described in relation to
The optical sensors 402 optically sense a gray component of an image, and color components of the image, but do not optically sense all the color components needed to describe the image in full color in one embodiment of the invention. The optical sensors 402 may thus include a sensor for detecting grayscale values, a sensor for detecting red values, and a sensor for detecting green values, but not a sensor for detect blue values, for instance. The optical sensors 402 may be or include the sensors 204 of
The white light sources 404 output white light inclusive of substantially all the visible light wavelengths. The white light is output incident to locations on an image, as has been described in relation to
The generation mechanism 406 may be implemented in hardware, software, or a combination of hardware and software. The generation mechanism 406 generates color component values for the image from the optically scanned gray and color component values of the image so that a full description of the image can be provided within a resulting digital representation. For example, as has been described, where gray, red, and green values are directly optically detected for each location of an image, the blue value for each location may be generated from these optically detected values.
The advancement mechanism 406 may be or include one or more motors. The advancement mechanism 406 moves the media sheet 102 in relation to the optical sensors 402 and/or the white light sources 404, so that each location on the media sheet 102 may be optically scanned. For example, the optical sensors 402 may be arranged in a linear array corresponding in length to the short side of a typical letter-sized sheet of media. A given line, or swath, of the sheet may be optically scanned by the optical sensors 402, and then the advancement mechanism 406 may advance the sheet so that the next line or swath is optically scanned. This process can be repeated until the entire sheet has been optically scanned.