The present application claims the priority based on Japanese Patent Applications No. 2006-329112 filed on Dec. 6, 2006 and No. 2007-264691 filed on Oct. 10, 2007, the disclosures of which are hereby incorporated by reference in their entirety.
1. Technical Field
The invention relates to a technique for detecting blurring in images.
2. Related Art
Digital still cameras have recently become popular, and the capacity of the memory cards used therein have been expanded. As a result, more and more general users are storing greater amounts of images. Digital still cameras require no film and allow photographs to be taken more casually, often resulting in photographing with unintentional blurring or object motion. As such, images are relatively often blurred due to blurring or object motion, and attempts to print such images on a printing apparatus require normal images to be selected beforehand.
It is extremely cumbersome to have to select normal images from out of an abundance of images. A desirable technique would therefore automatically exclude blurred images from the images to be printed before the user prints the images. In relation to such a technique for detecting blurring, JP-A-2006-19874 discloses a technique for detecting whether or not there is any blurring in images based on bit map data in the digital still camera used to photograph the images.
However, since recent digital still cameras can photograph images with a high resolution of several millions to ten millions of pixels, the bit map data volume can be quite extensive. As a result, CPUs with a high processing capacity and greater memory volume are needed, resulting in greater manufacturing costs, in order to detect blurring based on bit map data in compact devices such as digital still cameras and printers.
In view of the various problems noted above, an object which the present invention is intended to address is to detect blurring in images while minimizing the processing burden or the memory volume that is used.
In view of the foregoing object, the blurring determination device in an aspect of the invention comprises: an image data reference module configured to reference image data in which has been recorded coefficients that are obtained when pixel values forming the image in the spatial domain are converted to the frequency domain; an edge detection module configured to detect edges oriented in two or more directions, from among the image data, by comparing a series of the coefficients in each of the directions with various types of basic edge patterns whereby typical gradient patterns of the changes in pixel values are represented by values corresponding to the coefficients; and a blurring determination module configured to determine the representative values of the width of the detected edges in each of the directions and determine that the image data is not blurred when the representative values meet the condition of being at or below a certain threshold.
According to the blurring determination device in the above aspect, the coefficients recorded in the image data are used as such, without being converted to pixel values, to determine whether images are blurred. Blurring can thus be rapidly determined, with less of a process load for determining blurring. In addition, according to the blurring determination device in the above aspect, there is no need to ensure memory area for the conversion of the coefficients to pixel values during the blurring determination process. The memory volume that is used can therefore be decreased. Furthermore, according to the blurring determination device in the above aspect, edges are detected in two or more directions among the image data, and images are determined not to be blurred when the representative values of the width of the edges in each direction are at or below a certain threshold. Blurring can thus be accurately determined without depending on the direction of blurring. “Edge” refers to a border where there is a precipitous change in pixel values (such as luminance, hue, RBG values) in an image. “Edge width” refers to the width of the border. When the “edge width” expands, the border component becomes blurred. The “edge direction” refers to the normal directions of the border noted above.
Aspects of the invention other than the above blurring determination device can also comprise a printing apparatus, blurring determination method, and computer program. The computer program may be recorded on computer-readable recording media. Examples of recording media include a variety of media such as floppy disks, CD-ROM, DVD-ROM, opticomagnetic disks, memory cards, and hard disks.
These and other objects, features, aspects, and advantages of the present invention will become more apparent from the following detailed description of the preferred embodiments with the accompanying drawings.
Modes for implementing the invention will be elaborated in the following order based on the embodiments below.
1st Embodiment
A. Printer Structure
B. Printing Process
C. Blurring Determination Process
D. Effects
2nd Embodiment
3rd Embodiment
4. Modifications
5. Other Aspects
A. Printer Structure
The printer 100 is equipped with an operating panel 140 for a variety of printing-related operations. A liquid crystal display 145 is provided in the center of the operating panel 140. Displayed on the liquid crystal display 145 are images read from the memory card MC, digital camera, or a GUI (graphical user interface) when the various functions of the printer 100 are employed.
The printer 100 has the function of eliminating blurred images (“blurred images”) from among the plurality of image data input from the memory card MC, digital camera, and extracting images that are focused in even one location (“focused images”) for display on the liquid crystal display 145. The user can select desired images from among the images displayed on the liquid crystal display 145 to print images that are suitable for printing. The printer 100 structure and process for executing the function of eliminating blurred images are described in detail below.
The carriage 210 is equipped with a total of 6 ink heads 211 corresponding to the inks representing the colors of cyan, magenta, yellow, black, light cyan, and light magenta. The ink cartridges 212 housing these inks are mounted on the carriage 210. The inks supplied from the ink cartridges 212 to the ink heads 211 are ejected onto the printing paper P when piezo elements (not shown) are actuated.
The carriage 210 is movably supported by a sliding shaft 280 located parallel to the axial direction of the platen 270. The carriage motor 220 rotates a drive belt 260 according to commands from a control unit 150, so that the carriage 210 travels reciprocally parallel to the axial direction of the platen 270, that is, in the main scanning direction. The paper feed motor 230 rotates the platen 270 to feed printing paper P perpendicular to the axial direction of the platen 270. That is, the paper feed motor 230 can relatively move the carriage 210 in the sub-scanning direction.
The printer 100 is equipped with the control unit 150 to control the operation of the above ink heads 211, carriage motor 220, and paper feed motor 230. Connected to the control unit 150 are the scanner 110, the memory card slot 120, the USB interface 130, the operating panel 140, and the liquid crystal display 145 which are illustrated in
The control unit 150 comprises a CPU 160, a RAM 170, and a ROM 180. Stored in the ROM 180 are a control program for controlling the operation of the printer 100 and an edge pattern table 181 used in the blurring determination process described below. The CPU 160 runs the control program stored in the ROM 180 by loading it to the RAM 170 to execute the illustrated functional modules (161 to 163).
The control unit 150 is equipped with an image data reference module 161, blurring determination module 162, and JPEG decoder 163 as the functional modules run by the CPU 160. The operations of these functional modules are briefly described below (see the contents of the various processes described below for more detailed operations).
The image data reference module 161 is a module by which the JPEG format image data (“JPEG data” below) recorded on the memory card MC or digital camera is referenced through the memory card slot 120 or USB interface 130. Images are recorded in 8 pixel×8 pixel block units in JPEG data. The image data in these blocks is compressed in the following order: 1) conversion of pixel values from RGB color space to YCbCr color space; 2) discrete cosine transform (DCT) from the spatial domain to the frequency domain; 3) quantization in which data volume is reduced; and 4) Huffman coding, which is a form of entropy encoding.
The JPEG decoder 163 (see
The Huffman decoder 191 has the function of decoding the JPEG data bit stream which has undergone lossless compression by means of Huffman coding.
The inverse quantization processor 192 is a functional module that uses a certain quantization table for inverse quantization of the data decoded by the Huffman decoder 191 to determine the 8×8 DCT coefficients per block.
The inverse DCT module 193 is a functional module for the inverse DCT of the DCT coefficients determined by the inverse quantization processor 192 to determine image data in the YCbCr format.
The color space converter 194 is a functional module by which the data in YCbCr format obtained by the inverse DCT module 193 is converted to bit map data in the RGB format.
Here, the description returns to
The blurring determination module 162 has the function of determining blurring by extracting coefficients F01, F02, F03, F04, F05, F06, and F07 that are AC components only in the horizontal direction described as “horizontal coefficient group” in the figure, coefficients F10, F20, F30, F40, F50, F60, and F70 that are AC components only in the vertical direction described as “vertical coefficient group,” and coefficients F11, F22, F33, F44, F55, F66, and F77 that are AC components in the inclined direction described as “inclined coefficient group.” Details on the blurring determination process using these coefficient groups are given below.
B. Printing Process
When the printing process is started in response to certain operations by a user manipulating the operating panel 140, first the CPU 160 references a JPEG data set recorded on a memory card MC by means of the image data reference module 161 (Step S10). Here, the JPEG data on a memory card was referenced, but the JPEG data on a computer or digital camera connected by the USB interface 130 can also be referenced.
When JPEG data is referenced, the CPU 160 uses the blurring determination module 162 to carry out the blurring determination process on the referenced JPEG data (Step S20). Details on the blurring determination process are given below.
When the blurring determination process for one set of JPEG data is finished, the CPU 160 determines whether all the JPEG data on the memory card MC has been referenced (Step S30). When it is determined by this process that not all of the JPEG data has been referenced (Step S30: No), the process returns to Step S10, and the next JPEG data set is referenced to carry out the blurring determination process on that JPEG data.
When it is determined in Step S30 that all of the JPEG data has been referenced (Step S30: Yes), an overview of the JPEG data determined to be focused images by the blurring determination process in Step S20 is displayed by the CPU 160 on the liquid crystal display 145 (Step S40).
When the overview of the focused images is displayed on the liquid crystal display 145, the CPU 160 receives the user's selection of the images for printing via the operating panel 140 (Step S50). The selected image data format is converted with the use of the JPEG decoder 163 from JPEG format to bit map format, and is furthermore converted to data to control the amount of ink injection, and the ink heads 211, paper feed motor 230, and carriage motor 220 are controlled for printing (Step S60).
In the printing process noted above, all the JPEG data on the memory card MC was referenced, but when a plurality of folders have been created on a memory card, it is also possible to reference just the JPEG data included in folders indicated by the user. It is also possible to reference only JPEG data taken in a certain year or month or on a certain day.
C. Blurring Determination Process
When the blurring determination process is carried out, the CPU 160 first reads block data per certain band region from among the currently referenced JPEG data, and the data undergoes Huffman decoding and inverse quantization using the JPEG decoder 163 to obtain DCT coefficients (Step S100). The DCT coefficients that are obtained are temporarily stored in RAM 170.
Returning to
When the block blur determination process is carried out, the CPU 160 first obtains a horizontal coefficient group F0i (i=1 to 7), which is an AC component in the horizontal direction illustrated in
When the coefficient group is obtained, the CPU 160 determines the sum S of the absolute values of each coefficient that has been obtained based on the following Equation (1), and determines whether the value is over a certain flat threshold (Step S310).
S=Σ|F0i|(i=1 to 7) (1)
In Step S310 above, when the sum S is determined to be at or below the certain flat threshold (Step S310: No), the change in luminance represented by the coefficient of the block subject to analysis is regarded as being flat, and this block is determined to be a “flat pattern” (Step S320).
Meanwhile, when the sum S is determined to be over the certain flatness threshold in Step S310 (Step S310: Yes), it can be determined that there has been some change in luminance in the block subject to analysis. The CPU 160 then first normalizes the obtained coefficients by the following Equation (2) to permit easier comparison with the basic edge patterns described below (Step S330). The coefficient values Fr01 through Fr07 normalized by means of the normalization process are values obtained by dividing the coefficient values F01 through F07 by the sum S of the absolute values of the coefficient group.
Fr0i=F0i/S(i=1 to 7) (2)
The CPU 160 then references the edge pattern table 181 stored in ROM 180 (Step S340) to determine whether the gradient pattern represented by the normalized coefficient values Fr01 through Fr07 resemble any of the basic edge patterns (Step S350).
Each basic edge pattern is produced based on the luminance pattern shown in the 2nd column of
Also aligned with the basic edge patterns in the edge pattern table 181 are the three parameters of left edge width LW, middle edge width MW, and right edge width RW. The left edge width LW represents the width of the flat portion on the left side of the luminance pattern, and the right edge width RW represents the width of the flat portion on the right side of the luminance pattern. The middle edge width MW represents the width of the gradient flanked by the left edge width LW and right edge width RW.
As noted above, there are 16 patterns in all in the edge pattern table 181, but the edge pattern table referenced in Step S340 above can be selected based on the signs of the two coefficient F01 and F02 (F10 an F20 when in the vertical direction) codes.
Returning to the flow chart in
SD=Σ|Fr0i−Fb0i|(i=1 to 7) (3)
When a similar basic edge pattern has been retrieved in Step S350 above, or a flat pattern is determined in Step S320, the CPU 160 associates the edge pattern with the block (Step S360). It is determined whether edge patterns have been associated with all of the blocks in the band region (Step S370), and if not, the process returns to Step S300 to associate an edge pattern with the next block.
When it is determined in the above Step S370 that edge patterns have been associated with all blocks (Step S370: Yes), the CPU 160 runs a process for joining the horizontal and vertical edge patterns for the blocks in the band region (Step S380 in
Returning to
When each of the blocks has thus been determined to be a “focused block” or “blurred block,” the CPU 160 associates the results with each block (Step S420). It is determined whether all of the blocks have been finished (Step S430), and if not, the process returns to Step S390 to determine whether the remaining blocks are focused. If, on the other hand, it is determined that all the blocks are finished, the CPU 160 finishes the process for determining blurring in blocks, and the process returns to the blurring determination process in
Returning to
The size of the window areas, on L-ban size printing paper (8.9 cm×12.7 cm), for example, can be a size of 1 cm×1 cm, which will allow focusing to be determined for practical purposes (This size corresponds to a size of about 30×30 blocks, assuming an image of 6,000,000 pixels is printed on L-ban size printing paper).
When the window areas are extracted in Step S120 above (Step S130: Yes), the window areas can be assumed to be focused areas because of the abundance of focused blocks that are included. The CPU 160 then runs the process of Steps S140 through S170 below to strictly determine blurring in the window areas.
That is, the CPU 160 first references edge pattern tables by the four directions shown in
When the process is started, the CPU 160 first obtains horizontal, vertical, and inclined coefficient groups for the current block (Step S500) (see
When the coefficients are normalized per coefficient group, the CPU 160 carries out a process for selecting the table used in the following process from among the 16 edge pattern tables TB1 through TB16 (Step S540).
If, on the other hand, the sum HS is at or below the threshold (Step S710: Yes), the sum VS of the absolute values of the coefficients constituting the vertical coefficient group is determined (Step S730), and the sum VS is compared to a certain threshold (Step S740). If the results reveal the sum VS to be greater than the threshold (Step S740: No), the direction of the edge is determined to be in the vertical direction because the change in luminance in the vertical direction is considered substantial (Step S750).
In the above Step S740, when the sum VS is determined to be at or below the threshold (Step S740: Yes), the change in luminance is not considered to be very substantial in either the horizontal or vertical directions, in which case the CPU 160 therefore determines the direction of the edge to be inclined (Step S760).
When the direction of the edge is determined by the above process, the CPU 160 carries out a process for selecting the edge pattern table that will be used (Step S770). Specifically, as illustrated in
When an edge pattern table is selected by the above process, the CPU 160 returns the process to the process for matching edge patterns by direction in
When the selected edge pattern table is referenced, the CPU 160 compares the coefficient groups of the basic edge pattern and the coefficient groups corresponding to the direction of the edge determined by the above process for selecting the table that is used in order to search for similar basic edge patterns (Step S560). This search process is the same as the process in Step S350 in
When a similar basic edge pattern is retrieved in Step S560, or when the pattern is determined to be flat in Step S520, the CPU 160 associates the edge pattern and the direction of the edge with the block (Step S570). It is determined whether edge patterns have been associated with all of the blocks in the current window area (Step S580), and if not, the process returns to Step S500 to associate an edge pattern with the next block.
When it is determined by the above process that edge patterns have been associated with all blocks, the CPU 160 joins the edge patterns in the four directions shown in
When the process for matching edge patterns by direction is completed in the above process, the CPU 160 returns the process to the blurring determination process in
In Step S150 above, when the number of edges is collated in each direction, the CPU 160 determines whether the number of edges in all directions is less than a certain threshold (Step S160).
In Step S160 above, when it is determined that all directions include a number of edges at or over the certain threshold (Step S160: No), it may be concluded that a sufficient number of edges are in that window area. The CPU 160 therefore calculates the mean value as the representative width of the edges included per direction. It is then determined whether all the calculated mean edge width values per direction are at or under a certain threshold (Step S170). If, as a result, all the mean values in each direction are at or below the certain threshold, the image is determined to be a “focused image” (Step S180), and the blurring determination process is complete. In this case, that is because all four directions include a sufficient number of edges, the width of the edges is small enough, and the window area can be determined to be focused. If even one window area is focused in the entire image, the image can be determined to be a normal image that is focused at some point, resulting in the determination of a “focused image” in Step S180 without having to determine blurring in other window areas or band regions.
In Step S160 above, if the number of edges included in any direction is determined to be under the threshold (Step S160: No), that window area does not include a sufficient number of edges and is therefore determined to be blurred. The CPU 160 therefore finishes the blurring determination process for the current window area, and determines whether another window area has been extracted by the process in Step S120 above (Step S190). When another window area has been extracted (Step S190: Yes), the process returns to Step S140, and the above process (Steps S140 to S170) is repeated for that window area.
When the above process has been completed on all window areas extracted by the process in step S120 (Step S190: No), or when not even one window area including at leas the certain number of focused blocks has been extracted in Step S120 (Step S130: No), the CPU 160 determines whether the current band region is located at the end of the image (Step S200). If the current band region is the end of the image (Step S200: Yes), it will turn out that there are no focused window areas, and the current image is therefore determined to be a blurred image (Step S210). By contrast, if the current band area is not the end of the image (Step S200: No), the CPU 160 returns the process to Step S100, the next band region undergoes Huffman decoding and inverse quantization, and the above series of processes is repeated.
When the above blurring determination process is complete, the CPU 160 returns the process to the printing process shown in
D. Effects
According to the printer 100 in the embodiment described above, the storage volume in RAM 170 can be reduced because the JPEG data recorded on a memory card MC or the like can be divided into band regions and stored in RAM 170.
In the present embodiment, only window areas including an abundance of focused blocks are subject to detailed determination of blurring. This can thus alleviate the processing burden imposed on the CPU 160.
In the present embodiment, blurring can be rapidly determined because the entire image is determined to be focused when a window area is focused in any location.
In the present embodiment, blurring is determined based entirely on DCT coefficients, without converting the JPEG data to bit map format. The processing load imposed on the CPU 160 can therefore be alleviated, and blurring can be determined more rapidly.
In the present embodiment, edges are detected not only in the horizontal or vertical directions, but also in inclined directions, thus allowing it to be determined more accurately whether images are blurred, without being depending on the direction of camera.
The photo viewer 300 has the function of allowing the image data recorded in a storage device to be displayed on the monitor 310. Images from a digital camera or personal computer are transferred via the USB interface 320 to the internal storage device. The photo viewer 300 reads images from memory cards that are inserted into the memory card slot 330 and transfers the images to the storage device. A printer can be connected to the USB interface 320. The photo viewer 300 controls the printer using an internally installed printer driver so as to print the image data stored in the storage device.
The CPU in the photo viewer 300 runs a control program stored in ROM by loading it in RAM so as to carry out the image transfer function or image display function described above. The CPU also runs the control program to carry out the same processes as the various processes described in the first embodiment (printing process and blurring determination process). The photo viewer 300 can thus automatically extract focused images from out of the image data stored in the storage device and display the images on the monitor 310. The photo viewer 300 can also control the printer connected to the USB interface 320 to print the focused images that have been extracted.
The kiosk terminal 400 in this embodiment is equipped with a monitor 410, memory card reader 420, and printer 430. It is also internally equipped with a CPU, RAM, and ROM. The CPU executes a control program stored in ROM by loading the program in RAM, so as to carry out the above ticket-issuing function, ATM function, or various guided service functions. The CPU also runs the control program to carry out the same processes as the various processes described in the first embodiment (printing process and blurring determination process). The kiosk terminal 400 can thus read image data from memory cards inserted into the memory card reader 420 and automatically extract focused images to display them on the monitor 410. The focused images that have thus been extracted can also be printed by the kiosk terminal 400 using the printer 430.
In this embodiment, the kiosk terminal 400 was equipped with a printer 430, but a structure without the printer 430 can also be devised. In that case, the kiosk terminal 400 ca print to a remote printer connected through certain communications lines such as a network or the Internet.
Various embodiments of the invention have been described above, but it need hardly be pointed out that the invention is not limited to those embodiments and can assume a variety of structures within the spirit and scope of the invention.
For example, in addition to the photo viewer 300 and kiosk terminal 400, blurring can be determined by a computer, digital camera, cell phone, or the like in structures wherein the various processes noted above are carried out by such devices. After such devices determine blurring, the results may be recorded in the EXIF data of JPEG data. JPEG data in which the results of the blurring determination process have been recorded in this way can be used by printers or computers to select printing images or to carry out various image processes according to the data on blurring recorded in the EXIF data.
In the above embodiments, image data in the JPEG format was used as an example of image data in which were recorded two or more coefficients obtained when pixel values that are values in the spatial domain of pixels forming the image are converted to the frequency domain. However, the present invention can be applied to image data of other formats represented by coefficients in addition to image data in JPEG format. For example, DCT can be done in 4×4 pixel block units with the image format referred to as “HD Photo.” An edge pattern table can thus be prepared in advance according to the block size to allow blurring to be determined in the same manner as the above embodiments.
The coefficients recorded in the image data are not limited to coefficients obtained as a result of the DCT. For example, coefficients obtained by the DWT (discrete wavelet transform) or the Fourier transform may also be recorded in image data. Blurring can be determined in the same manner as in the above embodiments by producing an edge pattern table in advance based on coefficients obtained as a result of these transforms.
In the blurring determination device of the above aspect of the invention, the blurring determination module may determine the mean width of the detected edges in each of the directions as the representative value of the width of the edges. The median of the width of the edges, or a value around it, may also be used instead of the mean.
In the blurring determination device of the above aspect, the image data for which blurring is determined can be produced based on the JPEG standard, for example. In this case, coefficients refer to so-called DCT coefficients, which are obtained by the discrete cosine transform of pixel values per block. The spatial domain can also be converted to the frequency domain using, for example, the Fourier transform or wavelet transform instead of the discrete cosine transform.
In the blurring determination device of the above aspect, the blurring determination module may determine that the image data is not blurred when the number of the detected edges is at least a certain number in each of the directions and the representative values meet the condition of being at or below a certain threshold.
According to such an aspect, when the number of edges is at or over a certain number, it is possible to determine blurring only for image data with a high likelihood of not being blurred.
In the blurring determination device of the above aspect, the image data reference module may divide the image data into band regions having a certain width and input the coefficients to memory by band region, and the edge detection module may reference the memory to detect the edges.
According to such an aspect, the amount of memory that is used can be reduced because there is no need to input all of the coefficients in the image data to memory.
In the blurring determination device of the above aspect, a plurality of the coefficients may be recorded in the image data using, as units, blocks comprising a plurality of pixels, and the edge detection module may comprise a block blurring determination module configured to divide the band regions into a plurality of window areas that are smaller than the band region, and use the basic edge patterns to determine whether each of the blocks included in the window areas is blurred, and an in-window edge detection module configured to detect the edges oriented in each of the directions in the window areas that include at least a certain number of blocks not determined to be blurred by the block blurring determination module, and the blurring determination module may determine that the image data is not blurred when the above condition is met by any window area out of the window areas including at least a certain number of blocks not determined to be blurred.
According to such an aspect, a detailed determination of blurring is made in window areas that include at least a certain number of blocks not determined to be blurred. The process can therefore be efficiently carried out. In addition, the whole image is determined to not be blurred when certain conditions are met by any window area out of such window areas. The process can therefore be rapidly carried out without the need for making a determination on blurring in the whole image.
In the blurring determination device of the above aspect, the block blurring determination module may determine whether the blocks are blurred based on the series of the coefficients in the vertical and horizontal directions, and the in-window edge detection module may detect the edges based on a series of the coefficients in the inclined direction in addition to the vertical and horizontal directions.
According to such an aspect, the edges are detected in more directions rather than determining blurring by block when determining whether or not there is blurring in windows, thus permitting more reliable determination of blurring.
In the blurring determination device of the above aspect, when the directions of the gradients of basic edge patterns corresponding to the series of the plurality of coefficients are aligned between adjacent blocks, the block blurring determination module may cumulatively add the width of the gradients and determine whether blurring straddles adjacent blocks based on the cumulatively added gradient width.
According to such an aspect, the presence or absence of blurring can be accurately determined, even when the blurred portions straddle more than one block.
In the blurring determination device of the above aspect, the basic edge patterns may be classified into and stored in a plurality of tables according to the directions of the gradient patterns represented by the basic edge patterns, and the edge detection module may select a table from among the plurality of tables according to a sign of a certain coefficient in the series of the plurality of coefficients.
According to such an aspect, blurring can be determined more rapidly because the number of comparisons between the basic edge patterns and the series of coefficients in the image data can be reduced.
Although the present invention has been described and illustrated in detail, it is clearly understood that the same is by way of illustration and example only and is not to be taken by way of limitation, the spirit and scope of the present invention being limited only by the terms of the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
2006-329112 | Dec 2006 | JP | national |
2007-264691 | Oct 2007 | JP | national |