Cameras are commonly used to capture an image of a scene that includes one or more objects. Many cameras include a capturing system, a optical assembly, and an auto-focusing (“AF”) feature that automatically adjusts the focus of the optical assembly until one or more of the objects from the scene are optimally focused on the capturing system. In most AF techniques, a focus measure is defined to check the focus degree (or in another word, sharpness degree). Ideally, the focus measure should be a unimodal function that reaches maximum for the best-focused image and generally decreases as the focus decreases. Thus, the AF problem is essentially a search of a maximum focus measure in the lens position space. Accordingly, the definition of focus measure is crucial to the success of the AF technique.
Unfortunately, because the focus measure that is used in existing AF methods is too sensitive to noise and aliasing, existing AF methods do not necessarily lead to optimal results. Stated in another fashion, although various different focus measuring methods have been utilized successfully in cameras, AF still remains as an active research area for further improvement.
The present invention is directed to a method for estimating if a optical assembly is properly focused on a scene. In one embodiment, the method includes the steps of: capturing information for an image of the scene; determining an image gradient histogram distribution of at least a portion of the image; determining a Gaussian model gradient histogram distribution for the image; and comparing at least a portion of the image gradient histogram distribution to the Gaussian model gradient histogram distribution of the image to estimate if the image is in focus.
One idea behind this method is that sharp images have a heavy-tailed distribution in the image gradient histogram distribution. Stated in another fashion, sharp images show significantly more probability to large image gradient histogram distribution than the Gaussian model gradient histogram distribution. In one embodiment, the present invention defines the focus measure as the difference of large gradients distribution probability between a given image and the Gaussian model. With the present invention, the focus measure will show larger positive value for a properly focused sharp image and a smaller value for defocused, blurred image. Further, because the focus measure is the difference in large gradient distribution, the focus measure is less sensitive to noise because individual pixel values have less influence on the focus measure. Thus, a pixel with large gradient will have only a small influence on the focus measure provided herein. Moreover, the present method measures gradient distribution and is less influenced by the exact gradient values of the pixels. Thus, a noisy pixel will have less influence on the focus measure.
As provided herein, the step of capturing information can include the step of capturing information for a plurality of alternative images of the scene. In this embodiment, the step of determining an image gradient histogram distribution can include the step of determining an image gradient histogram distribution for at least a portion of each of the plurality of alternative images. Further, the step of comparing includes the step of comparing at least a portion of the image gradient histogram distribution for each of the alternative images to the Gaussian model gradient histogram distribution to estimate which of the alternative images is in focus. Additionally, the step of comparing can include the step of selecting the image having the greatest number of large gradients as being in focus.
In another embodiment, the step of comparing can include the step of comparing an image tail section of the image gradient histogram distribution with a Gaussian tail section of the Gaussian model gradient histogram distribution. In this embodiment, the image is in focus if the image tail section is much greater than the Gaussian tail section. In yet another embodiment, the method includes the steps of: capturing a plurality of images of the scene, each image being captured with the optical assembly at a different adjustment; and determining an image gradient histogram distribution for each of images. This embodiment can include the step of comparing at least a portion of the image gradient histogram distribution for each of the images to each other to estimate which image is best focused.
Further, the step of comparing can include the step of selecting the image having the greatest number of large gradients as being in focus. Further, the step of comparing can include the step of comparing an image tail section of each image gradient histogram distribution. Moreover, the image which includes the largest tail section can be selected as being in focus.
The present invention is also directed to an image apparatus for capturing an image of a scene. As provided herein, the image apparatus can utilize one or more of the methods disclosed herein to focus a optical assembly of the image apparatus.
The novel features of this invention, as well as the invention itself, both as to its structure and its operation, will be best understood from the accompanying drawings, taken in conjunction with the accompanying description, in which similar reference characters refer to similar parts, and in which:
In one embodiment, the image apparatus 10 includes a control system 24 (illustrated in phantom) that uses a unique method for estimating when the image apparatus 10 is properly focused on the scene 12. Stated in another fashion, the control system 24 performs an auto focusing technique that utilizes a new focus measuring method. More specifically, the new focus measuring method is based on an image gradient histogram distribution for the particular image being evaluated. This focus measuring method can be less sensitive to noise. As a result thereof, the control system 24 can be used to accurately focus the image apparatus 10 so that the desired captured image is sharp.
As provided herein, one or more of the captured images 14, 16, 18 can be thru images that are captured during focusing of the image apparatus 10, prior to capturing the desired image. As an overview, in one embodiment, the control system 24 calculates an image gradient histogram distribution for one or more of the captured images 14, 16, 18 during the auto-focusing procedure. These image gradient histogram distributions can be compared to each other and/or a Gaussian model gradient histogram distribution during the auto-focusing procedure to determine when the image apparatus 10 is properly focused.
The type of scene 12 captured by the image apparatus 10 can vary. For example, the scene 12 can include one or more objects 22, e.g. animals, plants, mammals, and/or environments. For simplicity, in
The apparatus frame 236 can be rigid and support at least some of the other components of the image apparatus 10. In one embodiment, the apparatus frame 236 includes a generally rectangular shaped hollow body that forms a cavity that receives and retains at least some of the other components of the camera.
The apparatus frame 236 can include an aperture 242 and a shutter mechanism 244 that work together to control the amount of light that reaches the capturing system 240. The shutter mechanism 244 can be activated by a shutter button 246. The shutter mechanism 244 can include a pair of blinds (sometimes referred to as “blades”) that work in conjunction with each other to allow the light to be focused on the capturing system 240 for a certain amount of time. Alternatively, for example, the shutter mechanism 244 can be all electronic and contain no moving parts. For example, an electronic capturing system 240 can have a capture time controlled electronically to emulate the functionality of the blinds.
The optical assembly 238 can include a single lens or a combination of lenses that work in conjunction with each other to focus light onto the capturing system 240. In one embodiment, the image apparatus 10 includes an autofocus assembly 248 (illustrated as a block in phantom) including one or more lens movers that adjust one or more lenses of the optical assembly 238 until the sharpest possible image of the subject is received by the capturing system 240. The autofocus assembly 248 is described in more detail below.
It should be noted that each of the images 14, 16, 18 (illustrated in
The capturing system 240 captures information for the images 14, 16 (illustrated in
The image sensor 250 receives the light that passes through the aperture 242 and converts the light into electricity. One non-exclusive example of an image sensor 250 for digital cameras is known as a charge coupled device (“CCD”). An alternative image sensor 250 that may be employed in digital cameras uses complementary metal oxide semiconductor (“CMOS”) technology.
The image sensor 250, by itself, produces a grayscale image as it only keeps track of the total quantity of the light that strikes the surface of the image sensor 250. Accordingly, in order to produce a full color image, the filter assembly 252 is generally used to capture the colors of the image.
The storage system 254 stores one or more of the finally captured images before these images are ultimately printed out, deleted, transferred or downloaded to an auxiliary storage system or a printer. The storage system 254 can be fixedly or removable coupled to the apparatus frame 236. Non-exclusive examples of suitable storage systems 254 include flash memory, a floppy disk, a hard disk, or a writeable CD or DVD.
The control system 24 is electrically connected to and controls the operation of the electrical components of the image apparatus 10. The control system 24 can include one or more processors and circuits, and the control system 24 can be programmed to perform one or more of the functions described herein. In
In certain embodiments, the control system 24 includes auto-focusing software that evaluates whether the optical assembly 238 is optimally focused prior to capturing the final image and controls the focusing of the optical assembly 238.
Referring back to
Further, the image display 56 can display other information that can be used to control the functions of the image apparatus 10.
Moreover, the image apparatus 10 can include one or more control switches 58 electrically connected to the control system 24 that allows the user to control the functions of the image apparatus 10. For example, one or more of the control switches 58 can be used to selectively switch the image apparatus 10 to the auto-focusing processes disclosed herein.
In
It should be noted that the typical gradient histogram distribution would have the shape of a bell curve. However, because only the absolute value is illustrated in
For each image 14, 16, 18, the respective image gradient histogram distribution 360, 362, 364 represents how much difference exists between each pixel and its neighboring pixels in the respective image. The difference can be measured along the X axis, along the Y axis, diagonally, or along some other axis. Further, the difference can be the intensity difference, the contrast difference, or some other difference.
For example, for the first image 14, the absolute value for the first image gradient histogram distribution 360 can represent how much difference in intensity exists between adjacent pixels along the X axis. This example can be represented as following equation:
Gx=|G(i,j)−G(i+1,j)|
where G(i,j) represents the intensity of a pixel located at i, j; and G(i+1, j) represents the intensity of a pixel located at i+1, j.
Alternatively, for the first image 14, the absolute value for the first image gradient histogram distribution 360 can represent how much difference in intensity exists between adjacent pixels along the Y axis. This example can be represented as following equation:
Gy=|G(i,j)−G(i,j+1)|
where G(i,j) represents the intensity of a pixel located at i, j; and G(i,j+1) represents the intensity of a pixel located at i, j+1.
The second image gradient histogram distribution 362, and the third image gradient histogram distribution 364 can be calculated in a similar fashion.
It should be noted that the respective image gradient histogram distribution 360, 362, 364 can be calculated for the entire respective image 14, 16, 18. Alternatively, the respective image gradient histogram distribution 360, 362, 364 can be calculated for just a selected region of the respective image 14, 16, 18. For example, a respective image gradient histogram distribution 360, 362, 364 can be calculated for just a centrally located square region of the respective image 14, 16, 18.
The Gaussian model is an adaptive reference model that is based on the image gradient histogram curve. In one embodiment, the Gaussian model is computed from a standard Gaussian function with variation 2.5 and window size of 150. The scale of the Gaussian model is computed as the ratio of the sum of the image gradient histogram to the sum of the standard Gaussian function. In certain embodiments, the Gaussian model window width is within approximately 120-180 gradients. Typically, the higher the peak distribution value, the smaller the Gaussian window width. Further, the higher the number of large image gradients, the bigger the Gaussian window width.
Stated in another fashion, the reference Gaussian model can be adjusted based on the image gradient characteristics. In general, the Gaussian model window size is approximately 150, with the large gradient cutoff of approximately 100-150. The Gaussian model scale is ratio from the area of the gradient curve to the area of the Gaussian model. In certain embodiments, the model is adaptively adjusted based on the image gradient histogram characteristics. In one embodiment, the basic adjusting rule includes (i) increasing or decreasing the window size based on the amount of high gradients present in the image, (ii) adjusting the cut-off window size based on the adjusted Gaussian window, and (iii) constraining the Gaussian model scale in certain range (not too low and not too high).
Comparing each image gradient histogram distribution 360, 362, 364 with its respective Gaussian model gradient histogram distribution 361, 363, 365 illustrates that the first image gradient histogram distribution 360 for a sharp image 14 has significantly more probability to large gradients than the Gaussian model gradient histogram distribution 361, while the second and third gradient histogram distributions 362, 364 for blurry images 16, 18 has significantly less probability for large gradients than their respective Gaussian model gradient histogram distributions 363, 365. Thus, in certain embodiments, the present invention relies on the determination that a sharp image 14 will have significantly more probability to large gradients than the Gaussian model gradient histogram distribution 361.
Stated in another fashion, sharp images 14 have a heavy-tailed distribution in their image gradient histogram distribution 360. Further, the sharp image gradient histogram distributions 360 show significantly more probability to large gradients than the Gaussian model gradient histogram distribution 361. In this embodiment, the present invention defines the focus measure as the difference of large gradients distribution probability between a given image and the Gaussian model. With the present invention, the focus measure will show larger positive value for a focused thru image and a smaller value for defocused thru image.
As provided herein, the present invention can focus on a tail section 370 of the gradient distribution to determine if an image is in focus. As used herein the term “tail section” 370 shall refer to the last portion of the gradient histogram, i.e. the last 10-20 percent of the respective Gaussian model gradient histogram distribution 361, 363, 365. Because the Gaussian model varies according to the scene, the exact value for the tail section 370 will vary according to the scene that is being captured. In the examples illustrated in
In this embodiment, reviewing the tail section 370 area of graphs in
Referring to
In this embodiment, during the auto-focusing procedure, the control system 24 (not shown in
Somewhat similar to the method described above, the present invention defines the focus measure as the difference of large gradients distribution probability between a thru images. Again, in this embodiment, the focus measure will show larger positive value for a focused thru image and a smaller value for defocused thru image.
In the embodiment illustrated in
In this embodiment, the image tail sections 480T, 482T, 484T, 486T, 488T are compared and the image tail sections 480T, 482T, 484T, 486T, 488T with the most large gradients is selected as the thru image that is in focus.
It should be noted that the methods disclosed herein can be used separately or in conjunction with other auto-focusing techniques to improve the accuracy of the auto-focusing techniques. If used in conjunction, the proposed focus measure can be used as a coarse blur degree estimation of the image at the selecting lens position.
Subsequently, after determining which thru image is in focus, the control system can adjust the optical assembly to the adjustment used for capturing that image, and the information for the final image can be captured 522.
Subsequently, after determining which thru image is in focus, the control system can adjust the optical assembly to the adjustment used for capturing that image, and the information for the final image can be captured 620.
While the current invention is disclosed in detail herein, it is to be understood that it is merely illustrative of the presently preferred embodiments of the invention and that no limitations are intended to the details of construction or design herein shown other than as described in the appended claims.
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/US08/77574 | 9/24/2008 | WO | 00 | 12/28/2010 |