This disclosure relates to digital picture processing and, more particularly, to a hand jitter reduction system for cameras.
The demand for multimedia applications in mobile communications has been growing at an astounding rate. Today, a user can send and receive still images, as well as download images and video from the Internet, for viewing on a mobile unit or handset. The integration of the digital camera into the mobile unit has further contributed to the growing trend in mobile communications for multimedia functionality.
Given the limited amount of resources like battery capacity, processing power, and transmission speed associated with a mobile unit, effective digital imaging processing techniques are needed to support multimedia functions. This requires the development of more sophisticated hardware and software that reduces computational complexity for multimedia applications while maintaining the image quality. The development of such hardware and software leads to lower power consumption and longer standby time for the mobile unit.
One facet of the digital imaging process involves removing blurriness from a picture. Blurriness may be caused by hand jitter. Hand jitter is caused by the movement of the user's hand when taking a digital picture with a camera. Even if the user is unaware of the movement, the hand may be continually moving. The movements are relatively small, but if the movements are large relative to the exposure time, the digital picture may be blurry. An object or person in the picture may appear to be moving. Blurriness may also be caused by an object/person moving when a picture is being taken. Blurriness may also be caused by limitations of the optical system used to capture the pictures.
Under low lighting conditions, a digital camera, for example, one found in a mobile unit, takes a longer time to register a picture. The longer exposure time increases the probability that the slight movements produced by the hand may lead to blurriness. Similarly, the longer exposure time increases the chance that the movement by the object/person may be large relative to the exposure time.
Current techniques for compensating for camera movements involve the use of small gyroscopes or other mechanical devices. None of the techniques seem to have an acceptable way to digitally compensate for the camera movements, especially under low lighting conditions. It would be desirable to reduce the amount of blurriness in a digital picture with efficient processing resources suitable for mobile applications under all conditions.
The details of one or more configurations are set forth in the accompanying drawings and the description below. Other features, objects, and advantages will be apparent from the description, drawings and claims.
A camera system comprising a hand jitter reduction (hjr) reduction mode and normal mode may operate by multiplying an exposure time and a gain to generate an exposure time-gain product. There may be a normal mode exposure time-gain product and an hjr mode exposure time-gain product. The exposure time-gain product in normal mode of the normal exposure time and normal gain may be stored in table or used as needed. The exposure time-gain product in hjr mode, may be also stored in a table. The exposure time-gain product in hjr mode, may be produced by modifying (adding, subtracting, multiplying, or dividing) entries in the exposure time-product table (or entry) in normal mode. As such, a separate table may not necessarily be needed, but an equivalent exposure time-gain product in hjr mode may be needed to compare to the exposure time-gain product in normal mode. When operating in hjr mode, parameters are changed to reduce the difference between the exposure-time-gain-product in hjr mode and the exposure time gain product in normal model. As long as an image sensor in a camera system is able to be above a minimum average light level, the difference may be reduced. In any region in hjr mode, where an image sensor may not be meeting the minimum average amount of light level, it may not be possible to maintain the same exposure time-gain product in hjr mode as in normal mode. Operation of a camera in normal mode may be in response to a sensed light level being above a threshold. The operation of a camera in the hjr mode may be selected by the user. The hjr mode may be used in response to a sensed light level being lower than the threshold.
Various configurations are illustrated by way of example, and not by way of limitation, in the accompanying drawings.
A flowchart illustrating how to operate a camera system in normal mode and hand jitter reduction mode is in
The word “exemplary” is used herein to mean “serving as an example, instance, or illustration.” Any configuration, scheme, design or calibration described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other configurations, schemes, designs, or calibrations. In general, described herein are techniques which reduce blurriness in digital pictures as a result of hand jitter and/or lower lighting conditions than normal. Also described are techniques to calibrate cameras which are operable in normal mode and hand jitter reduction mode.
In conventional camera devices, when a user takes a snapshot (currently done by pressing a button), mostly only one frame is used to generate a picture. Methods which employ using more than one frame to generate a picture often are not successful because they yield poor results. With conventional camera devices, the picture may be blurry due to movements produced by the user's own hand movements, these hand movements are known as hand jitter. Conventional camera devices also are challenged by the amount of time required to expose a picture. Under low lighting conditions, the exposure time is typically increased. Increasing the exposure time increases the amount of noise that a user may see due to low lighting conditions as well as increases the probability that hand jitter will produce a blurry picture. Currently, camera devices may contain small gyroscopes to compensate for the hand jitter produced by the user. However, there are many challenges faced when placing gyroscopes on mobile units. Even when these challenges are overcome, the digital hand jitter reduction techniques may be used in combination with devices that have gyroscopes. Current camera devices may also scale the gain under low lighting conditions. Unfortunately, simply increasing the gain amplifies the noise present as a result of the low light level. The result is often a picture of poor quality. Similarly, digital compensation for hand jitter does not always provide adequate results. However, with the techniques disclosed throughout this disclosure, it has been possible to reduce hand jitter, as well as reduce noise under lower light conditions.
Before a user presses the button to take a snapshot and produce a digital picture, a preview mode, may capture a series of frames produced by the image sensor 102. The whole frame or a sub-part of the frame is referred to as an image or interchangeably a picture. For illustrative purposes, it is convenient to discuss the images being processed as a series of frames. Although it should be recognized that not the entire frame need be processed when using a front-end image processing module 106. In addition, the sequence of frames is also known as a stream. The stream may be provided to a front-end image processing module 106 where they are de-mosaiced in order to obtain full RGB resolution as an input to the still image and video compressor 108. As the stream passes through the front-end image processing module 106, in the preview mode, statistics may be collected on frames that aid with the production of the digital picture. These statistics may be, but are not limited to, exposure metrics, white balance metrics, and focus metrics.
The front-end image processing module 106 may feed various signals, which help control the image sensor 102, back into the image sensor module 104. The still image and video compressor 108 may use JPEG compression, or any other suitable compression algorithm. An auto-exposure control module 110 may receive a value proportional to the light level being processed by the front-end image processing module 106, and compare it to a stored light target, in order to aid in at least one of the functions of the front-end image processing module 106. Images that are processed through the modules in front-end image processing module 106 are part of digital frames. The stream may also be sent to a view finder which may be located in display module 112. In the preview mode, a preview decision from the display module 112 may be used in the control of the auto-exposure.
The preview mode in a mobile unit having a digital camera may be used in either a normal mode or a hand jitter reduction (hjr) mode. The user may select the hjr mode (shown as hjr select in
After the color conversion module processes a frame, three color image-components (Y, Cb, and Cr) may be may be sent to hand jitter control module 212. The various parameters from the auto-exposure control module may be fed into hand jitter control module 212. Hand jitter control module 212 may serve multiple purposes. Hand jitter control module 212, may determine the image processing that takes place after the snapshot. Hand jitter control module 212 may detect the value of hjr select, and determine if hand jitter reduction (hjr) needs to be performed. Even though the user has selected hjr mode, hand jitter control module 212 may determine that image processing as is done in normal mode may take place. Hand jitter control module 212 may determine that image processing in hjr mode take place. Generating a digital picture image processing in hjr mode may include capturing a single frame or multiple frames. If hand jitter control module 212 determines that multiple frames will be captured, after passing through hjr control module, the frames may be sent to noise reduction/frame registration module 214, along with a parameter which indicates how many frames may be processed by noise reduction/frame registration module 214. If a single frame is to be processed, noise reduction may take place on the single frame through the use of a noise reduction module 215. Noise reduction module may be a bayer filter, or other similar filter. If multiple frames are to be processed, noise reduction/frame registration module 214 may buffer the number of frames, numf, specified by hand jitter control module 212, and perform frame registration on them. Depending on how many frames and the light level, the purpose of the multiple frame registration may serve the purpose of noise reduction and/or blur reduction. Multiple frame registration may be done by a frame registration module 216.
If hand jitter control module 212 determines that image processing takes place as in normal mode, noise reduction/frame registration module 214 may not be used, and the output from color correction module 210, for example, may be used, even though the user selected hjr mode. Depending on what image processing (the one in normal node or the one in hjr mode) is determined by hand jitter control module 212, a signal (sel) may be used to select which multiplexer 217 output to send to post-process module 218. The output of post-process module 218 may be sent to still and image video compressor 108 and/or display module 112.
In addition to outputting a select signal (sel) and the number of frames to use for noise reduction and/or frame registration, hand jitter control module 212 may also output other parameters: new auto-exposure frame rate (ae fr_new), new auto-exposure gain (ae gain new), new auto-exposure time (ae time_new), and the number of frames to be processed (numf). These parameters may be sent to image sensor module 104 to control image sensor 102. A digital gain may also be output by hand jitter control module 212 and may be applied at any module after the image sensor module 104. As an example, the digital gain may be applied during the white-balance/color correction module 206.
Those ordinarily skilled in the art will recognize that while pixels are normally described, sub-pixels, or multiple pixels may also be used as inputs into front-end image processing module 106a. Furthermore, a sub-set of these image-components or other forms: RGB, and spatial-frequency transformed pixels, may also be sent to a hand jitter control module, such as hand jitter control module 212. As such, another configuration which captures the functionality of a front end image processing module in a digital image processing system is illustrated in
In
The normal mode and hand jitter reduction mode may be calibrated by creating at least one auto-exposure time-gain table. An auto-exposure time-gain table may have an exposure time column and a gain column. An entry in the exposure-time column and an entry in the gain column may be multiplied to produce an exposure-time gain product. Each row entry or index into an auto-exposure time-gain table may represent a light level value, that is, each light level may be mapped to an auto-exposure index into the auto-exposure time-gain table. The auto-exposure time-gain table(s) may have various regions of operation, such as those designated in
In region R1 and R2 the camera may operate at one frame rate, e.g. fr1. As shown in
It is desirable to make the transition from one region to another as continuous as possible. Thus, the exposure time-gain product at the far right in region R2 may be near or the same as the exposure time-gain product at the far left in region R3. Since the exposure time is at a maximum, as shown at the far right in region R2, the frame rate may be lowered at boundary_b. The lowering of the frame rate, i.e., changing the frame rate from fr1 to fr2, allows an increase in the exposure time. The exposure time may reach the maximum (as shown in
Hand jitter reduction (hjr) mode may be calibrated by creating another auto-exposure time-gain table. Subtraction(s), addition(s), division(s) or multiplication(s) may be performed on the normal mode auto-exposure time-gain table entries to generate an “equivalent” auto-exposure time-gain table desired in the hand jitter reduction mode. The characteristics in the columns (exposure time and gain) of the other auto-exposure time-gain table or “equivalent” auto-exposure time-gain table is illustrated by figures
Longer exposure time increases the probability that slight movements produced by the hand may lead to blurriness. Thus, once the eye can detect a blur and/or noise reduction, in region R2, the exposure time should be decreased. Hence, the exposure time in region R2 is reduced (as shown in
As mentioned previously, in either normal mode or hjr mode, a series of frames may be previewed. Preview mode uses the characteristics and (unmodified) auto-exposure parameters of normal mode. In hjr mode, the equivalent auto-exposure time and gain table is relative to the normal mode while in preview mode. Thus, when increasing or decreasing the exposure time, the gain, or the frame rate it is relative to the value in preview mode. In the hjr mode, boundary_b may be determined by checking when the exposure time-gain product reaches the same value as the exposure time-gain product as in the normal mode. In hjr mode, if the light level is in region R3, the frame rate may be increased by an amount greater than frame rate fr2. This may happen because in preview mode, in region R3, the frame rate is fr2 (a rate lower than fr1). For example, if frame rate fr1 is 30 frames per second (fps), and frame rate fr2 is 15 fps, then in hjr mode the frame rate of 15 fps may be increased by some amount L, up to frame rate fr1. In this example, L is 15 fps. Hence, in hjr mode, increasing the frame rate from 15 fps to 30 fps, for example, allows the exposure time to be maximized in region R3 to the value that it was in region R2 for the normal mode. The increase in exposure time in region R3, may lead to a decrease in gain in region R3. If an analog gain is used throughout regions R1, R2 and R3, because of the increased gain offset (at the far left of region R2) in the hjr mode, the analog gain in region R3 may saturate, i.e., reach a maximum (as seen in the far right of region R3) prior to boundary_c. To operate beyond the maximum analog gain, a digital gain may be added to the analog gain. The digital gain is illustrated at the bottom of
The light level, in normal mode at boundary_c, may be stored as a predetermined light target to be checked in hjr mode. In hjr mode, if the light level is below that of the stored light target, it may be that there is not enough light for an image sensor 102 to produce a minimum average light level for an image sensor 102. A minimum average light level may be determined by adding the light value (luma and/or chrominance) at each pixel and dividing by the total number of pixels. Another way to compute the minimum average light level for an image sensor 102 is to discard in the computation all pixels which are below a certain threshold. For example, pixels below the value 10 are not used, and the remaining pixels (above the certain threshold) are used to compute the minimum average light level. Typically, luma light values are used, although chrominance values may also be used. For illustrative purposes the luma value is discussed. In the hjr mode, if the luma value is below the predetermined luma (light) target, the light (luma) level will be in region R4.
If it has been determined that the light level is below the luma target in hjr mode, the frame rate, which in preview mode was fr2, does not need to be changed. The exposure time, however, may be adjusted because of the reduced amount of light and may be increased. The exposure time may be increased to the maximum allowable for frame rate fr2. To counteract the increase in exposure time, the digital gain is decreased at boundary_c, and may continue to increase in region R4.
Throughout regions R1, R2, and R3, the exposure-time-gain product between normal mode and hjr mode is aimed to be the same. That is, the difference should be reduced to make the difference between the two as close as possible to zero. In region R4, an image sensor 102 may not be meeting the minimum average amount of light level. Hence, a digital gain may be applied to increase the gain level in region R4.
A flowchart illustrating how to operate a camera system in normal mode and hand jitter reduction mode is in
Selection of the minimum between the increased gain by K and the max (analog) gain 810, may be used if the increased gain exceeds the maximum analog gain of the image sensor. It may also be used in a configuration where a digital gain is used in a region like R2. The ratio of the increased gain by K and the new analog gain 812 (ae_gain new) may be compared with a constant C1. C1 is related to the maximum analog gain of the sensor, and a typical value may be one. The minimum between C1 and the ratio of the increased gain by K and the new analog gain is selected 814, and at the minimum value a digital gain may be applied.
A decision block 816 compares if the light level is greater than the light level at boundary_b. If the light level is greater than the light level at boundary_b, then modification of auto-exposure parameters if needed 706 ends. If the light level is less than at boundary_b, a decision block 818 compares if the light level is greater than the light level at boundary_c. If the light level is greater than at boundary_c, then the frame rate may increase by some amount L 820. As mentioned above, in region R3, in hjr mode the frame rate fr2 may be increased up to and including frame rate fr1. Thus, L may be an amount that adjusts the frame rate closer to and including frame rate fr1. The number of frames to process a digital picture, numf, may be set to two 822. If the light level is less than the light level at boundary_c, the new exposure time 824 and new gain 826 in hjr mode may be the same as that used in normal mode. The ratio of the luma_target with the measured luma (i.e., average luma for the image sensor) 828 may be compared with a constant C2. The minimum selection 830 between the constant C2 and the ratio of the luma_target with the measured luma may produce a digital gain. A typical value of two may be used, and may correspond two using the constant instead of the ratio when the measured luma in region R4 has dropped below half of the luma_target. When the image sensor is not meeting the luma target, more frames to process to generate a digital picture, numf, may be used to reduce noise and increase intensity after frame registration. A minimum value of three frames, for numf, 832 has been found to suffice in this case.
A number of different configurations and techniques have been described. The techniques may improve removing blurriness from images with longer exposure times. The techniques and configurations may also aid in the reduction of hand jitter for practically any digital device that takes pictures. The techniques and configurations may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the techniques and configurations may be directed to a computer-readable medium comprising computer-readable program code (also may be called computer-code), that when executed in a device that takes pictures, performs one or more of the methods mentioned above.
The computer-readable program code may be stored on memory in the form of computer readable instructions. In that case, a processor such as a DSP may execute instructions stored in memory in order to carry out one or more of the techniques described herein. In some cases, the techniques may be executed by a DSP that invokes various hardware components, such as multiplication of an exposure time and gain, to generate an exposure time-gain product. The exposure time-gain product (s) disclosed may be implemented in one or more microprocessors, one or more application specific integrated circuits (ASICs), and one or more field programmable gate arrays (FPGAs), or some other hardware-software combination. These techniques and configurations are within the scope of the following claims.
This application claims the benefit of provisional U.S. Application Ser. No. 60/760,768, entitled “HAND JITTER REDUCTION SYSTEM DESIGN,” filed Jan. 19, 2006. This disclosure is related to co-pending patent application Ser. No. 11/534,935 entitled “HAND JITTER REDUCTION COMPENSATING FOR ROTATIONAL MOTION,” and co-pending patent application Ser. No. 11/534,808 entitled “HAND JITTER REDUCTION FOR COMPENSATING FOR LINEAR DISPLACEMENT,” both co-filed with this application on Sep. 25, 2006.
Number | Name | Date | Kind |
---|---|---|---|
4446521 | Inouye | May 1984 | A |
4845766 | Peppers et al. | Jul 1989 | A |
4922543 | Ahlbom et al. | May 1990 | A |
5018216 | Kojima | May 1991 | A |
5262856 | Lippman et al. | Nov 1993 | A |
5745808 | Tintera | Apr 1998 | A |
5821943 | Shashua | Oct 1998 | A |
5832101 | Hwang et al. | Nov 1998 | A |
5832110 | Hull | Nov 1998 | A |
5943450 | Hwang | Aug 1999 | A |
6005981 | Ng et al. | Dec 1999 | A |
6128047 | Chang et al. | Oct 2000 | A |
6160900 | Miyawaki et al. | Dec 2000 | A |
6166370 | Sayag | Dec 2000 | A |
6243419 | Satou et al. | Jun 2001 | B1 |
6285711 | Ratakonda et al. | Sep 2001 | B1 |
6310985 | White | Oct 2001 | B1 |
6381279 | Taubman | Apr 2002 | B1 |
6418168 | Narita | Jul 2002 | B1 |
6434265 | Xiong et al. | Aug 2002 | B1 |
6522712 | Yavuz et al. | Feb 2003 | B1 |
6693673 | Tanaka et al. | Feb 2004 | B1 |
6750903 | Miyatake et al. | Jun 2004 | B1 |
6873360 | Kawashiri | Mar 2005 | B1 |
6879656 | Cesmeli et al. | Apr 2005 | B2 |
6996176 | Chang et al. | Feb 2006 | B1 |
6996254 | Zhang et al. | Feb 2006 | B2 |
7057645 | Hara et al. | Jun 2006 | B1 |
7065261 | Horie | Jun 2006 | B1 |
7414648 | Imada | Aug 2008 | B2 |
7555166 | Lee et al. | Jun 2009 | B2 |
7672503 | Morisada et al. | Mar 2010 | B2 |
20010033693 | Seol et al. | Oct 2001 | A1 |
20020097904 | White | Jul 2002 | A1 |
20020164075 | Acharya et al. | Nov 2002 | A1 |
20030223010 | Kaplinsky et al. | Dec 2003 | A1 |
20040056966 | Schechner et al. | Mar 2004 | A1 |
20040114831 | Notovitz et al. | Jun 2004 | A1 |
20040145673 | Washisu | Jul 2004 | A1 |
20040160525 | Kingetsu et al. | Aug 2004 | A1 |
20040170246 | Koenig et al. | Sep 2004 | A1 |
20040239775 | Washisu | Dec 2004 | A1 |
20050036558 | Dumitras et al. | Feb 2005 | A1 |
20050056699 | Meier et al. | Mar 2005 | A1 |
20050094901 | Seol et al. | May 2005 | A1 |
20050166054 | Fujimoto | Jul 2005 | A1 |
20050195221 | Berger et al. | Sep 2005 | A1 |
20050232494 | Fan | Oct 2005 | A1 |
20060274156 | Rabbani et al. | Dec 2006 | A1 |
20070076982 | Petrescu | Apr 2007 | A1 |
20070171981 | Qi | Jul 2007 | A1 |
20070172150 | Quan et al. | Jul 2007 | A1 |
20070236579 | Li et al. | Oct 2007 | A1 |
20070237514 | Pillman et al. | Oct 2007 | A1 |
20080165280 | Deever et al. | Jul 2008 | A1 |
20080292171 | Bruder et al. | Nov 2008 | A1 |
20090123082 | Atanssov et al. | May 2009 | A1 |
20100171837 | Pillman et al. | Jul 2010 | A1 |
Number | Date | Country |
---|---|---|
11215432 | Aug 1999 | JP |
2000047297 | Feb 2000 | JP |
2000-224470 | Aug 2000 | JP |
2003322906 | Nov 2003 | JP |
2004072422 | Mar 2004 | JP |
2004304252 | Oct 2004 | JP |
2004357202 | Dec 2004 | JP |
2005286790 | Oct 2005 | JP |
2006086762 | Mar 2006 | JP |
WO8603866 | Jul 1986 | WO |
Number | Date | Country | |
---|---|---|---|
20070166020 A1 | Jul 2007 | US |
Number | Date | Country | |
---|---|---|---|
60760768 | Jan 2006 | US |