The present invention relates generally to photography.
Image blur caused by camera shake is a common problem in photography. The problem is especially acute when a lens of relatively long focal length is used, because the effects of camera motion are magnified in proportion to the lens focal length. Many cameras, including models designed for casual “point and shoot” photographers, are available with zoom lenses that provide quite long focal lengths. Especially at the longer focal length settings, camera shake may become a limiting factor in a photographer's ability to take an unblurred photograph, unless corrective measures are taken.
Some simple approaches to reducing blur resulting from camera shake include placing the camera on a tripod, and using a “fast” lens that enables relatively short exposure times. However, a tripod may not be readily available or convenient in a particular photographic situation. A “fast” lens is one with a relatively large aperture. However large-aperture lenses are often bulky and expensive and not always available. In addition, the photographer may wish to use a smaller lens aperture to achieve other photographic effects such as large depth of field.
Various devices and techniques have been proposed to help address the problem of image blur due to camera shake. Some cameras or lenses are equipped with image stabilization mechanisms that sense the motion of the camera and move one or more optical elements in such a way as to compensate for the camera shake. Such motion-compensation systems often add complexity and cost to a camera.
Image data signals 104 are passed to logic 110. Logic 110 interprets the image data signals 104, converting them to a numerical representation, called a “digital image”, a “digital photograph”, or simply an “image” or “photograph”. A digital image is an ordered array of numerical values that represent the brightness or color or both of corresponding locations in a scene or picture. Logic 110 may perform other functions as well, such as analyzing digital images taken by the camera for proper exposure, adjusting camera settings, performing digital manipulations on digital images, managing the storage, retrieval, and display of digital images, accepting inputs from a user of the camera, and other functions. Logic 110 also controls electronic array light sensor 103 through control signals 105. Logic 110 may comprise a microprocessor, a digital signal processor, dedicated logic, or a combination of these.
Storage 111 comprises memory for storing digital images taken by the camera, as well as camera setting information, program instructions for logic 110, and other items. User controls 112 enable a user of the camera to configure and operate the camera, and may comprise buttons, dials, switches, or other control devices. A display 109 may be provided for displaying digital images taken by the camera, as well as for use in conjunction with user controls 112 in the camera's user interface. A flash or strobe light 106 may provide supplemental light 107 to the scene, under control of strobe electronics 108, which are in turn controlled by logic 110. Logic 110 may also provide control signals 113 to control lens 101. For example, logic 110 may adjust the focus of the lens 101, and, if lens 101 is a zoom lens, may control the zoom position of lens 101.
Motion sensing element 114 senses motion of camera 100, and supplies information about the motion to logic 110.
The amount of light collected from a particular scene location by a particular pixel is generally in proportion to the time duration for which rays from that scene location impinged on the pixel. This is generally inversely proportional to the speed of camera rotation during the impingement (and, of course, limited by the exposure time for the photograph).
Integrator 403 may be an analog circuit, or the function of integrator 403 may be performed digitally. Similarly, scaling block 405 may be an analog amplifier, but preferably its function is performed digitally, so that changes in lens focal length may be easily accommodated. Some of the functions of motion sensing element 114 may be performed by logic 110. For example, integrator 403 and scaling block 405 may be performed by a microprocessor or other circuitry comprised in logic 110.
In accordance with an example embodiment of the invention, logic 110 monitors the camera motion by monitoring position signal 406, and uses the information to control camera 100 so that extreme motion blur is avoided, and also to perform image processing to substantially compensate for remaining motion blur that does occur in photographs taken by camera 100.
In a preferred embodiment, camera 100 avoids extreme motion blur by controlling, in response to the measured motion of camera 100, a starting time, and ending time, or both for a photographic exposure. For the purposes of this disclosure, the starting and ending times for a photographic exposure are called exposure boundaries. Techniques for selecting exposure boundaries for the purpose of avoiding extreme motion blur are known in the art.
Pending U.S. patent application Ser. No. 10/339,132, entitled “Apparatus and method for reducing image blur in a digital camera” and having a common inventor and a common assignee with the present application, describes a digital camera that delays the capture of a digital image after image capture has been requested until the motion of the digital camera satisfies a motion criterion. That application is hereby incorporated in its entirety as if it were reproduced here. In one example embodiment, the camera of application Ser. No. 10/339,132 delays capture of a digital image until the output of a motion tracking subsystem reaches an approximate local minimum, indicating that the camera is relatively still. Such a camera avoids extreme motion blur by selecting, in response to measured camera motion, an exposure boundary that is the starting time of an exposure interval.
Pending U.S. patent application Ser. No. 10/842,222, entitled “Image-exposure system and methods” and also having a common inventor and a common assignee with the present application, describes detecting motion and determining when to terminate an image exposure based on the detected motion of a camera. That application is hereby incorporated in its entirety as if it were reproduced here. In one example embodiment described in application Ser. No. 10/842,222, an image-exposure system comprises logic configured to terminate an exposure when the motion exceeds a threshold amount. Such a method avoids extreme motion blur by selecting, in response to measured camera motion, an exposure boundary that is the ending time of and exposure interval.
Other methods and devices may be envisioned that select, based on measured camera motion, a starting time, an ending time, or both for a photographic exposure. In example camera 100, logic 110 monitors the camera motion, as detected by motion sensing element 114, and selects one or more exposure boundaries for a photograph.
Devices and methods also exist in the art for performing image processing to substantially compensate for motion blur. Pending U.S. patent application Ser. No. 11/148,985, entitled “A method and system for deblurring an image based on motion tracking” and having a common assignee with the present application, describes deblurring an image based on motion tracking. That application is hereby incorporated in its entirety as if it were reproduced here. In one example embodiment described in application Ser. No. 11/148,985, motion of an imaging device is sensed during a photographic exposure, a blur kernel is generated based on the motion, and the resulting photograph is deblurred based on the blur kernel.
Preferably, image processing performed to compensate for motion blur, in accordance with an example embodiment of the present invention, is performed using frequency-domain methods. Such methods are known in the art. See for example The Image Processing Handbook, 2nd ed. by John C. Russ, CRC Press, 1995. An example of frequency domain processing is given below.
The combination, in camera 100, of avoiding extreme motion blur by selecting one or more exposure boundaries for a photograph coupled with image processing to compensate for residual blur in the resulting photograph, provides a synergistic improvement in image quality. Each capability enhances the performance of the other.
While image processing performed to remove motion blur can improve an image considerably, it can introduce noise artifacts into the image, some of which are visible in recovered photograph 1201. These artifacts tend to be worse when the motion blur vector has a complex trajectory or extends over a relatively large number of pixels. That is, the larger and more complex the camera motion, the less reliable image processing is for recovering an unblurred image. Furthermore, when the exposure time for a photograph is long enough that large or complex camera motions can occur during the exposure, other uncompensated camera motion is also more likely to occur. For example, camera rotation about the Z axis, or camera translations may occur, which are not detected by motion sensing element 114. These motions may cause the image processing to fail to remove image blur. Because camera 100 avoids extreme motion blur by selecting the starting time, ending time, or both of a photographic exposure, image processing performed to remove the residual blur from the resulting photograph is likely to result in a pleasing image. The blur vector is kept to a manageable size, and the exposure time may be kept short enough that the other, uncompensated motions remain insignificant.
Similarly, the ability to perform image processing to compensate for image blur allows more flexible use of blur minimization by selection of exposure boundaries based on camera motion. Without the capability to perform the image processing, camera 100 would constrain exposure times based on camera motion so that the resulting photographs were acceptably sharp. With the capability of performing the image processing, camera 100 can extend exposure times, relying on the image processing to correct the motion blur that occurs. These extended exposure times are very desirable because they allow the photographer increased flexibility, and can enable convenient handheld camera operation in situations where it would otherwise be infeasible.
Furthermore, these performance improvements are accomplished without the need for actuating an optical element in the camera. Much of the required control and processing occurs in logic 110, which would likely be present in a camera without an embodiment of the invention. If the image processing to compensate for the residual motion blur is performed by using a blur kernel, the relatively small blur vector allowed by the extreme blur avoidance may also reduce the time required to perform the image processing, as compared with a camera that relies on image processing alone to compensate for motion blur.
In another example embodiment of the invention, logic 110 is configured to choose exposure boundaries encompassing camera motion that is especially amenable to compensation by digital image processing. For example, logic 110 may favor linear motion in its selection of exposure boundaries for a photograph, choosing exposure boundaries between which the camera motion is substantially linear. For the purposes of this disclosure, linear motion is camera motion that causes the camera's optical axis to trace a straight line on an imaginary distant surface. Linear motion also results in a blur vector that is a straight line. Note that during linear motion, the camera may actually be rotating about one or more axes.
If sufficient exposure has occurred before time T2, logic 110 may terminate the exposure before time T2. Likewise, if the delay between S2 and T1 is too long for crisp camera operation, that is, if after S2 logic 110 must wait an excessively long time before finding a trajectory portion with little curvature, logic 110 may start the exposure recognizing that curvature is occurring in the interest of being quickly responsive to the photographer's command.
Criteria for determining times T1 and T2 will depend on the camera geometry, lens focal length, the processing capability of logic 110, and other factors. For example, logic 110 may select T1 to be a time when trajectory 1301 has not deviated from a straight line by more than a first predetermined number of pixels in a second predetermined number of previous pixels most recently traversed. For example, logic 110 may select T1 to be a time when trajectory 1301 has not deviated from a straight line by more than three pixels in image space in the previous 10 pixels traversed. Similarly, logic 110 may select T2 to be a time when trajectory 1301 has again deviated from a straight line by a first predetermined number of pixels in a second predetermined number of pixels most recently traversed. For example, logic 110 may select T2 to be a time when trajectory 1301 has again deviated from a straight line by more than three pixels in image space in the previous 10 pixels traversed. The first and second predetermined numbers of pixels used for selecting T1 need not be the same as the first and second predetermined numbers used for selecting T2.
This application is related to the following application, which is filed on the same date as this application, and which is assigned to the assignee of this application: Motion blur reduction and compensation (U.S. application Ser. No. ______ not yet assigned);