With the proliferation of low cost microprocessors, memory, and image capture electronics, digital cameras are gaining popularity and are becoming more and more widely available to a larger number of consumers. One of the advantages of a digital camera over a conventional film camera is that when a digital camera captures an image, the image is stored electronically in a memory element associated with the camera and is available for immediate viewing. For example, it is common to capture an image using a digital camera and then immediately display the captured image on a display screen associated with the digital camera. This ability to immediately view the image is commonly referred to as “instant review.” The ability to immediately review the recaptured image allows the user to immediately decide whether the image is satisfactory and worth keeping. The image may then be printed at a later time.
Many characteristics for determining whether the image is satisfactory may not be readily visually noticeable on the small display associated with many digital cameras. The displays used on the cameras typically are not able to display an image with the clarity of a printed image. Therefore, the user may not be able to determine whether image quality was optimized simply by viewing the image displayed on the display. For example, while the image may appear to be in focus and exposed properly when viewed on the camera display, the image may appear out of focus and improperly exposed when it is printed. Unfortunately, printing the image is a time consuming and costly way to determine whether an image is satisfactory.
Devices and methods for analyzing images are described herein. The devices and methods described herein analyze image data that is representative of images. The devices and methods for analyzing images may be implemented in hardware, software, firmware, or a combination thereof. In one embodiment, the system and method for analyzing images are implemented using a combination of hardware, software or firmware that is stored in a memory and that is executable by a suitable instruction execution system. In the embodiments described herein, the device is a digital camera wherein software stored on hardware in the camera analyzes image data or otherwise instructs the digital camera to analyze image data.
The hardware portion of the system and method for analyzing a captured image can be implemented with any or a combination of the following technologies, which are all well known in the art: a discreet logic circuit(s) having logic gates for implementing logic functions upon data signals, an application specific integrated circuit (ASIC) having appropriate combinational logic gates, a programmable gate array(s) (PGA), a field programmable gate array (FPGA), etc. The software portion of the system and method for analyzing a captured image can be stored in one or more memory elements and executed by a suitable general purpose or application specific processor.
The software for analyzing images, which comprises an ordered listing of executable instructions for implementing logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. In the context of this document, a “computer-readable medium” can be any means, which contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
The camera 100 includes an image sensor 104. The image sensor 104 may comprise a charge coupled device (CCD) or an array of complementary metal oxide semiconductors (CMOS), which are both arrays of light sensors. Both the CCD and the CMOS sensor includes a two-dimensional array of photosensors, which are sometimes referred to as pixels. The pixels convert specific wavelengths or colors of light intensities to voltages that are representative of the light intensities. In one embodiment, higher pixel values or voltages are representative of higher intensities of light and lower pixel values are representative of lower intensities of light.
In one embodiment of the camera 100, the image sensor 104 captures an image of a subject by converting incident light into an analog signal. The analog signal is transmitted via a connection 109 to an analog front end (AFE) processor 111. The analog front end processor 111 typically includes an analog-to-digital converter for converting the analog signal received from the image sensor 104 into a digital signal. The analog front end processor 111 provides this digital signal as image data via a connection 112 to the ASIC 102 for image processing.
The ASIC 102 is coupled to one or more motor drivers 119 via a connection 118. The motor drivers 119 control the operation of various parameters of the lens 122 via a connection 121. For example, lens controls, such as zoom, focus, aperture and shutter operations can be controlled by the motor drivers 119. A connection 123 between the lens 122 and the image sensor 104 is shown as a dotted line to illustrate the operation of the lens 122 focusing on a subject and communicating light to the image sensor 104, which captures the image provided by the lens 122.
The ASIC 102 also sends display data via a connection 124 to a display controller 126. The display controller may be, for example, a national television system committee (NTSC)/phase alternate line (PAL) encoder, although, depending on the application, other standards for presenting a display data may be used. The display controller 126 converts the display data from the ASIC 102 into a signal that can be forwarded via a connection 127 to an image display 128. The image display 128, which, as an example may be a liquid crystal display (LCD) or other display, displays the captured image to the user of a digital camera 100. The image display 128 is typically a color display located on the digital camera 100.
Depending on the configuration of the digital camera 100, the image shown to a user on the image display 128 may be shown before the image is captured and processed, in what is referred to as “live view” mode, or after the image is captured and processed, in what is referred to as “instant review” mode. In some embodiments, a previously captured image may be displayed in what is referred to as “review” or “playback” mode. The instant review mode is typically used to display the captured image to the user immediately after the image is captured and the playback mode is typically used to display the captured image to the user sometime after the image has been captured and stored in memory.
The instant review mode allows the user of the camera 100 to immediately view the captured image on the display 128. Unfortunately, because the image display 128 is typically small, only gross features, or characteristics, of the image can be visually observed. Furthermore, the image display 128 may not accurately reproduce color, tint, brightness, etc., which may further make it difficult for a user to determine the quality of the captured image. The difficulty in visually determining the quality of the captured image leads to the possibility of saving an image that may include deficiencies that, if visually detected, would likely cause the user to discard the image and attempt to capture another image having better quality. In order to determine whether the image includes deficiencies that may not be apparent to the user when viewing the captured image on the image display 128 in the instant review mode, the image analysis logic 150 dynamically analyzes one or more characteristics of the captured image. The analysis logic 150 then presents the user, via the image display 128 and a user interface, an analysis of the captured image. An exemplary dynamic analysis of the data for each pixel in a captured image is described below with reference to
It is noted that the terms of bright and dark may not necessarily refer to pixels that are saturated (clipped) due to extremely bright light or pixels that imaged no light. The term bright may refer to clipped pixels and pixels that are within a range of being clipped. Likewise, dark pixels may refer to pixels that are within a range of the dark current.
Similar dynamic analyses can be performed to determine whether an image is in focus or to determine whether the white balance is correct. In one embodiment of determining whether an image is in focus, pixels in an image are examined to determine whether sharp transitions exist between the pixels. For example, a dark pixel adjoining or in close proximity to a bright pixel may indicate that the image is in focus, while a dark pixel separated from a bright pixel by a number of gray pixels may indicate that the image is out of focus.
White balance is a characteristic of the image that generally refers to the color balance in the image to ensure that color reproductions are accurate. An image in which each pixel is a different shade of the same color may indicate an image in which the white balance is improperly adjusted.
In addition to the foregoing, an image improvement logic 160 may be provided to present to the user with a recommendation in the form of instructions presented on the image display 128 on ways in which to possibly improve a subsequent image. In other embodiments, the information may be provided via mechanisms, such as audio or speech information. For example, the image improvement logic may suggest adjusting a condition under which the image was captured or adjusting a setting or parameter used to capture the image. As will be described below, in one embodiment the image analysis logic 150 analyzes the captured image and, optionally, the camera settings used to capture the image, and determines a value of one or more characteristics of the captured image. For example, to determine whether the exposure of the image is satisfactory, if a predefined number of white pixels in the image is exceeded, then the image analysis logic 150 may indicate that the image is overexposed. Further, if the image analysis logic 150 determines that one or more characteristics of the captured image is not satisfactory to yield a high quality image, the image improvement logic 160 may determine whether a condition used to capture the image should be adjusted, or whether a camera setting should be adjusted, to improve a subsequent image. For example, if the image analysis logic 150 determines that the image is underexposed, the image improvement logic 160 may determine that a subsequent image may be improved by activating the camera flash for a subsequent image.
When the image analysis logic 150 analyzes the data representing the captured image and the setting used to capture the image, the analysis can be used by the image improvement logic 160 to suggest adjustments to the settings to improve a subsequent image. These suggested adjustments to the camera settings or parameters can be presented to the user on a help screen via the image display 128, or, in an alternative configuration, can be automatically changed for a subsequent image.
It is noted that the image analysis logic 150 and the image improvement logic 160 may be a single unit. For example, they may exist in the same firmware or be a single computer program. They have been split into separate functions herein solely for illustration purposes.
The ASIC 102 is coupled to a microcontroller 161 via a connection 154. The microcontroller 161 can be a specific or general purpose microprocessor that controls the various operating aspects and parameters of the digital camera 100. For example, the microcontroller 161 may be coupled to a user interface 164 via a connection 162. The user interface 164 may include, for example but not limited to, a keypad, one or more buttons, a mouse or pointing device, a shutter release, and any other buttons or switches that allow the user of the digital camera 100 to input commands.
The ASIC 102 is also coupled to various memory modules, which are collectively referred to as memory 136. The memory 136 may include memory internal to the digital camera 100 and/or memory external to the digital camera 100. The internal memory may, for example, comprise flash memory and the external memory may comprise, for example, a removable compact flash memory card. The various memory elements may comprise volatile, and/or non-volatile memory, such as, for example but not limited to, synchronous dynamic random access memory (SDRAM) 141, illustrated as a portion of the memory 136 and flash memory. Furthermore, the memory elements may comprise memory distributed over various elements within the digital camera 100.
The memory 136 may also store the image analysis logic 150, the image improvement logic 160, the settings file 155 and the various software and firmware elements and components (not shown) that allow the digital camera 100 to perform its various functions. The memory also stores an image file 135, which represents a captured image. When the system and method for analyzing an image is implemented in software, the software code (i.e., the image analysis logic 150) is typically executed from the SDRAM 141 in order to enable the efficient execution of the software in the ASIC 102. The settings file 155 comprises the various settings used when capturing an image. For example, the exposure time, aperture setting (f-stop), shutter speed, white balance, flash on or off, focus, contrast, saturation, sharpness, ISO speed, exposure compensation, color, resolution and compression, and other camera settings may be stored in the setting file 155. As will be described below, the setting file 155 may be accessed by the image analysis logic 150 to analyze a captured image by, in one example, determining the camera settings used to capture the image that is under analysis.
The ASIC 102 executes the image analysis logic 150 so that after an image is captured by the image sensor 104, the image analysis logic 150 analyzes various characteristics of the captured image. These characteristics may include characteristics of the captured image, or alternatively, may include the settings used to capture the image. Further, if the image improvement logic 160 determines that the image could be improved by changing one or more of the conditions under which the image was captured, or by changing one or more camera settings, then the image improvement logic 160 can either suggest these changes via the user interface 164 and the image display 128, or can automatically change the settings and prepare the camera for a subsequent image. Embodiments of the analysis are described in greater detail below.
The data for each pixel in the image file 135 can be analyzed by the image analysis logic 150 to determine characteristics of the image. For example, characteristics including, but not limited to, the exposure, focus or the white balance of the captured image can be analyzed. A predominance of white pixels may be indicative of overexposure and a predominance of black pixels may be indicative of underexposure. To determine whether an image is in focus, pixels in an image are analyzed to determine whether sharp transitions exist between pixels. For example, a black pixel adjoining a white pixel may indicate that the image is in focus, while a black pixel separated from a white pixel by a number of gray pixels may indicate that the image is out of focus. An image in which each pixel is a different shade of the same color may indicate a problem with the white balance of the image. An example of determining the exposure will be described below with respect to
In block 302 the image sensor 104 of
In decision block 306, the user determines whether he or she wants to view the settings with which the image was captured. If the user wants to view the settings, the settings are displayed to the user on the image display 128 as indicated in block 308. If the user does not want to view the settings, then, in decision block 312, it is determined whether the user wants the image analysis logic 150 to analyze the image. If the user does not want the image to be analyzed, then, in block 314 the image can be saved or discarded. Alternatively, the image analysis logic 150 can be invoked automatically without user intervention.
In block 316, the image analysis logic 150 analyzes the data within the image file 135. The data is analyzed to determine various characteristics of the captured image. The following example will use exposure as the characteristic that is analyzed by the image analysis logic 150. However, other characteristics, such as, focus and white balance, can be analyzed. Analysis of several of these other characteristics will be described in greater detail below.
When analyzing exposure, the image analysis logic 150 performs a pixel by pixel analysis to determine whether the image includes a predominance of either black or white pixels. It should be noted that rather than sampling all the pixels constituting the image, a sample of the pixels may be analyzed. In this example, the data associated with each pixel in the image file 135 is analyzed to determine whether a pixel is a black pixel or a white pixel. Each pixel is analyzed to determine its corresponding R, G and B values. For example, if the R, G and B values for the pixel 204 are all zeros, the pixel is considered a black pixel. Each pixel in the pixel array 208 is analyzed in this manner to determine the number of black or white pixels in the pixel array 208 for this image file. A determination in block 306 that a substantial portion of the pixels in the array 208 are black indicates that the image is likely underexposed. Conversely, a determination that many of pixels in the array 208 are white indicates that the image is likely overexposed. Of course the image may be of an all white or an all black subject, in which case the user may choose to disregard the analysis.
In an alternative embodiment, the data in the image file 135 can be analyzed in combination with other data available either in the image file 135 or from the settings file 155 in the camera 100. For example, additional data, sometimes referred to as metadata, saved in the header 202 of the image file 135 can be analyzed in conjunction with the information from each pixel in the array 208. This information might include, for example, the ISO setting and the aperture setting (f-stop) used to capture the image. These data items can be used in conjunction with the pixel data above to develop additional information regarding the characteristic of the analyzed image. Analysis of the settings will be described in greater detail below.
Furthermore, the image analysis logic 150 can also analyze the camera settings used to capture the image and use those settings when analyzing the data in the image file 135 to develop additional data regarding the image file 135. For example, the image analysis logic 150 can access the settings file 155 in the memory 136 of
In decision block 318, it is determined whether the image data analyzed in block 316 represents an acceptable image. This can be an objective determination based on criteria that the user enters into the camera 100 via a user interface 164,
If, however, in decision block 318 the image analysis logic 150 determines that certain conditions under which the image was captured or settings used to capture the image can be changed to improve the image, then, in block 322, the image improvement logic 160 evaluates the settings used to capture the data in the image file 135 to determine whether a condition or setting can be changed to improve the image. In addition, the image improvement logic 160 can also develop recommendations to present to the user of the camera to improve a subsequent image. For example, if the analysis in block 316 suggests that the image was underexposed, the image improvement logic 160 may develop “advice” to be presented to the user. In this example, as will be described below, the image improvement logic 160 may suggest that the user activate the flash to improve a subsequent image. This suggestion may be provided to the user via the image display 128 in conjunction with the user interface 164.
In block 324, an instant review of settings and a help screen is displayed to the user. The instant review and help screen may include, for example, a thumbnail size display of the image, a display of the setting used to capture the image, an evaluation of the image and, if the user desires, suggestions on ways to improve the image. The evaluation of the image may include, for example, a notification that characteristics, such as exposure, focus and color balance are satisfactory. Suggestions on ways in which to improve the image may be communicated to the user via the image display 128 and may include, for example, changing a condition under which the image was captured, changing a setting with which the image was captured, or a combination of both changing a condition and a setting.
In decision block 326, the user determines whether another image is to be captured. If the user does not want to capture another image, the process ends. If, however, in decision block 326, the user wants to capture another image, then, in decision block 332, it is determined whether the user wants to manually change a parameter, such as a condition or setting, for the subsequent image or, if the parameter is to be set automatically by the digital camera 100,
If, in decision block 332, the user decides to manually change the setting, then, in block 334, the user changes the setting and the process returns to block 302 where another image is captured and the process repeats. If, however, in decision block 332, the user wants the digital camera 100 to automatically change the setting, then, in block 336, the setting used to capture the previous image are changed according to the new setting determined in block 324. The process then returns to block 302 to capture a subsequent image.
Having described some embodiments of analyzing characteristics of an image and camera settings, other embodiments will now be described.
In the following embodiments, the data in the header 202,
It should be noted that the following analysis provides determinations of some of the possible anomalies that may be detected by the image analysis logic 150. Thus, fewer or more possible anomalies may be detected. The following analysis also provides information related to correcting the anomalies. It is noted that the analysis may occur during a live view when an live image of a scene is displayed on the camera. The analysis may also occur after an image is captured.
Focus Problems Due to Low Contrast in the Scene
The processing program may analyze several items in the metadata to determine that the image may be blurry due to shaking of the camera at the time the image was captured. An embodiment for determining whether the image may be blurry is shown in the flowchart 400 of
In step 410 of
If the image was out of focus, processing proceeds to block 414, which simply indicates that the analysis of flowchart 400 has no bearing on the problem. This information is not necessarily displayed for the user of the camera.
At decision block 415, a decision is made as to whether the image was captured using a theater mode or a similar mode. A theater mode is a mode wherein the camera is used to capture an image without distracting subjects in the scene. For example, the theater mode may be used to capture images in a theater without distracting the performers in the theater. In theater mode, the flash or strobe is off. In addition, any light emitted by the camera to assist in focusing is also off. Because theater mode is typically used indoors with the flash turned off, the camera uses a high ISO, which causes high gain and low shutter speed. If the image was captured using theater mode, processing proceeds to block 414 as described above. It is noted that the advice provided per the flowchart 400 may, in some embodiments, be given regardless of whether the camera is in a theater or similar mode.
If the decision of decision block 415 is negative, processing proceeds to decision block 416 where a determination is made as to whether focus lock was achieved during image capture. During image capture, the camera attempts to focus the scene. If the scene is able to be focused at the time of image capture, focus lock is achieved. If focus lock was not achieved at the time of image capture, processing proceeds to block 414 as described above. It is noted that the focus detection of decision block 410 may analyze the image while the focus lock of decision block 416 may analyze the metadata to determine if focus lock was achieved at the time of image capture. It is noted that in some embodiments, decision blocks 410 and 416 may be combined into a single decision block.
If focus lock was achieved per decision block 416, processing proceeds to decision block 418 where a determination is made as to whether a “handheld” limit was exceeded during image capture. The handheld limit is a function of zoom and exposure time. The basis for the handheld limit is that a user of a camera who holds the camera is going to shake the camera during image capture, which is going to blur the image. The camera may be programmed with a handheld number or limit, which may be based on the amount of shaking a typical user shakes while holding the camera. It is noted that a longer exposure time or greater zoom increases the handheld calculation closer to or beyond the handheld limit because longer exposure and greater zoom will increase the possibility of a blurred image. Functions associated with the handheld limit may be assigned values so that a value may be calculated for the handheld limit. Accordingly, the handheld limit may be compared to a predetermined value to determine whether the handheld limit exceeds the predetermined value.
If the camera was not below the handheld limit at the time of image capture, processing proceeds to block 414 as described above. If the camera was below the handheld limit at the time of image capture, processing proceeds to decision block 422. The decision at decision block 422 determines if light conditions were low or below a predetermined value at the time of image capture. If the light conditions were not low at the time of image capture, processing proceeds to block 414 as described above. If the light conditions were low during image capture, processing proceeds to decision block 424. As set forth above, the camera may be used in the theater mode in low light conditions.
Decision block 424 determines whether the strobe was off during image capture. If the strobe was on during image capture, processing proceeds to block 414 as described above. If the strobe was off during image capture, the analysis is complete and processing proceeds to block 428 where advice may be provided to the camera user. The advice may be in any form, such as text or audio. The information provided to the camera user may suggest focusing on a high contrast portion of the scene during image capture. Digital cameras typically use the center of the scene for focusing, so the user may want to make sure that the center portion of the scene contains high contrast areas. The advice may also include informing the camera user to capture images of still scenes rather than scenes in which objects may be moving.
Another embodiment of the flowchart 400 includes situations wherein the camera was in a burst mode during image capture. The burst mode causes the camera to capture several simultaneous images, usually with a single activation of a capture button. When images are captured using burst mode, the flash is typically forced off because of the time required to charge the power source for the flash delays image capture. In such a situation, the advice provided to the user may include not using burst mode in low light conditions.
Blurry Image Using Night Mode
When images are captured at night, various camera modes can be used to enhance the images, which would otherwise be dark. The night modes use long exposure times and may or may not use a flash depending on the scene being captured.
One analysis of images captured in a night mode is described with reference to the flow chart 450 of
If the image was captured using a night mode, processing proceeds to decision block 456 where a determination is made as to whether the exposure time was long. More specifically, decision block 456 may determine whether the exposure time was greater than a preselected value. If the exposure time was not long, processing proceeds to block 454 as described above.
If the exposure time per decision block 456 was long, processing proceeds to decision block 458, where a decision is made as to whether light conditions were low during image capture. In low light conditions, the ambient light is below a predetermined value. If the light conditions were not low during image capture, processing proceeds to block 454 as described above.
If the light conditions were low at the time the image was captured, processing proceeds to decision block 460 where a decision is made as to whether the flash or strobe activated during image capture. If the strobe did activate, processing proceeds to block 454 as described above. Other embodiments of capturing images in a night mode using the strobe are described below.
If the strobe did not activate during image capture, processing proceeds to decision block 462 where a decision is made as to whether the image was analyzed for focus. In the embodiments wherein advice is provided as the image is being captured or before the image is captured, the image likely is not analyzed for focus. More specifically, given the conditions up to this point, advice as indicated in block 464 may be provided to the user. The information states that the image may be out of focus and that the image may be improved by stabilizing the camera during image capture.
If the image was analyzed for focus, processing proceeds to block 466. More specifically, metadata or other data associated with the image may be analyzed to determine if the camera obtained focus prior to being captured. Thus, the suggestions may include obtaining focus lock prior to capturing the image. The suggestions may also include stabilizing the camera or subjects within the scene during image capture.
Some embodiments of the camera have a mode for capturing images of objects at night, wherein the images are located in close proximity to the camera. One such embodiment is referred to as night portrait mode. In these modes, the camera uses a long exposure time in addition to a strobe. A procedure as described with regard to the flowchart 450 may be used to determine if images captured using a night portrait mode are blurry. Rather than determining whether an image was captured using a night mode in decision block 452, the analysis may determine whether the image was captured using a night portrait mode.
In addition, decision block 460 would determine whether the strobe activated during image capture and processing would proceed to block 454 if the strobe did not activate. The advice provided in blocks 464 and 466 may include additional suggestions advising persons in the scene to remain still for a longer period. More specifically, the camera may have an exposure time that is longer than the strobe, which requires persons to remain still longer than the time of the strobe activation.
Camera strobes have a limited range. When strobes are used to illuminate objects beyond the range of the strobes, the objects will not be illuminated properly for image capture. The resulting image will be dark or portions of the image intended to be illuminated will be dark. An additional factor that darkens the images is that the exposure time is typically reduced when using a strobe. Therefore, a dark scene is captured using a short exposure time and a strobe that cannot illuminate the scene.
Embodiments for determining whether an image was captured using a strobe when subjects in the image were out of range of the strobe are described in the flowchart 500 of
If the camera was not in aperture priority mode, the analysis continues to decision block 506 where a determination is made as to whether the camera was in time value mode during image capture. Time value mode is sometimes referred to as Tv mode. The time value mode enables a user to select the shutter speed of the camera, which determines the exposure time during image capture. More specifically, the shutter speed determines the amount of time that the photosensors charge during image capture. If the shutter speed is set too slow, the image may be over exposed. Likewise, if the shutter speed is set too fast, the image will be under exposed. If the camera was in time value mode during image capture, processing proceeds to block 504 as described above.
If the camera was not in time value mode during image capture, processing proceeds to decision block 510 where a determination is made as to whether the ISO was set manually for the image capture. If the ISO was not set manually, processing proceeds to block 504 as described above. In some embodiments, the processing may determine if the ISO is equal to 400 and may continue processing if the ISO is equal to 400 or thereabout.
If the conditions of decision block 510 are met, processing proceeds to decision block 514 where a determination is made based on the strobe power during image capture. In some embodiments, the decision determines whether the strobe activated at full power or at a power greater than a preselected value during image capture. In some embodiments, the decision determines the period in which the strobe was active during image capture. For example, the decision may determine whether the strobe time was greater than one half second. If the conditions of decision block 514 are not met, processing proceeds to block 504 as described above.
If the conditions of decision block 514 are met, processing proceeds to decision block 516 where a determination is made as to whether the image is dark. Determining whether the image is dark may be achieved using a plurality of different methods. One embodiment includes determining the average value of the pixels or the average value of some of the pixels. The average value is compared to a preselected value wherein the image is deemed dark if the average value is less than the preselected value. If the image is not deemed to be dark, processing proceeds to block 504 as described above.
If the image is deemed to be dark at decision block 516, processing proceeds to block 520 where an indication is provided that the subject of the image may have been out of range of the strobe. Suggestions for correcting the image may also be provided and may include moving closer to the subject or deactivating the strobe and using a longer exposure time.
The metadata and other data may be used to provide the user with ways to improve the image quality. The program may analyze the settings or different camera parameters at the time of image capture and may provide suggestions for improving the image during subsequent image capture.
Adaptive Lighting
Some cameras include processing that lightens dark regions of an image and/or masks bright portions of an image to prevent further brightening. This process balances extreme contrasts in the image. For example, a subject may be dark when it is captured using a bright background. In some embodiments, the camera analyzes the ISO used to capture the image. If the ISO was set below a preselected value and the adaptive lighting was set, the camera may provide information indicating that the image may appear unrealistic or grainy. The camera may also provide information for improving the image including using a lower ISO or setting the camera to an automatic ISO.
Too High Contrast in the Scene
The image data and the metadata may be analyzed to determine if the contrast in the scene is high or greater than a predetermined value. The high contrast may result in the subject of the image being in a shadow. In one embodiment, the following analysis is not performed in panoramic or portrait modes. Images captured using the panoramic mode may have high contrasts due to the nature of capturing panoramic images. Likewise, images captured using the portrait mode may be subject to high contrast due to the nature of capturing portrait images.
The number of dark and clipped pixels in various portions of the image may be analyzed to determine the contrast. For example, pixel values in the center of the image may be analyzed to determine if they are generally greater than a predetermined value. Pixel values in other regions of the image may be analyzed to determine if they are generally less than a predetermined value. If a high number of pixel values are clipped and dark, the contrast may be too high. The camera may display information suggesting setting the camera to lower ambient lighting as a basis for image processing, which may lower the contrast. In some embodiments, the program may suggest setting an adaptive lighting setting lower so as to capture images that may be located in shadows in the scene.
The camera may provide advice related to setting an adaptive lighting setting to low. The adaptive lighting setting reduces the effects of low light in the scene and, thus, may reduce the high contrast.
In some embodiments all or some of the following criteria may be applied before the advice to use low adaptive lighting is given. The EV compensation may be analyzed wherein the information is provided if the EV compensation is greater than 2.0. In addition, the information may be provided if the exposure time is less than one second. The information may be provided if the ISO is between 64 and 100. In some embodiments, the information is provided if the strobe activated during image capture and a return flash was detected.
Panoramic Image Improvements
Some cameras include a panoramic mode that enables a user to capture a plurality of images and connect or stitch the images together. The camera bases the stitching on high contrast portions of the image. Images captured using a wide angle and other settings may have too many distortions to perform stitching.
In some embodiments, the camera determines if the camera is in a macro mode or if the camera was used to capture close images. The macro mode may be defined by a setting on the camera. Determining whether the images are close up may be accomplished by determining the range of the focus. If any of these conditions are met, the camera may provide information indicating that the image may be distorted. Advice may include moving away from the subject. General advice may also be provided suggesting using a tripod with a rotatable head so that the camera is able to be carefully swept across the scene.
This application is a continuation in part of Ser. No. 11/054,291, filed on Feb. 8, 2005, which is a continuation in part of Ser. No. 10/461,600, filed Jun. 12, 2003, for SYSTEM AND METHOD FOR ANALYZING A DIGITAL IMAGE, which are hereby incorporated by reference for all that is disclosed therein
Number | Date | Country | |
---|---|---|---|
Parent | 11054291 | Feb 2005 | US |
Child | 11412155 | Apr 2006 | US |
Parent | 10461600 | Jun 2003 | US |
Child | 11054291 | Feb 2005 | US |