This application relates generally to imaging systems, and more specifically to multiple camera systems and methods for controlling same.
To take pictures or video that are in focus, spectrally balanced, and exposed properly, cameras may include functions including automatic focus (AF), automatic white balance (AWB), and automatic exposure control (AEC). These three functions (sometimes referred to herein as “3A”) enable an imaging system to produce focused, balanced, and properly exposed still or video images. When a camera is first turned on or actuated from a non-imaging state it may take some time for a camera to determine where to position one or more lenses to properly focus an image on an image sensor, determine white balance and/or exposure information. When the camera is turned on, or when ambient lighting conditions change, determining parameters that a camera needs to properly focus, determine optimum exposure (for example, an exposure time period and an aperture size used for the exposure), and to perform white balance of captured images may be too long, resulting in a delay before the camera allows an image to be captured. This delay is perceptible and is commonly seen in digital photography.
Scenes with high dynamic range include dark and light regions requiring long and short exposure periods, respectively, so that detail is visible. High dynamic range imagery may be taken by combining images taken with different exposure periods. Therefore, there is a need for different exposure periods for a single scene with high dynamic range so that detail in both dark and light regions is visible. When the camera is turned on, or when ambient lighting conditions change, the 3A convergence time as well as the time to converge to the exposures required to capture and combine high dynamic range imagery may be too long, resulting in a delay before being able to take focused, balanced, and well exposed high dynamic range imagery. Therefore, there is a need to reduce the 3A and high dynamic range exposure convergence times for cameras.
A summary of sample aspects of the disclosure follows. For convenience, one or more aspects of the disclosure may be referred to herein simply as “some aspects.”
Methods and apparatuses or devices being disclosed herein each have several aspects, no single one of which is solely responsible for its desirable attributes. Without limiting the scope of this disclosure, for example, as expressed by the claims which follow, its more prominent features will now be discussed briefly. After considering this discussion, and particularly after reading the section entitled “Detailed Description” one will understand how the features being described provide advantages that include allowing for multicasting using Bluetooth wireless technologies.
One innovation is an apparatus. The apparatus includes a main camera including a main sensor. The main camera is configured to receive main image capture information, and capture an image using the main sensor and the image capture information. The apparatus also includes a main image processing module in communication with the main camera. The main image processing module is configured to receive an image from the main camera, receive main image processing information, and process the image received from the main camera using the main image processing information. The apparatus also includes an auxiliary camera including an auxiliary sensor. The auxiliary camera is configured to capture an image using the auxiliary sensor. The apparatus also includes an auxiliary image processing module in communication with the auxiliary camera. The auxiliary image processing module is configured to receive at least one image from the auxiliary camera and determine auxiliary control information based on the at least one image received from the auxiliary camera. The apparatus also includes a camera controller in communication with the auxiliary image processing module. The camera controller is configured to receive the auxiliary control information from the auxiliary image processing module. The cameral controller is further configured to determine main image capture information and main image processing information from the auxiliary control information. The cameral controller is further configured to communicate the main image capture information to the main camera, and communicate main image processing information to the main image processing module.
For some implementations, the auxiliary control information includes information for controlling the auxiliary camera and processing the auxiliary image. For some implementations, the main image capture information includes information for operating the main camera to perform autofocus operations. For some implementations, the auxiliary control information comprises exposure information. For some implementations, the main image capture information includes information for controlling an exposure of the main sensor while capturing an image. For some implementations, the main image processing information comprises information for performing a white balance adjustment of an image received from the main camera.
For some implementations, the main image processing module is further configured to determine main control information. For some implementations, the camera controller receives main control information from the main image processing module. For some implementations, the camera controller determines additional main image capture information based at least in part on the auxiliary control information and the received main control information. For some implementations, the camera controller communicates the additional main image capture information for autofocus and exposure control to the main camera.
For some implementations, the main camera is configured to receive the main image capture information from the camera controller and perform autofocus operations based on the received main image capture information.
For some implementations, the auxiliary control information includes autofocus data. For some implementations, the auxiliary camera comprises an auxiliary lens. For some implementations, the auxiliary image processing module and the auxiliary camera are collectively configured to determine the autofocus data by moving the auxiliary lens to a plurality of positions, capturing an image at each of the positions, and determining at which position an image includes the most high frequency information.
For some implementations, auxiliary control information comprises white balance information. For some implementations, the auxiliary image processing module is configured to determine the white balance by comparing intensity values for a plurality of spectral regions of an image captured by the auxiliary sensor.
For some implementations, the camera controller is configured to switch to an auxiliary capture mode in response to powering on the apparatus, or when the apparatus operates switches from a recording mode to a non-recording mode. For some implementations, the camera controller is configured to determine the main image capture information and the main image processing information while in the auxiliary image capture mode based on the at least one image received from the auxiliary camera.
Another innovation is a method. In various embodiments the method may include capturing at least one auxiliary image by an auxiliary camera. The method may further include determining, by an auxiliary image processing module, auxiliary control information based on the at least one auxiliary image. The method may further include determining, by a camera controller, main image capture information and main image processing information from the auxiliary control information. The method may include capturing at least one main image by a main camera using the main image capture information. The method may further include receiving the at least on main image and main image processing information at a main image processing module. The method may further include processing, by the main image processing module, the at least one main image using the main image processing information.
For some implementations, the auxiliary control information includes autofocus information, exposure information and white balance information for controlling the auxiliary camera and processing the at least one auxiliary image. For some implementations, the main image capture information includes autofocus information and exposure information for use by the main camera. For some implementations, the main image processing information includes white balancing information for use by the main image processing module. For some implementations, the method further includes switching to an auxiliary capture mode when the apparatus is powered on. For some implementations, the method further includes switching to an auxiliary capture mode when the apparatus switches from a recording mode to a non-recording mode. For some implementations, the at least one auxiliary image is captured when the apparatus is in the auxiliary capture mode.
For some implementations, the method further includes determining additional main control information based on the at least one main image. For some implementations, the method further includes communicating the additional main control information to the camera controller. For some implementations, the method further includes determining additional main image capture information by the camera controller, the additional main image capture information based at least in part on the auxiliary control information and the additional main control information.
For some implementations, the method further includes communicating the additional main image capture information to the main camera. For some implementations, the method further includes using the additional main image capture information by the main camera to perform autofocus and control exposure while capturing at least one additional main image. For some implementations, the additional main control information is determined by the main image processing module.
Another innovation is an apparatus. In some embodiments, the apparatus may include means for means for capturing at least one auxiliary image. In some embodiments, the apparatus may include means for determining auxiliary control information based on the at least one auxiliary image. In some embodiments, the apparatus may include means for determining main image capture information and main image processing information from the auxiliary control information. In some embodiments, the apparatus may include means for capturing at least one main image using the main image capture information. In some embodiments, the apparatus may include means for receiving the at least one main image and main image processing information at a means for processing the at least one main image. In some embodiments, the apparatus may include means for processing the at least one main image using the main image processing information.
In some embodiments, the auxiliary control information includes autofocus information, exposure information and white balance information for controlling the means for capturing at least one auxiliary image and processing the at least one auxiliary image. In some embodiments, the main image capture information comprises autofocus information and exposure information for use by the means for capturing at least one main image. In some embodiments, the main image processing information comprises white balancing information for use by the means for capturing at least one main image.
In some embodiments, the apparatus may include means for switching to an auxiliary capture mode when the apparatus is powered on. In some embodiments, the apparatus may include means for switching to an auxiliary capture mode when the apparatus switches from a recording mode to a non-recording mode. In some embodiments, the at least one auxiliary image may be captured when the apparatus is in the auxiliary capture mode.
In some embodiments, the apparatus may include means for determining additional main control information based on the at least one main image. In some embodiments, the apparatus may include means for determining additional main image capture information by the camera controller, the additional main image capture information based at least in part on the auxiliary control information and the additional main control information.
In some embodiments, the apparatus may include means for communicating the additional main image capture information to the means for capturing at least one main image. In some embodiments, the means for capturing at least one main image is configured to use the additional main image capture information by to perform autofocus and control exposure while capturing at least one additional main image.
Another innovation is a computer program product comprising a non-transitory computer readable medium encoded thereon with instructions that when executed cause an apparatus to perform a method of capturing an image. The method may include capturing at least one auxiliary image by an auxiliary camera. The method may further include determining auxiliary control information based on the at least one auxiliary image. The method may further include determining main image capture information and main image processing information from the auxiliary control information. The method may include capturing at least one main image by a main camera using the main image capture information. The method may further include receiving the at least on main image and main image processing information at a main image processing module. The method may further include processing the at least one main image using the main image processing information.
For some embodiments, the auxiliary control information includes autofocus information, exposure information and white balance information for controlling the auxiliary camera and processing the at least one auxiliary image. For some embodiments, the main image capture information includes autofocus information and exposure information for use by the main camera. For some embodiments, the main image processing information comprises white balancing information for use by the main image processing module.
Another innovation is an apparatus that includes a main camera having a main sensor. In some embodiments, the main camera is configured to receive control information to perform autofocus operations and control exposure of the main sensor. The apparatus may further includes a main image processing module, coupled to the main camera, configured to receive main control information to perform white balance adjustment of an image received from the main camera, an auxiliary camera having an auxiliary sensor, an auxiliary image processing module, coupled to the auxiliary camera, configured to determine auxiliary control information for performing autofocus operations and control exposure of the auxiliary sensor based on at least one image received from the auxiliary camera. The apparatus may include a camera controller coupled to the auxiliary image processing module. The camera controller may be configured to receive the auxiliary control information from the auxiliary image processing module. The camera controller may be configured to determine, using a processor, main control information from the auxiliary control information, and configured to communicate main control information for autofocus and exposure control to the main camera. The camera controller may be configured to communicate main control information for white balance to the main image processing module.
For some implementations, the main image processing module is further configured to determine main control information. For some implementations, the camera controller receives main control information from the main image processing module. For some implementations, the camera controller determines additional main control information based in part on the auxiliary control information and the received main control information. For some implementations, the camera controller communicates the additional main control information for autofocus and exposure control to the main camera. The camera controller may communicate the additional camera control information for white balance to the main imaging processing module. For some implementations, the main camera is configured to receive the main control information for autofocus operations from the camera controller and perform autofocus operations using the received main control information. For some implementations, auxiliary control information includes autofocus data. For some implementations, the auxiliary camera comprises an auxiliary lens. For some implementations, the auxiliary image processing module is further configured to determine the autofocus data by moving the auxiliary lens to a plurality of positions, capturing an image at each of the positions, and determine at which position an image includes the most high frequency information. For some implementations, determining the first main exposure period comprises analyzing an intensity histogram. For some implementations, determining white balancing for the primary image processing module comprises comparing intensity values for a plurality of spectral regions. For some implementations, the processor is configured to switch to an auxiliary capture mode in response to powering on the dual camera, or in response to a user command to stop capturing video. For some implementations, the processor is configured to determine the main focus distance, the first main exposure period, and the white balance for the main image processing module while in the auxiliary capture mode based on the at least one image received from the auxiliary camera. For some implementations, the processor is configured to switch to a main capture mode in response to a user command to capture video. For some implementations the processor is configured to determine the main focus distance, the first main exposure period, and the white balance for the main image processing module while in the main capture mode based on the at least one image received from the auxiliary camera. For some implementations, the processor maybe further configured to determine a second main exposure period and a third main exposure period of the main image processing module. For some implementations, the second and the third exposure periods are based on the at least one image received from the auxiliary camera, the second main exposure period shorter than the first main exposure period, the third main exposure period longer than the second main exposure period. The second and third exposure periods may be based on the at least one image received from the auxiliary camera and the at least one image received from the main camera, the second exposure period shorter than the first main exposure period, the third main exposure period longer than the first exposure period. For some implementations, the main image processing module is further configured to generate a composite image by combining images captured by the main image processing module at the first main exposure period, the second main exposure period, and third main exposure periods.
Another innovation is a method for automatic exposure control, automatic white balance, and automatic focus for a dual camera. In various embodiments the method may include capturing, by an auxiliary image processing module, a first plurality of images focused on a first image sensor at a first resolution at a first frame rate. The method may further include measuring a first plurality of image statistics in response to the first plurality of images, and determining a main focus distance between a main lens and a main image processing module based on the first plurality of image statistics. The method may further include determining a first exposure period of the main image processing module based on the first plurality of image statistics, and determining white balancing for the main image processing module based on the first plurality of image statistics. The method may further includes capturing, by the main image processing module, a second plurality of images focused on a second image sensor at a second resolution at a second frame rate, the second resolution higher than the first resolution, the second frame rate higher than the first frame rate.
The following detailed description is directed to certain specific embodiments of the invention. However, the invention can be embodied in a multitude of different ways. It should be apparent that the aspects herein may be embodied in a wide variety of forms and that any specific structure, function, or both being disclosed herein is merely representative. Based on the teachings herein one skilled in the art should appreciate that an aspect disclosed herein may be implemented independently of any other aspects and that two or more of these aspects may be combined in various ways. For example, an apparatus may be implemented or a method may be practiced using any number of the aspects set forth herein. In addition, such an apparatus may be implemented or such a method may be practiced using other structure, functionality, or structure and functionality in addition to or other than one or more of the aspects set forth herein.
Further, the systems and methods described herein may be implemented on a variety of different computing devices that hosts a camera. These include mobile phones, tablets, dedicated cameras, wearable computers, personal computers, photo booths or kiosks, personal digital assistants, ultra-mobile personal computers, and mobile internet devices. They may use general purpose or special purpose computing system environments or configurations. Examples of computing systems, environments, and/or configurations that may be suitable for use with the invention include, but are not limited to, personal computers, server computers, hand-held or laptop devices, multiprocessor systems, microprocessor-based systems, programmable consumer electronics, network PCs, minicomputers, mainframe computers, distributed computing environments that include any of the above systems or devices, and the like.
Referring to
Referring to
Referring to
Referring to the embodiment illustrated in
Images may be captured at a spatial resolution and a frame rate by the main and auxiliary sensors 116, 126. The main and auxiliary sensors 116,126 may comprise rows and columns of picture elements (pixels) that may use semiconductor technology, such as charged couple device (CCD) or complementary metal oxide semiconductors (CMOS) technology, that determine an intensity of incident light at each pixel during an exposure period for each image frame. In some embodiments, the main and auxiliary sensors 116, 125 may be the same or similar sensors. In some embodiments, the auxiliary sensor 126 may be a lower quality or have lower imaging capabilities such that it is less expensive. For example, in some embodiments the auxiliary sensor 126 may produce data representative of a black and white image. In some embodiments, incident light may be filtered to one or more spectral ranges to take color images. For example, a Bayer filter mosaic on the auxiliary sensor 126 may filter light using red, green and blue filters capture full color, three band images.
As illustrated in the embodiment of
In the embodiment illustrated in
In the embodiment illustrated in
Automatic Focus
In some embodiments of focusing operations, the auxiliary image processing module 130 receives captured raw imagery 201 from the auxiliary camera 227 and determines to control information for automatic focus, automatic white balance, and automatic exposure control. By adjusting the focal plane relationship between an element of auxiliary lens 122 and the auxiliary image sensor 126, objects may be focused on the auxiliary image sensor 126. As a scene is focused, high frequency content of the captured image increases because objects in focus have sharp edges. Accordingly, focus may be automated by varying the focal plane relationship between an element of auxiliary lens 122 and the auxiliary image sensor 126, calculating the relative amount of high frequency content, and setting the focal plane relationship to correspond to the position that maximizes high frequency content. For some implementations, the high frequency content for a portion of the scene selected by the user is used to focus the image, as objects at different distances from the lens will come into and out of focus. Once a focus setting is determined, a processor may estimate the distance of in-focus objects based on the selected focus distance. This distance may applied by the camera controller 210 to a camera model of a main camera 110 to estimate a focal plane relationship between an element of main lens 112 and the main image sensor 116 using image statistics for images captured by the auxiliary camera 110.
Automatic White Balance
Outside ambient lighting conditions may vary with time of day and cloud cover. Indoor ambient lighting conditions can vary greatly based on the amount of light present and the type of light source, for example, incandescent, fluorescent, halogen, LED or candle light. In some circumstances ambient lighting may include both sunlight and indoor lights. Different ambient lighting conditions lead to differences in illumination. For example, an object that appears white at noon on a sunny day may appear off-white under an incandescent bulb, slightly yellow in candlelight, or appear bluer when illuminated by an LED.
Different lighting conditions can be characterized by differences in relative spectral power distributions. The Commission Internationale de l'Eclairage (CIE) standards body maintains illumination models that provide different spectral weights for different spectral regions in the visible range. For example, CIE illuminant models A, C, D50, D65, F2, F7, and F11 model incandescent, daylight, daylight with a color temperature of 5000 degrees Kelvin, daylight at 6500 degrees Kelvin, broad band daylight, and narrow band daylight. Different spectral ranges can be equalized to correct for variations in ambient lighting conditions. In an implementation, the red and blue balance may be adjusted to reduce differences in color as ambient lighting conditions change.
Automatic white balance correction factors are calculated by the auxiliary image processing module 140 by estimating the relative spectral power distribution for images captured by the auxiliary camera 120, determining the averaging intensity in each spectral band, applying a model (for example, assuming that the average scene color follows an expected distribution), and then determining spectral weighting factors to equalize or adjusting spectral component so that the different spectral bands approximate the assumed distribution. These spectral weighting factors may applied by the camera controller 210 to a camera model of the spectral characteristics of the main camera 110 to map the spectral weightings of the auxiliary camera 120 for automatic white balance to the spectral weightings of the main camera 110 for automatic white balance. For some implementations, white balancing may also be used to correct known image sensor sensitivity variations in different spectral regions.
Automatic Exposure Control
The exposure may be described as the amount of light per unit area incident on an image sensor. Exposure is dependent on the scene luminance, auxiliary lens aperture 122, and shutter speed. Automatic exposure control may adjust the shutter speed or time for each exposure to an optimum exposure period, which corresponds to the amount of time the auxiliary image sensor 126 receives incident light to determine intensity at each pixel for an image frame. If the exposure period is too short, the image may be underexposed and detail in dark regions will not be visible. If the exposure period is too long, the image may be saturated and detail in light regions will not be visible. For scenes with relatively uniform lighting, the optimum exposure period is relatively constant throughout the scene.
An “optimal” exposure period may be estimated using a light meter (not shown), and/or capturing one or more images by auxiliary image sensor 126, calculating image statistics of the captured image(s) by the auxiliary image processing module 140, and setting the exposure period based on the image statistics and/or light meter reading. An intensity histogram may be used to determine, by auxiliary image processing module 140, whether an the image is either underexposed or saturated, as underexposed pixels will have intensity values close to zero, and saturated pixels will have intensity values close to the maximum (for example, 255 for eight bit intensity values). Intensity histogram statistics may be used to characterize, by auxiliary image processing module 140, whether the image may be underexposed or saturated. The auxiliary image processing module 140 determines the parameters to adjust the auxiliary aperture, the shutter or exposure period the auxiliary aperture exposure period until the image or histogram statistics are within desired limits, to reach an “optimal” exposure. The auxiliary image processing module 140 outputs automatic exposure control information and parameters to the auxiliary camera 120 for image capture by the auxiliary camera, and to the camera controller 210. The camera controller 210 maps the aperture and shutter speed exposure period for the auxiliary camera 120 to an aperture and shutter speed exposure period for the main camera 110 based on camera models of the main camera and auxiliary camera.
As noted above, the auxiliary image processing module 140 provides control information for autofocus, automatic white balance, and automatic exposure control 213 to the camera controller 210. The camera controller 210 uses this information, as described above, to determine autofocus, automatic white balance, and automatic exposure control 223 parameters information 223 for the main camera and main image processing module.
The main camera 110 receives focus and exposure control information 227 from the camera controller 210. The main controller 118 controls the focus of the main lens 112 by adjusting a focal plane relationship between an element of the main lens 112 and the main sensor 116. The main controller may also control a main aperture 114 opening and an exposure period of incident light through the main lens 112 onto the main sensor 116 to capture images during an exposure period.
Images may be captured at a spatial resolution and a frame rate by the main sensor 116 based on user input received via a touchscreen 150 another input device (not shown), or under program control. The spatial resolution for images captured by the main sensor 116 may be higher than the spatial resolution of images captured by the auxiliary sensor 126. The frame rate of imagery captured by the main sensor 116 may be higher than the frame rate of the images captured by the auxiliary sensor 126. The main sensor may comprise rows and columns of picture elements (pixels) that may use semiconductor technology, such as charged couple device (CCD) or complementary metal oxide semiconductors (CMOS) technology, that determine an intensity of incident light at each pixel during an exposure period for each image frame. The main sensor 116 may take a black and white image, or incident light may be filtered to one or more spectral ranges to take color images. For example, a Bayer filter mosaic on the main sensor 116 may filter light using red, green and blue filters capture full color, three band images. The main image sensor 116 may capture an image in visible or non-visible spectral ranges. Multispectral cameras capture multiple spectral bands of data (for example, 4-20 bands of data). Hyperspectral cameras capture a multiplicity of bands of data, often as a spectral response at each picture element to capture an image cube. Exemplary embodiments herein may use three band cameras with Bayer filters for clarity of discussion, but the disclosed technology is not limited to these three band cameras.
The main image processing module 130 receives captured raw imagery 201 from the auxiliary camera 227 and white balance control information from the camera controller 210. The white balance control information may contain weight factors for different spectral bands. The main image processing module may apply the weighting factors to the different spectral bands to equalize or white balance the imagery, thereby producing balanced processed imagery that is output by the main image processing module 130 for viewing, storage in memory 250, or further processing. The main image processing module may compute image statistics from the raw input imagery to determine control information for auto focus, automatic white balance, or automatic exposure control.
The main image processing module 130, the auxiliary image processing module 140, and the camera controller 210 are three separate modules in the exemplary embodiment depicted in
The imagery captured by main sensor 116 or auxiliary sensor 118 may be still images or video. The imagery resolution of still images and video, and frame rate of video may vary based on user selection. Frames may be combined in different ways, for example by stitching them together to form a panorama. The image sensor 135, 145 may take a black and white image, or incident light may be filtered to one or more spectral ranges to take color images.
Scenes with highly variable lighting may include both bright well lit objects and dark shadowed objects.
This process may be automated, with variable settings for the number of combined images and the relative exposure periods. In some embodiments, for example, once the “optimal” exposure period is measured for the overall scene, images may be captured at half optimal exposure period, at the exposure period, and at twice the exposure period. Detail in bright regions of the image will be apparent in the short exposure image. Detail in dark regions of the image will be apparent in the long exposure image. By combining the three images, it may be possible to capture detail in dark, normal, and light regions of a scene. This example of combining three images, the images at half optimal, optimal, and twice optimal exposures, is just one example. Other exposure combinations may use four or more exposures, for example nine or sixteen exposures, each exposure at a different exposure period to capture high dynamic range still images and high dynamic range videos.
Just as the auxiliary image processing module 140 may determine an “optimal” automatic exposure by capturing one or more images by auxiliary sensor 126, calculating image statistics of the captured image(s) by auxiliary image processing module 140, and setting the exposure period based on the image statistics or light meter reading, auxiliary image processing module 140 may conduct a similar search to determine short and long exposures. The auxiliary image processing module 140 may select a short exposure period for which detail of bright objects is apparent. The auxiliary image processing module 140 applies a high dynamic range exposure metering algorithm by analyzing intensity histograms statistically to determine one or more short exposure periods. Similarly, the auxiliary image processing module 140 may select a long exposure period for which detail of dark objects is apparent, and a long exposure period for which detail of dark objects is apparent. The processor applies a high dynamic range exposure metering algorithm by analyzing intensity histograms statistically to determine one or more 1 exposure periods.
According to some embodiments, once powered on a camera system with a main (main) camera 110 waits for a computer or user input command to capture imagery. The imagery may be a still photo, a video, a high definition still photo, or a high definition video. Once a capture imagery command is invoked, the camera system captures images, collects image statistics, and then focuses the object image on an image sensor in an autofocus (AF) operation. The camera system can automatically determines spectral weightings for white balance (AWB), and automatically determines an exposure period (AEC), or set of exposure periods for high dynamic range imagery. It takes a finite amount of time for autofocus, automatic white balance, and automatic exposure control after power is turned on, when lighting conditions change, or the camera is pointed to an object that is not in focus. This introduces a delay before it's possible to capture focused, balanced, well exposed imagery, including high dynamic range imagery. There is a need to reduce these convergence times.
By having an auxiliary camera 120 and auxiliary image processing module 140, it is possible to reduce or eliminate this convergence time delay. The auxiliary camera captures imagery at a lower resolution and/or a lower frame rate than the main camera. Therefore, the volume of data processed by the auxiliary image processing module 140 is less than the volume of data that is captured by the main image processing module 130 when calculating control information for automatic focus, automatic white balance, and automatic exposure control. With lower data rates, the computational load is reduced when to calculate image statistics and computes high frequency content, spectral weightings or histogram intensity values that are used for autofocus, automatic white balance, and automatic exposure control. With a reduced computational load, convergence time for autofocus, automatic white balance, and automatic exposure control is reduced when compared to making these same calculations using data captured by the higher resolution, higher frame rate main image processing module.
Furthermore, a dual camera system may turn on the auxiliary camera 120 and auxiliary image processing module 140 as soon as the dual camera is powered on. By not waiting for a capture imagery command, the dual camera system starts to converge to (determine) the autofocus, automatic white balance, and automatic exposure control parameters on power up. Therefore, the dual camera both starts earlier and takes less time to estimate the autofocus, automatic white balance, and automatic exposure control parameters. This reduces or eliminates the time between invoking imagery capture and being able to capture imagery that is focused, balanced, and correctly exposed.
The autofocus parameters computed based on images captured by the auxiliary camera 120 estimate the distance to the object, based on a camera model for the auxiliary camera 120. This distance is used with a camera model for the main camera 110 to determine the focal plane relationship between the main lens 112 and the main sensor 116. The spectral weightings derived for the auxiliary camera 120 are used to determine spectral weightings for the main camera 110—either directly, or with correction for spectral response characteristic differences between the auxiliary camera 120 and the main camera 110. The ambient lighting characteristics determined by the auxiliary camera 110 and auxiliary image processing module 140 are used to determine the shutter speed exposure period and aperture setting and for the main camera 110. For some implementations, image statistics from both the main image processing module 130 and the auxiliary image processing module 140 are combined for faster convergence.
After the main image processing module 130 or auxiliary image processing module 140, or camera controller 210 determines the “optimum” exposure period, the exposure may be locked until a change in the scene is detected because of variations in image statistic. Once the change is detected, the auxiliary camera 120 and auxiliary image processing module 140 may refine the exposure period in response to the scene change. After determining the new “optimum” exposure, the auxiliary camera 120 and auxiliary image processing module 140 may search for short and long exposure periods. The auxiliary image processing module 140 may then output this information to the camera controller 210 which generates equivalent exposure periods for the main camera 110 via exposure synchronization control between the main camera 110 and the auxiliary camera 120. When the user requests high dynamic range imagery via the touchscreen 150 or other input device (not shown) the main camera 110 and main image processing module 130 captures images at short, “optimum,” and long exposure periods. The main image processing module 130 then combines the imagery captured at short, “optimum,” and long exposure periods to form high dynamic range imagery. The high dynamic range imagery may be output to memory 250 and viewed on the touchscreen 150.
The dual camera system transitions from auxiliary capture state 430 to main capture state 470 when a start imagery capture command is invoked by a user or software 450, and transitions back from main capture state 470 to auxiliary capture state 430 when a stop imagery capture command is invoked by a user or software 490. While in main capture state 470, the camera controller 210 controls the main camera, the main controller 118 controls the main camera 110, the main camera 110 captures imagery, the main image processing module 130 processes the captured imagery, and the main image processing module 130 refines the automatic focus, automatic white balance, and automatic exposure control parameters, during state transition 480.
For some implementations, the auxiliary camera 120 will keep capturing imagery while in the main video capture state 470, and the image statistics from these images may be used, in addition to image statistics from the main image processing module, to refine the automatic focus, automatic white balance, and automatic exposure control parameters during state transition 480. If power is turned off 495 while in main capture state 470, the dual camera system transitions to power off state 410.
A user may preview images on the touchscreen 150 during operation, and issue commands via the touchscreen 150. For example, while in main capture state 470, the user may view a changing scene. The main image sensor continues to capture images.
At block 530, the process 500 determines main image capture information and main image processing information from the auxiliary control information. In some implementations, the functionality of block 530 may be performed by the camera controller 210 illustrated in
At block 560, the process 500 processes the at least one main image using the main image processing information. In some implementations, the functionality of block 560 may be performed by the main image processing module 130 illustrated in
The apparatus may include means 630 to determine main image capture information and main image processing information from the auxiliary control information. In some implementations, the determining main image capture and main image processing information means may be a camera controller 210. The apparatus may include means 640, to capture at least one main image using the main image capture information. In some implementations, the capturing main image means may be a main camera 110. The apparatus may include means 650 to receive the at least one main image and main image processing information. In some implementations, the receiving main image and main image processing information means may be a main image processing module 130.
The apparatus may include means 660 to process the at least one main image using the main image processing information. In some implementations, the processing main image means may be a main image processing module 130.
It should be understood that any reference to an element herein using a designation such as “first,” “second,” and so forth does not generally limit the quantity or order of those elements. Rather, these designations may be used herein as a convenient method of distinguishing between two or more elements or instances of an element. Thus, a reference to first and second elements does not mean that only two elements may be employed there or that the first element must precede the second element in some manner. Also, unless stated otherwise a set of elements may comprise one or more elements. In addition, terminology of the form “at least one of: A, B, or C” used in the description or the claims means “A or B or C or any combination of these elements.”
As used herein, the term “determining” encompasses a wide variety of actions. For example, “determining” may include calculating, computing, processing, deriving, investigating, looking up (e.g., looking up in a table, a database or another data structure), ascertaining and the like. Also, “determining” may include receiving (e.g., receiving information), accessing (e.g., accessing data in a memory) and the like. Also, “determining” may include resolving, selecting, choosing, establishing and the like.
As used herein, a phrase referring to “at least one of” a list of items refers to any combination of those items, including single members. As an example, “at least one of: a, b, or c” is intended to cover: a, b, c, a-b, a-c, b-c, and a-b-c.
The various operations of methods described above may be performed by any suitable means capable of performing the operations, such as various hardware and/or software component(s), circuits, and/or module(s). Generally, any operations illustrated in the Figures may be performed by corresponding functional means capable of performing the operations.
The various illustrative logical blocks, modules and circuits described in connection with the present disclosure may be implemented or performed with a general purpose processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array signal (FPGA) or other programmable logic device (PLD), discrete gate or transistor logic, discrete hardware components or any combination thereof designed to perform the functions described herein. A general purpose processor may be a microprocessor, but in the alternative, the processor may be any commercially available processor, controller, microcontroller or state machine. A processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
In one or more aspects, the functions described may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Computer-readable media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another. A storage media may be any available media that can be accessed by a computer. By way of example, and not limitation, such computer-readable media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. Also, any connection is properly termed a computer-readable medium. For example, if the software is transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. Disk and disc, as used herein, includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk and Blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Thus, in some aspects computer readable medium may comprise non-transitory computer readable medium (e.g., tangible media). In addition, in some aspects computer readable medium may comprise transitory computer readable medium (e.g., a signal). Combinations of the above should also be included within the scope of computer-readable media.
The methods disclosed herein comprise one or more steps or actions for achieving the described method. The method steps and/or actions may be interchanged with one another without departing from the scope of the claims. In other words, unless a specific order of steps or actions is specified, the order and/or use of specific steps and/or actions may be modified without departing from the scope of the claims.
The functions described may be implemented in hardware, software, firmware or any combination thereof. If implemented in software, the functions may be stored as one or more instructions on a computer-readable medium. A storage media may be any available media that can be accessed by a computer. By way of example, and not limitation, such computer-readable media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. Disk and disc, as used herein, include compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk, and Blu-ray® disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers.
Thus, certain aspects may comprise a computer program product for performing the operations presented herein. For example, such a computer program product may comprise a computer readable medium having instructions stored (and/or encoded) thereon, the instructions being executable by one or more processors to perform the operations described herein. For certain aspects, the computer program product may include packaging material.
Software or instructions may also be transmitted over a transmission medium. For example, if the software is transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of transmission medium.
Further, it should be appreciated that modules and/or other appropriate means for performing the methods and techniques described herein can be downloaded and/or otherwise obtained by a user terminal and/or base station as applicable. For example, such a device can be coupled to a server to facilitate the transfer of means for performing the methods described herein. Alternatively, various methods described herein can be provided via storage means (e.g., RAM, ROM, a physical storage medium such as a compact disc (CD) or floppy disk, etc.), such that a user terminal and/or base station can obtain the various methods upon coupling or providing the storage means to the device. Moreover, any other suitable technique for providing the methods and techniques described herein to a device can be utilized.
It is to be understood that the claims are not limited to the precise configuration and components illustrated above. Various modifications, changes and variations may be made in the arrangement, operation and details of the methods and apparatus described above without departing from the scope of the claims.