The systems and methods disclosed herein are directed to phase detection autofocus, and, more particularly, to mask-less phase detection autofocus sensors and processing techniques.
Some image capture devices use phase difference detection sensors (which may also be referred to as “pixels”) to perform autofocus. On-sensor phase difference detection works by interspersing phase difference detection pixels between imaging pixels, typically arranged in repeating sparse patterns of left and right pixels. The system detects phase differences between signals generated by different phase difference detection pixels, for example between a left pixel and a nearby right pixel. The detected phase differences can be used to perform autofocus, for example, the detected phase differences can be used to calculate depth in a scene to assist autofocus.
Phase detection autofocus operates faster than contrast-based autofocus, however some implementations place a metal mask or other structures over the image sensor to create left and right phase detection pixels, resulting in less light reaching the masked pixels. Typical imaging sensors have a microlens formed over each individual pixel to focus light onto each pixel, and the phase detection autofocus mask placed over the microlenses reduces the light entering the microlens of a phase detection pixel by about 50%. Because the output of phase detection pixels has lower brightness than the output of normal image capturing pixels, the phase difference detection pixels create noticeable artifacts in captured images that require correction. By placing the phase detection pixels individually amidst imaging pixels, the system can interpolate values for the phase detection pixels.
Phase detection pixels are used in pairs. When the scene is out of focus, the phase detection pixel phase shifts the incoming light slightly. The distance between phase detection pixels, combined with their relative shifts, can be convolved to give a determination of how far an optical assembly of an imaging device needs to move a lens to bring the scene into focus.
A summary of sample aspects of the disclosure follows. For convenience, one or more aspects of the disclosure may be referred to herein simply as “some aspects.”
Methods and apparatuses or devices being disclosed herein each have several aspects, no single one of which is solely responsible for its desirable attributes. Without limiting the scope of this disclosure, for example, as expressed by the claims which follow, its more prominent features will now be discussed briefly.
One aspect of the present disclosure provides an image capture device. The image capture device may include an image sensor, multiple diodes, a color filter array, multiple single-diode microlenses, multiple multi-pixel-microlenses, and an image signal processor. The multiple diodes may be configured to sense light from a target. The color filter array may be arranged in a pattern, where each color filter may be positioned within proximity of one of the multiple diodes and configured to pass one or more wavelengths of light to one of the multiple diodes. For some embodiments, the plurality of color filters are arranged in a Bayer pattern. Each of the multiple single-diode microlenses may be positioned within proximity of one of the color filters of the color filter array. Each of multi-pixel-microlens may be positioned within proximity of at least three adjacent color filters of the color filter array. Two of the at least three adjacent color filters may be configured to pass the same wavelengths of light to a first and second diode. One of the at least three adjacent color filters may be disposed between the two of the at least three adjacent color filters and configured to pass different wavelengths of light to a third diode. Light incident on each multi-pixel-microlens in a first direction may be collected in the first diode and light incident in a second direction may be collected in the second diode. The image signal processor may be configured to perform phase detection autofocus based on values received from the first and second diodes.
Another aspect of the present disclosure provides an image sensor. The image sensor may include multiple diodes, multiple single-diode microlenses, multiple multi-diode-microlenses, and an image signal processor. The multiple diodes may be configured to sense light from a target scene. Each of the multiple single-diode microlenses may be positioned adjacent to one of the multiple diodes. Each multi-pixel-microlens may be positioned adjacent to at least three linearly adjacent diodes of the multiple diodes. The at least three diodes may include a first and second diode disposed at the respective ends of the multi-pixel-microlens and a third diode positioned between the first and second diode. The light incident in a first direction may be collected in the first diode and light incident in a second direction may be collected in the second diode. The image an image signal processor may be configured to receive values representing the light incident on the first and second diodes and perform phase detection autofocus using the received values.
Another aspect of the present disclosure provides a method for constructing a final image. The method includes receiving image data from multiple diodes associated with multiple color filters arranged in a pattern. The image data includes multiple imaging pixel values from a first subset of the multiple diodes associated with a first subset of the multiple color filters and multiple multi-pixel-microlenses, and a second subset of the multiple diodes associated with a second subset of the multiple color filters and multiple single-diode microlenses. The image data may also include multiple phase detection pixel values from a third subset of the multiple diodes associated with a third subset of the multiple color filters and the multiple multi-pixel-microlenses. The third subset of the multiple diodes may be arranged in multiple groups of adjacent diodes including at least one diode of the first subset of the multiple diodes and at least two diodes of the third subset of the multiple diodes. Each group of the multiple groups may receive light from a corresponding multi-pixel-microlens formed such that light incident in a first direction is collected in a first diode of the third subset of the multiple diodes and light incident in a second direction is collected in a second diode of the third subset of the multiple diodes. The method also includes calculating a disparity based on the light collected in the first direction and light collected in the second direction to generate focus instructions, and constructing an image based at least partly on the multiple imaging pixel values and focus instructions.
Another aspect of the present disclosure provides an image signal processor configured by instructions to execute a process for constructing a final image. The process includes receiving image data from multiple diodes associated with multiple color filters arranged in a pattern. The image data includes multiple imaging pixel values from a first subset of the multiple diodes associated with a first subset of the multiple color filters and multiple multi-pixel-microlenses, and a second subset of the multiple diodes associated with a second subset of the multiple color filters and multiple single-diode microlenses. The image data may also include multiple phase detection pixel values from a third subset of the multiple diodes associated with a third subset of the multiple color filters and the multiple multi-pixel-microlenses. The third subset of the multiple diodes may be arranged in multiple groups of adjacent diodes including at least one diode of the first subset of the multiple diodes and at least two diodes of the third subset of the multiple diodes. Each group of the multiple groups may receive light from a corresponding multi-pixel-microlens formed such that light incident in a first direction is collected in a first diode of the third subset of the multiple diodes and light incident in a second direction is collected in a second diode of the third subset of the multiple diodes. The method also includes calculating a disparity based on the light collected in the first direction and light collected in the second direction to generate focus instructions, and constructing an image based at least partly on the multiple imaging pixel values and focus instructions.
The disclosed aspects will hereinafter be described in conjunction with the appended drawings and appendices, provided to illustrate and not to limit the disclosed aspects, wherein like designations denote like elements.
Embodiments of this disclosure relate to systems and techniques for mask-less phase detection pixels by using microlenses that extend over and within proximity to adjacent diodes of an image sensor (referred to herein as multi-pixel-microlenses). The phase difference detection pixels below the multi-pixel-microlenses are provided to obtain a phase difference signal indicating a shift direction (defocus direction) and a shift amount (defocus amount) of an image focus.
It should be noted that the term “diodes” or other variations of the word as used herein may be, for example, photodiodes formed in a semiconductor substrate. An example semiconductor substrate may be, for example, a complementary metal-oxide semiconductor (CMOS) image sensor. As used herein, diode refers to a single unit of any material, semiconductor, sensor element or other device that converts incident light into current. The term “pixel” as used herein can refer to a single diode in the context of its sensing functionality due to optical elements such as color filters or microlenses. Accordingly, although “pixel” generally may refer to a display picture element, a “pixel” as used herein may refer to a sensor (for example, a photodiode) that receives or senses light from a target and generates a signal which if rendered on a display, may be displayed as a point in an image captured by the sensor (and a plurality of other sensors). The individual units or sensing elements of an array of sensors, for example in a CMOS or charge-coupled device (CCD), can also be referred to as sensels.
It should be noted that the term “color filter” or other variations of the word as used herein may, for example, act as wavelength-selective pass filters that may “filter” or “split” incoming light in the visible range into component sub-ranges of the visible spectrum. For example, the color filters may split incoming light into red, green, and/or blue ranges (as indicated by the R, G, and B notation used throughout this application). The light is split or filtered by allowing only certain selected wavelengths to pass through each of the color filters. The filtered light may be received by dedicated red, green, or blue diodes on an image sensor. Although red, blue, and green color filters are commonly used, it should be understood that the color filters used in the embodiments described herein and throughout this application can vary according to the color channel requirements of the captured image data, for example including ultraviolet, infrared, or near-infrared pass filters.
As used herein, “over” and “above” refer to the position of a structure (for example, a color filter or lens) such that light incident from a target scene propagates through the structure before it reaches (or is incident on) another structure. To illustrate, a microlens array may be positioned above a color filter array, which is positioned above a diode array. Accordingly, light from the target scene first passes through the microlenses, then the color filter array, and finally is incident on the diodes.
Using multi-pixel-microlenses allows for substantially full brightness of the phase detection pixels. For example, the phase detection pixels have a similar brightness relative to adjacent imaging pixels, in contrast to masked phase detection pixels which exhibit reduced-brightness. Accordingly, embodiments described herein can produce a final image with fewer and/or less noticeable artifacts as compared to an image produced using a sensor with masked phase detection pixels. Also embodiments described herein can produce better performance of phase detection autofocus in low-light settings. Such multi-pixel-microlenses also provide for left and right phase detection pixels that are close to one another, for example, separated by an image pixel. Without subscribing to a particular scientific theory, such proximity may provide more accurate phase detection information than traditional phase detection pixels that are spaced apart to reduce artifacts in the final image. The accuracy of the phase detection information may be further improved by providing a strong and distinct separation of the left and right phase detection pixels, for example, by providing an image pixel between the left and right phase detection pixels.
Various embodiments will be described below in conjunction with the drawings for purposes of illustration. It should be appreciated that many other implementations of the disclosed concepts are possible, and various advantages can be achieved with the disclosed implementations. Headings are included herein for reference and to aid in locating various sections. These headings are not intended to limit the scope of the concepts described with respect thereto. Such concepts may have applicability throughout the entire specification.
For example,
As noted above, the masked phase detection approach using sensor portion 100 including a phase detection pixels 130R and/or 130L may produce artifacts as compared to an image produced using a sensor comprising multi-pixel-microlenses as described herein. For example,
However, the split pixel approach of
According to the embodiments described herein, color filters placed between a multi-pixel-microlens and the corresponding diodes used for phase detection can be selected to pass the same wavelengths of light. The multi-pixel-microlens may correspond to a plurality of adjacent diodes, where two diodes are associated with a color filter selected to pass the same wavelength or wavelengths of light. The plurality of adjacent diodes may also include a third diode associated with a color filter selected to pass a different wavelength or wavelengths of light than the aforementioned color filters. By using multiple color filters under a multi-pixel-microlens, the color filters placed between single-diode and multi-pixel-microlenses and corresponding diodes can follow the standard Bayer pattern, without a need for complex and expensive image signal processing following capturing the final image. For example, in current implementations, a non-standard Bayer pattern is compensated for through image processing techniques. These techniques may not be optimal when used with image processing hardware designed for operation using a standard Bayer pattern. Furthermore, by using multiple diodes that receive a single color “under” each multi-pixel-microlens used for phase detection, a pixel color value can be more accurately calculated as compared to a sensor having different color filter colors under a microlens. In one embodiment, at least two of the color filters disposed between the multi-pixel-microlens and the corresponding diodes can be selected to pass blue light. Accordingly, blue correction is unnecessary or trivial and the resulting image data may not require correction of any blue pixel information by having defective or missing blue pixel information, because blue pixels are the least important for human vision. In some embodiments, by having at least one diode disposed between the two diodes used for phase detection, the left and right phase detection pixels may be separated to provide accurate phase detection values for each phase detection. In one embodiment, at least one color filter positioned between the colors filters of the phase detection pixels (e.g., the color filters configured to pass the same wavelength of light) and disposed between the multi-pixel-microlens and the corresponding diodes can be selected to pass green light. Accordingly, green correction is trivial and the resulting image data does not lose any green pixel information by having defective or missing green pixel information because the green pixel is used for obtaining imaging information and green pixels are particularly important for human vision.
Red, green, and blue, as used herein to describe pixels or color filters, may refer to wavelength ranges roughly following the color receptors in the human eye. Exact beginning and ending wavelengths (or portions of the electromagnetic spectrum) that define colors of light (for example, red, green, and blue light) are not typically defined to be a single wavelength. Each color filter can have a spectral response function within the visible spectrum, and each color channel resulting from placement of a color filter over a portion of the image sensor can have a typical human response function. Image sensor filter responses are more or less the same, however may change from sensor to sensor.
The image sensors used to capture phase detection autofocus information as described herein may be used in conjunction with a color filter array (CFA) or color filter mosaic (CFM). Such color filters split all incoming light in the visible range into red, green, and blue categories to direct the split light to dedicated red, green, or blue photodiode receptors on the image sensor. As such, the wavelength ranges of the color filter can determine the wavelength ranges represented by each color channel in the captured image. Accordingly, a red channel of an image may correspond to the red wavelength region of the color filter and can include some yellow and orange light, ranging from approximately 570 nm to approximately 760 nm in various embodiments. A green channel of an image may correspond to a green wavelength region of a color filter and can include some yellow light, ranging from approximately 570 nm to approximately 480 nm in various embodiments. A blue channel of an image may correspond to a blue wavelength region of a color filter and can include some violet light, ranging from approximately 490 nm to approximately 400 nm in various embodiments.
Although discussed herein primarily in the context of phase detection autofocus, the phase detection image sensors and techniques described herein can be used in other contexts, for example generation of stereoscopic image pairs or sets.
Color filters 315A-315D act as wavelength-selective pass filters and, as described above, may filter incoming light in the visible range into red, green, and blue ranges (as indicated by the R, G, and B). The light is filtered by allowing only certain selected wavelengths to pass through the color filters 315A-315D. The split light is received by dedicated red, green, or blue diodes 320A-320D on the image sensor. Although red, blue, and green color filters are commonly used, in other embodiments the color filters can vary according to the color channel requirements of the captured image data, for example including ultraviolet, infrared, or near-infrared pass filters.
Each single-diode microlens 305A and 305D is positioned over a single color filter 315A and 315D and a single diode 320A and 320D, respectively. Diodes 320A and 320D accordingly provide imaging pixel information. Dual-diode microlens 310 is positioned over and within proximity to two adjacent color filters 315B and 315C and two corresponding adjacent diodes 320B and 320C, respectively. Diodes 320B and 320C accordingly provide phase detection pixel information by diode 320B receiving light entering dual-diode microlens 310 in a first direction (L(X)) and diode 320C receiving light entering dual-diode microlens 310 in a second direction (R(X)). In some embodiments, the dual-diode microlens 310 can be a planoconvex lens having a circular perimeter, where the at least one dual-diode microlens may be sized to pass light to a 2×2 cluster of diodes of the plurality of diodes. In other embodiments, the dual-diode microlens 110 can be a planoconvex lens having an oval perimeter, where the at least one dual-diode microlens may be sized to pass light to a 2×1 cluster of diodes of the plurality of diodes, as described in connection with
The microlens array comprising single-diode microlenses 305A, 310, and 305D can be positioned above the color filter array 315A-315D, which is positioned above the diodes 320A-320D. Accordingly, light from the target scene first passes through the microlenses 305A, 310, and 305D, then the color filter array 315A-315D, and finally is incident on the diodes 315A-315D.
Placement of the microlenses above each photodiode 320A-320D redirects and focuses the light onto the active detector regions. Each microlens may be formed by dropping the lens material in liquid form onto the color filters 315A-315D on which the lens material solidifies. In other embodiments, wafer-level optics can be used to create a one or two dimensional array of microlenses using semiconductor-like techniques, where a first subset of the microlenses in the array include single-diode microlenses and a second subset of the microlenses in the array include dual-diode microlenses. As illustrated by single-diode microlens 305A and 305D and dual-diode microlens 310, each microlens may be a single element with one planar surface and one spherical convex surface to refract the light. Other embodiments of the microlenses may use aspherical surfaces, and some embodiments may use several layers of optical material to achieve their design performance.
Color filters 315A and 315D under single-diode microlenses 305A and 305D can be positioned according to the Bayer pattern in some embodiments. Accordingly, color filter 315A is either a red color filter or a blue color filter, while color filter 315D is a green color filter. Preserving the Bayer pattern for diodes 320A and 320D and other diodes under single-diode microlenses can provide computational benefits, for example enabling use of widespread demosaicking techniques on captured image data. The Bayer pattern is a specific pattern for arranging RGB color filters on a rectangular grid of photosensors. The particular arrangement of color filters of the Bayer pattern is used in most single-chip digital image sensors used in digital cameras, camcorders, and scanners to create a color image. The Bayer pattern is 50% green, 25% red and 25% blue with rows of repeating red and green color filters alternating with rows of repeating blue and green color filters.
Although the color filters over which the single-diode microlenses 305A and 305D are positioned are described herein in the context of the Bayer pattern arrangement, such color filters can be arranged in other patterns that are 50% green color filters, 25% blue color filters, and 25% red color filters. Other patterns that include more green color filters than blue or red color filters are possible or other patterns that have generally twice as many green color filters as blue or red color filters. The color filters can also be positioned according to other color filter patterns in some embodiments, for example color filter patterns designed for use with panchromatic diodes (sensitive to all visible wavelengths) and/or color filters for passing light outside of the visible spectrum.
As depicted in
In some embodiments, the “missing” color filter that is replaced by the green color filter 315C under the dual-diode microlens 310 can be a blue color filter, as the blue channel of image data is the least important for quality in terms of human vision. In other embodiments, green color filter 315C can be in the location where a red color filter would be if not for interruption of the color filter pattern due to the dual-diode microlens 310.
As illustrated, a number of green color filters 312, red color filters 314, and blue color filters 316 are arranged in the Bayer pattern under a number of single-diode microlenses 305. Each color filter is called out once using reference numbers 312, 314, or 316 and shown throughout the remainder of the figures using G, R, or B for simplicity of the illustration. However, at the location of the dual-diode microlens 310 the Bayer pattern is interrupted and an additional green color filter is inserted at the location of the right phase detection pixel. As such, there is a “missing” red filter at the location of the right phase detection pixel. In the illustrated embodiment, the right phase detection pixel green color filter replaces what would otherwise, according to the Bayer pattern, be a red color filter. In other embodiments the phase detection pixel green color filter can replace a blue color filter.
As illustrated, the left phase detection pixel (L) value can be determined by summing the green values of the left and right phase detection pixels under the dual-diode microlens 310. The summed green value is assigned to the left phase detection pixel as the Bayer pattern used to arrange the color filters under the single-diode microlenses specifies a green pixel in the location of the left phase detection pixel. The small phase shift of the summed green value may provide improved green aliasing. In some embodiments, the summed value may be divided by the number of diodes under the microlens (here, two) to obtain the green value.
As illustrated, the right phase detection (R) pixel value can be determined by interpolation using two nearby red pixel values (values received from the diodes under red color filters). Two horizontally located red pixels are illustrated for the interpolation, however alternatively or additionally two vertically located red pixels can be used. The interpolated value is assigned to the right phase detection pixel as the Bayer pattern used to arrange the color filters under the single-diode microlenses specifies a red pixel in the location of the right phase detection pixel.
In some embodiments, interpolation as depicted for the green value and missing pixel value of
The decision regarding which neighboring pixels to use for calculating the “missing” pixel value can be predetermined or can be adaptively selected from a range of pre-identified alternatives, for example based on calculated edge data. In some embodiments, the missing pixel under the dual-diode microlens 310 may be recorded as a defective pixel. The image signal processor (e.g., image signal processor 1020 of
According to the embodiments described herein, two or more color filters placed between a multi-pixel-microlens and the corresponding diodes can be selected to pass the same wavelengths of light. In some embodiments, the color filters and miroclenses are positioned within proximity of the diodes. In various embodiments, the color filters may be adjacent to the diodes. The color filters placed between the single-diode and multi-pixel-microlens and corresponding diodes can follow the standard Bayer pattern. Each multi-pixel-microlens may be positioned within proximity of, for example, at least three adjacent color filters, where two of the color filters are configured to pass the same wavelengths of light. The three or more adjacent color filters, and corresponding diodes, can be arranged in a row or column. The two color filters that pass the same wavelength of light are disposed on opposite sides of at least one other color filter positioned within proximity of the multi-pixel-microlens. The at least one other adjacent color filter is configured to pass different wavelengths of light to a corresponding diode. In some embodiments, the two color filters that pass the same wavelength of light, and corresponding diodes, are disposed at opposite ends of the multi-pixel-microlens. In another embodiment, the two color filters that pass the same wavelength of light, and corresponding diodes, need not be positioned at the ends of the multi-pixel-microlens, for example, disposed on opposite sides of a central axis of the multi-pixel-microlens.
By having two diodes that receive the same color “under” each multi-pixel-microlens, a pixel color value can be more accurately calculated compared to a sensor having multiple different color filter colors under a multi-pixel-microlens. Furthermore, by having at least one diode between the two diodes that receive the same wavelengths of light, the color filters may be arranged in a standard Bayer pattern, which may reduce or minimize the need for complicated reconstruction and interpolation described above in connection to
In some embodiments, a single-diode microlens may focus most or all of the received light onto the diode, thereby focusing the received light onto the active detector region. However, in some embodiments using microlenses within proximity of two or more diodes (e.g., dual-diode or multi-pixel-microlenses), the microlens may have a shape that does not equate to that of a single-diode microlens. Thus, in some embodiments, less than all of the received light may be focused onto the corresponding diode. For example, focusing of light may be affected by the shape of the microlens, thus some of the light may not be received by the diode and, instead, may be incident on non-active detector regions of the pixel (e.g., transistors, wires, etc. using for sending and receiving information to and from the pixel). Accordingly, in some embodiments, the need for reconstruction and interpolation may be based on a pixel fill factor, for example, the ratio between the area of the active detector region (e.g., diode area) and the entire pixel (e.g., active and non-active detector regions). The pixel fill factor may also be indicative of the amount or percentage of the light received by the microlens that is focused onto the active detector region for use as image data and/or in phase detection.
Without subscribing to a particular scientific theory, the human eye is partially sensitive to green light, therefore, green pixels may be particularly important for human vision. In one embodiment, at least two color filters disposed between the multi-pixel-microlens and the corresponding diodes can be selected to pass red or blue light. The at least two color filters may be used for phase detection. While another (or third) color filter is positioned between the at least two color filters and can be selected to pass green light for use in obtaining image information. In this embodiment, based on optical properties (e.g., shape, optical power, focusing properties, etc.) of the portion of the multi-pixel-microlens within proximity of or associated with the green microlens, the amount of light focused onto the diode may be similar to the amount of light focused by a single-diode microlens. For example, by using a central portion of the microlens, the shape may be configured to focus an amount of light that is similar to a single-diode microlens. The amount of light focused onto the active detector region may also be based on the pixel fill factor. For example, if the light is not focused in the same manner as the single-diode microlens, a larger diode will be capable of receiving more of the light focused by the multi-pixel-microlens.
Accordingly, green correction may be trivial or rendered unnecessary and the resulting image data does not lose green pixel information by having defective or missing green pixel information, because almost all of the green pixel is used for obtaining imaging information. In some embodiments, green correction may be trivial if the difference between the light focused onto the diode by the multi-pixel-microlens is less than a threshold of the amount of light focused by a single-diode microlens, because a difference of less than the threshold may be unnoticeable in an image. For example, green correction may be unnecessary where the difference is less than 20% or 10%, because the effect on the pixel information may be unnoticeable at less than the threshold. If the difference is over the threshold, reconstruction or interpolation, as described above in connection to
In some embodiments, in the alternative or in combination, the at least two color filters disposed between a multi-pixel-microlens and the corresponding diodes can be selected to pass blue light. By using blue color filters, correction may be unnecessary or trivial and the resulting image data may not require correction of any blue pixel information by having defective or missing blue pixel information, as blue pixels are the least important for human vision. As described above, the need to correct the pixel information may be based on the optical properties of the multi-pixel-microlens and/or the pixel fill factor. In some embodiments, blue correction may be trivial if the difference between the light focused onto the diode by the multi-pixel-microlens is less than a threshold value of the amount of light focused by a single-diode microlens. For example, blue correction may be unnecessary where the difference is less than 20% because the effect on the pixel information may be unnoticeable or compensated for by less destructive noise reduction algorithms However, the threshold may be less than or more then 20% based on the image signal processing hardware implemented in the imaging device. As described above, the reconstruction or interpolation may be carried out in a manner similar to that described in connection to
In some embodiments, a multi-pixel-microlens may have optical properties and optical powers that are different than the single-diode microlens. Light entering a single-diode microlens may be differently focused onto a corresponding diode than light entering a multi-pixel-microlens and focused onto one of the corresponding diodes, for example, a diode positioned away from the edges of the multi-diode lens. In some embodiments, the multi-pixel-microlens may be formed such that light focused onto diodes corresponding to imaging pixels may be substantially similar to light focused onto imaging pixels corresponding to a single-diode microlens. For example, the multi-pixel-microlens may be formed such that light focused onto diodes corresponding to imaging pixels may be approximately 10% less than light focused onto imaging pixels corresponding to a single-diode microlens. However, other configurations are possible based on the desired performance and characteristics of the image devices. Accordingly, correction is unnecessary or trivial and the resulting image data may not require correction due to defective or missing imaging pixel information, because the image pixel information obtained therefrom will be substantially similar to that obtained by a single-diode microlens. As described above, the need to correct for defective or missing imaging pixel information may be based on or connected to the pixel fill factor and/or the amount of light focused onto the active detector region versus the amount of light focused onto the entire pixel.
As described above in connection to
Each single-diode microlens 405A-C, 405G, and 405H is positioned over a single color filter 415A-C, 415G, and 415H and a single diode 420A-C, 420G, and 420H, respectively. Diodes 420A-C, 420G, and 420H accordingly provide imaging pixel information. Multi-pixel-microlens 410 is positioned over and within proximity of three adjacent color filters 415D, 415E, and 415F and three corresponding adjacent diodes 420D, 420E, and 420F, respectively. In the embodiments described herein, diodes 420D and 420F may be configured to provide phase detection pixel information based on diode 420D receiving light entering a corresponding portion of the multi-pixel-microlens 410 in a first direction (L(X)) and diode 420F receiving light entering a corresponding portion of the multi-pixel-microlens 410 in a second direction (R(X)). Diode 420E, disposed between the diodes 420D and 420F, may provide imaging pixel information by receiving light entering the multi-pixel-microlens 410. In some embodiments, the multi-pixel-microlens 410 can be a planoconvex lens having an oval perimeter, where the at least one multi-pixel-microlens may be sized to pass light to a 3×1 cluster of diodes of the plurality of diodes. While a specific example multi-pixel-microlens 410 is described herein, it should be understood that other configurations are possible. For example, any number of diode clusters may be disposed under the multi-pixel-microlens so long as at least two diodes correspond to color filters that pass the same wavelength of light.
As used herein, “over” and “above” refer to the position of a structure (for example, a color filter or lens) such that light incident from a target scene propagates through the structure before it reaches (or is incident on) another structure. To illustrate, the microlens array 405A-C, 410, 405G, and 405H is positioned above the color filter array 415A-415H, which is positioned above the diodes 420A-420H. Accordingly, light from the target scene first passes through the microlens array 405A-C, 410, 405G, and 405H, then the color filter array 415A-415H, and finally is incident on the diodes 420A-420H.
Placement of the microlenses above each photodiode 420A-420H redirects and focuses the light onto the active detector regions. Each microlens may be formed by dropping the lens material in liquid form onto the color filters 415A-415H on which the lens material solidifies. In other embodiments, wafer-level optics can be used to create a one or two dimensional array of microlenses using semiconductor-like techniques, where a first subset of the microlenses in the array include single-diode microlenses and a second subset of the microlenses in the array include multi-pixel-microlenses. As illustrated by single-diode microlens 405A-C, 405G, and 405H and multi-pixel-microlens 410, each microlens may be a single element with one planar surface and one spherical convex surface to refract the light. Other embodiments of the microlenses may use aspherical surfaces, and some embodiments may use several layers of optical material to achieve their design performance In some embodiments, the multi-pixel-microlenses may be shaped and designed to optimize light focused onto variously sized pixels. In some implementations, the pixels described herein may be 1 micron by 1 micron. Accordingly, the multi-pixel-microlens may be 1 micron by 3 microns where the multi-pixel-microlens is associated with three diodes. However, other dimensions are possible based on the dimensions of the corresponding pixels and diodes under the microlenses.
Color filters 415A-H under the microlens array 405A-C, 410, 405G, and 405H can be positioned according to the Bayer pattern in some embodiments. As described above in connection to
Although the colors are described herein in the context of the Bayer pattern arrangement, such color filters can be arranged in other patterns that are 50% green color filters, 25% blue color filters, and 25% red color filters. Other patterns are also possible, for example, that include more green color filters than blue or red color filters, or other patterns that have generally twice as many green color filters as blue or red color filters. The color filters can also be positioned according to other color filter patterns in some embodiments, for example color filter patterns designed for use with panchromatic diodes (sensitive to all visible wavelengths) and/or color filters for passing light outside of the visible spectrum.
As depicted in
Specifically,
When the image is in focus, the left rays L(X) and right rays R(X) converge at the plane of the phase detection diodes 520D and 520F. As illustrated in
Incoming light is represented by arrows, and is understood to be incident from a target scene. As used herein, “target scene” refers to any scene or area having objects reflecting or emitting light that is sensed by the image sensor or any other phenomena viewable by the image sensor. Light from the target scene propagates toward diodes 420C-420G and 420I-420M, and is incident on the diodes after first passing through the microlenses and then the color filter array.
To perform phase detection, the imaging system can save two images containing only values received from the phase detection diodes 420D, 420F, 420J, and 420L. For example, left side data may be used to save an image based on light received from direction L(X) and right side data may be used to save an image based on light received from direction R(X). Diode 420D receives light entering multi-pixel-microlens 410 from the left side direction and diode 420F receives light entering multi-pixel-microlens 410 from the right side direction. Similarly, diode 420J receives light entering multi-pixel-microlens 425 from the left side direction (L(X)) and diode 420L receives light entering multi-pixel-microlens 425 from the right side direction (R(X)). Any number of multi-pixel-microlenses can be disposed over an image sensor ranging from one to all of the microlenses of the sensor, based on balancing the considerations of more multi-pixel-microlenses providing more reliable phase detection autofocus data but requiring greater amounts of computation for pixel value calculations and also increasing the likelihood of artifacts in a final image.
Focus can be calculated by applying a cross-correlation function to the data representing the left and right images. If the distance between the two images is narrower than the corresponding distance in an in-focus condition, the autofocus system determines that the focal point is in front of the subject. If the distance is wider than the reference value, the system determines that the focal point is behind the subject. For example, when in focus, the two images correlate without an offset or minimal offset (e.g., an offset of less than 1 pixel), and when not in focus the offset is noticeable (e.g., several pixels in the negative or positive, depending if it is behind or in front of the focus point). The autofocus system can compute how much the lens position (or sensor position, in embodiments having a movable sensor) should be moved and in which direction and provide this information to the lens actuator to move the lens accordingly, providing for fast focusing. The above-described process can be performed by the image signal processor 1020 of
As illustrated, a number of green color filters 705, red color filters 710, and blue color filters 715 are arranged in a Bayer pattern. Each color filter is called out once using reference numbers 705, 710, or 715 and shown using G, R, or B for simplicity of the illustration.
In some embodiments, the determination to dispose the multi-pixel-microlens over pairs of red, blue, or even green color filters may be based on the desired ability of the image signal processing hardware (e.g., as described in connection to
Light representing the target scene 905 is passed through the lens assembly 910 and received by the image sensor, where half-image samples 915 are produced using the multi-pixel-microlenses described above. Because the dimensions of the lens assembly 910 and sensor are larger than the length of the light-wave, the lens assembly 910 can be modeled as a linear low-pass filter with a symmetric impulse response. The impulse response (also referred to as the point spread function) of the lens assembly 910 may be of a rectangular shape with a width parameter proportional to the distance between the sensor and the image plane. The scene is “in focus” when the sensor is in the image plane, that is, in the plane where all rays from a single point at the scene converge into a single point. As shown in
A focus function calculator 920 applies a cross-correlation function to the partial images to determine disparity. The cross-correlation function of the left and right impulse responses of the lens assembly 910 can be approximately symmetric and unimodal. However, due to the nature of the target scene 905, the cross-correlation function as applied to the left and right captured images may have one or more false local maxima. Various approaches can be used to identify the true maximum of the cross-correlation function. The result of the cross-correlation function is provided as feedback to the autofocus control 925, which can be used to drive a lens actuator to move the primary focusing lens assembly 910 to a desired focus position. Other embodiments may use a stationary primary focusing lens assembly and move the image sensor to the desired focus position. Accordingly, in the phase detection autofocus process 900, focusing may be equivalent to searching for the cross-correlation function maximum. This is a fast process that can be done quickly enough to provide focus adjustment for each frame at typical frame rates, for example at 30 frames per second, and thus can be used to provide smooth autofocusing for video capture. Some implementations combine phase detection autofocus with contrast-based autofocus techniques, for example, to increase accuracy.
When the primary focusing lens assembly 910 and/or image sensor are in the desired focus position, the image sensor can capture in-focus imaging pixel information and phase detection pixel information. The imaging pixel values and determined phase detection pixel values can be output for preforming autofocusing operations or capturing an image. Optionally, the imaging pixel values and determined phase detection pixel values can also be output for preforming demosaicking, calculating, and interpolating color values for the phase detection pixels, and other image processing techniques to generate a final image of the target scene.
Image capture device 1000 may be a portable personal computing device such as a mobile phone, digital camera, tablet computer, personal digital assistant, or the like. There are many portable computing devices in which using the phase detection autofocus techniques as described herein would provide advantages. Image capture device 1000 may also be a stationary computing device or any device in which the multispectral iris verification techniques would be advantageous. A plurality of applications may be available to the user on image capture device 1000. These applications may include traditional photographic and video applications as well as data storage applications and network applications.
The image capture device 1000 includes phase detection autofocus camera 1015 for capturing external images. The phase detection autofocus camera 1015 can include an image sensor having multi-pixel-microlenses and color filters arranged according to the embodiments described above, for example, in connection to
The sensor of the phase detection autofocus camera 1015 can have different processing functionalities in different implementations. In one implementation, the sensor may not process any data, and the image signal processor 1020 may perform all needed data processing. In another implementation, the sensor may be capable of extracting phase detection pixels, for example into a separate Mobile Industry Processor Interface (MIPI) channel. An imaging apparatus as described herein may include an image sensor capable of performing all phase detection calculations or an image sensor capable of performing some or no processing together with an image signal processor 1020 and/or device processor 1050. While not necessary to achieve accurate phase detection and image values, in some embodiments, the sensor may optionally be capable of interpolating missing pixel values, for example, in a RAW channel The sensor may optionally be capable of interpolating missing pixel values, for example, in a normal channel, and may be able to process the whole phase detection calculation internally (on-sensor). For example, the sensor may include analog circuitry for performing sums, subtractions, and/or comparisons of values received from diodes.
The image signal processor 1020 may be configured to perform various processing operations on received image data in order to execute phase detection autofocus and image processing techniques. Image signal processor 1020 may be a general purpose processing unit or a processor specially designed for imaging applications. Examples of image processing operations include demosaicking, white balance, cross talk reduction, cropping, scaling (e.g., to a different resolution), image stitching, image format conversion, color interpolation, color processing, image filtering (e.g., spatial image filtering), lens artifact or defect correction, etc. The image signal processor 1020 can also control image capture parameters such as autofocus and auto-exposure. Image signal processor 1020 may, in some embodiments, comprise a plurality of processors. Image signal processor 1020 may be one or more dedicated image signal processors (ISPs) or a software implementation of a processor. In some embodiments, the image signal processor 1020 may be optional for phase detection operations, as some or all of the phase detection operations can be performed on the image sensor.
As shown, the image signal processor 1020 is connected to a memory 1030 and a working memory 1005. In the illustrated embodiment, the memory 1030 stores capture control module 1035, phase detection autofocus module 1040, and operating system module 1045. The modules of the memory 1030 include instructions that configure the image signal processor 1020 of device processor 1050 to perform various image processing and device management tasks. Working memory 1005 may be used by image signal processor 1020 to store a working set of processor instructions contained in the modules of memory. Alternatively, working memory 1005 may also be used by image signal processor 1020 to store dynamic data created during the operation of image capture device 1000.
As mentioned above, the image signal processor 1020 is configured by several modules stored in the memories. The capture control module 1035 may include instructions that configure the image signal processor 1020 to adjust the focus position of phase detection autofocus camera 1015, for example, in response to instructions generated during a phase detection autofocus technique. Capture control module 1035 may further include instructions that control the overall image capture functions of the image capture device 1000. For example, capture control module 1035 may include instructions that call subroutines to configure the image signal processor 1020 to capture multispectral image data including one or more frames of a target scene using the phase detection autofocus camera 1015. In one embodiment, capture control module 1035 may call the phase detection autofocus module 1040 to calculate lens or sensor movement needed to achieve a desired autofocus position and output the needed movement to the imaging processor 1020. Optionally, in some embodiments, the capture control module 1035 may call the phase detection autofocus module 1040 to interpolate color values for pixels positioned beneath multi-pixel-microlenses.
Accordingly, phase detection autofocus module 1040 can store instructions for executing phase detection autofocus. In some embodiments, the phase detection autofocus module 1040 can also store instructions for calculating color values for phase detection pixels and for image generation based on phase detection pixel values and imaging pixel values.
Operating system module 1045 configures the image signal processor 1020 to manage the working memory 1005 and the processing resources of image capture device 1000. For example, operating system module 1045 may include device drivers to manage hardware resources such as the phase detection autofocus camera 1015. Therefore, in some embodiments, instructions contained in the image processing modules discussed above may not interact with these hardware resources directly, but instead interact through standard subroutines or APIs located in operating system module 1045. Instructions within operating system module 1045 may then interact directly with these hardware components. Operating system module 1045 may further configure the image signal processor 1020 to share information with device processor 1050.
Device processor 1050 may be configured to control the display 1025 to display the captured image, or a preview of the captured image, to a user. The display 1025 may be external to the imaging device 1000 or may be part of the imaging device 1000. The display 1025 may also be configured to provide a view finder displaying a preview image for a use prior to capturing an image, for example to assist the user in aligning the image sensor field of view with the user's eye, or may be configured to display a captured image stored in memory or recently captured by the user. The display 1025 may comprise an LCD, LED, or OLED screen, and may implement touch sensitive technologies.
Device processor 1050 may write data to storage module 1010, for example data representing captured images and data generated during phase detection and/or pixel value calculation. While storage module 1010 is represented schematically as a traditional disk device, storage module 1010 may be configured as any storage media device. For example, the storage module 1010 may include a disk drive, such as an optical disk drive or magneto-optical disk drive, or a solid state memory such as a FLASH memory, RAM, ROM, and/or EEPROM. The storage module 1010 can also include multiple memory units, and any one of the memory units may be configured to be within the image capture device 1000, or may be external to the image capture device 1000. For example, the storage module 1010 may include a ROM memory containing system program instructions stored within the image capture device 1000. The storage module 1010 may also include memory cards or high speed memories configured to store captured images which may be removable from the camera. The storage module 1010 can also be external to image capture device 1000, and in one example image capture device 1000 may wirelessly transmit data to the storage module 1010, for example over a network connection. In such embodiments, storage module 1010 may be a server or other remote computing device.
Although
Additionally, although
At block 1110, the image capture device receives image data from a plurality of diodes associated with a plurality of color filters arranged in a pattern as described in the various embodiments throughout this application. In some embodiments, the image data may comprise imaging pixel values and phase detection pixel values. For example, the imaging pixel values may be received from a first subset of the plurality of diodes (e.g., diodes 420E and 420K of
At block 1120, the image display device 1000 may calculate a disparity based on the light collected in the first direction (L(X)) and second direction (R(X)) in block 1110. In some embodiments, at block 1120 the image display device may generate focus instructions based on the received image data, for example, based on the light collected in the first direction (L(X)) and second direction (R(X)) at block 1110. In some embodiments, the focus instructions may be based on the calculated disparity between the light collected in the first direction (L(X)) and second direction (R(X)) in block 1110. In some embodiments, the focus instructions may comprise a distance and direction for moving the movable lens assembly to a desired focus position, as described in connection to
At block 1130, the image display device 1000 may construct an image based on the received image data. For example, the image display device 1000 may construct an image based at least on the plurality of pixel values of block 1110 and the focus instructions of block 1120.
Implementations disclosed herein provide systems, methods and apparatus for mask-less phase detection autofocus. It is noted that these embodiments may be implemented in hardware, software, firmware, or any combination thereof.
In some embodiments, the circuits, processes, and systems discussed above may be utilized in a wireless communication device. The wireless communication device may be a kind of electronic device used to wirelessly communicate with other electronic devices. Examples of wireless communication devices include cellular telephones, smart phones, Personal Digital Assistants (PDAs), e-readers, gaming systems, music players, netbooks, wireless modems, laptop computers, tablet devices, etc.
The wireless communication device may include one or more image sensors, one or more image signal processors, and a memory including instructions or modules for carrying out the process discussed above. The device may also have data, a processor loading instructions and/or data from memory, one or more communication interfaces, one or more input devices, one or more output devices such as a display device and a power source/interface. The wireless communication device may additionally include a transmitter and a receiver. The transmitter and receiver may be jointly referred to as a transceiver. The transceiver may be coupled to one or more antennas for transmitting and/or receiving wireless signals.
The wireless communication device may wirelessly connect to another electronic device (e.g., base station). A wireless communication device may alternatively be referred to as a mobile device, a mobile station, a subscriber station, a user equipment (UE), a remote station, an access terminal, a mobile terminal, a terminal, a user terminal, a subscriber unit, etc. Examples of wireless communication devices include laptop or desktop computers, cellular phones, smart phones, wireless modems, e-readers, tablet devices, gaming systems, etc. Wireless communication devices may operate in accordance with one or more industry standards such as the 3rd Generation Partnership Project (3GPP). Thus, the general term “wireless communication device” may include wireless communication devices described with varying nomenclatures according to industry standards (e.g., access terminal, user equipment (UE), remote terminal, etc.).
The functions described herein may be stored as one or more instructions on a processor-readable or computer-readable medium. The term “computer-readable medium” refers to any available medium that can be accessed by a computer or processor. By way of example, and not limitation, such a medium may comprise RAM, ROM, EEPROM, flash memory, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer. Disk and disc, as used herein, includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk, and Blu-ray® disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. It should be noted that a computer-readable medium may be tangible and non-transitory. The term “computer-program product” refers to a computing device or processor in combination with code or instructions (e.g., a “program”) that may be executed, processed, or computed by the computing device or processor. As used herein, the term “code” may refer to software, instructions, code, or data that is/are executable by a computing device or processor.
The methods disclosed herein comprise one or more steps or actions for achieving the described method. The method steps and/or actions may be interchanged with one another without departing from the scope of the claims. In other words, unless a specific order of steps or actions is required for proper operation of the method that is being described, the order and/or use of specific steps and/or actions may be modified without departing from the scope of the claims.
It should be noted that the terms “couple,” “coupling,” “coupled” or other variations of the word couple as used herein may indicate either an indirect connection or a direct connection. For example, if a first component is “coupled” to a second component, the first component may be either indirectly connected to the second component or directly connected to the second component. As used herein, the term “plurality” denotes two or more. For example, a plurality of components indicates two or more components.
The term “determining” encompasses a wide variety of actions and, therefore, “determining” can include calculating, computing, processing, deriving, investigating, looking up (e.g., looking up in a table, a database or another data structure), ascertaining and the like. Also, “determining” can include receiving (e.g., receiving information), accessing (e.g., accessing data in a memory) and the like. Also, “determining” can include resolving, selecting, choosing, establishing, and the like.
The phrase “based on” does not mean “based only on,” unless expressly specified otherwise. In other words, the phrase “based on” describes both “based only on” and “based at least on.”
In the foregoing description, specific details are given to provide a thorough understanding of the examples. However, it is noted that the examples may be practiced without these specific details. For example, electrical components/devices may be shown in block diagrams in order not to obscure the examples in unnecessary detail. In other instances, such components, other structures, and techniques may be shown in detail to further explain the examples. Thus, the present invention is not intended to be limited to the implementations shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.