The described embodiments relate generally to a device having a camera or other image capture device. More particularly, the described embodiments relate to detection of image flare using asymmetric pixels.
Cameras and other image capture devices often include a set of lenses that focus light or other electromagnetic radiation on an image sensor. The image sensor, in some examples, may be a complementary metal-oxide-semiconductor (CMOS) image sensor or a charge-coupled device (CCD) image sensor. The set of lenses may be moved, and the relationship between one or more lenses and the image sensor changed, to focus an image on the image sensor.
Some image sensors include a set of asymmetric pixels (ASPs), which are sometimes referred to as phase detection auto-focus (PDAF) pixels. Asymmetric pixels include elements, such as shields or lenses, that restrict the directions or incident angles of light that impinge on a photodetector of the pixel. Asymmetric pixels having out-of-phase directional asymmetries can therefore provide focus information to an image processor, enabling the image processor to quickly determine whether an image is in-focus or out-of-focus, and enabling the image processor to provide feedback to an auto-focus mechanism that adjusts a relationship between the image sensor and one or more lenses that focus an image on the image sensor. When an image is in-focus, asymmetric pixels have gains (ratios of asymmetric pixel values to symmetric pixel values) that are almost independent of scene content. When an image is out-of-focus, asymmetric pixels having different directional asymmetries have gains that differ in magnitude and polarity, thereby indicating an out-of-focus condition.
Image flare results from stray light reflecting within a camera module and arriving at an image sensor at angles that are far from intended incident angles (e.g., outside the intended ranges of incident angles that pixels are expected to receive light). Due to angular mismatches, image flare causes asymmetric pixel gains to deviate far outside the gains expected under both in-focus and out-of-focus conditions. Asymmetric pixels can therefore be used to detect image flare.
Embodiments of the systems, devices, methods, and apparatus described in the present disclosure are directed to identifying regions of a pixel array affected by image flare, detecting a directionality (or directionalities) of image flare, and processing images to remove, enhance, or reshape image flare using directional processing techniques. Also described are systems, devices, methods, and apparatus directed to distinguishing image flare from noise.
In a first aspect, the present disclosure describes a method for processing an image. The method may include obtaining a set of pixel values captured from a pixel array during an image capture frame. The set of pixel values may include pixel values for a set of asymmetric pixels having different directional asymmetries (where asymmetric pixels having different directional asymmetries are asymmetric pixels that are configured or oriented within a pixel array to restrict light arriving at different subsets of incident angles). The method may also include detecting, using the pixel values for at least the asymmetric pixels and the different directional asymmetries of the asymmetric pixels, a directionality of image flare; and processing an image defined by the set of pixel values in accordance with the detected directionality of image flare.
In another aspect, the present disclosure describes an electronic device. The electronic device may include a pixel array and a processor. The pixel array may include asymmetric pixels having different directional asymmetries. The processor may be configured to obtain a set of pixel values captured from the pixel array during an image capture frame. The set of pixel values may include pixel values for the asymmetric pixels. The processor may also be configured to evaluate the pixel values for at least the asymmetric pixels to identify a region of the pixel array affected by image flare; and to detect, using the pixel values for at least the asymmetric pixels and the different directional asymmetries of the asymmetric pixels, a directionality of image flare.
In still another aspect of the disclosure, an imaging system is described. The imaging system may include a pixel array and a processor. The pixel array may include a first set of asymmetric pixels distributed across the pixel array and having a first directional asymmetry, and a second set of asymmetric pixels distributed across the pixel array and having a second directional asymmetry. The second directional asymmetry may be different from the first directional asymmetry. The processor may be configured to determine that a first set of pixel values obtained from the first set of asymmetric pixels, during an image capture frame, satisfies an image flare threshold. The processor may also be configured to evaluate a second set of pixel values obtained from the second set of asymmetric pixels, during the image capture frame, to determine whether the first set of pixel values is indicative of image flare or noise in an image captured by the pixel array during the image capture frame.
In addition to the exemplary aspects and embodiments described above, further aspects and embodiments will become apparent by reference to the drawings and by study of the following description.
The disclosure will be readily understood by the following detailed description in conjunction with the accompanying drawings, wherein like reference numerals designate like structural elements, and in which:
The use of cross-hatching or shading in the accompanying figures is generally provided to clarify the boundaries between adjacent elements and also to facilitate legibility of the figures. Accordingly, neither the presence nor the absence of cross-hatching or shading conveys or indicates any preference or requirement for particular materials, material properties, element proportions, element dimensions, commonalties of similarly illustrated elements, or any other characteristic, attribute, or property for any element illustrated in the accompanying figures.
Additionally, it should be understood that the proportions and dimensions (either relative or absolute) of the various features and elements (and collections and groupings thereof) and the boundaries, separations, and positional relationships presented therebetween, are provided in the accompanying figures merely to facilitate an understanding of the various embodiments described herein and, accordingly, may not necessarily be presented or illustrated to scale, and are not intended to indicate any preference or requirement for an illustrated embodiment to the exclusion of embodiments described with reference thereto.
Reference will now be made in detail to representative embodiments illustrated in the accompanying drawings. It should be understood that the following description is not intended to limit the embodiments to one preferred embodiment. To the contrary, it is intended to cover alternatives, modifications, and equivalents as can be included within the spirit and scope of the described embodiments as defined by the appended claims.
When light rays coming from a bright source of light (such as the sun or artificial light) directly reach the front element of a camera lens, they can reflect and bounce off different lens elements within the camera, the camera's diaphragm, and even the camera's image sensor. These light rays can potentially degrade image quality and create unwanted objects in images. Better known as “image flare,” “lens flare,” or just “flare,” the effects can impact images in a number of ways. For example, image flare can drastically reduce image contrast by introducing haze in different colors. Image flare can also introduce circular or semi-circular halos or “ghosts” in an image, and even introduce odd-shaped semi-transparent objects of various color intensities. It is useful to detect the region(s) of an image affected by image flare, such that further image processing can be performed, and image quality can be improved or intentionally enhanced.
Described herein are techniques that use pixel values (or other information) obtained from asymmetric pixels having different directional asymmetries to identify regions of a pixel array affected by image flare, and/or to detect a directionality or directionalities of image flare. In some cases, image flare may effect more than one region of an image, or have more than one cause that effects a single region, or have more than one cause that effects different regions in the same or different ways. As examples, a directionality of image flare may be an indicator of the direction to a cause of image flare, or an indicator of an average incident angle (or other metric) of the photons that are causing image flare within a region of a pixel array (including the pixel array as a whole). In some cases, a directionality (or directionalities) of image flare may be representative of, or used to determine (e.g., calculate), the direction(s) to one or more causes of image flare.
In some cases, an image may be processed to remove, enhance, or reshape image flare using directional processing techniques (e.g., techniques that are tuned to process image flare using one or more directionalities of image flare and/or directions to causes of image flare (and in some cases, one or more indicators of the region(s) within a pixel array that are affected by image flare)). Also described are techniques for distinguishing image flare from noise using pixel values for asymmetric pixels. These and other techniques are described with reference to
Directional terminology, such as “top”, “bottom”, “upper”, “lower”, “front”, “back”, “over”, “under”, “left”, “right”, etc. is used with reference to the orientation of some of the components in some of the figures described below. Because components in various embodiments can be positioned in a number of different orientations, directional terminology is used for purposes of illustration only and is in no way limiting. The directional terminology is intended to be construed broadly, and therefore should not be interpreted to preclude components being oriented in different ways.
Referring now to
In the illustrated embodiment, the electronic device 100 is implemented as a smartphone. Other embodiments, however, are not limited to this construction. Other types of computing or electronic devices which may include a camera or image sensor include, but are not limited to, a netbook or laptop computer, a tablet computer, a digital camera, a scanner, a video recorder, a watch (or other wearable electronic device), a drone, a vehicle navigation system, and so on.
As shown in
The I/O device 108 may be implemented with any type of input or output device. By way of example only, the I/O device 108 may be a switch, a button, a capacitive sensor, or other input mechanism. The I/O device 108 allows a user to interact with the electronic device 100. For example, the I/O device 108 may be a button or switch to alter the volume, return to a home screen, and the like. The electronic device may include one or more input devices or output devices, and each device may have a single input, output, or I/O function, or some devices may have multiple I/O functions. In some embodiments, the I/O device 108 may be incorporated into the display 110, the first camera 102, or an audio I/O device.
The display 110 may be operably or communicatively connected to the electronic device 100. The display 110 may be implemented as any suitable type of display, such as a high resolution display or an active matrix color liquid crystal display (LCD). The display 110 may provide a visual output for the electronic device 100 or function to receive user inputs to the electronic device 100 (or provide user outputs from the electronic device 100). For example, the display 110 may display images captured by the first camera 102 or the second camera 104. In some embodiments, the display 110 may include a multi-touch capacitive sensing touchscreen that is operable to detect one or more user inputs.
The first and second cameras 102, 104 may be any type of cameras, and may be sensitive to visible light, infrared light, or other types of electromagnetic radiation. The cameras 102, 104 may be the same type of camera or different types of cameras. In some examples, one or both of the first or second cameras 102, 104 may include a lens or lens stack (i.e., a stack of multiple lenses) that is positioned to direct light onto an image sensor of the camera. The camera and/or a processor, controller, circuit and/or mechanical actuator associated with the camera, may be configured to actuate movement between the lens and image sensor to provide one or more of an autofocus, image stabilization, or zoom operation. Movement between the lens and image sensor may be achieved by moving a lens, the image sensor, both a lens and the image sensor, and/or a combination of lenses in a lens stack.
The pixel array 206 may be in communication with the column select circuit 208 through one or more column select lines 214, and with the row select circuit 210 through one or more row select lines 216. The row select circuit 210 may selectively activate a particular pixel or group of pixels, such as all of the pixels 212 in a particular row. The column select circuit 208 may selectively receive the data output from a selected pixel or group of pixels (e.g., all of the pixels 212 in a particular row).
The row select circuit 210 and/or column select circuit 208 may be in communication with the image processor 204. The image processor 204 may process data received from the pixels 212 and provide the data to another processor (e.g., a system processor) and/or other components of a device (e.g., other components of the electronic device 100). In some embodiments, the image processor 204 may be incorporated into the system processor. Alternatively, the image processor 204 and/or some or all of the image processing operations may be integrated directly into the image sensor 202. The image processor 204 may receive pixel values from some or all of the pixels 212, and may process some or all of the pixel values to generate an image.
The pixel 300 may include a photodetector 304 or other photosensitive element formed on or in a substrate layer. The photodetector 304 may be laterally surrounded by an electromagnetic radiation isolator 306 (e.g., implant isolation or physical trench isolation). The electromagnetic radiation isolator 306 reduces the probability that electromagnetic radiation received by the photodetector 304 is also received by adjacent (or neighbor) photodetectors in a pixel array. An optional optical element 308 (e.g., a coating) may be disposed on an outward-facing surface of the photodetector 304, and a symmetric shield 310 (e.g., a metal shield) may be disposed on the optical element 308 over an outward-facing edge of the electromagnetic radiation isolator 306. The symmetric shield may be positioned over or around a perimeter of the photodetector 304, and is symmetric about the longitudinal axis 302. The shield 310 may reduce the probability that electromagnetic radiation received through the microlens 314 is received by one or more adjacent pixels.
An optional color filter 312 may be disposed on or over the photodetector 304 (or on the optical element 308), or on or over at least a central portion of the photodetector 304. In some examples, the color filter 312 may be a red, blue, or green filter forming part of a Bayer pattern color filter, and other pixels within a pixel array may be associated with the same or other color filters. In some examples, the color filter 312 may form part of another type of color filter for a pixel array (e.g., a CMYK color filter), or the color filter 312 may form part of a panchromatic or other type of filter for a pixel array. A lens (e.g., a microlens 314) may be positioned or disposed on a side of the color filter 312 opposite the photodetector 304.
During an image capture frame of an image sensor including the pixel 300, the pixel 300 may be activated and deactivated one or more times to start and stop one or more integration periods in which charge is collected by the photodetector 304. Electromagnetic radiation 316 (e.g., light) received by the photodetector 304 may initially impinge on the microlens 314. The microlens 314 may tend to focus or collimate the electromagnetic radiation 316. Electromagnetic radiation 316 that passes through the microlens 314 may be filtered by the color filter 312 such that one or a range of wavelengths of electromagnetic radiation pass through the color filter 312, and other wavelengths of electromagnetic radiation are reflected, redirected, or absorbed by the color filter 312. Photons of electromagnetic radiation 316 received by the photodetector 304 may converted into electron-hole pairs, and the electrons (for electron collected devices) or holes (for hole collected devices) may be converted to an analog voltage using an in-pixel amplifier. The analog voltage may then be converted to a digital signal by an analog-to-digital converter (ADC).
Charge integrated by the pixel 300 may be read out of a pixel array as described with reference to
Image flare may affect the charge integrated by the photodetector 304. For example, strong rays of light received from a source of image flare (e.g., the sun or a bright lamp) may not only reflect at unintended angles within a camera module as a whole, but may reflect in a manner that causes unintended light to be received by the photodetector 304 during integration, for example, by reflecting between the microlens 314 and the optical element 308, within the microlens 314 or optical element 318, or at various interfaces between the microlens 314, color filter 312, shield, optical element 308, and photodetector 304.
The pixel 400 may include a photodetector 404 or other photosensitive element formed on or in a substrate layer. The photodetector 404 may be laterally surrounded by an electromagnetic radiation isolator 406 (e.g., implant isolation or physical trench isolation). The electromagnetic radiation isolator 406 reduces the probability that electromagnetic radiation received by the photodetector 404 is also received by adjacent (or neighbor) photodetectors in a pixel array. An optional optical element 408 (e.g., a coating) may be disposed on an outward-facing surface of the photodetector 404, and an asymmetric shield 410 (e.g., a metal shield) may be disposed on the optical element 408 over an outward-facing edge of the electromagnetic radiation isolator 406. The asymmetric shield may be positioned over or around a perimeter of the photodetector 404, and over more of one side of the photodetector 404 than the other. For example, in the cross-section shown in
An optional color filter 412 may be disposed on or over the photodetector 404 (or on the optical element 408), or on or over at least the portion of the photodetector 404 that is not blocked by the shield 410. In some examples, the color filter 410 may be a red, blue, or green filter forming part of a Bayer pattern color filter, and other pixels within a pixel array may be associated with the same or other color filters. In some examples, the color filter 410 may form part of another type of color filter for a pixel array (e.g., a CMYK color filter), or the color filter 410 may form part of a panchromatic or other type of filter for a pixel array. A lens (e.g., a microlens 414) may be positioned or disposed on a side of the color filter 410 opposite the photodetector 404.
In some embodiments, the pixel 400 may be configured similarly to the pixel 300 described with reference to
Image flare may affect the charge integrated by the photodetector 404. For example, strong rays of light received from a source of image flare (e.g., the sun or a bright lamp) may not only reflect at unintended angles within a camera module as a whole, but may reflect in a manner that causes unintended light to be received by the photodetector 404 during integration, for example, by reflecting between the microlens 414 and the optical element 408, within the microlens 414 or optical element 418, or at various interfaces between the microlens 414, color filter 412, shield, optical element 408, and photodetector 404.
Asymmetric pixels 510 having different directional asymmetries may be surrounded by symmetric neighbor pixels 508, as shown. Alternatively, asymmetric pixels 510 having different directional asymmetries may be positioned adjacent one another within the pixel array 500 (with a pair of asymmetric pixels 510a, 510b having different directional asymmetries surrounded by symmetric neighbor pixels 508). By way of example, the pixel array 500 includes two asymmetric pixel 510, having different directional asymmetries, per every block of eight rows and eight columns. Asymmetric pixels 510 may also be interspersed with symmetric pixels 508 in other ways, or at varying densities. In alternative embodiments, all of the asymmetric pixels 510 may have the same directional asymmetry.
Given their different angular responses, as shown in
When left-shielded asymmetric pixels and right-shielded pixels are interspersed in a pixel array, differences in their signals can be used to identify an out-of-phase or out-of-focus condition. That is, when an image is in-focus, the signals or pixel values obtained from left-shielded asymmetric pixels and right-shielded asymmetric pixels should be substantially the same (both in magnitude and polarity). When they differ, the camera including the asymmetric pixels is out-of-focus, and a camera lens position may be adjusted.
As previously discussed, the directional asymmetries of asymmetric pixels, such as left and right-shielded asymmetric pixels, may also be used to determine when an image is affected by image flare. In addition to detecting the presence of image flare, different directional asymmetries of different asymmetric pixels may be used to detect a directionality (or directionalities) of image flare, and in some cases determine the direction(s) to one or more causes of image flare. This is because image flare often has a directional component. The directionality of image flare is due to the position of its cause (e.g., the sun, a lamp, or a reflective surface) within or outside an imaged scene. This directionality can be determined when a pixel array includes asymmetric pixels having different directional asymmetries, because asymmetric pixels having different directional asymmetries are affected differently by image flare. Different asymmetric pixels having the same directional asymmetry may also be affected differently.
In some embodiments, the components of the imaging system 700 may be included in a camera (e.g., the components may be solely or primarily used to provide a camera function for a device). In other embodiments, some of the components of the imaging system 700 may be used to support a camera function and other functions of a device. For example, a pixel array 706 of the imaging system 700 may be part of a camera, and part or all of an image processor 704 may be provided by a multipurpose processing system.
As shown, the imaging system 700 may include an image sensor 702 and an image processor 704. Although an example distribution of components between the image sensor 702 and image processor 704 is shown, the components shown in
The pixel array 706 may include asymmetric pixels having different directional asymmetries. For example, the pixel array 706 may include a first set of asymmetric pixels distributed across the pixel array 706 and having a first directional asymmetry, and a second set of asymmetric pixels distributed across the pixel array 706 and having a second directional asymmetry different from the first directional asymmetry. In some embodiments, the second directional asymmetry may be out-of-phase with the first directional asymmetry (e.g., as described with reference to
In some examples, the pixel array 706 may include symmetric pixels in addition to asymmetric pixels. However, as described with reference to
In some embodiments, the image processor 704 may include an asymmetric pixel processor 708 and an image post-processor 710. In some embodiments, these processors may be incorporated into a single processor, or part or all of the image processor 704 may be incorporated into a multipurpose processor. In some embodiments, one or more of the processors 704, 708, 710 may be subdivided into multiple components.
The image processor 704 may provide one or more signals 712 to the image sensor 702 to control the image sensor 702. For example, the image processor 704 may provide signals 712 to the image sensor 702 to turn the image sensor 702 on or off, or to determine when or how long the pixels of the image sensor 702 integrate charge during an image capture frame. In some embodiments, the image processor 704 may receive feedback signals 714 from the image sensor 702.
During an image capture frame of the image sensor 702, a set of pixel values 716 may be obtained for the pixel array 706. Each pixel value may represent the amount of charge integrated by a pixel during the image capture frame. The pixel values 716 may be transferred from the image sensor 702 to the image processor 704 as a set of digital values. Alternatively, the pixel values 716 may be analog values that are converted to digital values by the image processor 704. The pixel values 716 captured during an image capture frame may include pixel values for all of the pixels in the pixel array 706, or pixel values for only some of the pixels in the pixel array 706.
The asymmetric pixel processor 708 may receive, from the image sensor 702 or a memory that stores pixel values obtained by the image sensor 702, the pixel values captured for asymmetric pixels. Alternatively, the asymmetric pixel processor 708 may receive pixel values captured for just some of the asymmetric pixels, or pixel values captured for both symmetric and asymmetric pixels. The asymmetric pixel processor 708 may evaluate the pixel values for at least the asymmetric pixels to determine whether the pixel values of an image captured during an image capture frame are affected by image flare. In some cases, the asymmetric pixel processor 708 may also determine whether a set of pixel values tends to indicate the presence of image flare or noise. If an image is affected by image flare, the asymmetric pixel processor 708 may identify a region (or regions) of the pixel array 706 (or image) affected by image flare, a directionality (or directionalities) of image flare, the direction(s) to one or more causes of image flare, or an amount of image flare. The asymmetric pixel processor 708 may also classify an image in response to pixel values obtained for asymmetric or other pixels. In some cases, the asymmetric pixel processor 708 may identify more than one region of the pixel array 706 affected by image flare, and in some cases, separate characteristics of the image flare affecting each region. The asymmetric pixel processor 708 may also evaluate the pixel values for at least the asymmetric pixels to determine a focus of an image captured by the pixel array 706, or may use the asymmetric pixel values for other purposes. In some embodiments, the asymmetric pixel processor 708 may evaluate the asymmetric pixel values by comparing the asymmetric pixel values to other asymmetric pixel values, or by comparing the asymmetric pixel values to symmetric pixel values.
The asymmetric pixel processor 708 may provide, to the image post-processor, image flare determinations (Flare? 718), noise determinations (Noise? 720), indications of regions affected by image flare (Region(s) 722), indications of directionalities of image flare or directions to causes of image flare (Direction(s) 724), indications of amounts of flare (Amount(s) 726), an image classification (Classification 728), or image focus information (Focused? 730). The asymmetric pixel processor 708 may also adjust or combine the pixel values of asymmetric pixels and provide adjusted or combined pixel values (Adj. Pixel Values 732) to the image post-processor 710.
The image post-processor 710 may receive, from the image sensor 702 or a memory that stores pixel values obtained by the image sensor 702, symmetric pixel values (if any) or asymmetric pixel values. The image post-processor 710 may also receive adjusted or combined pixel values from the asymmetric pixel processor 708. In some embodiments, all of the pixels in the pixel array may be asymmetric pixels, and all of the pixel values received by the image post-processor 710 may be pixel values for asymmetric pixels, or adjusted or combined pixel values received from the asymmetric pixel processor 708. A scenario in which all of the pixel values received by the image post-processor 710 may be pixel values for asymmetric pixels, or adjusted or combined pixel values, is a scenario in which the pixel array 706 is configured as described with reference to
The image post-processor 710 may use the information provided by the asymmetric pixel processor 708 to process a set of pixel values (e.g., pixel values representing an image) received from the image sensor 702 or the asymmetric pixel processor 708. The processing may variously include: adjusting pixel values to remove image flare, adjusting pixel values to enhance image flare (e.g., increasing the size of an area affected by image flare, or extending the effects of flare along a ray of light), or adjusting pixel values to reshape an area affected by image flare. In some cases, the image may be processed in accordance with a detected directionality (or directionalities) of image flare, and/or in accordance with a determined direction(s) to one or more causes of image flare. In some cases, the processing in accordance with the detected directionality (or directionalities) of image flare and/or determined direction(s) to one or more causes of image flare may be limited to or concentrated on an identified region of the image affected by image flare. For purposes of this description, “concentrating” processing on an identified region of an image means that a higher percentage of pixel values are adjusted within the identified region than in a surrounding region, and/or individual pixel values are adjusted more (e.g., increased or decreased more) within the identified region than in a surrounding region, and/or an average pixel value is adjusted more (e.g., increased or decreased more) within the identified region than in a surrounding region.
The image flare identifier 802 may be used to determine whether pixel values in a pixel array indicate the presence of image flare in an image. In some embodiments, the image flare identifier 802 may determine whether a pixel value obtained from an asymmetric pixel satisfies an image flare threshold. The image flare threshold may be an absolute threshold or a relative threshold. When the image flare threshold is a relative threshold, the image flare threshold may be: a threshold based on one or more pixel values obtained for one or more symmetric pixels; a threshold based on one or more pixel values obtained for one or more asymmetric pixels; a threshold determined as an amount or percentage increase over a dark current or other baseline value for the asymmetric pixel; or a threshold determined as an amount or percentage increase over a dark current average or other average baseline value for a set of symmetric or asymmetric pixels.
An image flare threshold based on one or more pixel values obtained for one or more symmetric pixels may be, in some examples, a threshold determined as an amount or percentage increase over an average pixel value of one or more symmetric pixels that are neighbor pixels to the asymmetric pixel for which the image flare threshold is determined. Alternatively, an image flare threshold based on one or more pixel values obtained for one or more symmetric pixels may be a threshold determined as an amount or percentage increase over an average of all pixel values for symmetric pixels.
An image flare threshold based on one or more pixel values obtained for one or more asymmetric pixels may be, in some examples, a threshold determined as an amount or percentage increase over an average pixel value of one or more asymmetric pixels that are neighbor pixels to the asymmetric pixel for which the image flare threshold is determined. The neighbor pixels may in some cases include only neighbor pixels having a same directional asymmetry as the asymmetric pixel for which the image flare threshold is determined. Alternatively, an image flare threshold based on one or more pixel values obtained for one or more asymmetric pixels may be a threshold determined as an amount or percentage increase over an average of all pixel values for all asymmetric pixels (or an average of all pixel values for asymmetric pixels having a same directional asymmetry as the asymmetric pixel for which the image flare threshold is determined).
A pixel value for each asymmetric pixel may be compared to a single image flare threshold, or different pixel values may be compared to different image flare thresholds. Image flare may be identified when a single pixel value satisfies an image flare threshold, or when a predetermined number or percentage of pixel values satisfies an image flare threshold (which image flare threshold may be the same threshold for each pixel value, or include different thresholds for different pixel values). Image flare may also or alternatively be identified using other metrics.
The image flare locator 804 may be used to evaluate pixel values, for at least asymmetric pixels, to identify a region of a pixel array affected by image flare. In some cases, the image flare locator may include the image flare identifier 802. In some embodiments, the image flare locator 804 may identify a plurality of regions (e.g., predetermined regions) of a pixel array (or image), evaluate groups of pixel values obtained from asymmetric pixel values within each region, and identify a region as affected by image flare when one or a predetermined number or percentage of the pixel values in the region satisfy an image flare threshold.
In some embodiments, the image flare locator 804 may dynamically identify a region of a pixel array (or image) as a region affected by image flare. A region may be dynamically identified, for example, by moving a fixed size window across a pixel array and identifying a region defined by the moving window as a region affected by image flare when one or a predetermined number or percentage of the pixel values within the region satisfy an image flare threshold. Alternatively, a size of a region affected by image flare may be dynamically determined by generating a boundary, for the region, such that the region includes a predetermined number or percentage of pixel values satisfying an image flare threshold. The image flare locator may perform its operations for each of a plurality of overlapping or non-overlapping regions of a pixel array.
The image flare quantifier 806 may be used to quantify image flare. In some embodiments, image flare may be quantified by quantifying an amount or severity of image flare. Image flare may be quantified per asymmetric pixel, per region, or for an image. Severity may be used to rank the presence of image flare in a less granular way than by amount. For example, severity may be good or bad, or a rank on a scale of 1-10 or 1-100. In some cases, image flare may be quantified with respect to a baseline or average pixel value.
The flare directionality detector 808 may be used to detect a directionality (or directionalities) of image flare, and in some cases to determine a direction (or directions) to one or more causes of image flare. For example, when pixel values of asymmetric pixels having a particular directional asymmetry satisfy an image flare threshold and indicate the presence of image flare, the directional asymmetry of those pixels may indicate a directionality (or directionalities) of image flare and/or a direction (or directions) to one or more causes of image flare. A right-shielded asymmetric pixel, for example, may block high incident angle light arriving from the right of a scene, and may therefore indicate that a cause of image flare is generally in front of or to the left of the asymmetric pixel. An amount or severity of image flare indicated by a pixel value obtained from a right-shielded asymmetric pixel may indicate, to some degree, whether a cause of image flare is more in front of or to the left of the pixel, and how much the cause of image flare is to the left of the pixel. In some embodiments, a directionality of image flare or direction to a cause of image flare may be determined as an angle with respect to an image plane, or an angle with respect to an outer surface of a photodetector or other pixel element.
In some embodiments, the flare directionality detector 808 may use the pixel values and directional asymmetries of asymmetric pixels, and in some cases the pixel values of symmetric pixels, to determine the directionality (or directionalities) of image flare and/or direction(s) to cause(s) of image flare. For example, when a plurality of asymmetric pixels having a particular directional asymmetry indicate the presence of image flare, but their associated pixel values vary, the variance in pixel values may be used to determine a directionality (or directionalities) of image flare and/or direction(s) to cause(s) of image flare. In some embodiments, a variance in pixel values of asymmetric pixels having a first asymmetric direction (e.g., right-shielded asymmetric pixels) and a variance in pixel values of asymmetric pixels having a second asymmetric direction (e.g., top-shielded asymmetric pixels) may be used to determine a directionality (or directionalities) and/or direction(s) to cause(s) of image flare. In some embodiments, variances in pixel values of four types of asymmetric pixels (e.g., right-shielded, left-shielded, top-shielded, and bottom-shielded) may be used to determine a direction to a cause of image flare.
The flare directionality detector 808 may, in some cases, pattern match 1) a distribution of pixel values and directional asymmetries of asymmetric pixels (and in some cases, pixel values of symmetric pixels) to 2) a set of patterns representing exposures of a pixel array to causes of image flare positioned at different directions with respect to the pixel array. The flare directionality detector 808 may identify a directionality (or directionalities) of image flare and/or the direction(s) to cause(s) of image flare by identifying a pattern that best matches the distribution of pixel values and directional asymmetries.
In some embodiments, the flare directionality detector 808 may also or alternatively determine whether a cause of image flare is within or outside a scene imaged by a pixel array during an image capture frame. This determination may be made using the pixel values and directional asymmetries of asymmetric pixels, and in some cases may be made by additionally using the pixel values of symmetric pixels. In some embodiments, a determination of whether a cause of image flare is within or outside an imaged scene may be made as a byproduct of the pattern matching described in the preceding paragraph. For example, some of the patterns to which a distribution of pixel values and directional asymmetries is matched may be patterns in which a cause of image flare is within the imaged scene, and other patterns may be patterns in which a cause of image flare is outside the imaged scene.
The image classifier 810 may be used to classify an image captured during an image capture frame. The image classifier 810 may classify an image based on a detected directionality (or directionalities) or determined direction(s) to cause(s) of image flare, region(s) of a pixel array that are affected by image flare, or quantified amount(s) of image flare per asymmetric pixel. In some embodiments, an image may be classified by pattern matching one or more of the preceding items to a set of patterns representing the effects of different types of image flare (e.g., image flare caused by direct exposure to sunlight, image flare from sunlight filtered through a set of tree leaves, image flare caused by a flash, image flare caused by one or more types of artificial light, and so on). Image classifications may include, for example, classifications based on the type of image flare to which an image has been exposed, or classifications based on the severity of image flare to which an image has been exposed.
The asymmetric pixel value adjuster 812 may be used to adjust or combine asymmetric pixel values. For example, asymmetric pixels may generally have lower gains, because asymmetric pixels only receive light from unblocked directions (see, e.g.,
The outputs of the asymmetric pixel processor 800 may be provided to an image post-processor, such as the image post-processor 710 described with reference to
In some cases, the image post-processor 710 may directionally mitigate or directionally accentuate an effect of image flare. For example, the image post-processor 710 may apply image processing techniques that are directionally tuned to process image flare.
In some embodiments, the image post-processor 710 may determine to keep or alternatively discard an image in response to a classification of the image. For example, an image that is too severely affected by flare may be discarded.
In some embodiments, the image post-processor 710 may determine which of a set of images to keep or discard, or whether to combine an image affected by image flare with other images that are not affected by image flare (or not as severely affected). For example, the asymmetric pixel processor 800 may obtain pixel values, evaluate the pixel values, and classify an image (and in some cases determine a directionality (or directionalities) of image flare and/or the direction(s) to cause(s) of image flare, a region (or regions) of a pixel array affected by image flare, or a quantity of image flare) for each image in a set of images captured during a set of image capture frames. The image post-processor 710 may rank the images using their classifications, and select from the ranked images one or more images to keep, one or more images to discard, or one or more images to flag. For example, in a set of images associated with a “live photo” (i.e., a short video clip that plays before ending on a still image selected from the frames of the video clip, thereby associating the still frame with subtle movement or a “live” feeling), a frame or image that is less affected (or not affected) by image flare may be flagged as a thumbnail image for representing a saved file including the images. As another example, when a high-speed video shot through the rustling leaves of a tree is down-sampled, video frames that are more affected by image flare caused by the sun peeking through the leaves may be dropped. As another example, a final image may be formed from the fusion of a plurality of individual frames, and differences in image flare between frames may be used to adjust the selection of frames for fusion and/or the fusion weight of a given image.
The asymmetric pixel processor 820 assumes the availability of pixel values obtained from asymmetric pixels having a first directional asymmetry, and pixel values obtained from asymmetric pixels having a second directional asymmetry. The second directional asymmetry is different from the first directional asymmetry, and in some cases is out-of-phase with or orthogonal to the first directional asymmetry. In some embodiments, the components of the asymmetric pixel processor 820 may separately operate on pixel values of the asymmetric pixels disposed in each region of a pixel array.
The image flare probability determiner 822 may be used to determine whether pixel values obtained from asymmetric pixels indicate the presence of image flare in an image. In some embodiments, the image flare probability determiner 822 may determine whether a pixel value obtained from an asymmetric pixel satisfies an image flare threshold. The image flare threshold may be an absolute threshold or a relative threshold. When the image flare threshold is a relative threshold, the image flare threshold may be: a threshold based on one or more pixel values obtained for one or more symmetric pixels; a threshold based on one or more pixel values obtained for one or more asymmetric pixels; a threshold determined as an amount or percentage increase over a dark current or other baseline value for the asymmetric pixel; or a threshold determined as an amount or percentage increase over a dark current average or other average baseline value for a set of symmetric or asymmetric pixels.
An image flare threshold based on one or more pixel values obtained for one or more symmetric pixels may be, in some examples, a threshold determined as an amount or percentage increase over an average pixel value of one or more symmetric pixels that are neighbor pixels to the asymmetric pixel for which the image flare threshold is determined. Alternatively, an image flare threshold based on one or more pixel values obtained for one or more symmetric pixels may be a threshold determined as an amount or percentage increase over an average of all pixel values for symmetric pixels.
An image flare threshold based on one or more pixel values obtained for one or more asymmetric pixels may be, in some examples, a threshold determined as an amount or percentage increase over an average pixel value of one or more asymmetric pixels that are neighbor pixels to the asymmetric pixel for which the image flare threshold is determined. The neighbor pixels may in some cases include only neighbor pixels having a same directional asymmetry as the asymmetric pixel for which the image flare threshold is determined. Alternatively, an image flare threshold based on one or more pixel values obtained for one or more asymmetric pixels may be a threshold determined as an amount or percentage increase over an average of all pixel values for all asymmetric pixels (or an average of all pixel values for asymmetric pixels having a same directional asymmetry as the asymmetric pixel for which the image flare threshold is determined).
A pixel value for each asymmetric pixel may be compared to a single image flare threshold, or different pixel values may be compared to different image flare thresholds. The image flare probability determiner 822 may determine that pixel values obtained from the asymmetric pixels having the first directional asymmetry indicate the presence of image flare when a single pixel value satisfies an image flare threshold, or when a predetermined number or percentage of pixel values satisfies an image flare threshold (which image flare threshold may be the same threshold for each pixel value, or include different thresholds for different pixel values). Image flare may also or alternatively be identified using other metrics.
The image flare locator 824 may be used to evaluate pixel values, for at least asymmetric pixels, to identify a region of a pixel array affected by image flare. In some cases, the image flare locator may include the image flare probability determiner 822. In some embodiments, the image flare locator 824 may identify a plurality of regions (e.g., predetermined regions) of a pixel array (or image), evaluate groups of pixel values obtained from asymmetric pixel values within each region, and identify a region as probably affected by image flare when one or a predetermined number or percentage of the pixel values in the region satisfy an image flare threshold.
In some embodiments, the image flare locator 824 may dynamically identify a region of a pixel array (or image) as a region affected by image flare. A region may be dynamically identified, for example, by moving a fixed size window across a pixel array and identifying a region defined by the moving window as a region affected by image flare when one or a predetermined number or percentage of the pixel values within the region satisfy an image flare threshold. Alternatively, a size of a region affected by image flare may be dynamically determined by generating a boundary, for the region, such that the region includes a predetermined number or percentage of pixel values satisfying an image flare threshold. The image flare locator 824 may perform its operations for each of a plurality of overlapping or non-overlapping regions of a pixel array.
The image flare quantifier 826 may be used to quantify image flare. In some embodiments, image flare may be quantified by quantifying an amount or severity of image flare. Image flare may be quantified per asymmetric pixel, per region, or for an image. Severity may be used to rank the presence of image flare in a less granular way than by amount. For example, severity may be good or bad, or a rank on a scale of 1-10 or 1-100. In some cases, image flare may be quantified with respect to a baseline or average pixel value.
The image flare-noise discriminator 828 may be used to distinguish image flare from noise. For example, when pixel values obtained from asymmetric pixels having different directional asymmetries tend to indicate the presence of image flare (i.e., when it appears that image flare is associated with multiple directionalities), it is possible that the asymmetric pixels are affected by noise rather than image flare, given that image flare tends to be directional. In some embodiments, the image flare-noise discriminator 828 may distinguish image flare from noise by evaluating a set of pixel values obtained from asymmetric pixels having a directional asymmetry that differs from the directional asymmetry of the pixels that indicate image flare is probable. For example, the image flare probability determiner 822 may determine, for an image capture frame, that a first set of pixel values obtained from a first set of asymmetric pixels having a first directional asymmetry indicates image flare is probable. In response (or in parallel), the image flare-noise discriminator 828 may evaluate a second set of pixel values obtained from a second set of asymmetric pixels having a second directional asymmetry. The second set of pixel values may be evaluated to determine whether the first set of pixel values is really indicative of image flare, or indicative of noise. In some cases, the second set of pixel values may be evaluated by determining whether the pixel values satisfy a same image flare threshold satisfied by the pixel values of the first set of pixel values. When they do, or when a predetermined number or percentage of the pixel values in the second set satisfy the image flare threshold, noise rather than image flare may be affecting a pixel array from which the pixel values are obtained.
In other cases, the second set of pixel values may be evaluated by determining whether the pixel values satisfy a different image flare threshold (or set of thresholds) than an image flare threshold (or set of thresholds) satisfied by the pixel values of the first set of pixel values. The different image flare thresholds may be referred to as image flare confirmation thresholds. When the pixel values in the second set satisfy the image flare confirmation threshold(s), or when a predetermined number or percentage of the pixel values in the second set satisfy the image flare threshold(s), noise rather than image flare may be affecting a pixel array from which the pixel values are obtained. In some cases, the image flare confirmation threshold(s) may be lower than the image flare threshold(s) satisfied by the pixels in the first set.
The second set of pixel values may also or alternatively be evaluated by comparing the pixel values in the second set to pixel values in the first set (or to an average of the pixel values in the first set). When both sets of pixel values are within a predetermined range, an image may be affected by noise rather than image flare.
In some embodiments, the first and second sets of pixel values considered by the image flare-noise discriminator 828 may be pixel values of asymmetric pixels disposed in a particular region (of multiple regions) of a pixel array. In other embodiments, the first and second sets of pixel values considered by the image flare-noise discriminator 828 may be pixel values of asymmetric pixels disposed across the entirety of a pixel array.
The asymmetric pixel value adjuster 830 may be used to adjust or combine asymmetric pixel values. For example, asymmetric pixels may generally have lower gains, because asymmetric pixels only receive light from unblocked directions (see, e.g.,
The outputs of the asymmetric pixel processor 820 may be provided to an image post-processor, such as the image post-processor 710 described with reference to
The operation(s) at block 902 may include obtaining a set of pixel values captured from a pixel array during an image capture frame. The set of pixel values may include pixel values for a set of asymmetric pixels having different directional asymmetries. The operation(s) at block 902 may be performed, for example, by the image processor or asymmetric pixel processor described with reference to
The operation(s) at block 904 may optionally include evaluating the pixel values for at least the set of asymmetric pixels to identify a region (or regions) of the pixel array affected by image flare. The operation(s) at block 904 are optional and may be performed, for example, by the image processor or asymmetric pixel processor described with reference to
The operation(s) at block 906 may include detecting, using the pixel values for at least the asymmetric pixels and the different directional asymmetries of the asymmetric pixels, a directionality (or directionalities) of image flare. The operation(s) at block 906 may also or alternatively include determining a direction (or directions) to one or more causes of image flare. The operation(s) at block 906 may be performed, for example, by the image processor or asymmetric pixel processor described with reference to
The operation(s) at block 908 may include processing an image defined by the set of pixel values in accordance with the detected directionality (or directionalities) of image flare and/or the determined direction(s) to cause(s) of image flare. In some embodiments, the processing of the image in accordance with the detected directionality (or directionalities) of image flare and/or determined direction(s) to cause(s) of image flare may be concentrated on the region(s) identified at block 904. The operation(s) at block 908 may be performed, for example, by the image processor or image post-processor described with reference to
The operation(s) at block 912 may include determining that a first set of pixel values obtained from a first set of asymmetric pixels, during an image capture frame, indicates image flare is probable. During the image capture frame, the first set of pixel values and a second set of pixel values may be captured. Additional pixel values may also be captured during the image capture frame. The second set of pixel values may be obtained from a second set of asymmetric pixels. The second set of asymmetric pixels may have a different directional asymmetry than the first set of asymmetric pixels (e.g., an out-of-phase or orthogonal directional asymmetry). The operation(s) at block 912 may be performed, for example, by the image processor or asymmetric pixel processor described with reference to
The operation(s) at block 914 may include evaluating the second set of pixels values to confirm whether the first set of pixel values is indicative of image flare, or to determine that an image including the pixel values is affected by noise. The operation(s) at block 914 may be performed, for example, by the image processor or asymmetric pixel processor described with reference to
In some embodiments, the method 900 or 910 may include providing a user a cue (e.g., a graphical cue) that indicates how a user should reposition a camera to mitigate or increase the effects of image flare while capturing an image.
In
The pixel 1100 may include first and second photodetectors 1102, 1104 or other photosensitive elements formed on or in a substrate layer. Each photodetector 1102, 1104 may be laterally surrounded by an electromagnetic radiation isolator 1110 (e.g., implant isolation or physical trench isolation).
An optional optical element 1112 (e.g., a coating) may be disposed on outward-facing surfaces of the photodetectors 1102, 1104, and a symmetric shield 1114 (e.g., a metal shield) may be disposed on the optical element 1112 over outward-facing edges of the electromagnetic radiation isolators 1110. The symmetric shield 1114 may be positioned over or around a perimeter of the pair of photodetectors 1102, 1104. The shield 1114 may reduce the probability that electromagnetic radiation received through the microlens 1106 is received by one or more adjacent pixels.
An optional color filter 1116 may be disposed on or over the photodetectors 1102, 1104, or different color filters may be disposed on the different photodetectors 1102, 1104. In some examples, the color filter 1116 may be a red, blue, or green filter forming part of a Bayer pattern color filter, and other pixels within a pixel array may be associated with the same or other color filters. In some examples, the color filter 1116 may form part of another type of color filter for a pixel array (e.g., a CMYK color filter), or the color filter 1116 may form part of a panchromatic or other type of filter for a pixel array. A lens (e.g., a microlens 1106) may be positioned or disposed on a side of the color filter 1116 opposite the photodetectors 1102, 1104.
The pixel 1100 is asymmetric because the curvature of the microlens 1106 bends light 1118 incident on different portions of the microlens 1106, and light incident on the microlens 1106 at different angles, in different directions, such that the two photodetectors 1102, 1104 receive different light based on their different positions under the microlens 1106. Charge integrated by each of the photodetectors 1102, 1104 may be separately read out of the pixel 1100, with one or each of the charges being used as a pixel value of an asymmetric pixel. In some embodiments, the charges may be summed after readout, or alternatively summed during readout, to provide a combined pixel value for the pixel 1100.
As shown in
Similarly to the pixel described with reference to
The processor 1504 may be implemented as any electronic device capable of processing, receiving, or transmitting data or instructions. For example, the processor 1504 may be a microprocessor, a central processing unit (CPU), an application-specific integrated circuit (ASIC), a digital signal processor (DSP), or combinations of such devices. As described herein, the term “processor” is meant to encompass a single processor or processing unit, multiple processors, multiple processing units, or other suitably configured computing element or elements.
It should be noted that the components of the electronic device 1500 may be controlled by multiple processors. For example, select components of the electronic device 1500 may be controlled by a first processor and other components of the electronic device 1500 may be controlled by a second processor, where the first and second processors may or may not be in communication with each other. In some embodiments, the processor 1504 may include in the image processor described with reference to
The power source 1506 may be implemented with any device capable of providing energy to the electronic device 1500. For example, the power source 1506 may be one or more batteries or rechargeable batteries. Additionally or alternatively, the power source 1506 may be a power connector or power cord that connects the electronic device 1500 to another power source, such as a wall outlet.
The memory 1508 may store electronic data that may be used by the electronic device 1500. For example, the memory 1508 may store electrical data or content such as, for example, audio and video files, documents and applications, device settings and user preferences, timing signals, control signals, data structures or databases, image data, or focus settings. The memory 1508 may be configured as any type of memory. By way of example only, the memory 1508 may be implemented as random access memory, read-only memory, Flash memory, removable memory, other types of storage elements, or combinations of such devices.
The electronic device 1500 may also include one or more sensors 1510 positioned substantially anywhere on the electronic device 1500. The sensor(s) 1510 may be configured to sense substantially any type of characteristic, such as but not limited to, pressure, light, touch, heat, movement, relative motion, biometric data, and so on. For example, the sensor(s) 1510 may include a heat sensor, a position sensor, a light or optical sensor, an accelerometer, a pressure transducer, a gyroscope, a magnetometer, a health monitoring sensor, and so on. Additionally, the one or more sensors 1510 may utilize any suitable sensing technology, including, but not limited to, capacitive, ultrasonic, resistive, optical, ultrasound, piezoelectric, and thermal sensing technology.
The I/O mechanism 1512 may transmit and/or receive data from a user or another electronic device. An I/O device may include a display, a touch sensing input surface such as a track pad, one or more buttons (e.g., a graphical user interface “home” button), one or more cameras (e.g., the cameras described with reference to
The foregoing description, for purposes of explanation, uses specific nomenclature to provide a thorough understanding of the described embodiments. However, it will be apparent to one skilled in the art, after reading this description, that the specific details are not required in order to practice the described embodiments. Thus, the foregoing descriptions of the specific embodiments described herein are presented for purposes of illustration and description. They are not targeted to be exhaustive or to limit the embodiments to the precise forms disclosed. It will be apparent to one of ordinary skill in the art, after reading this description, that many modifications and variations are possible in view of the above teachings.
This application is a nonprovisional of and claims the benefit under 35 U.S.C. 119(e) of U.S. Provisional Patent Application No. 62/700,103, filed Jul. 18, 2018, and entitled “Camera Flare Determination Using Asymmetric Pixels,” which is incorporated herein by reference as if fully disclosed herein.
Number | Name | Date | Kind |
---|---|---|---|
4686572 | Takatsu | Aug 1987 | A |
4686648 | Fossum | Aug 1987 | A |
5105264 | Erhardt et al. | Apr 1992 | A |
5329313 | Keith | Jul 1994 | A |
5396893 | Oberg et al. | Mar 1995 | A |
5471515 | Fossum et al. | Nov 1995 | A |
5541402 | Ackland | Jul 1996 | A |
5550677 | Schofield et al. | Aug 1996 | A |
5781312 | Noda | Jul 1998 | A |
5841126 | Fossum et al. | Nov 1998 | A |
5880459 | Pryor et al. | Mar 1999 | A |
5949483 | Fossum et al. | Sep 1999 | A |
6008486 | Stam et al. | Dec 1999 | A |
6040568 | Caulfield et al. | Mar 2000 | A |
6233013 | Hosier et al. | May 2001 | B1 |
6348929 | Acharya et al. | Feb 2002 | B1 |
6448550 | Nishimura | Sep 2002 | B1 |
6528833 | Lee et al. | Mar 2003 | B2 |
6541751 | Bidermann | Apr 2003 | B1 |
6670904 | Yakovlev | Dec 2003 | B1 |
6713796 | Fox | Mar 2004 | B1 |
6714239 | Guidash | Mar 2004 | B2 |
6798453 | Kaifu | Sep 2004 | B1 |
6816676 | Bianchi et al. | Nov 2004 | B2 |
6905470 | Lee et al. | Jun 2005 | B2 |
6931269 | Terry | Aug 2005 | B2 |
6956605 | Hashimoto | Oct 2005 | B1 |
6982759 | Goto | Jan 2006 | B2 |
7075049 | Rhodes et al. | Jul 2006 | B2 |
7084914 | Van Blerkom | Aug 2006 | B2 |
7091466 | Bock | Aug 2006 | B2 |
7119322 | Hong | Oct 2006 | B2 |
7133073 | Neter | Nov 2006 | B1 |
7259413 | Rhodes | Aug 2007 | B2 |
7262401 | Hopper et al. | Aug 2007 | B2 |
7271835 | Iizuka | Sep 2007 | B2 |
7282028 | Kim et al. | Oct 2007 | B2 |
7319218 | Krymski | Jan 2008 | B2 |
7332786 | Altice | Feb 2008 | B2 |
7342270 | Kuwazawa | Mar 2008 | B2 |
7390687 | Boettiger | Jun 2008 | B2 |
7415096 | Sherman | Aug 2008 | B2 |
7437013 | Anderson | Oct 2008 | B2 |
7443421 | Stavely et al. | Oct 2008 | B2 |
7446812 | Ando et al. | Nov 2008 | B2 |
7471315 | Silsby et al. | Dec 2008 | B2 |
7502054 | Kalapathy | Mar 2009 | B2 |
7525168 | Hsieh | Apr 2009 | B2 |
7554067 | Zarnoski et al. | Jun 2009 | B2 |
7555158 | Park et al. | Jun 2009 | B2 |
7589316 | Dunki-Jacobs | Sep 2009 | B2 |
7622699 | Sakakibara et al. | Nov 2009 | B2 |
7626626 | Panicacci | Dec 2009 | B2 |
7636109 | Nakajima et al. | Dec 2009 | B2 |
7667400 | Goushcha | Feb 2010 | B1 |
7671435 | Ahn | Mar 2010 | B2 |
7714292 | Agarwal et al. | May 2010 | B2 |
7728351 | Shim | Jun 2010 | B2 |
7733402 | Egawa et al. | Jun 2010 | B2 |
7742090 | Street | Jun 2010 | B2 |
7764312 | Ono et al. | Jul 2010 | B2 |
7773138 | Lahav et al. | Aug 2010 | B2 |
7786543 | Hsieh | Aug 2010 | B2 |
7796171 | Gardner | Sep 2010 | B2 |
7817198 | Kang et al. | Oct 2010 | B2 |
7838956 | McCarten et al. | Nov 2010 | B2 |
7873236 | Li et al. | Jan 2011 | B2 |
7880785 | Gallagher | Feb 2011 | B2 |
7884402 | Ki | Feb 2011 | B2 |
7906826 | Martin et al. | Mar 2011 | B2 |
7952121 | Arimoto | May 2011 | B2 |
7952635 | Lauxtermann | May 2011 | B2 |
7982789 | Watanabe et al. | Jul 2011 | B2 |
8026966 | Altice | Sep 2011 | B2 |
8032206 | Farazi et al. | Oct 2011 | B1 |
8089036 | Manabe et al. | Jan 2012 | B2 |
8089524 | Urisaka | Jan 2012 | B2 |
8094232 | Kusaka | Jan 2012 | B2 |
8116540 | Dean | Feb 2012 | B2 |
8140143 | Picard et al. | Mar 2012 | B2 |
8153947 | Barbier et al. | Apr 2012 | B2 |
8159570 | Negishi | Apr 2012 | B2 |
8159588 | Boemler | Apr 2012 | B2 |
8164669 | Compton et al. | Apr 2012 | B2 |
8174595 | Honda et al. | May 2012 | B2 |
8184188 | Yaghmai | May 2012 | B2 |
8194148 | Doida | Jun 2012 | B2 |
8194165 | Border et al. | Jun 2012 | B2 |
8222586 | Lee | Jul 2012 | B2 |
8227844 | Adkisson | Jul 2012 | B2 |
8233071 | Takeda | Jul 2012 | B2 |
8259228 | Wei et al. | Sep 2012 | B2 |
8310577 | Neter | Nov 2012 | B1 |
8324553 | Lee | Dec 2012 | B2 |
8330839 | Compton et al. | Dec 2012 | B2 |
8338856 | Tai et al. | Dec 2012 | B2 |
8340407 | Kalman | Dec 2012 | B2 |
8350940 | Smith et al. | Jan 2013 | B2 |
8355117 | Niclass | Jan 2013 | B2 |
8388346 | Rantala et al. | Mar 2013 | B2 |
8400546 | Itano et al. | Mar 2013 | B2 |
8456540 | Egawa | Jun 2013 | B2 |
8456559 | Yamashita | Jun 2013 | B2 |
8462247 | Kim | Jun 2013 | B2 |
8508637 | Han et al. | Aug 2013 | B2 |
8514308 | Itonaga et al. | Aug 2013 | B2 |
8520913 | Dean | Aug 2013 | B2 |
8546737 | Tian et al. | Oct 2013 | B2 |
8547388 | Cheng | Oct 2013 | B2 |
8575531 | Hynecek et al. | Nov 2013 | B2 |
8581992 | Hamada | Nov 2013 | B2 |
8594170 | Mombers et al. | Nov 2013 | B2 |
8619163 | Ogua | Dec 2013 | B2 |
8619170 | Mabuchi | Dec 2013 | B2 |
8629484 | Ohri et al. | Jan 2014 | B2 |
8634002 | Kita | Jan 2014 | B2 |
8637875 | Finkelstein et al. | Jan 2014 | B2 |
8648947 | Sato et al. | Feb 2014 | B2 |
8653434 | Johnson et al. | Feb 2014 | B2 |
8723975 | Solhusvik | May 2014 | B2 |
8724096 | Gosch et al. | May 2014 | B2 |
8730345 | Watanabe | May 2014 | B2 |
8754983 | Sutton | Jun 2014 | B2 |
8755854 | Addison et al. | Jun 2014 | B2 |
8759736 | Yoo | Jun 2014 | B2 |
8760413 | Peterson et al. | Jun 2014 | B2 |
8767104 | Makino et al. | Jul 2014 | B2 |
8803990 | Smith | Aug 2014 | B2 |
8810703 | Mabhuchi | Aug 2014 | B2 |
8817154 | Manabe et al. | Aug 2014 | B2 |
8879686 | Okada et al. | Nov 2014 | B2 |
8902330 | Theuwissen | Dec 2014 | B2 |
8902341 | Machida | Dec 2014 | B2 |
8908073 | Minagawa | Dec 2014 | B2 |
8923994 | Laikari et al. | Dec 2014 | B2 |
8934030 | Kim et al. | Jan 2015 | B2 |
8936552 | Kateraas et al. | Jan 2015 | B2 |
8946610 | Iwabuchi et al. | Feb 2015 | B2 |
8982237 | Chen | Mar 2015 | B2 |
9001251 | Smith et al. | Apr 2015 | B2 |
9006641 | Drader | Apr 2015 | B2 |
9041837 | Li | May 2015 | B2 |
9017748 | Theuwissen | Jun 2015 | B2 |
9054009 | Oike et al. | Jun 2015 | B2 |
9058081 | Baxter | Jun 2015 | B2 |
9066017 | Geiss | Jun 2015 | B2 |
9066660 | Watson et al. | Jun 2015 | B2 |
9088727 | Trumbo | Jul 2015 | B2 |
9094623 | Kawaguchi | Jul 2015 | B2 |
9099604 | Roy | Aug 2015 | B2 |
9100597 | Hu | Aug 2015 | B2 |
9106859 | Kizuna et al. | Aug 2015 | B2 |
9131171 | Aoki et al. | Sep 2015 | B2 |
9151829 | Campbell | Oct 2015 | B2 |
9154750 | Pang | Oct 2015 | B2 |
9160949 | Zhang et al. | Oct 2015 | B2 |
9164144 | Dolinsky | Oct 2015 | B2 |
9178100 | Webster et al. | Nov 2015 | B2 |
9209320 | Webster | Dec 2015 | B1 |
9225948 | Hasegawa | Dec 2015 | B2 |
9232150 | Kleekajai et al. | Jan 2016 | B2 |
9232161 | Suh | Jan 2016 | B2 |
9235267 | Burrough et al. | Jan 2016 | B2 |
9270906 | Peng et al. | Feb 2016 | B2 |
9276031 | Wan | Mar 2016 | B2 |
9277144 | Kleekajai et al. | Mar 2016 | B2 |
9287304 | Park et al. | Mar 2016 | B2 |
9288380 | Nomura | Mar 2016 | B2 |
9288404 | Papiashvili | Mar 2016 | B2 |
9293500 | Sharma et al. | Mar 2016 | B2 |
9312401 | Webster | Apr 2016 | B2 |
9313434 | Dutton et al. | Apr 2016 | B2 |
9319611 | Fan | Apr 2016 | B2 |
9331116 | Webster | May 2016 | B2 |
9344649 | Bock | May 2016 | B2 |
9380245 | Guidash | Jun 2016 | B1 |
9392237 | Toyoda | Jul 2016 | B2 |
9417326 | Niclass et al. | Aug 2016 | B2 |
9438258 | Yoo | Sep 2016 | B1 |
9445018 | Fettig et al. | Sep 2016 | B2 |
9448110 | Wong | Sep 2016 | B2 |
9451887 | Watson et al. | Sep 2016 | B2 |
9467553 | Heo et al. | Oct 2016 | B2 |
9473706 | Malone et al. | Oct 2016 | B2 |
9478030 | Lecky | Oct 2016 | B1 |
9479688 | Ishii | Oct 2016 | B2 |
9490285 | Itonaga | Nov 2016 | B2 |
9497397 | Kleekajai et al. | Nov 2016 | B1 |
9503616 | Taniguchi et al. | Nov 2016 | B2 |
9516244 | Borowski | Dec 2016 | B2 |
9538067 | Hamada | Jan 2017 | B2 |
9538106 | McMahon et al. | Jan 2017 | B2 |
9549099 | Fan | Jan 2017 | B2 |
9560339 | Borowski | Jan 2017 | B2 |
9584743 | Lin et al. | Feb 2017 | B1 |
9584744 | Lenchenkov et al. | Feb 2017 | B2 |
9596420 | Fan et al. | Mar 2017 | B2 |
9596423 | Molgaard | Mar 2017 | B1 |
9609250 | Lee et al. | Mar 2017 | B2 |
9639063 | Dutton et al. | May 2017 | B2 |
9661210 | Haneda | May 2017 | B2 |
9661308 | Wang et al. | May 2017 | B1 |
9685576 | Webster | Jun 2017 | B2 |
9686485 | Agranov et al. | Jun 2017 | B2 |
9700240 | Letchner et al. | Jul 2017 | B2 |
9741754 | Li et al. | Aug 2017 | B2 |
9749556 | Fettig et al. | Aug 2017 | B2 |
9754994 | Koo et al. | Sep 2017 | B2 |
9774318 | Song | Sep 2017 | B2 |
9781368 | Song | Oct 2017 | B2 |
9831283 | Shepard et al. | Nov 2017 | B2 |
9857469 | Oggier et al. | Jan 2018 | B2 |
9870053 | Modarres et al. | Jan 2018 | B2 |
9888198 | Mauritzson et al. | Feb 2018 | B2 |
9894304 | Smith | Feb 2018 | B1 |
9912883 | Agranov et al. | Mar 2018 | B1 |
9915733 | Fried et al. | Mar 2018 | B2 |
9952323 | Deane | Apr 2018 | B2 |
9973678 | Mandelli et al. | May 2018 | B2 |
10044954 | Ikeda et al. | Aug 2018 | B2 |
10104318 | Smith et al. | Oct 2018 | B2 |
10120446 | Pance et al. | Nov 2018 | B2 |
10136090 | Vogelsang et al. | Nov 2018 | B2 |
10145678 | Wang et al. | Dec 2018 | B2 |
10153310 | Zhang et al. | Dec 2018 | B2 |
10205904 | Kobayashi | Feb 2019 | B2 |
10249660 | Guidash et al. | Apr 2019 | B2 |
10263032 | Wan | Apr 2019 | B2 |
10271037 | Oh | Apr 2019 | B2 |
10285626 | Kestelli et al. | May 2019 | B1 |
10334181 | Guenter et al. | Jun 2019 | B2 |
10440301 | Li et al. | Oct 2019 | B2 |
20030036685 | Goodman et al. | Feb 2003 | A1 |
20040207836 | Chhibber et al. | Oct 2004 | A1 |
20050026332 | Fratti et al. | Feb 2005 | A1 |
20060274161 | Ing et al. | Dec 2006 | A1 |
20070263099 | Motta et al. | Nov 2007 | A1 |
20080177162 | Bae et al. | Jul 2008 | A1 |
20080315198 | Jung | Dec 2008 | A1 |
20090096901 | Bae et al. | Apr 2009 | A1 |
20090101914 | Hirotsu et al. | Apr 2009 | A1 |
20090146234 | Luo et al. | Jun 2009 | A1 |
20090201400 | Zhang et al. | Aug 2009 | A1 |
20090219266 | Lim et al. | Sep 2009 | A1 |
20100110018 | Faubert et al. | May 2010 | A1 |
20100245631 | Hoda | Sep 2010 | A1 |
20110080500 | Wang et al. | Apr 2011 | A1 |
20110156197 | Tivarus et al. | Jun 2011 | A1 |
20110164162 | Kato | Jul 2011 | A1 |
20120092541 | Tuulos et al. | Apr 2012 | A1 |
20120162632 | Dutton | Jun 2012 | A1 |
20130002902 | Ito | Jan 2013 | A1 |
20130147981 | Wu | Jun 2013 | A1 |
20130147986 | Chen | Jun 2013 | A1 |
20140022439 | Aoki | Jan 2014 | A1 |
20140049683 | Guenter et al. | Feb 2014 | A1 |
20140071321 | Seyama | Mar 2014 | A1 |
20140132528 | Catton | May 2014 | A1 |
20140231630 | Rae et al. | Aug 2014 | A1 |
20150062391 | Murata | Mar 2015 | A1 |
20150277559 | Vescovi et al. | Oct 2015 | A1 |
20150312479 | McMahon et al. | Oct 2015 | A1 |
20160050379 | Jiang et al. | Feb 2016 | A1 |
20160191826 | Furuya | Jun 2016 | A1 |
20160218236 | Dhulla et al. | Jul 2016 | A1 |
20160219232 | Murata | Jul 2016 | A1 |
20160274237 | Stutz | Sep 2016 | A1 |
20170047363 | Choi et al. | Feb 2017 | A1 |
20170052065 | Sharma et al. | Feb 2017 | A1 |
20170082746 | Kubota et al. | Mar 2017 | A1 |
20170142325 | Shimokawa et al. | May 2017 | A1 |
20170170229 | Oh et al. | Jun 2017 | A1 |
20170364736 | Ollila | Dec 2017 | A1 |
20170373106 | Li et al. | Dec 2017 | A1 |
20180090526 | Mandai et al. | Mar 2018 | A1 |
20180090536 | Mandai et al. | Mar 2018 | A1 |
20180109742 | Agranov et al. | Apr 2018 | A1 |
20180209846 | Mandai et al. | Jul 2018 | A1 |
20180316878 | Zhou | Nov 2018 | A1 |
20180338096 | Matsunaga | Nov 2018 | A1 |
20190018119 | Laifenfeld et al. | Jan 2019 | A1 |
20190027674 | Zhang et al. | Jan 2019 | A1 |
20200029043 | McMahon | Jan 2020 | A1 |
Number | Date | Country |
---|---|---|
1630350 | Jun 2005 | CN |
1774032 | May 2006 | CN |
1833429 | Sep 2006 | CN |
1842138 | Oct 2006 | CN |
1947414 | Apr 2007 | CN |
101189885 | May 2008 | CN |
101221965 | Jul 2008 | CN |
101233763 | Jul 2008 | CN |
101472059 | Jul 2009 | CN |
101567977 | Oct 2009 | CN |
101622859 | Jan 2010 | CN |
101739955 | Jun 2010 | CN |
101754029 | Jun 2010 | CN |
101803925 | Aug 2010 | CN |
102036020 | Apr 2011 | CN |
102067584 | May 2011 | CN |
102208423 | Oct 2011 | CN |
102451160 | May 2012 | CN |
102668542 | Sep 2012 | CN |
102820309 | Dec 2012 | CN |
102821255 | Dec 2012 | CN |
103024297 | Apr 2013 | CN |
103051843 | Apr 2013 | CN |
103329513 | Sep 2013 | CN |
103546702 | Jan 2014 | CN |
104041009 | Sep 2014 | CN |
104052919 | Sep 2014 | CN |
204761615 | Nov 2015 | CN |
205211754 | May 2016 | CN |
102010060527 | Apr 2012 | DE |
1763228 | Mar 2007 | EP |
2023611 | Feb 2009 | EP |
2107610 | Oct 2009 | EP |
2230690 | Sep 2010 | EP |
2512126 | Oct 2012 | EP |
2787531 | Oct 2014 | EP |
S61123287 | Jun 1986 | JP |
2000059697 | Feb 2000 | JP |
2001211455 | Aug 2001 | JP |
2001358994 | Dec 2001 | JP |
2004111590 | Apr 2004 | JP |
2005318504 | Nov 2005 | JP |
2006287361 | Oct 2006 | JP |
2007504670 | Mar 2007 | JP |
2007516654 | Jun 2007 | JP |
2008507908 | Mar 2008 | JP |
2008271280 | Nov 2008 | JP |
2008543061 | Nov 2008 | JP |
2009021809 | Jan 2009 | JP |
2009159186 | Jul 2009 | JP |
2009212909 | Sep 2009 | JP |
2009296465 | Dec 2009 | JP |
2010080604 | Apr 2010 | JP |
2010114834 | May 2010 | JP |
2011040926 | Feb 2011 | JP |
2011049697 | Mar 2011 | JP |
2011091775 | May 2011 | JP |
2011216970 | Oct 2011 | JP |
2011217315 | Oct 2011 | JP |
2011097646 | Dec 2011 | JP |
2012010306 | Jan 2012 | JP |
2012019516 | Jan 2012 | JP |
2012513160 | Jun 2012 | JP |
2013051523 | Mar 2013 | JP |
2013070240 | Apr 2013 | JP |
2013529035 | Jul 2013 | JP |
2014081254 | May 2014 | JP |
2016145776 | Aug 2016 | JP |
20030034424 | May 2003 | KR |
20030061157 | Jul 2003 | KR |
20050103732 | Nov 2005 | KR |
20080069851 | Jul 2008 | KR |
20100008239 | Jan 2010 | KR |
20100065084 | Jun 2010 | KR |
20130074459 | Jul 2013 | KR |
200520551 | Jun 2005 | TW |
200803481 | Jan 2008 | TW |
201110689 | Mar 2011 | TW |
201301881 | Jan 2013 | TW |
WO 05041304 | May 2005 | WO |
WO 06014641 | Feb 2006 | WO |
WO 06130443 | Dec 2006 | WO |
WO 07049900 | May 2007 | WO |
WO 10120945 | Oct 2010 | WO |
WO 12011095 | Jan 2012 | WO |
WO 12032353 | Mar 2012 | WO |
WO 12053363 | Apr 2012 | WO |
WO 12088338 | Jun 2012 | WO |
WO 12122572 | Sep 2012 | WO |
WO 12138687 | Oct 2012 | WO |
WO 13008425 | Jan 2013 | WO |
WO 13179018 | Dec 2013 | WO |
WO 13179020 | Dec 2013 | WO |
WO 17112416 | Jun 2017 | WO |
Entry |
---|
U.S. Appl. No. 15/699,806, filed Sep. 8, 2017, Li et al. |
U.S. Appl. No. 15/879,350, filed Jan. 24, 2018, Mandai et al. |
U.S. Appl. No. 15/880,285, filed Jan. 25, 2018, Laifenfeld et al. |
U.S. Appl. No. 16/226,491, filed Dec. 19, 2018, McMahon. |
Aoki, et al., “Rolling-Shutter Distortion-Free 3D Stacked Image Sensor with −160dB Parasitic Light Sensitivity In-Pixel Storage Node,” ISSCC 2013, Session 27, Image Sensors, 27.3 27.3 A, Feb. 20, 2013, retrieved on Apr. 11, 2014 from URL:http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6487824. |
Charbon, et al., SPAD-Based Sensors, TOF Range-Imaging Cameras, F. Remondino and D. Stoppa (eds.), 2013, Springer-Verlag Berlin Heidelberg, pp. 11-38. |
Cox, “Getting histograms with varying bin widths,” http://www.stata.com/support/faqs/graphics/histograms-with-varying-bin-widths/, Nov. 13, 2017, 5 pages. |
Elgendi, “On the Analysis of Fingertip Photoplethysmogram Signals,” Current Cardiology Reviews, 2012, vol. 8, pp. 14-25. |
Feng, et al., “On the Stoney Formula for a Thin Film/Substrate System with Nonuniform Substrate Thickness,” Journal of Applied Mechanics, Transactions of the ASME, vol. 74, Nov. 2007, pp. 1276-1281. |
Fu, et al., “Heart Rate Extraction from Photoplethysmogram Waveform Using Wavelet Multui-resolution Analysis,” Journal of Medical and Biological Engineering, 2008, vol. 28, No. 4, pp. 229-232. |
Gallivanoni, et al., “Progress n Quenching Circuits for Single Photon Avalanche Diodes,” IEEE Transactions on Nuclear Science, vol. 57, No. 6, Dec. 2010, pp. 3815-3826. |
Han, et al., “Artifacts in wearable photoplethysmographs during daily life motions and their reduction with least mean square based active noise cancellation method,” Computers in Biology and Medicine, 2012, vol. 42, pp. 387-393. |
Jahromi et al., “A Single Chip Laser Radar Receiver with a 9×9 SPAD Detector Array and a 10-channel TDC,” 2013 Proceedings of the ESSCIRC, IEEE, Sep. 14, 2015, pp. 364-367. |
Leslar, et al., “Comprehensive Utilization of Temporal and Spatial Domain Outlier Detection Methods for Mobile Terrestrial LiDAR Data,” Remote Sensing, 2011, vol. 3, pp. 1724-1742. |
Lopez-Silva, et al., “Heuristic Algorithm for Photoplethysmographic Heart Rate Tracking During Maximal Exercise Test,” Journal of Medical and Biological Engineering, 2011, vol. 12, No. 3, pp. 181-188. |
Mota, et al., “A flexible multi-channel high-resolution Time-to-Digital Converter ASIC,” Nuclear Science Symposium Conference Record IEEE, 2000, Engineering School of Geneva, Microelectronics Lab, Geneva, Switzerland, 8 pages. |
Niclass, et al., “Design and Characterization of a CMOS 3-D Image Sensor Based on Single Photon Avalanche Diodes,” IEEE Journal of Solid-State Circuits, vol. 40, No. 9, Sep. 2005, pp. 1847-1854. |
Santos, et al., “Accelerometer-assisted PPG Measurement During Physical Exercise Using the LAVIMO Sensor System,” Acta Polytechnica, 2012, vol. 52, No. 5, pp. 80-85. |
Sarkar, et al., “Fingertip Pulse Wave (PPG signal) Analysis and Heart Rate Detection,” International Journal of Emerging Technology and Advanced Engineering, 2012, vol. 2, No. 9, pp. 404-407. |
Schwarzer, et al., On the determination of film stress from substrate bending: STONEY'S formula and its limits, Jan. 2006, 19 pages. |
Shen et al., “Stresses, Curvatures, and Shape Changes Arising from Patterned Lines on Silicon Wafers,” Journal of Applied Physics, vol. 80, No. 3, Aug. 1996, pp. 1388-1398. |
Shin, et al., “Photon-Efficient Computational 3D and Reflectivity Imaging with Single-Photon Detectors,” IEEE International Conference on Image Processing, Paris, France, Oct. 2014, 11 pages. |
Tisa, et al., “Variable-Load Quenching Circuit for single-photon avalanche diodes,” Optics Express, vol. 16, No. 3, Feb. 4, 2008, pp. 2232-2244. |
Ullrich, et al., “Linear LIDAR versus Geiger-mode LIDAR: Impact on data properties and data quality,” Laser Radar Technology and Applications XXI, edited by Monte D. Turner, Gary W. Kamerman, Proc. of SPIE, vol. 9832, 983204, 2016, 17 pages. |
Yan, et al., “Reduction of motion artifact in pulse oximetry by smoothed pseudo Wigner-Ville distribution,” Journal of NeuroEngineering and Rehabilitation, 2005, vol. 2, No. 3, pp. 1-9. |
Yousefi, et al., “Adaptive Cancellation of Motion Artifact in Wearable Biosensors,” 34th Annual International Conference of the IEEE EMBS, San Diego, California, Aug./Sep. 2012, pp. 2004-2008. |
Number | Date | Country | |
---|---|---|---|
20200029035 A1 | Jan 2020 | US |
Number | Date | Country | |
---|---|---|---|
62700103 | Jul 2018 | US |