The present invention relates generally to image sensors, and more particularly to detecting flicker in an image sensor.
Image sensors can capture an image using a rolling shutter mode or a global shutter mode. In the global shutter mode, the pixels in the image sensor capture the image at a single point in time. Each pixel in the image sensor begins and ends an integration or exposure period at the same time. In the rolling shutter mode, different lines of pixels (e.g., rows) have different exposure times as the signals are read out of the pixels line by line. Each line of pixels will start and end its exposure slightly offset in time from the other lines of pixels in the image sensor. Thus, different lines of the image are captured at slightly different points in time.
Typically, light sources such as incandescent bulbs or fluorescent tubes flicker at the same frequency as their power supply, usually 100 Hz or 120 Hz. This flicker can be captured by some or all of the pixels in an image sensor when an image is captured. The flicker can produce horizontal bands in the captured image depending on the state of the light source at the point in time that each line in the captured image was exposed. A person viewing the captured image may detect the horizontal band or bands in the image.
In some situations, an image sensor can avoid capturing flicker in an image by using exposure times that are multiples of an assumed frequency. For example, to avoid flicker caused by a 60 Hz power supply, the image sensor may choose integration times that are multiples of 1/120 of a second. In this way, each line of the image sees an integer multiple of flicker periods. But if the power supply is a 50 Hz power supply instead of a 60 Hz power supply, a horizontal band caused by a 50 Hz flicker will move predictably in the captured image. An image sensor or imaging system may be able to detect the 50 Hz flicker and responsively adjust the exposure time to avoid the 50 Hz flicker.
An image sensor, however, is not always able to detect flicker in an image when the image sensor uses exposure times that are multiples of an assumed frequency. For example, when an image sensor using a frame rate of 60 frames per second operates with exposure times that are multiples of 50 Hz, flicker produced by a 60 Hz frequency may not be detectable because a horizontal band will not move in the image. Instead, the horizontal band will appear at the same location in the captured images, making it difficult to distinguish the horizontal band from the content in the scene being imaged.
In one aspect, an image sensor can include an imaging area and one or more flicker detection regions. The imaging area includes one or more pixels. Each flicker detection region includes one or more pixels. The pixel(s) in at least one flicker detection region are sampled multiple times while at least one pixel in the imaging area capture an image.
In another aspect, a processor can be operatively connected to the one or more flicker detection regions. The processor may be adapted to enable the pixel(s) in at least one flicker detection region to be sampled multiple times while at least one pixel in the imaging area capture an image. The processor can receive the samples and analyze the samples to detect flicker in the image being captured. If flicker is detected, the processor can compensate for the flicker. As one example, the processor can adjust an exposure time for capturing the image.
In yet another aspect, a method for capturing an image in an image sensor can include capturing the image with at least one pixel in the imaging area and substantially simultaneously accumulating charge in the pixel(s) in at least one flicker region. Multiple readout operations can be performed to obtain samples from the one or more pixels in the at least one flicker detection region while the at least one pixel in the imaging area captures the image. The samples can be analyzed to detect flicker in the image. If flicker is detected, the processor can compensate for the flicker.
In another aspect, a method for enabling a flicker detection mode in an image sensor can include determining whether flicker is to be detected using at least one flicker detection region when an image is to be captured by at least one pixel in the imaging area, and if flicker is to be detected, reading out charge in the one or more pixels in the at least one flicker detection region multiple times while the image is being captured by the at least one pixel in the imaging area. If flicker will not be detected, the one or more pixels in the at least one flicker detection region and the at least one pixel in the imaging area capture the image.
Embodiments of the invention are better understood with reference to the following drawings. The elements of the drawings are not necessarily to scale relative to each other. Identical reference numerals have been used, where possible, to designate identical features that are common to the figures.
Embodiments described herein provide an image sensor that includes an imaging area and one or more flicker detection regions. The imaging area includes pixels that capture one or more images. Each flicker detection region includes pixels that are sampled multiple times while an image is being captured. The samples can be analyzed by a processor to detect flicker in the scene being imaged.
Referring now to
In the illustrated embodiment, the electronic device 100 is implemented as a smart telephone. Other embodiments, however, are not limited to this construction. Other types of computing or electronic devices can include one or more cameras, including, but not limited to, a netbook or laptop computer, a tablet computing device, a wearable communications device, a wearable health assistant, a digital camera, a printer, a scanner, a video recorder, and a copier.
As shown in
The I/O member 108 can be implemented with any type of input or output member. By way of example only, the I/O member 108 can be a switch, a button, a capacitive sensor, or other input mechanism. The I/O member 108 allows a user to interact with the electronic device 100. For example, the I/O member 108 may be a button or switch to alter the volume, return to a home screen, and the like. The electronic device can include one or more input members or output members, and each member can have a single I/O function or multiple I/O functions.
The display 110 can be operably or communicatively connected to the electronic device 100. The display 110 can be implemented with any type of suitable display, such as a retina display or an active matrix color liquid crystal display. The display 110 provides a visual output for the electronic device 100. In some embodiments, the display 110 can function to receive user inputs to the electronic device. For example, the display 110 can be a multi-touch capacitive sensing touchscreen that can detect one or more user inputs.
The electronic device 100 can also include a number of internal components.
The one or more processors 200 can control some or all of the operations of the electronic device 100. The processor(s) 200 can communicate, either directly or indirectly, with substantially all of the components of the electronic device 100. For example, one or more system buses 210 or other communication mechanisms can provide communication between the processor(s) 200, the memory 202, the I/O interfaces 204, the cameras 102, 104, the display 110, the I/O member 108, and/or the sensors 208. The processor(s) 200 can be implemented as any electronic device capable of processing, receiving, or transmitting data or instructions. For example, the one or more processors 200 can be a microprocessor, a central processing unit (CPU), an application-specific integrated circuit (ASIC), a digital signal processor (DSP), or combinations of multiple such devices. As described herein, the term “processor” is meant to encompass a single processor or processing unit, multiple processors, multiple processing units, or other suitably configured computing element or elements.
The one or more storage or memory devices 202 can store electronic data that can be used by the electronic device 100. For example, the memory 202 can store electrical data or content such as, for example, audio files, document files, timing signals, and image data. The memory 202 can be configured as any type of memory. By way of example only, the memory 202 can be implemented as random access memory, read-only memory, Flash memory, removable memory, or other types of storage elements, in any combination.
The one or more I/O interfaces 204 can receive data from a user or one or more other electronic devices. For example, an I/O interface 204 can receive input from the I/O member 108 shown in
The one or more power sources 206 can be implemented with any device capable of providing energy to the electronic device 100. For example, the power source 206 can be a battery or a connection cable that connects the electronic device 100 to another power source such as a wall outlet.
The one or more sensors 208 can by implemented with any type of sensors. Examples sensors include, but are not limited to, audio sensors (e.g., microphones), light sensors (e.g., ambient light sensors), gyroscopes, and accelerometers. The sensors 208 can be used to provide data to the processor 200, which may be used to enhance or vary functions of the electronic device.
It should be noted that
As described with reference to
The camera 102 includes an imaging stage 300 that is in optical communication with an image sensor 302. The imaging stage 300 is operably connected to the enclosure 106 and positioned in front of the image sensor 302. The imaging stage 300 can include conventional elements such as a lens, a filter, an iris, and a shutter. The imaging stage 300 directs, focuses or transmits light 304 within its field of view onto the image sensor 302. The image sensor 302 captures one or more images of a subject scene by converting the incident light into electrical signals.
The image sensor 302 is supported by a support structure 306. The support structure 306 can be a semiconductor-based material including, but not limited to, silicon, silicon-on-insulator (SOI) technology, silicon-on-sapphire (SOS) technology, doped and undoped semiconductors, epitaxial layers formed on a semiconductor substrate, well regions or buried layers formed in a semiconductor substrate, and other semiconductor structures.
Various elements of imaging stage 300 or image sensor 302 can be controlled by timing signals or other signals supplied from a processor or memory, such as processor 200 in
Referring now to
The imaging area 404 may be in communication with a column select 408 through one or more column select lines 410 and a row select 412 through one or more row select lines 414. The row select 412 selectively selects a particular pixel 406 or group of pixels, such as all of the pixels 406 in a certain row. The column select 408 selectively receives the data output from the select pixels 406 or groups of pixels (e.g., all of the pixels with a particular column).
The row select 412 and/or the column select 408 may be in communication with the image processor 402. The image processor 402 can process data from the pixels 406 and provide that data to the processor 200 and/or other components of the electronic device 100. It should be noted that in some embodiments, the image processor 402 can be incorporated into the processor 200 or separate therefrom.
One or more flicker detection regions 416 may be positioned adjacent to the imaging area 404. In the illustrated embodiment, one flicker detection region is positioned adjacent to the top of the imaging area 504 and another flicker detection region is located adjacent to the bottom of the imaging area 504. Other embodiments can position the one or more flicker detection regions at different locations. Additionally, each flicker detection region 416 is shown as a single horizontal line of pixels. Other embodiments can include any given number of pixels arranged in any configuration.
Each flicker detection region includes pixels 418. In some embodiments, the pixels 418 are configured as the pixels 406 in the imaging area 404. Other embodiments can configure the pixels 418 differently. Two illustrative pixel configurations are described in more detail later in conjunction with
The pixels 418 in a flicker detection region may be optionally divided into sub-regions. As one example, the pixels 418 can be divided horizontally into two or four sub-regions 420, 421, 422, 423. Dividing the pixels into sub-regions 420, 421, 422, 423 allows the top flicker detection region to be separate from the bottom flicker detection region regardless of the orientation of the image sensor (e.g., portrait or landscape). For example, if the top of the scene being imaged is flickering differently from the bottom of the scene, the top of the scene can be observed using two sub-regions, such as sub-regions 421 and 422, or 422 and 423, or 420 and 423, or 420 and 421 depending on the orientation of the image sensor.
In some embodiments, the one or more flicker detection regions 416 include pixels 418 that are physically separate and distinct from the pixels 406 in the imaging area 404. In other embodiments, the flicker detection region(s) include pixels that are implemented within the imaging area 404 but are logically separated from the pixels in the imaging area. For example, the pixels in the top row and in the bottom row of an imaging area can be used to for flicker detection instead of image capture. And in yet another embodiment, the pixels in the one or more flicker detection region can have multiple modes, with one mode being flicker detection and another mode image capture. Mode enablement is described in more detail in conjunction with
Referring now to
One terminal of the transfer transistor 504 is connected to the photodetector 502 while the other terminal is connected to the sense region 506. One terminal of the reset transistor 508 and one terminal of the readout transistor 510 are connected to a supply voltage (Vdd) 514. The other terminal of the reset transistor 508 is connected to the sense region 506, while the other terminal of the readout transistor 510 is connected to a terminal of the row select transistor 512. The other terminal of the row select transistor 512 is connected to a column select line 410.
By way of example only, in one embodiment the photodetector 502 is implemented as a photodiode (PD) or pinned photodiode, the sense region 506 as a floating diffusion (FD), and the readout transistor 510 as a source follower transistor (SF). The photodetector 502 can be an electron-based photodiode or a hole based photodiode. It should be noted that the term photodetector as used herein is meant to encompass substantially any type of photon or light detecting component, such as a photodiode, pinned photodiode, photogate, or other photon sensitive region. Additionally, the term sense region as used herein is meant to encompass substantially any type of charge storing or charge converting region.
Those skilled in the art will recognize that the pixel 500 can be implemented with additional or different components in other embodiments. For example, a row select transistor can be omitted and a pulsed power supply mode used to select the pixel, the sense region can be shared by multiple photodetectors and transfer transistors, or the reset and readout transistors can be shared by multiple photodetectors, transfer gates, and sense regions.
When an image is to be captured, an integration period for the pixel begins and the photodetector 502 accumulates photo-generated charge in response to incident light. When the integration period ends, the accumulated charge in the photodetector 502 is transferred to the sense region 506 by selectively pulsing the gate of the transfer transistor 504. Typically, the reset transistor 508 is used to reset the voltage on the sense region 506 (node 516) to a predetermined level prior to the transfer of charge from the photodetector 502 to the sense region 506. When charge is to be readout of the pixel, the gate of the row select transistor is pulsed through the row select 412 and row select line 414 to select the pixel (or row of pixels) for readout. The readout transistor 510 senses the voltage on the sense region 506 and the row select transistor 512 transmits the voltage to the column select line 410. The column select line 410 is connected to readout circuitry (and optionally an image processor) through the column select 408.
In some embodiments, an image capture device, such as a camera, may not include a shutter over the lens, and so the image sensor may be constantly exposed to light. In these embodiments, the photodetectors may have to be reset or depleted before a desired image is to be captured. Once the charge from the photodetectors has been depleted, the transfer gate and the reset gate are turned off, isolating the photodetectors. The photodetectors can then begin integration and collecting photo-generated charge.
The pixels in the flicker detection region(s) can be implemented similarly to pixel 500. Other embodiments can configure the pixels in the flicker detection region or regions differently. Two different implementations are described in more detail in conjunction with
The pixels in a flicker detection region can be operatively connected to respective column select lines, such as column select lines 410 shown in
Alternatively, the pixels in the flicker detection region can be operatively connected to separate readout circuitry and column lines.
The common flicker sense node 606 connects to the gate of a readout transistor 608. In some embodiments, the readout transistor 608 is a source follower transistor. One terminal of the readout transistor 608 is connected to a terminal of a column select transistor 610. The other terminal of the column select transistor 610 is connected to a column line 612. The gate of the column select transistor 610 can be connected to a row select line (not shown). The row select line can be used to select or turn on the column select transistor 610 when the charge or signal is to be sampled from the flicker detection region 602. The other terminal of the readout transistor 608 can be connected to a supply voltage. The column select lines 410 and the column line 612 can each be connected to readout circuitry 614. In some embodiments, the readout circuitry can be shared between two or more column select lines 410 and/or column lines 612. Each readout circuitry 614 can include a digital-to-analog converter to convert the analog signals obtained from the pixels 406 and the flicker detection region 602 to digital signals.
Referring now to
The photo-current summing node 706 is connected to a flicker readout circuitry 710. In some embodiments, the flicker readout circuitry can include at least one column and ADC dedicated to the flicker detection region (e.g., column line 718 and ADC 722), in addition to the circuitry for the imaging area 404. The flicker readout circuitry 710 can include circuitry that is similar in structure as some or all of the circuitry in the pixels 406. For example, the flicker readout circuitry 710 can include a transfer transistor 712. One terminal of the transfer transistor 712 is connected to the photo-current summing node 706, while the other terminal is connected to the gate of a readout transistor 714. In some embodiments, the readout transistor 714 is a source follower transistor. One terminal of the readout transistor 714 is connected to a terminal of a column select transistor 716. The other terminal of the column select transistor 716 is connected to a column line 718. The column line 718 is dedicated to flicker readout circuitry. Any pixels in the flicker detection column other than the flicker readout circuitry 710 are disconnected from the column line 718 and can be connected to a separate column line (not shown).
The gate of the column select transistor 716 can be connected to a row select line (not shown). The row select line can be used to select or turn on the column select transistor 716 when the charge or signal is to be sampled from the flicker detection region 702. The other terminal of the readout transistor 714 can be connected to a supply voltage. The column select lines 410 and the column line 718 can each be connected to readout circuitry 720, which includes an analog-to-digital converter 722.
A capacitor 724 can be connected to the photocurrent summing node 706 through switch 726. The size of the capacitor 724 can be programmable to adjust for various lighting conditions and/or integration times of the summed photocurrent detected by the pixels 700 in the flicker detection region(s). For example, a smaller capacitor value can be selected for low illumination conditions and/or a short integration time, such that the flicker readout circuitry 710 can be operated in parallel with the normal operation of pixels 406. Additionally, in some embodiments, the capacitor 724 can reduce the threshold voltage or current needed to turn on the readout transistor 714.
Other embodiments can average the samples differently. For example, the samples can be averaged as described in conjunction with
Next, as shown in block 804, the averaged samples are analyzed to determine if flicker is present in the scene being imaged. The averaged samples can be analyzed by a processor, such as image processor 402 (
A determination can then be made at block 806 as to whether or not flicker is detected. If not, the method ends. When flicker is detected, the process passes to block 808 where a frequency of the flicker can be determined. Compensation for the flicker is then determined at block 810. In one embodiment, the exposure time of the image capture is adjusted to reduce or eliminate flicker in the captured image.
As described earlier, the pixels in the one or more flicker detection region can have multiple modes, with one mode being flicker detection and another mode image capture.
Next, as shown in block 902, a determination can be made as to whether or not the flicker detection mode is selected. If so, the process passes to block 904 where one or more pixels in at least one flicker detection region are used to detect flicker and the method ends. When the flicker detection mode is not selected, the pixel(s) in the flicker detection region(s) may be used to capture an image along with the pixels in the imaging area (block 906), and the method ends. The signals and/or the timing of the signals provided to the pixels can change based on which mode is selected. By way of example only, the processor 402 can provide or enable the signals to be transmitted to the pixels for each mode. In some embodiments, the processor can produce the signals or read the signals from a memory (e.g., memory 202).
Other embodiments can perform the method shown in
Flicker detection regions can detect the presence of flicker as well as the absence of flicker. Compensation for detected flicker can occur more quickly than in convention image sensors because samples are read out of the flicker detection region(s) multiple times per frame (e.g., per image capture). Additionally or alternatively, flicker occurring at a variety of different frequencies can be detected more easily.
Various embodiments have been described in detail with particular reference to certain features thereof, but it will be understood that variations and modifications can be effected within the spirit and scope of the disclosure. And even though specific embodiments have been described herein, it should be noted that the application is not limited to these embodiments. In particular, any features described with respect to one embodiment may also be used in other embodiments, where compatible. Likewise, the features of the different embodiments may be exchanged, where compatible.
This application claims the benefit under 35 U.S.C. §119(e) of U.S. Provisional Patent Application No. 61/913,851, filed Dec. 9, 2013, and entitled “Image Sensor Flicker Detection,” the entirety of which is incorporated by reference as if fully enclosed herein.
Number | Name | Date | Kind |
---|---|---|---|
4686648 | Fossum | Aug 1987 | A |
5105264 | Erhardt et al. | Apr 1992 | A |
5329313 | Keith | Jul 1994 | A |
5396893 | Oberg et al. | Mar 1995 | A |
5471515 | Fossum et al. | Nov 1995 | A |
5541402 | Ackland et al. | Jul 1996 | A |
5550677 | Schofield et al. | Aug 1996 | A |
5781312 | Noda | Jul 1998 | A |
5841126 | Fossum et al. | Nov 1998 | A |
5880459 | Pryor et al. | Mar 1999 | A |
5949483 | Fossum et al. | Sep 1999 | A |
6008486 | Stam et al. | Dec 1999 | A |
6040568 | Caulfield et al. | Mar 2000 | A |
6233013 | Hosier et al. | May 2001 | B1 |
6348929 | Acharya et al. | Feb 2002 | B1 |
6448550 | Nishimura | Sep 2002 | B1 |
6541751 | Bidermann | Apr 2003 | B1 |
6713796 | Fox | Mar 2004 | B1 |
6714239 | Guidash | Mar 2004 | B2 |
6798453 | Kaifu | Sep 2004 | B1 |
6816676 | Bianchi et al. | Nov 2004 | B2 |
6905470 | Lee et al. | Jun 2005 | B2 |
6931269 | Terry | Aug 2005 | B2 |
6982759 | Goto | Jan 2006 | B2 |
7091466 | Bock | Aug 2006 | B2 |
7133073 | Neter | Nov 2006 | B1 |
7259413 | Rhodes | Aug 2007 | B2 |
7262401 | Hopper et al. | Aug 2007 | B2 |
7271835 | Iizuka | Sep 2007 | B2 |
7282028 | Kim et al. | Oct 2007 | B2 |
7332786 | Altice | Feb 2008 | B2 |
7390687 | Boettiger | Jun 2008 | B2 |
7437013 | Anderson | Oct 2008 | B2 |
7443421 | Stavely et al. | Oct 2008 | B2 |
7446812 | Ando et al. | Nov 2008 | B2 |
7502054 | Kalapathy | Mar 2009 | B2 |
7525168 | Hsieh | Apr 2009 | B2 |
7554067 | Zarnoski et al. | Jun 2009 | B2 |
7555158 | Park et al. | Jun 2009 | B2 |
7626626 | Panicacci | Dec 2009 | B2 |
7671435 | Ahn | Mar 2010 | B2 |
7728351 | Shim | Jun 2010 | B2 |
7733402 | Egawa et al. | Jun 2010 | B2 |
7742090 | Street | Jun 2010 | B2 |
7773138 | Lahav et al. | Aug 2010 | B2 |
7786543 | Hsieh | Aug 2010 | B2 |
7796171 | Gardner | Sep 2010 | B2 |
7873236 | Li et al. | Jan 2011 | B2 |
7880785 | Gallagher | Feb 2011 | B2 |
7884402 | Ki | Feb 2011 | B2 |
7906826 | Martin et al. | Mar 2011 | B2 |
7952121 | Arimoto | May 2011 | B2 |
7952635 | Lauxtermann | May 2011 | B2 |
7982789 | Watanabe et al. | Jul 2011 | B2 |
8026966 | Altice | Sep 2011 | B2 |
8032206 | Farazi et al. | Oct 2011 | B1 |
8094232 | Kusaka | Jan 2012 | B2 |
8116540 | Dean et al. | Feb 2012 | B2 |
8140143 | Picard et al. | Mar 2012 | B2 |
8153947 | Barbier et al. | Apr 2012 | B2 |
8159588 | Boemler | Apr 2012 | B2 |
8164669 | Compton et al. | Apr 2012 | B2 |
8174595 | Honda et al. | May 2012 | B2 |
8184188 | Yaghmai | May 2012 | B2 |
8194148 | Doida | Jun 2012 | B2 |
8194165 | Border et al. | Jun 2012 | B2 |
8222586 | Lee | Jul 2012 | B2 |
8227844 | Adkisson | Jul 2012 | B2 |
8233071 | Takeda | Jul 2012 | B2 |
8259228 | Wei et al. | Sep 2012 | B2 |
8310577 | Neter | Nov 2012 | B1 |
8324553 | Lee | Dec 2012 | B2 |
8340407 | Kalman | Dec 2012 | B2 |
8350940 | Smith et al. | Jan 2013 | B2 |
8400546 | Itano et al. | Mar 2013 | B2 |
8456559 | Yamashita | Jun 2013 | B2 |
8508637 | Han et al. | Aug 2013 | B2 |
8514308 | Itonaga et al. | Aug 2013 | B2 |
8520913 | Dean et al. | Aug 2013 | B2 |
8547388 | Cheng | Oct 2013 | B2 |
8575531 | Hynecek et al. | Nov 2013 | B2 |
8581992 | Hamada | Nov 2013 | B2 |
8594170 | Mombers et al. | Nov 2013 | B2 |
8619163 | Ogura | Dec 2013 | B2 |
8629484 | Ohri et al. | Jan 2014 | B2 |
8634002 | Kita | Jan 2014 | B2 |
8648947 | Sato et al. | Feb 2014 | B2 |
8723975 | Solhusvik | May 2014 | B2 |
8754983 | Sutton | Jun 2014 | B2 |
8755854 | Addison et al. | Jun 2014 | B2 |
8759736 | Yoo | Jun 2014 | B2 |
8767104 | Makino et al. | Jul 2014 | B2 |
8803990 | Smith | Aug 2014 | B2 |
8817154 | Manabe et al. | Aug 2014 | B2 |
8902330 | Theuwissen | Dec 2014 | B2 |
8908073 | Minagawa et al. | Dec 2014 | B2 |
8936552 | Kateraas et al. | Jan 2015 | B2 |
8946610 | Iwabuchi et al. | Feb 2015 | B2 |
8982237 | Chen | Mar 2015 | B2 |
9041837 | Li | May 2015 | B2 |
9054009 | Oike et al. | Jun 2015 | B2 |
9066017 | Geiss | Jun 2015 | B2 |
9066660 | Watson et al. | Jun 2015 | B2 |
9088727 | Trumbo | Jul 2015 | B2 |
9094623 | Kawaguchi | Jul 2015 | B2 |
9099604 | Roy | Aug 2015 | B2 |
9100597 | Hu | Aug 2015 | B2 |
9131171 | Aoki et al. | Sep 2015 | B2 |
9232150 | Kleekajai et al. | Jan 2016 | B2 |
9270906 | Peng et al. | Feb 2016 | B2 |
9277144 | Kleekajai et al. | Mar 2016 | B2 |
9344649 | Bock | May 2016 | B2 |
20030036685 | Goodman et al. | Feb 2003 | A1 |
20040207836 | Chhibber et al. | Oct 2004 | A1 |
20050026332 | Fratti et al. | Feb 2005 | A1 |
20060256208 | Ono | Nov 2006 | A1 |
20060274161 | Ing et al. | Dec 2006 | A1 |
20070263099 | Motta et al. | Nov 2007 | A1 |
20080177162 | Bae et al. | Jul 2008 | A1 |
20080315198 | Jung | Dec 2008 | A1 |
20090096901 | Bae et al. | Apr 2009 | A1 |
20090101914 | Hirotsu et al. | Apr 2009 | A1 |
20090135276 | Urisaka | May 2009 | A1 |
20090146234 | Luo et al. | Jun 2009 | A1 |
20090201400 | Zhang et al. | Aug 2009 | A1 |
20100013964 | Negishi | Jan 2010 | A1 |
20100134631 | Voth | Jun 2010 | A1 |
20110028802 | Addison et al. | Feb 2011 | A1 |
20110077531 | Watson et al. | Mar 2011 | A1 |
20110080500 | Wang et al. | Apr 2011 | A1 |
20110156197 | Tivarus et al. | Jun 2011 | A1 |
20110245690 | Watson et al. | Oct 2011 | A1 |
20120092541 | Tuulos et al. | Apr 2012 | A1 |
20120098964 | Oggier et al. | Apr 2012 | A1 |
20120147207 | Itonaga | Jun 2012 | A1 |
20130147981 | Wu | Jun 2013 | A1 |
20130155271 | Ishii | Jun 2013 | A1 |
20130222584 | Aoki et al. | Aug 2013 | A1 |
20140049683 | Guenter et al. | Feb 2014 | A1 |
20140071321 | Seyama | Mar 2014 | A1 |
20140232902 | Kim | Aug 2014 | A1 |
20140240550 | Taniguchi | Aug 2014 | A1 |
20140246568 | Wan | Sep 2014 | A1 |
20140247378 | Sharma et al. | Sep 2014 | A1 |
20140252201 | Li et al. | Sep 2014 | A1 |
20140253754 | Papiashvili | Sep 2014 | A1 |
20140263951 | Fan et al. | Sep 2014 | A1 |
20140267855 | Fan | Sep 2014 | A1 |
20140347533 | Toyoda | Nov 2014 | A1 |
20140354861 | Pang | Dec 2014 | A1 |
20150163422 | Fan et al. | Jun 2015 | A1 |
20150237314 | Hasegawa | Aug 2015 | A1 |
20150264241 | Kleekajai et al. | Sep 2015 | A1 |
20150312479 | McMahon et al. | Oct 2015 | A1 |
20150350575 | Agranov et al. | Dec 2015 | A1 |
20160050379 | Jiang et al. | Feb 2016 | A1 |
Number | Date | Country |
---|---|---|
1842138 | Oct 2006 | CN |
101189885 | May 2008 | CN |
101233763 | Jul 2008 | CN |
101472059 | Jul 2009 | CN |
101567977 | Oct 2009 | CN |
101622859 | Jan 2010 | CN |
101803925 | Aug 2010 | CN |
102036020 | Apr 2011 | CN |
102821255 | Dec 2012 | CN |
103329513 | Sep 2013 | CN |
103546702 | Jan 2014 | CN |
2023611 | Feb 2009 | EP |
2107610 | Oct 2009 | EP |
2230690 | Sep 2010 | EP |
2000059697 | Feb 2000 | JP |
2001211455 | Aug 2001 | JP |
2009021809 | Jan 2009 | JP |
2009159186 | Jul 2009 | JP |
201149697 | Mar 2011 | JP |
2011097646 | Dec 2011 | JP |
2012019516 | Jan 2012 | JP |
2012513160 | Jun 2012 | JP |
2013070240 | Apr 2013 | JP |
2013529035 | Jul 2013 | JP |
20030034424 | May 2003 | KR |
20030061157 | Jul 2003 | KR |
20080069851 | Jul 2008 | KR |
20100008239 | Jan 2010 | KR |
20100065084 | Jun 2010 | KR |
20130074459 | Jul 2013 | KR |
201301881 | Jan 2013 | TW |
WO 2010120945 | Oct 2010 | WO |
WO 2012053363 | Apr 2012 | WO |
WO 2012088338 | Jun 2012 | WO |
WO 2012122572 | Sep 2012 | WO |
WO 2013008425 | Jan 2013 | WO |
WO 2013179018 | Dec 2013 | WO |
WO 2013179020 | Dec 2013 | WO |
Entry |
---|
U.S. Appl. No. 13/782,532, filed Mar. 1, 2013, Sharma et al. |
U.S. Appl. No. 13/783,536, filed Mar. 4, 2013, Wan. |
U.S. Appl. No. 13/785,070, filed Mar. 5, 2013, Li. |
U.S. Appl. No. 13/787,094, filed Mar. 6, 2013, Li et al. |
U.S. Appl. No. 13/797,851, filed Mar. 12, 2013, Li. |
U.S. Appl. No. 13/830,748, filed Mar. 14, 2013, Fan. |
U.S. Appl. No. 14/098,504, filed Dec. 5, 2013, Fan et al. |
U.S. Appl. No. 14/207,150, filed Mar. 12, 2014, Kleekajai et al. |
U.S. Appl. No. 14/207,176, filed Mar. 12, 2014, Kleekajai et al. |
U.S. Appl. No. 14/276,728, filed May 13, 2014, McMahon et al. |
U.S. Appl. No. 14/292,599, filed May 30, 2014, Agranov et al. |
U.S. Appl. No. 14/462,032, filed Aug. 18, 2014, Jiang et al. |
U.S. Appl. No. 14/481,806, filed Sep. 9, 2014, Kleekajai et al. |
U.S. Appl. No. 14/481,820, filed Sep. 9, 2014, Lin et al. |
U.S. Appl. No. 14/503,322, filed Sep. 30, 2014, Molgaard. |
U.S. Appl. No. 14/569,346, filed Dec. 12, 2014, Kestelli et al. |
U.S. Appl. No. 14/611,917, filed Feb. 2, 2015, Lee et al. |
Evaluation Report dated Mar. 4, 2015, Chinese Application No. ZL2014207337276, 13 pages. |
Aoki, et al., “Rolling-Shutter Distortion-Free 3D Stacked Image Sensor with -160dB Parasitic Light Sensitivity In-Pixel Storage Node,” ISSCC 2013, Session 27, Image Sensors, 27.3 27.3 A, Feb. 20, 2013, retrieved on Apr. 11, 2014 from URL:http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6487824. |
Elgendi, “On the Analysis of Fingertip Photoplethysmogram Signals,” Current Cardiology Reviews, 2012, vol. 8, pp. 14-25. |
Feng, et al., “On the Stoney Formula for a Thin Film/Substrate System with Nonuniform Substrate Thickness,” Journal of Applied Mechanics, Transactions of the ASME, vol. 74, Nov. 2007, pp. 1276-1281. |
Fu, et al., “Heart Rate Extraction from Photoplethysmogram Waveform Using Wavelet Multui-resolution Analysis,” Journal of Medical and Biological Engineering, 2008, vol. 28, No. 4, pp. 229-232. |
Han, et al., “Artifacts in wearable photoplethysmographs during daily life motions and their reduction with least mean square based active noise cancellation method,” Computers in Biology and Medicine, 2012, vol. 42, pp. 387-393. |
Lopez-Silva, et al., “Heuristic Algorithm for Photoplethysmographic Heart Rate Tracking During Maximal Exercise Test,” Journal of Medical and Biological Engineering, 2011, vol. 12, No. 3, pp. 181-188. |
Santos, et al., “Accelerometer-assisted PPG Measurement During Physical Exercise Using the LAVIMO Sensor System,” Acta Polytechnica, 2012, vol. 52, No. 5, pp. 80-85. |
Sarkar, et al., “Fingertip Pulse Wave (PPG signal) Analysis and Heart Rate Detection,” International Journal of Emerging Technology and Advanced Engineering, 2012, vol. 2, No. 9, pp. 404-407. |
Schwarzer, et al., On the determination of film stress from substrate bending: Stoney'S formula and its limits, Jan. 2006, 19 pages. |
Yan, et al., “Reduction of motion artifact in pulse oximetry by smoothed pseudo Wigner-Ville distribution,” Journal of Neuro Engineering and Rehabilitation, 2005, vol. 2, No. 3, pp. 1-9. |
Yousefi, et al., “Adaptive Cancellation of Motion Artifact in Wearable Biosensors,” 34th Annual International Conference of the IEEE EMBS, San Diego, California, Aug./Sep. 2012, pp. 2004-2008. |
Number | Date | Country | |
---|---|---|---|
20150163392 A1 | Jun 2015 | US |
Number | Date | Country | |
---|---|---|---|
61913851 | Dec 2013 | US |