Embodiments disclosed herein relate in general to multi-aperture imaging (“MAI”) systems (where “multi” refers to two or more apertures) and more specifically to thin MAI systems with high color resolution and/or optical zoom.
Small digital cameras integrated into mobile (cell) phones, personal digital assistants and music players are becoming ubiquitous. Each year, mobile phone manufacturers add more imaging features to their handsets, causing these mobile imaging devices to converge towards feature sets and image quality that customers expect from standalone digital still cameras. Concurrently, the size of these handsets is shrinking, making it necessary to reduce the total size of the camera accordingly while adding more imaging features. Optical Zoom is a primary feature of many digital still cameras but one that mobile phone cameras usually lack, mainly due to camera height constraints in mobile imaging devices, cost and mechanical reliability.
Mechanical zoom solutions are common in digital still cameras but are typically too thick for most camera phones. Furthermore, the F/# (“F number) in such systems typically increases with the zoom factor (ZF) resulting in poor light sensitivity and higher noise (especially in low-light scenarios). In mobile cameras, this also results in resolution compromise, due to the small pixel size of their image sensors and the diffraction limit optics associated with the F/#.
One way of implementing zoom in mobile cameras is by over-sampling the image and cropping and interpolating it in accordance with the desired ZF. While this method is mechanically reliable, it results in thick optics and in an expensive image sensor due to the large number of pixels so associated therewith. As an example, if one is interested in implementing a 12 Megapixel camera with X3 ZF, one needs a sensor of 108 Megapixels.
Another way of implementing zoom, as well as increasing the output resolution, is by using a dual-aperture imaging (“DAI”) system. In its basic form, a DAI system includes two optical apertures which may be formed by one or two optical modules, and one or two image sensors (e.g., CMOS or CCD) that grab the optical image or images and convert the data into the electronic domain, where the image can be processed and stored.
The design of a thin MAI system with improved resolution requires a careful choice of parameters coupled with advanced signal processing algorithms to support the output of a high quality image. Known MAI systems, in particular ones with short optical paths, often trade-off functionalities and properties, for example zoom and color resolution, or image resolution and quality for camera module height. Therefore, there is a need for, and it would be advantageous to have thin MAI systems that produce an image with high resolution (and specifically high color resolution) together with zoom functionality.
Moreover, known signal processing algorithms used together with existing MAI systems often further degrade the output image quality by introducing artifacts when combining information from different apertures. A primary source of these artifacts is the image registration process, which has to find correspondences between the different images that are often captured by different sensors with different color filter arrays (CFAs). There is therefore a need for, and it would be advantageous to have an image registration algorithm that is more robust to the type of CFA used by the cameras and which can produce better correspondence between images captured by a multi-aperture system.
Embodiments disclosed herein teach the use of multi-aperture imaging systems to implement thin cameras (with short optical paths of less than about 9 mm) and/or to realize optical zoom systems in such thin cameras. Embodiments disclosed herein further teach new color filter arrays that optimize the color information which may be achieved in a multi-aperture imaging system with or without zoom. In various embodiments, a MAI system disclosed herein includes at least two sensors or a single sensor divided into at least two areas. Hereinafter, the description refers to “two sensors”, with the understanding that they may represent sections of a single physical sensor (imager chip). Exemplarily, in a dual-aperture imaging system, a left sensor (or left side of a single sensor) captures an image coming from a first aperture while a right sensor (or right side of a single sensor) captures an image coming from a second aperture. In various embodiments disclosed herein, one sensor is a “Wide” sensor while another sensor is a “Tele” sensor, see e.g.
The Tele sensor may be a Clear sensor (i.e. a sensor without color filters) or a standard CFA sensor. This arrangement of the two (or more than two) sensors and of two (or more than two) Wide and Tele “subset cameras” (or simply “subsets”) related to the two Wide and Tele subsets. Each sensor provides a separate image (referred to respectively as a Wide image and a Tele image), except for the case of a single sensor, where two images are captured (grabbed) by the single sensor (example above). In some embodiments, zoom is achieved by fusing the two images, resulting in higher color resolution that approaches that of a high quality dual-aperture zoom camera. Some thin MAI systems disclosed herein therefore provide zoom, super-resolution, high dynamic range and enhanced user experience.
In some embodiments, in order to reach optical zoom capabilities, a different magnification image of the same scene is grabbed by each subset, resulting in field of view (FOV) overlap between the two subsets. In some embodiments, the two subsets have the same zoom (i.e. same FOV). In some embodiments, the Tele subset is the higher zoom subset and the Wide subset is the lower zoom subset. Post processing is applied on the two images grabbed by the MAI system to fuse and output one fused (combined) output zoom image processed according to a user ZF input request. In some embodiments, the resolution of the fused image may be higher than the resolution of the Wide/Tele sensors. As part of the fusion procedure, up-sampling may be applied on the Wide image to scale it to the Tele image.
In an embodiment there is provided a multi-aperture imaging system comprising a first camera subset that provides a first image, the first camera subset having a first sensor with a first plurality of sensor pixels covered at least in part with a non-standard CFA, the non-standard CFA used to increase a specific color sampling rate relative to a same color sampling rate in a standard CFA; a second camera subset that provides a second image, the second camera subset having a second sensor with a second plurality of sensor pixels either Clear or covered with a standard CFA; and a processor configured to process the first and second images into a combined output image.
In some embodiments, the first and the second camera subsets have identical FOVs and the non-standard CFA may cover an overlap area that includes all the pixels of first sensor, thereby providing increased color resolution. In some such embodiments, the processor is further configured to, during the processing of the first and second images into a combined output image, register respective first and second Luma images obtained from the first and second images, the registered first and second Luma images used together with color information to form the combined output image. In an embodiment, the registration includes finding a corresponding pixel in the second Luma image for each pixel in the first Luma image, whereby the output image is formed by transferring information from the second image to the first image. In another embodiment, the registration includes finding a corresponding pixel in the first Luma image for each pixel in the second Luma image, whereby the output image is formed by transferring information from the first image to the second image.
In some embodiments, the first camera subset has a first FOV, the second camera subset has a second, smaller FOV than the first FOV, and the non-standard CFA covers an overlap area on the first sensor that captures the second FOV, thereby providing both optical zoom and increased color resolution. In some such embodiments, the processor is further configured to, during the processing of the first and second images into a combined output image and based on a ZF input, register respective first and second Luma images obtained from the first and second images, the registered first and second Luma images used together with color information to form the combined output image. For a ZF input that defines an FOV greater than the second FOV, the registration includes finding a corresponding pixel in the second Luma image for each pixel in the first Luma image and the processing includes forming the output image by transferring information from the second image to the first image. For a ZF input that defines an FOV smaller than or equal to the second FOV, the registration includes finding a corresponding pixel in the first Luma image for each pixel in the second Luma image, and the processing includes forming the output image by transferring information from the first image to the second image.
In an embodiment there is provided a multi-aperture imaging system comprising a first camera subset that provides a first image, the first camera subset having a first sensor with a first plurality of sensor pixels covered at least in part with a standard CFA; a second camera subset that provides a second image, the second camera subset having a second sensor with a second plurality of sensor pixels either Clear or covered with a standard CFA; and a processor configured to register first and second Luma images obtained respectively from the first and second images and to process the registered first and second Luma images together with color information into a combined output image.
In some embodiments, the first and the second camera subsets have identical first and second FOVs. In some such embodiments, the registration includes finding a corresponding pixel in the second Luma image for each pixel in the first Luma image and the processing includes forming the output image by transferring information from the second image to the first image. In other such embodiments, the registration includes finding a corresponding pixel in the first Luma image for each pixel in the second Luma image and the processing includes forming the output image by transferring information from the first image to the second image.
In some embodiments, the first camera subset has a first FOV, the second camera subset has a second, smaller FOV than the first FOV, and the processor is further configured to register the first and second Luma images based on a ZF input. For a ZF input that defines an FOV greater than the second FOV, the registration includes finding a corresponding pixel in the second Luma image for each pixel in the first Luma image and the processing includes forming the output image by transferring information from the second image to the first image. For a ZF input that defines an FOV smaller than or equal to the second FOV, the registration includes finding a corresponding pixel in the first Luma image for each pixel in the second Luma image, and the processing includes forming the output image by transferring information from the first image to the second image.
Non-limiting examples of embodiments disclosed herein are described below with reference to figures attached hereto that are listed following this paragraph. The drawings and descriptions are meant to illuminate and clarify embodiments disclosed herein, and should not be considered limiting in any way.
Embodiments disclosed herein relate to multi-aperture imaging systems that include at least one Wide sensor with a single CFA or with two different CFAs and at least one Tele sensor. The description continues with particular reference to dual-aperture imaging systems that include two (Wide and Tele) subsets with respective sensors. A three-aperture imaging system is described later with reference to
The Wide sensor includes an overlap area (see description of
The Tele sensor may be Clear (providing a Tele Clear image scaled relative to the Wide image) or may include a standard (Bayer or non-Bayer) CFA. It in the latter case, it is desirable to define primary and auxiliary sensors based on the applied ZF. If the ZF is such that the output FOV is larger than the Tele FOV, the primary sensor is the Wide sensor and the auxiliary sensor is the Tele sensor. If the ZF is such that the output FOV is equal to, or smaller than the Tele FOV, the primary sensor is the Tele sensor and the auxiliary sensor is the Wide sensor. The point of view defined by the output image is that of the primary sensor.
Processing Flow
In use, an image is acquired with imaging system 100 and is processed according to steps illustrated in a flowchart shown in
In step 1004, the data from the Wide and Tele images is processed together with the registration information from step 1002 to form a high quality output zoom image. In cases where the Tele sensor is a Clear only sensor, the high resolution luminance component is taken from the Tele sensor and color resolution is taken from the Wide sensor. In cases where the Tele sensor includes a CFA, both color and luminance data are taken from the Tele subset to form the high quality zoom image. In addition, color and luminance data is taken from the Wide subset.
Exemplary Process for Fusing a Zoom Image
1. Special Demosaicing
In this step, the Wide image is interpolated to reconstruct the missing pixel values. Standard demosaicing is applied in the non-overlap area. If the overlap area includes a standard CFA, standard demosaicing is applied there as well. If the overlap area includes a non-standard CFA, a special demosaicing algorithm is applied, depending on the CFA pattern used. In addition, in case the Tele sensor has a CFA, standard demosaicing is applied to reconstruct the missing pixel values in each pixel location and to generate a full RGB color image.
2. Registration Preparation
This step of the algorithm calculates the mapping between the overlap areas in the two luminance images. The registration step does not depend on the type of CFA used (or the lack thereof), as it is applied on luminance images. The same registration step can therefore be applied on Wide and Tele images captured by standard CFA sensors, as well as by any combination of CFAs or Clear sensor pixels disclosed herein. The registration process chooses either the Wide image or the Tele image to be a primary image. The other image is defined as an auxiliary image. The registration process considers the primary image as the baseline image and registers the overlap area in the auxiliary image to it, by finding for each pixel in the overlap area of the primary image its corresponding pixel in the auxiliary image. The output image point of view is determined according to the primary image point of view (camera angle). Various correspondence metrics could be used for this purpose, among which are a sum of absolute differences and correlation.
In an embodiment, the choice of the Wide image or the Tele image as the primary and auxiliary images is based on the ZF chosen for the output image. If the chosen ZF is larger than the ratio between the focal-lengths of the Tele and Wide cameras, the Tele image is set to be the primary image and the Wide image is set to be the auxiliary image. If the chosen ZF is smaller than or equal to the ratio between the focal-lengths of the Tele and Wide cameras, the Wide image is set to be the primary image and the Tele image is set to be the auxiliary image. In another embodiment independent of a zoom factor, the Wide image is always the primary image and the Tele image is always the auxiliary image. The output of the registration stage is a map relating Wide image pixels indices to matching Tele image pixels indices.
4. Combination into a High Resolution Image
In this final step, the primary and auxiliary images are used to produce a high resolution image. One can distinguish between several cases:
a. If the Wide image is the primary image, and the Tele image was generated from a Clear sensor, LumaWide is calculated and replaced or averaged with LumaTele in the overlap area between the two images to create a luminance output image, matching corresponding pixels according to the registration map LumaOut=c1*LumaWide+c2*LumaTele. The values of c1 and c2 may change between different pixels in the image. Then, RGB values of the output are calculated from LumaOut and RWide, GWide, and BWide.
b. If the Wide image is the primary image and the Tele image was generated from a CFA sensor, LumaTele is calculated and is combined with LumaWide in the overlap area between the two images, according to the flow described in 4a.
c. If the Tele image is the primary image generated from a Clear sensor, the RGB values of the output are calculated from the LumaTele image and RWide, GWide, and BWide (matching pixels according to the registration map).
d. If the Tele image is the primary image generated from a CFA sensor, the RGB values of the output (matching pixels according to the registration map) are calculated either by using only the Tele image data, or by also combining data from the Wide image. The choice depends on the zoom factor.
Certain portions of the registered Wide and Tele images are used to generate the output image based on the ZF of the output image. In an embodiment, if the ZF of the output image defines a FOV smaller than the Tele FOV, the fused high resolution image is cropped to the required field of view and digital interpolation is applied to scale up the image to the required output image resolution.
Exemplary and Non-Limiting Pixel Interpolations Specifications for the Overlap Area
In order to reconstruct the missing R22 pixel, we perform R22=(R31+R13)/2. The same operation is performed for all missing Blue pixels.
In order to reconstruct the missing B22 pixel, we perform B22=(B12+B21+B32+B23)/4. The same operation is performed for all missing Red pixels.
In order to reconstruct the missing C22 pixel, we perform C22=(C12+C21+C32+C23)/4. The same operation is performed for all missing Yellow pixels.
Case 1: W is Center Pixel
In order to reconstruct the missing 22 pixels, we perform the following:
B22=(B12+B32)/2
R22=(R21+R23)/2
G22=(W22−R22−B22) (assuming that W includes the same amount of R, G and B colors).
Case 2: R22 is Center Pixel
B22=(B11+R33)/2
In order to reconstruct the missing 22 pixels, we perform the following:
W22=(2*W21+W24)/3
G22=(W22−R22−B22) (assuming that W contains the same amount of R, G and B colors). The same operation is performed for Blue as the center pixel.
In order to reconstruct the missing 22 pixels, we perform the following:
B22=(B12+B32)/2
R22=(R21+R23)/2.
In order to reconstruct the missing 32 pixels, we perform the following:
G32=(2*G31+2*G22+G43)/5
R32=(R41+2*R42+2*R33+R23+R21)/7.
In order to reconstruct the missing 22 pixels, we perform the following:
B22=(2*B12+2*B23+B31)/5
R22=(2*R21+2*R32+R13)/5
and similarly for all other missing pixels.
In order to reconstruct the missing 22 pixels, we perform the following:
B22=(2*B12+2*B32+B13)/5
R22=(2*R21+2*R23+R11)/5.
In order to reconstruct the missing 32 pixels, we perform the following:
G32=(2*G22+G52)/3
R32=(2*R33+2*R42+R41+R21+R23)/7.
In order to reconstruct the missing 22 pixels, we perform the following:
B22=(B12+B32+B23+B21)/4
R22=(R11+R13+R31+R33)/4.
In order to reconstruct the missing 32 pixels, we perform the following:
G32=(2*G22+G52)/3
R32=(R42+R31+R33)/3.
Triple-Aperture Zoom Imaging System with Improved Color Resolution
As mentioned, a multi-aperture zoom or non-zoom imaging system disclosed herein may include more than two apertures. A non-limiting and exemplary embodiment 1100 of a triple-aperture imaging system is shown in
While this disclosure has been described in terms of certain embodiments and generally associated methods, alterations and permutations of the embodiments and methods will be apparent to those skilled in the art. For example, multi-aperture imaging systems with more than two Wide or Wide-Tele subsets (and sensors) or with more than one Tele subset (and sensor) may be constructed and used according to principles set forth herein. Similarly, non-zoom multi-aperture imaging systems with more than two sensors, at least one of which has a non-standard CFA, may be constructed and used according to principles set forth herein. The disclosure is to be understood as not limited by the specific embodiments described herein, but only by the scope of the appended claims.
This application is a Continuation application of U.S. patent application Ser. No. 14/386,823 (now allowed), which was a National Phase application from PCT patent application PCT/IB2013/060356 which claimed priority from U.S. Provisional Patent Application No. 61/730,570 having the same title and filed Nov. 28, 2012, the latter incorporated herein by reference in its entirety.This broadening reissue application is a continuation of U.S. patent application Ser. No. 16/383,618, filed Apr. 14, 2019, now U.S. Pat. No. RE48,4544E, which is a reissue application of U.S. patent application Ser. No. 15/375,090, filed Dec. 11, 2016, now U.S. Pat. No. 9,876,952, which is a continuation of U.S. patent application Ser. No. 14/386,823, filed Apr. 22, 2014, now U.S. Pat. No. 9,538,152, which was a National Phase application from PCT application PCT/IB2013/060356 which claimed priority from U.S. Provisional Patent Application No. 61/730,570 having the same title and filed Nov. 28, 2012, the latter incorporated herein by reference in its entirety. The following three co-pending applications are also continuation reissue applications of U.S. patent application Ser. No. 16/383,618, filed Apr. 14, 2019: U.S. patent application Ser. No. 16/384,140, filed Apr. 15, 2019, U.S. patent application Ser. No. 16/384,197, filed Apr. 15, 2019, and U.S. patent application Ser. No. 16/384,244, filed Apr. 15, 2019.
Number | Name | Date | Kind |
---|---|---|---|
4199785 | McCullough et al. | Apr 1980 | A |
5005083 | Grage et al. | Apr 1991 | A |
5032917 | Aschwanden | Jul 1991 | A |
5041852 | Misawa et al. | Aug 1991 | A |
5051830 | von Hoessle | Sep 1991 | A |
5099263 | Matsumoto et al. | Mar 1992 | A |
5248971 | Mandl | Sep 1993 | A |
5287093 | Amano et al. | Feb 1994 | A |
5394520 | Hall | Feb 1995 | A |
5436660 | Sakamoto | Jul 1995 | A |
5444478 | Lelong et al. | Aug 1995 | A |
5459520 | Sasaki | Oct 1995 | A |
5657402 | Bender et al. | Aug 1997 | A |
5682198 | Katayama et al. | Oct 1997 | A |
5768443 | Michael et al. | Jun 1998 | A |
5926190 | Turkowski et al. | Jul 1999 | A |
5940641 | McIntyre et al. | Aug 1999 | A |
5982951 | Katayama et al. | Nov 1999 | A |
6101334 | Fantone | Aug 2000 | A |
6128416 | Oura | Oct 2000 | A |
6148120 | Sussman | Nov 2000 | A |
6208765 | Bergen | Mar 2001 | B1 |
6268611 | Pettersson et al. | Jul 2001 | B1 |
6549215 | Jouppi | Apr 2003 | B2 |
6611289 | Yu et al. | Aug 2003 | B1 |
6643416 | Daniels et al. | Nov 2003 | B1 |
6650368 | Doron | Nov 2003 | B1 |
6680748 | Monti | Jan 2004 | B1 |
6714665 | Hanna et al. | Mar 2004 | B1 |
6724421 | Glatt | Apr 2004 | B1 |
6738073 | Park et al. | May 2004 | B2 |
6741250 | Furlan et al. | May 2004 | B1 |
6750903 | Miyatake et al. | Jun 2004 | B1 |
6778207 | Lee et al. | Aug 2004 | B1 |
7002583 | Rabb, III | Feb 2006 | B2 |
7015954 | Foote et al. | Mar 2006 | B1 |
7038716 | Klein et al. | May 2006 | B2 |
7199348 | Olsen et al. | Apr 2007 | B2 |
7206136 | Labaziewicz et al. | Apr 2007 | B2 |
7248294 | Slatter | Jul 2007 | B2 |
7256944 | Labaziewicz et al. | Aug 2007 | B2 |
7305180 | Labaziewicz et al. | Dec 2007 | B2 |
7339621 | Fortier | Mar 2008 | B2 |
7346217 | Gold, Jr. | Mar 2008 | B1 |
7365793 | Cheatle et al. | Apr 2008 | B2 |
7411610 | Doyle | Aug 2008 | B2 |
7424218 | Baudisch et al. | Sep 2008 | B2 |
7509041 | Hosono | Mar 2009 | B2 |
7533819 | Barkan et al. | May 2009 | B2 |
7561191 | May et al. | Jul 2009 | B2 |
7619683 | Davis | Nov 2009 | B2 |
7676146 | Border et al. | Mar 2010 | B2 |
7738016 | Toyofuku | Jun 2010 | B2 |
7773121 | Huntsberger et al. | Aug 2010 | B1 |
7809256 | Kuroda et al. | Oct 2010 | B2 |
7880776 | LeGall et al. | Feb 2011 | B2 |
7918398 | Li et al. | Apr 2011 | B2 |
7964835 | Olsen et al. | Jun 2011 | B2 |
7978239 | Deever et al. | Jul 2011 | B2 |
8094208 | Myhrvold | Jan 2012 | B2 |
8115825 | Culbert et al. | Feb 2012 | B2 |
8134115 | Koskinen et al. | Mar 2012 | B2 |
8149327 | Lin et al. | Apr 2012 | B2 |
8154610 | Jo et al. | Apr 2012 | B2 |
8179457 | Koskinen et al. | May 2012 | B2 |
8238695 | Davey et al. | Aug 2012 | B1 |
8274552 | Dahi et al. | Sep 2012 | B2 |
8390729 | Long et al. | Mar 2013 | B2 |
8391697 | Cho et al. | Mar 2013 | B2 |
8400555 | Georgiev et al. | Mar 2013 | B1 |
8439265 | Ferren et al. | May 2013 | B2 |
8446484 | Muukki et al. | May 2013 | B2 |
8483452 | Ueda et al. | Jul 2013 | B2 |
8514491 | Duparre | Aug 2013 | B2 |
8542287 | Griffith et al. | Sep 2013 | B2 |
8547389 | Hoppe et al. | Oct 2013 | B2 |
8553106 | Scarff | Oct 2013 | B2 |
8587691 | Takane | Nov 2013 | B2 |
8619148 | Watts et al. | Dec 2013 | B1 |
8660420 | Chang | Feb 2014 | B2 |
8803990 | Smith | Aug 2014 | B2 |
8896655 | Mauchly et al. | Nov 2014 | B2 |
8976255 | Matsuoto et al. | Mar 2015 | B2 |
9019387 | Nakano | Apr 2015 | B2 |
9025073 | Attar et al. | May 2015 | B2 |
9025077 | Attar et al. | May 2015 | B2 |
9041835 | Honda | May 2015 | B2 |
9137447 | Shibuno | Sep 2015 | B2 |
9185291 | Shabtay et al. | Nov 2015 | B1 |
9215377 | Sokeila et al. | Dec 2015 | B2 |
9215385 | Luo | Dec 2015 | B2 |
9270875 | Brisedoux et al. | Feb 2016 | B2 |
9286680 | Jiang et al. | Mar 2016 | B1 |
9344626 | Silverstein et al. | May 2016 | B2 |
9360671 | Zhou | Jun 2016 | B1 |
9369621 | Malone et al. | Jun 2016 | B2 |
9413930 | Geerds | Aug 2016 | B2 |
9413984 | Attar et al. | Aug 2016 | B2 |
9420180 | Jin | Aug 2016 | B2 |
9438792 | Nakada et al. | Sep 2016 | B2 |
9485432 | Medasani et al. | Nov 2016 | B1 |
9578257 | Attar et al. | Feb 2017 | B2 |
9618748 | Munger et al. | Apr 2017 | B2 |
9681057 | Attar et al. | Jun 2017 | B2 |
9723220 | Sugie | Aug 2017 | B2 |
9736365 | Laroia | Aug 2017 | B2 |
9736391 | Du et al. | Aug 2017 | B2 |
9768310 | Ahn et al. | Sep 2017 | B2 |
9800798 | Ravirala et al. | Oct 2017 | B2 |
9851803 | Fisher et al. | Dec 2017 | B2 |
9894287 | Qian et al. | Feb 2018 | B2 |
9900522 | Lu | Feb 2018 | B2 |
9927600 | Goldenberg et al. | Mar 2018 | B2 |
20020005902 | Yuen | Jan 2002 | A1 |
20020030163 | Zhang | Mar 2002 | A1 |
20020063711 | Park et al. | May 2002 | A1 |
20020075258 | Park et al. | Jun 2002 | A1 |
20020122113 | Foote | Sep 2002 | A1 |
20020167741 | Koiwai et al. | Nov 2002 | A1 |
20030030729 | Prentice et al. | Feb 2003 | A1 |
20030093805 | Gin | May 2003 | A1 |
20030160886 | Misawa et al. | Aug 2003 | A1 |
20030202113 | Yoshikawa | Oct 2003 | A1 |
20040008773 | Itokawa | Jan 2004 | A1 |
20040012683 | Yamasaki et al. | Jan 2004 | A1 |
20040017386 | Liu et al. | Jan 2004 | A1 |
20040027367 | Pilu | Feb 2004 | A1 |
20040061788 | Bateman | Apr 2004 | A1 |
20040141065 | Hara et al. | Jul 2004 | A1 |
20040141086 | Mihara | Jul 2004 | A1 |
20040240052 | Minefuji et al. | Dec 2004 | A1 |
20050013509 | Samadani | Jan 2005 | A1 |
20050046740 | Davis | Mar 2005 | A1 |
20050157184 | Nakanishi et al. | Jul 2005 | A1 |
20050168834 | Matsumoto et al. | Aug 2005 | A1 |
20050185049 | Iwai et al. | Aug 2005 | A1 |
20050200718 | Lee | Sep 2005 | A1 |
20060054782 | Olsen et al. | Mar 2006 | A1 |
20060056056 | Ahiska et al. | Mar 2006 | A1 |
20060067672 | Washisu et al. | Mar 2006 | A1 |
20060102907 | Lee et al. | May 2006 | A1 |
20060125937 | LeGall et al. | Jun 2006 | A1 |
20060170793 | Pasquarette et al. | Aug 2006 | A1 |
20060175549 | Miller et al. | Aug 2006 | A1 |
20060187310 | Janson et al. | Aug 2006 | A1 |
20060187322 | Janson et al. | Aug 2006 | A1 |
20060187338 | May et al. | Aug 2006 | A1 |
20060227236 | Pak | Oct 2006 | A1 |
20070024737 | Nakamura et al. | Feb 2007 | A1 |
20070126911 | Nanjo | Jun 2007 | A1 |
20070127040 | Davidovici | Jun 2007 | A1 |
20070177025 | Kopet et al. | Aug 2007 | A1 |
20070188653 | Pollock et al. | Aug 2007 | A1 |
20070189386 | Imagawa et al. | Aug 2007 | A1 |
20070257184 | Olsen et al. | Nov 2007 | A1 |
20070285550 | Son | Dec 2007 | A1 |
20080017557 | Witdouck | Jan 2008 | A1 |
20080024614 | Li et al. | Jan 2008 | A1 |
20080025634 | Border et al. | Jan 2008 | A1 |
20080030592 | Border et al. | Feb 2008 | A1 |
20080030611 | Jenkins | Feb 2008 | A1 |
20080084484 | Ochi et al. | Apr 2008 | A1 |
20080106629 | Kurtz et al. | May 2008 | A1 |
20080117316 | Orimoto | May 2008 | A1 |
20080129831 | Cho et al. | Jun 2008 | A1 |
20080218611 | Parulski et al. | Sep 2008 | A1 |
20080218612 | Border et al. | Sep 2008 | A1 |
20080218613 | Janson et al. | Sep 2008 | A1 |
20080219654 | Border et al. | Sep 2008 | A1 |
20090086074 | Li et al. | Apr 2009 | A1 |
20090109556 | Shimizu et al. | Apr 2009 | A1 |
20090122195 | Van Baar et al. | May 2009 | A1 |
20090122406 | Rouvinen et al. | May 2009 | A1 |
20090128644 | Camp et al. | May 2009 | A1 |
20090219547 | Kauhanen et al. | Sep 2009 | A1 |
20090252484 | Hasuda et al. | Oct 2009 | A1 |
20090295949 | Ojala | Dec 2009 | A1 |
20090324135 | Kondo et al. | Dec 2009 | A1 |
20100013906 | Border et al. | Jan 2010 | A1 |
20100020221 | Tupman et al. | Jan 2010 | A1 |
20100060746 | Olsen et al. | Mar 2010 | A9 |
20100097444 | Lablans | Apr 2010 | A1 |
20100103194 | Chen et al. | Apr 2010 | A1 |
20100165131 | Makimoto et al. | Jul 2010 | A1 |
20100196001 | Ryynänen et al. | Aug 2010 | A1 |
20100238327 | Griffith et al. | Sep 2010 | A1 |
20100259836 | Kang et al. | Oct 2010 | A1 |
20100277619 | Scarff | Nov 2010 | A1 |
20100283842 | Guissin et al. | Nov 2010 | A1 |
20100321494 | Peterson et al. | Dec 2010 | A1 |
20110058320 | Kim et al. | Mar 2011 | A1 |
20110063417 | Peters et al. | Mar 2011 | A1 |
20110063446 | McMordie et al. | Mar 2011 | A1 |
20110064327 | Dagher | Mar 2011 | A1 |
20110080487 | Venkataraman et al. | Apr 2011 | A1 |
20110121421 | Charbon et al. | May 2011 | A1 |
20110128288 | Petrou et al. | Jun 2011 | A1 |
20110164172 | Shintani et al. | Jul 2011 | A1 |
20110216228 | Kawamura | Sep 2011 | A1 |
20110229054 | Weston et al. | Sep 2011 | A1 |
20110234798 | Chou | Sep 2011 | A1 |
20110234853 | Hayashi et al. | Sep 2011 | A1 |
20110234881 | Wakabayashi et al. | Sep 2011 | A1 |
20110242286 | Pace et al. | Oct 2011 | A1 |
20110242355 | Goma et al. | Oct 2011 | A1 |
20110285730 | Lai et al. | Nov 2011 | A1 |
20110292258 | Adler et al. | Dec 2011 | A1 |
20110298966 | Kirschstein et al. | Dec 2011 | A1 |
20120026366 | Golan et al. | Feb 2012 | A1 |
20120044372 | Cote et al. | Feb 2012 | A1 |
20120062780 | Morihisa | Mar 2012 | A1 |
20120069235 | Imai | Mar 2012 | A1 |
20120075489 | Nishihara | Mar 2012 | A1 |
20120081566 | Cote | Apr 2012 | A1 |
20120105579 | Jeon et al. | May 2012 | A1 |
20120124525 | Kang | May 2012 | A1 |
20120154547 | Aizawa | Jun 2012 | A1 |
20120154614 | Moriya et al. | Jun 2012 | A1 |
20120196648 | Havens et al. | Aug 2012 | A1 |
20120229663 | Nelson et al. | Sep 2012 | A1 |
20120249815 | Bohn et al. | Oct 2012 | A1 |
20120287315 | Huang et al. | Nov 2012 | A1 |
20120320467 | Baik et al. | Dec 2012 | A1 |
20130002928 | Imai | Jan 2013 | A1 |
20130016427 | Sugawara | Jan 2013 | A1 |
20130063629 | Webster et al. | Mar 2013 | A1 |
20130076922 | Shihoh et al. | Mar 2013 | A1 |
20130093842 | Yahata | Apr 2013 | A1 |
20130094126 | Rappoport et al. | Apr 2013 | A1 |
20130113894 | Mirlay | May 2013 | A1 |
20130135445 | Dahi et al. | May 2013 | A1 |
20130136355 | Demandolx | May 2013 | A1 |
20130155176 | Paripally et al. | Jun 2013 | A1 |
20130182150 | Asakura | Jul 2013 | A1 |
20130201360 | Song | Aug 2013 | A1 |
20130202273 | Ouedraogo et al. | Aug 2013 | A1 |
20130235224 | Park et al. | Sep 2013 | A1 |
20130250150 | Malone et al. | Sep 2013 | A1 |
20130258044 | Betts-LaCroix | Oct 2013 | A1 |
20130270419 | Singh et al. | Oct 2013 | A1 |
20130278785 | Nomura et al. | Oct 2013 | A1 |
20130321668 | Kamath | Dec 2013 | A1 |
20140009631 | Topliss | Jan 2014 | A1 |
20140049615 | Uwagawa | Feb 2014 | A1 |
20140118584 | Lee et al. | May 2014 | A1 |
20140192238 | Attar et al. | Jul 2014 | A1 |
20140192253 | Laroia | Jul 2014 | A1 |
20140218587 | Shah | Aug 2014 | A1 |
20140313316 | Olsson et al. | Oct 2014 | A1 |
20140362242 | Takizawa | Dec 2014 | A1 |
20150002683 | Hu et al. | Jan 2015 | A1 |
20150042870 | Chan et al. | Feb 2015 | A1 |
20150070781 | Cheng et al. | Mar 2015 | A1 |
20150092066 | Geiss et al. | Apr 2015 | A1 |
20150103147 | Ho et al. | Apr 2015 | A1 |
20150138381 | Ahn | May 2015 | A1 |
20150154776 | Zhang et al. | Jun 2015 | A1 |
20150162048 | Hirata et al. | Jun 2015 | A1 |
20150195458 | Nakayama et al. | Jul 2015 | A1 |
20150215516 | Dolgin | Jul 2015 | A1 |
20150237280 | Choi et al. | Aug 2015 | A1 |
20150242994 | Shen | Aug 2015 | A1 |
20150244906 | Wu et al. | Aug 2015 | A1 |
20150253543 | Mercado | Sep 2015 | A1 |
20150253647 | Mercado | Sep 2015 | A1 |
20150261299 | Wajs | Sep 2015 | A1 |
20150271471 | Hsieh et al. | Sep 2015 | A1 |
20150281678 | Park et al. | Oct 2015 | A1 |
20150286033 | Osborne | Oct 2015 | A1 |
20150316744 | Chen | Nov 2015 | A1 |
20150334309 | Peng et al. | Nov 2015 | A1 |
20160044250 | Shabtay et al. | Feb 2016 | A1 |
20160070088 | Koguchi | Mar 2016 | A1 |
20160154202 | Wippermann et al. | Jun 2016 | A1 |
20160154204 | Lim et al. | Jun 2016 | A1 |
20160212358 | Shikata | Jul 2016 | A1 |
20160212418 | Demirdjian et al. | Jul 2016 | A1 |
20160241751 | Park | Aug 2016 | A1 |
20160291295 | Shabtay et al. | Oct 2016 | A1 |
20160295112 | Georgiev et al. | Oct 2016 | A1 |
20160301840 | Du et al. | Oct 2016 | A1 |
20160353008 | Osborne | Dec 2016 | A1 |
20160353012 | Kao et al. | Dec 2016 | A1 |
20170019616 | Zhu et al. | Jan 2017 | A1 |
20170070731 | Darling et al. | Mar 2017 | A1 |
20170187962 | Lee et al. | Jun 2017 | A1 |
20170214846 | Du et al. | Jul 2017 | A1 |
20170214866 | Zhu et al. | Jul 2017 | A1 |
20170242225 | Fiske | Aug 2017 | A1 |
20170289458 | Song et al. | Oct 2017 | A1 |
20180013944 | Evans, V et al. | Jan 2018 | A1 |
20180017844 | Yu et al. | Jan 2018 | A1 |
20180024329 | Goldenberg et al. | Jan 2018 | A1 |
20180059379 | Chou | Mar 2018 | A1 |
20180120674 | Avivi et al. | May 2018 | A1 |
20180150973 | Tang et al. | May 2018 | A1 |
20180176426 | Wei et al. | Jun 2018 | A1 |
20180198897 | Tang et al. | Jul 2018 | A1 |
20180241922 | Baldwin et al. | Aug 2018 | A1 |
20180295292 | Lee et al. | Oct 2018 | A1 |
20180300901 | Wakai et al. | Oct 2018 | A1 |
20190121103 | Bachar et al. | Apr 2019 | A1 |
Number | Date | Country |
---|---|---|
101276415 | Oct 2008 | CN |
201514511 | Jun 2010 | CN |
102739949 | Oct 2012 | CN |
103024272 | Apr 2013 | CN |
103841404 | Jun 2014 | CN |
1536633 | Jun 2005 | EP |
1780567 | May 2007 | EP |
2523450 | Nov 2012 | EP |
S59191146 | Oct 1984 | JP |
04211230 | Aug 1992 | JP |
H07318864 | Dec 1995 | JP |
08271976 | Oct 1996 | JP |
2002010276 | Jan 2002 | JP |
2003298920 | Oct 2003 | JP |
2004133054 | Apr 2004 | JP |
2004245982 | Sep 2004 | JP |
2005099265 | Apr 2005 | JP |
2006238325 | Sep 2006 | JP |
2007228006 | Sep 2007 | JP |
2007306282 | Nov 2007 | JP |
2008076485 | Apr 2008 | JP |
2010204341 | Sep 2010 | JP |
2011085666 | Apr 2011 | JP |
2013106289 | May 2013 | JP |
20070005946 | Jan 2007 | KR |
20090058229 | Jun 2009 | KR |
20100008936 | Jan 2010 | KR |
20140014787 | Feb 2014 | KR |
101477178 | Dec 2014 | KR |
20140144126 | Dec 2014 | KR |
20150118012 | Oct 2015 | KR |
2000027131 | May 2000 | WO |
2004084542 | Sep 2004 | WO |
2006008805 | Jan 2006 | WO |
2009097552 | Aug 2009 | WO |
2010122841 | Oct 2010 | WO |
2014072818 | May 2014 | WO |
2017025822 | Feb 2017 | WO |
2017037688 | Mar 2017 | WO |
2018130898 | Jul 2018 | WO |
Entry |
---|
Statistical Modeling and Performance Characterization of a Real-Time Dual Camera Surveillance System, Greienhagen et al., Publisher: IEEE, 2000, 8 pages. |
A 3MPixel Multi-Aperture Image Sensor with 0.7μm Pixels in 0.11μm CMOS, Fife et al., Stanford University, 2008, 3 pages. |
Dual camera intelligent sensor for high definition 360 degrees surveillance, Scotti et al., Publisher: IET, May 9, 2000, 8 pages. |
Dual-sensor foveated imaging system, Hua et al., Publisher: Optical Society of America, Jan. 14, 2008, 11 pages. |
Defocus Video Matting, McGuire et al., Publisher: ACM Siggraph, Jul. 31, 2005, 11 pages. |
Compact multi-aperture imaging with high angular resolution, Santacana et al., Publisher: Optical Society of America, 2015, 10 pages. |
Multi-Aperture Photography, Green et al., Publisher: Mitsubishi Electric Research Laboratories, Inc., Jul. 2007, 10 pages. |
Multispectral Bilateral Video Fusion, Bennett et al., Publisher: IEEE, May 2007, 10 pages. |
Super-resolution imaging using a camera array, Santacana et al., Publisher: Optical Society of America, 2014, 6 pages. |
Optical Splitting Trees for High-Precision Monocular Imaging, McGuire et al., Publisher: IEEE, 2007, 11 pages. |
High Performance Imaging Using Large Camera Arrays, Wilburn et al., Publisher: Association for Computing Machinery, Inc., 2005, 12 pages. |
Real-time Edge-Aware Image Processing with the Bilateral Grid, Chen et al., Publisher: ACM Siggraph, 2007, 9 pages. |
Superimposed multi-resolution imaging, Carles et al., Publisher: Optical Society of America, 2017, 13 pages. |
Viewfinder Alignment, Adams et al., Publisher: Eurographics, 2008, 10 pages. |
Dual-Camera System for Multi-Level Activity Recognition, Bodor et al., Publisher: IEEE, Oct. 2014, 6 pages. |
Engineered to the task: Why camera-phone cameras are different, Giles Humpston, Publisher: Solid State Technology, Jun. 2009, 3 pages. |
Office Action in related CN patent application No. 202110100089.9, dated Jul. 21, 2021. |
International Search Report and Written Opinion issued in related PCT patent application PCT/IB2013/060356, dated Apr. 17, 2014, 15 pages. |
Number | Date | Country | |
---|---|---|---|
61730570 | Nov 2012 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16383618 | Apr 2019 | US |
Child | 15375090 | US | |
Parent | 14386823 | US | |
Child | 15375090 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15375090 | Dec 2016 | US |
Child | 16419604 | US | |
Parent | 15375090 | Dec 2016 | US |
Child | 16383618 | US |