Synthetically enlarged camera aperture

Information

  • Patent Grant
  • 11695896
  • Patent Number
    11,695,896
  • Date Filed
    Tuesday, February 2, 2021
    3 years ago
  • Date Issued
    Tuesday, July 4, 2023
    11 months ago
Abstract
Methods for obtaining a shallow depth of field effect (DOF) and improved signal-to-noise (SNR) in an image through synthetically increase the camera aperture of a compact camera using at least one actuator included in such a camera for other known purposes, for example for providing optical image stabilization (OIS). The synthetically enlarged camera aperture enables to take a plurality of images at different aperture positions. The plurality of images is processed into an image with shallow DOF and improved SNR.
Description
FIELD

Embodiments disclosed herein relate in general to digital cameras, and in particular to miniature folded and non-folded digital cameras.


BACKGROUND

Compact cameras, such as those that are incorporated in smartphones, have typically small apertures with a size of a few millimeters (mm) (e.g. 1-5 mm). The relatively small zo aperture of the camera (compared with cameras with larger aperture) causes at least the following handicaps:


a) the amount of light that can be absorbed by the camera image sensor in a given period of time is limited. This results in poor signal to noise ratio (SNR) ratio when capturing an image in low light situations; and


b) the small aperture, when coupled with a relatively short focal length (e.g. 3-15 mm) due to the physical dimensions of the camera, causes a relatively wide depth of field (DOF). This is contrary to the shallow DOF or “bokeh” effect that is a sought-after property in smartphone devices. Note that “shallow DOF” and “bokeh” are used herein interchangeably.


In known art, the bokeh effect is achieved with a dual-camera setup, by calculating a depth map from two camera images obtained from two separate cameras and by digitally blurring the images according to the depth map.


Compact cameras in smartphones and other hand-held personal electronic devices have different types of actuators. In an example, they often have an optical image stabilization (OIS) actuator that can move the lens barrel (or simply the “lens”) of the camera in a plane parallel to the image sensor plane. In folded cameras, in which an optical path from an object to be photographed is folded toward the image sensor by an optical path folding element (OPFE), for example a prism, OIS is known to be achieved by shifting the lens barrel laterally, in parallel to the sensor plane, or by tilting the prism (see for example co-owned published international patent application WO2016166730).


SUMMARY

In the contexts of the following disclosure, DOF is defined as the distance (in meters, cm, etc.) around the plane of focus (POF) in which objects appear acceptably sharp in an image. A shallow DOF is such that the distance is small (e.g. less than 20% of the object distance from the camera) and a wide DOF is such that the distance is large (e.g. more than 30% of the object distance from the camera).


In various exemplary embodiments, there are provided methods for synthetically enlarging a camera aperture and for obtaining shallow DOF effects in folded and non-folded (also referred to as “upright”, “straight”, standing” or “vertical”) compact cameras using dedicated and/or existing actuators, and in particular OIS actuators included in such cameras. The miniature cameras for example in camera incorporated in smartphones, tablet computers, laptop computer, smart televisions, smart home assistant devices, drones, baby monitors, video conference rooms, surveillance cameras, cars, robots and other personalized electronic devices. Known OIS actuators, for example similar to those disclosed in co-owned published international patent application WO2016166730 (folded case) and see for example co-owned international patent application WO20160156996 (non-folded case), may be “modified” by increasing the size and/or length of their magnets and/or coils and/or rails to enable longer movement range (e.g. up to about ±2 mm) of elements in folded and non-folded compact cameras. Henceforth, such OIS actuators will be referred to as “modified OIS actuators”.


The following description refers to relative movements of one camera element (for example the lens, prism, or both) vs. another camera element (for example the image sensor) in an exemplary orthogonal XYZ coordinate system. The exemplary coordinate system is for reference and for understanding inventive features disclosed herein, and should not be considered limiting.


The new use of dedicated and/or existing camera actuators in general and OIS actuators (regular or modified) in particular, coupled with an image acquisition system and a post-processing algorithm, can synthetically increase the size of the aperture, providing better signal-to-noise ratio (SNR) and a shallower DOF. The term “synthetically increasing” or “synthetically enlarging” as applied herein to a camera aperture refers to the camera aperture size being effectively (but not physically) increased by capturing different (e.g. a plurality N of) images with the aperture in (N) different positions. The camera aperture is shifted laterally relative to the sensor by a significant amount (for example by a few mm) and several images are captured, each with the aperture located in a different position relative to the sensor. To clarify, the physical aperture size remains constant. Then, by aligning all captured images with respect to a certain in-focus object at a certain distance from the camera and by averaging them, objects outside the plane of focus will blur due to the parallax effect, thereby providing a shallow DOF effect. In some embodiments, the modifications to the OIS actuators to obtain modified OIS actuators enable large enough movement of the aperture of the optical system relative to the original position, so that the resulting parallax effect will be significant, i.e. shallower DOF by 10%, 20% or even 50% from the DOF of a single frame.


In exemplary embodiments there is provided a method comprising providing a camera that includes a camera aperture, a lens having a lens optical axis, an image sensor and an actuator, and operating the at least one actuator to synthetically enlarge the camera aperture to obtain a shallow DOF effect and improved SNR in an image formed from a plurality of images obtained with the image sensor.


In some exemplary embodiments, the actuator is an OIS actuator.


In some exemplary embodiments, the actuator is a modified OIS actuator with an extended actuation range. The extended actuation range may be a range of up to ±2 mm, and more specifically between ±1-2 mm.


In an exemplary embodiment, the operating the actuator to synthetically enlarge the camera aperture includes operating the actuator to move the camera aperture to a plurality of positions, wherein each of the plurality of images is obtained in a respective camera aperture position.


In an exemplary embodiment, the operating the actuator to synthetically enlarge the camera aperture includes operating the actuator to move the lens relative to the image sensor in a first direction substantially perpendicular to the lens optical axis.


In an exemplary embodiment, the operating the actuator to synthetically enlarge the camera aperture includes operating the actuator to move the lens relative to the image sensor in a second direction substantially perpendicular to the lens optical axis, wherein the second direction is not parallel to the first direction.


In some exemplary embodiments, the first and second directions are orthogonal to each other.


In some exemplary embodiments, the camera is a non-folded camera.


In some exemplary embodiments, the camera is a folded camera.


In an exemplary embodiment, the camera is a folded camera that further includes an optical path folding element (OPFE) that folds light from a first optical axis to the lens optical axis.


In an exemplary embodiment in which the camera is folded the operating the actuator to synthetically enlarge the camera aperture includes operating the actuator to move the camera aperture to a plurality of positions, wherein each of the plurality of images is obtained in a respective camera aperture position.


In an exemplary embodiment in which the camera is folded, the operating the actuator to synthetically enlarge the camera aperture includes operating the actuator to move the lens relative to the image sensor in a first direction substantially perpendicular to the lens optical axis.


In an exemplary embodiment in which the camera is folded, the operating the actuator to synthetically enlarge the camera aperture includes operating the actuator to move the lens relative to the image sensor in a second direction substantially perpendicular to the lens optical axis, wherein the second direction is not parallel to the first direction.





BRIEF DESCRIPTION OF THE DRAWINGS

Aspects, embodiments and features disclosed herein will become apparent from the following detailed description when considered in conjunction with the accompanying drawings, in which:



FIG. 1A shows an aperture and lens barrel centered relative to an image sensor in a non-folded camera in (a) top view and (b) side view;



FIG. 1B shows the aperture and lens barrel of FIG. 1A after a lateral shift in a first direction relative to the image sensor in (a) top view and (b) side view;



FIG. 1C shows the aperture and lens barrel of FIG. 1A after a lateral shift in a second direction orthogonal to the first direction relative to the image sensor in (a) top view and (b) side view;



FIG. 2A shows an exemplary embodiment of a folded camera disclosed herein in (a) top view and (b) side view;



FIG. 2B shows the aperture, prism and lens barrel of the camera of FIG. 2A after a lateral shift of the lens in a first direction relative to the image sensor in (a) top view and (b) side view;



FIG. 2C shows the aperture, prism and lens barrel of the camera of FIG. 2A after a shift of the prism in a second direction orthogonal to the first direction relative to the image sensor in (a) top view and (b) side view;



FIG. 2D shows the aperture, prism and the lens barrel of the camera of FIG. 2A after a shift of the lens in a second direction orthogonal to the first direction relative to the image sensor in (a) top view and (b) side view;



FIG. 3A shows schematically an example of 5 possible positions of the camera aperture relative to the image sensor, corresponding to 5 different frames captured by camera of FIGS. 1 and/or 2



FIG. 3B shows schematically an example of nine possible positions of the camera aperture relative to the image sensor, corresponding to nine different frames captured by a camera of FIGS. 1 and/or 2;



FIG. 3C shows schematically an example of three possible positions of the camera aperture relative to the image sensor, corresponding to three different frames captured by a camera of FIGS. 1 and/or 2;



FIG. 4 shows in a flow chart stages of an exemplary method that uses a synthetically increased camera aperture to obtain a shallow DOF and improved SNR.





DETAILED DESCRIPTION


FIGS. 1A-C show an exemplary embodiment of a camera 100 exhibiting a synthetically enlarged camera aperture disclosed herein. Camera 100 can be exemplary an upright camera. Camera 100 comprises a lens 104 and an image sensor 106. Camera 100 may further comprise other elements which are not shown for simplicity and are known in the art, such as: protective shield, infra-red (IR) filter, focusing mechanism (e.g. actuator), electrical connectivity, etc. Lens 104 may be for example a fixed-focal-length-lens, characterized by an effective-focal-length (EFL), and an aperture 102. Aperture 102 defines the aperture of camera 100, and the two terms (camera aperture and lens aperture) are used herein interchangeably. The ratio between the EFL and the lens aperture diameter is known as the camera or lens “f-number”. For example, the EFL of lens 104 may be in the range of 3-15 mm. For example, the diameter of aperture 102 may be in the range of 1-6 mm. For example, the f-number of lens may be in the range of 1.2-3.2. However, all these exemplary numbers (EFL, aperture diameter, f-numbers) are not limiting. Camera 100 also comprises a first actuator, which is not shown in the figures. The first actuator may move (shift, actuate) lens 104 and aperture 102 with respect to image sensor 106 in the X-Y plane (parallel to the image sensor plane). Camera 100 may also include a second actuator (not shown) which may move (shift, actuate) lens 104 in the Z direction with respect to image sensor 106 for focusing purposes. In an example embodiment, the first actuator used for moving the lens may shift the lens in the X-Y plane, with a moving range on the order of ±2 mm along each axis. The first and/or second actuators may be similar in structure to actuators used in a non-folded compact camera for OIS but with a significantly increased range, from (typically) a few hundreds of microns to an order of +/−2 mm along each axis. In some embodiments, the first and/or second actuator may be not similar to an OIS actuator. The following non-limiting description gives an example with the use of modified OIS actuators, with the underling understanding that other methods of actuations may be used. The significantly increased movement range may be achieved by using current and known OIS technology (for example ball-bearing voice coil motor (VCM) technology), with larger magnets and coils and with longer rails, which allows a larger range of movement (e.g. in a range ±1-2 mm range of movement). Other actuation methods may also be applied, providing the lens can be shifted with the specified range with respect to the sensor (e.g. stepper motors, piezoelectric motors, shape memory alloy (SMA) actuators, etc.).


Returning now to FIGS. 1A-C, each figure shows the lens and lens aperture position vs. the image sensor from, respectively, a top view (a) and a schematic side view or 3D rendering (b). FIG. 1A shows upright camera 100 with aperture 102 and lens 104 centered vs. image sensor 106 in an “original position”. FIGS. 1B and 1C show camera 100 with aperture 102 and lens 104 shifted respectively in (a) in a first direction (+X) and in (b) in a second direction (−Y) relative to image sensor 106 (i.e. shifted in two orthogonal directions vs. an optical axis 110 parallel to the Z direction, along which light enters the lens toward image sensor 106). The shift of the lens from its original position in FIGS. 1B and 1C can be in the range of a few mm, for example 2 mm. The two positions in FIGS. 1B and 1C are only an example, and other shift directions and positions are also possible in the X-Y plane, for example −X shift, +Y shift and a shift in a combined X and Y directions. In particular, note that while shift directions described herein are orthogonal to each other, in some embodiments shifts may occur in directions that are not orthogonal to other shift directions.



FIGS. 2A-D show an exemplary embodiment of a folded camera 200 exhibiting a synthetically enlarged camera aperture disclosed herein. Each figure shows the aperture and lens position vs. the image sensor from, respectively, a top view (a) and a schematic side view or 3D rendering (b). Camera 200 includes a camera aperture 202, a lens 204, an image sensor 206 and an OPFE (for example a prism) 208. Like camera 100, camera 200 may include other elements not shown, for simplicity, and are known in the art. FIG. 2A shows the “original position” of camera aperture 202, centered on a first optical axis 210. In the folded camera, light entering the lens along first optical axis 210 is folded to continue along a second optical axis 212 toward image sensor 206. FIG. 2B shows lens 204 shifted relative to image sensor 206 in a first (e.g. +Y) direction. FIG. 2C shows camera aperture 202 and prism 208 shifted relative to lens 204 in a second direction (e.g. −X) which is orthogonal to the first direction. FIG. 2D shows lens 204 shifted relative to image sensor 206 in a third direction (e.g. +Z). The motion direction shown in FIG. 2B-2D are only an example, and other shift direction may exist, in particular any linear combination of the shift directions shown.


In FIG. 2B, the lens is moved in a direction similar to that in FIG. 1B. In FIG. 2C, the lens is stationary, but the prism moves to create the same optical affect as in FIG. 1C. In FIG. 2D, the lens moves in a third direction, resulting in an optical effect similar to that in FIG. 2C.


An actuator (not shown) as disclosed in co-owned published international patent applications WO2016166730 (folded) and WO20160156996 (non-folded), may be modified to increase the movement (range) of the lens barrel from a few hundreds of microns to a movement on the order of ±2 mm, to enable shifting the aperture and/or lens along a first direction. The prism may be moved in a second direction, to bring it closer or further away from the edge of the lens, thus also shifting the camera aperture. In other words, in the exemplary coordinate systems shown in FIGS. 1 and 2, a “longer range” modified OIS actuator can shift the lens and camera aperture by up to about ±2 mm in the X-Y directions in a non-folded camera, and by up to about ±2 mm in the X-Y directions or in Y-Z directions in a folded camera. The prism can be moved for example using the same technology that is used to move the lens, but in a range that positions the prism closer/farther from the edge of the lens.


An exemplary method that uses a synthetically increased camera aperture to obtain a shallow DOF and improved SNR is provided with reference to FIG. 4. The exemplary method can be performed, for example, with a camera similar to camera 100 or camera 200:


Acquisition stage (FIG. 4, step 402): a plurality N of images of the same scene is acquired in rapid succession (for example in 10-50 ms per image, to prevent a significant object movement during the acquisition process) when moving the position of the aperture between images, so that each image is captured when the aperture is at a different position. In an example of an exposure rate of 10-60 ms, N would typically be in the range of 3-9 frames. In an example of high illumination conditions, a high frame rate may be obtained such that N may be on the order of a few tens or even a few hundreds of frames.



FIG. 3A shows schematically an example of N=5 possible positions 302-310 of the aperture relative to image sensor 106, corresponding to N=5 different frames captured by the camera (e.g. camera 100 or 200). In other examples, N may differ from 5. FIG. 3B shows schematically an example of N=9 possible positions 312-328 of the aperture relative to image sensor 106, corresponding to N=9 different frames captured by the camera (e.g. camera 100 or 200). FIG. 3C shows schematically an example of N=3 possible positions 330-334 of the aperture relative to image sensor 106, corresponding to N=3 different frames captured by the camera (e.g. camera 100 or 200).


In an embodiment, the first or second actuator may be a closed loop actuator, such that a position indication and a settling (i.e. arrival of the actuator to the target position) indication may be provided to the camera. In an embodiment, the exposure of the camera maybe synchronized with the motion of the actuator, such that the sequence of N frames acquisition may be constructed from a repetitive stage of (1) frame exposure, (2) actuator motion to a new position, (3) actuator settling. Actuator settling may be such that the actuator does not shift the lens during exposure by more than 1 pixel, 2 pixels or 5 pixels.


Processing Stage: This Stage Includes Three Steps:


a) Frame stack alignment (FIG. 4, step 404): continuing the example in step 402, the N (e.g. 5) frames are aligned according to a certain region of interest (ROI). Alignment may be performed using several methods. According to one example, a known frame alignment procedure such as image registration can be used. According to a second example, frames may be aligned by calculating the expected image shift resulted from known lens or prism shift. According to a third example, data from an inertial measurement system (e.g. a gyroscope accelerometer) may be used to calculate image shift caused by camera handshakes. Lastly, a combination of some or all the alignments above may be used. The ROI may be the same ROI that was used to determine the focus of the camera, may be an ROI indicated by a user via a user-interface, or may be chosen automatically, by detecting the position of an object of interest (for example, a face, detected by known face-detection methods) in the image and by choosing the ROI to include that object of interest. The alignment may compensate for image shifts. The alignment may also include distortion correction that can be introduced by the shifting of the aperture. The alignment may include shifting the images, or may include cropping the images around some point, which may not be the center point of the image. After the alignment, all objects positioned in the same focus plane that corresponds to the object in the chosen ROI are aligned between the frames. Objects at distances outside that plane are misaligned.


b) Frame averaging—(FIG. 4, step 406) the frames are averaged together, using for example a known procedure. In this averaging process, pixels that belong to aligned objects will not suffer any blur in the averaging (i.e. the resulting object in the output image will look the same as in any of the captured images, only with better SNR (approximately increased by a factor IN, where N is the number of averaged images), while pixels that belong to misaligned objects will be averaged with misaligned pixels and will therefore suffer blurring and will appear blurred in the output image. The blurring of object outside the focus plane will result in a shallower DOF, resulting DOF may be shallower by 10%, 20% or 50% than the original DOF (of a single frame). The same effect occurs optically when the lens has a much larger aperture—the object that is in the focus plane (i.e. in the depth position where the lens is focused at) is sharp, and objects in out-of-focus planes (i.e. outside the depth position where the lens is focused at) are blurry. Therefore, the proposed system synthetically enlarges the aperture. According to some example the system may discard some of the N frames, such that discarded frames are not included in the averaging. Discarding frames can be done for example due to blurry images, mis-focus of ROI, etc.


c) Post processing (FIG. 4, step 408): extra stages of processing such as refinement of the blur, adding additional blur, sharpening, etc., may be applied on the synthetically-increased-camera-aperture output image, using for example a known procedure. The output of the camera may include one or all of the original N frames in addition to the synthetically-increased-camera-aperture output image.


Processing steps 404-408 may be performed immediately after frame acquisition step 402, or at later time. Processing steps 404-408 may be done in the host device (e.g. in the host CPU, GPU etc.), or outside the host device (e.g. by cloud computing).


The added blur for objects outside the chosen plane of focus (defined by the object in the chosen ROI) will result in a shallow DOF effect in the output image, compared with any of the images in the stack (the input image to the algorithm). On the focused objects, the averaging of frames will result in better signal to noise ratio compared with any of the images in the stack.


While this disclosure describes a limited number of embodiments, it will be appreciated that many variations, modifications and other applications of such embodiments may be made. In general, the disclosure is to be understood as not limited by the specific embodiments described herein, but only by the scope of the appended claims.


All references mentioned in this specification are herein incorporated in their entirety by reference into the specification, to the same extent as if each individual reference was specifically and individually indicated to be incorporated herein by reference. In addition, citation or identification of any reference in this application shall not be construed as an admission that such reference is available as prior art to the present application.

Claims
  • 1. A method, comprising: in a camera that includes a camera aperture, an image sensor and an actuator, operating the actuator to move the camera aperture relative to the image sensor to a plurality of different aperture positions;capturing a plurality of images, wherein each image of the plurality of images is captured at a respective different camera aperture position; andprocessing the plurality of images into a single image that exhibits a shallow depth of focus effect.
  • 2. The method of claim 1, wherein the actuator is an optical image stabilization (OIS) actuator.
  • 3. The method of claim 1, wherein each image capture includes a respective camera exposure synchronized with a respective camera aperture movement.
  • 4. The method of claim 1, wherein the actuator is a closed-loop actuator and wherein the method further comprises providing an aperture position and a settling indication.
  • 5. The method of claim 1, wherein the capturing of each of the plurality of images includes performing a sequence of frame exposure, movement of the actuator to a new position and actuator settling.
  • 6. The method of claim 1, wherein the camera is a folded camera.
  • 7. The method of claim 2, wherein the camera is a non-folded camera.
  • 8. The method of claim 2, wherein the OIS actuator as an extended actuation range.
  • 9. The method of claim 8, wherein the extended actuation range includes a range of up to ±2 mm.
  • 10. The method of claim 9, wherein the extended actuation range is between ±1-2 mm.
  • 11. A camera, comprising: a camera aperture;an image sensor; andan actuator operative to move the camera aperture relative to the image sensor to a plurality of different aperture positions,wherein the camera is operative to capture a plurality of images, wherein each image of the plurality of images is captured at a respective different aperture position, and to process the plurality of images into a single image that exhibits a shallow depth of focus effect.
  • 12. The camera of claim 1, wherein the actuator is an optical image stabilization (OIS) actuator.
  • 13. The camera of claim 11, wherein the actuator is a closed-loop actuator and wherein the method further comprises providing an aperture position and a settling indication.
  • 14. The camera of claim 11, wherein the camera is a folded camera.
  • 15. The camera of claim 11, wherein the camera is a non-folded camera.
  • 16. The camera of claim 12, wherein the OIS actuator has an extended actuation range.
  • 17. The method of claim 16, wherein the extended actuation range includes a range of up to ±2 mm.
  • 18. The method of claim 17, wherein the extended actuation range is between ±1-2 mm.
CROSS REFERENCE TO RELATED APPLICATIONS

This is a continuation application from U.S. patent application Ser. No. 16/121,049 filed Sep. 4, 2018 (now allowed), and is related to and claims the benefit of U.S. Provisional patent application 62/567,287 filed Oct. 3, 2017, which is incorporated herein by reference in its entirety.

US Referenced Citations (289)
Number Name Date Kind
4199785 McCullough et al. Apr 1980 A
5005083 Grage et al. Apr 1991 A
5032917 Aschwanden Jul 1991 A
5041852 Misawa et al. Aug 1991 A
5051830 von Hoessle Sep 1991 A
5099263 Matsumoto et al. Mar 1992 A
5248971 Mandl Sep 1993 A
5287093 Amano et al. Feb 1994 A
5394520 Hall Feb 1995 A
5436660 Sakamoto Jul 1995 A
5444478 Lelong et al. Aug 1995 A
5459520 Sasaki Oct 1995 A
5657402 Bender et al. Aug 1997 A
5682198 Katayama et al. Oct 1997 A
5768443 Michael et al. Jun 1998 A
5926190 Turkowski et al. Jul 1999 A
5940641 McIntyre et al. Aug 1999 A
5982951 Katayama et al. Nov 1999 A
6101334 Fantone Aug 2000 A
6128416 Oura Oct 2000 A
6148120 Sussman Nov 2000 A
6208765 Bergen Mar 2001 B1
6268611 Pettersson et al. Jul 2001 B1
6549215 Jouppi Apr 2003 B2
6611289 Yu et al. Aug 2003 B1
6643416 Daniels et al. Nov 2003 B1
6650368 Doron Nov 2003 B1
6680748 Monti Jan 2004 B1
6714665 Hanna et al. Mar 2004 B1
6724421 Glatt Apr 2004 B1
6738073 Park et al. May 2004 B2
6741250 Furlan et al. May 2004 B1
6750903 Miyatake et al. Jun 2004 B1
6778207 Lee et al. Aug 2004 B1
7002583 Rabb, III Feb 2006 B2
7015954 Foote et al. Mar 2006 B1
7038716 Klein et al. May 2006 B2
7199348 Olsen et al. Apr 2007 B2
7206136 Labaziewicz et al. Apr 2007 B2
7248294 Slatter Jul 2007 B2
7256944 Labaziewicz et al. Aug 2007 B2
7305180 Labaziewicz et al. Dec 2007 B2
7339621 Fortier Mar 2008 B2
7346217 Gold, Jr. Mar 2008 B1
7365793 Cheatle et al. Apr 2008 B2
7411610 Doyle Aug 2008 B2
7424218 Baudisch et al. Sep 2008 B2
7509041 Hosono Mar 2009 B2
7533819 Barkan et al. May 2009 B2
7619683 Davis Nov 2009 B2
7738016 Toyofuku Jun 2010 B2
7773121 Huntsberger et al. Aug 2010 B1
7809256 Kuroda et al. Oct 2010 B2
7880776 LeGall et al. Feb 2011 B2
7918398 Li et al. Apr 2011 B2
7964835 Olsen et al. Jun 2011 B2
7978239 Deever et al. Jul 2011 B2
8115825 Culbert et al. Feb 2012 B2
8149327 Lin et al. Apr 2012 B2
8154610 Jo et al. Apr 2012 B2
8238695 Davey et al. Aug 2012 B1
8274552 Dahi et al. Sep 2012 B2
8390729 Long Mar 2013 B2
8391697 Cho et al. Mar 2013 B2
8400555 Georgiev et al. Mar 2013 B1
8439265 Ferren et al. May 2013 B2
8446484 Muukki et al. May 2013 B2
8483452 Ueda et al. Jul 2013 B2
8514491 Duparre Aug 2013 B2
8547389 Hoppe et al. Oct 2013 B2
8553106 Scarff Oct 2013 B2
8587691 Takane Nov 2013 B2
8619148 Watts et al. Dec 2013 B1
8803990 Smith Aug 2014 B2
8896655 Mauchly et al. Nov 2014 B2
8976255 Matsuoto et al. Mar 2015 B2
9019387 Nakano Apr 2015 B2
9025073 Attar et al. May 2015 B2
9025077 Attar et al. May 2015 B2
9041835 Honda May 2015 B2
9137447 Shibuno Sep 2015 B2
9185291 Shabtay et al. Nov 2015 B1
9215377 Sokeila et al. Dec 2015 B2
9215385 Luo Dec 2015 B2
9270875 Brisedoux et al. Feb 2016 B2
9286680 Jiang et al. Mar 2016 B1
9344626 Silverstein et al. May 2016 B2
9360671 Zhou Jun 2016 B1
9369621 Malone et al. Jun 2016 B2
9413930 Geerds Aug 2016 B2
9413984 Attar et al. Aug 2016 B2
9420180 Jin Aug 2016 B2
9438792 Nakada et al. Sep 2016 B2
9485432 Medasani et al. Nov 2016 B1
9578257 Attar et al. Feb 2017 B2
9618748 Munger et al. Apr 2017 B2
9681057 Attar et al. Jun 2017 B2
9723220 Sugie Aug 2017 B2
9736365 Laroia Aug 2017 B2
9736391 Du et al. Aug 2017 B2
9768310 Ahn et al. Sep 2017 B2
9800798 Ravirala et al. Oct 2017 B2
9851803 Fisher et al. Dec 2017 B2
9894287 Qian et al. Feb 2018 B2
9900522 Lu Feb 2018 B2
9927600 Goldenberg et al. Mar 2018 B2
10951834 Cohen Mar 2021 B2
20020005902 Yuen Jan 2002 A1
20020030163 Zhang Mar 2002 A1
20020063711 Park et al. May 2002 A1
20020075258 Park et al. Jun 2002 A1
20020122113 Foote Sep 2002 A1
20020167741 Koiwai et al. Nov 2002 A1
20030030729 Prentice et al. Feb 2003 A1
20030093805 Gin May 2003 A1
20030160886 Misawa et al. Aug 2003 A1
20030202113 Yoshikawa Oct 2003 A1
20040008773 Itokawa Jan 2004 A1
20040012683 Yamasaki et al. Jan 2004 A1
20040017386 Liu et al. Jan 2004 A1
20040027367 Pilu Feb 2004 A1
20040061788 Bateman Apr 2004 A1
20040141065 Hara et al. Jul 2004 A1
20040141086 Mihara Jul 2004 A1
20040240052 Minefuji et al. Dec 2004 A1
20050013509 Samadani Jan 2005 A1
20050046740 Davis Mar 2005 A1
20050157184 Nakanishi et al. Jul 2005 A1
20050168834 Matsumoto et al. Aug 2005 A1
20050185049 Iwai et al. Aug 2005 A1
20050200718 Lee Sep 2005 A1
20060054782 Olsen et al. Mar 2006 A1
20060056056 Ahiska et al. Mar 2006 A1
20060067672 Washisu et al. Mar 2006 A1
20060102907 Lee et al. May 2006 A1
20060125937 LeGall et al. Jun 2006 A1
20060170793 Pasquarette et al. Aug 2006 A1
20060175549 Miller et al. Aug 2006 A1
20060187310 Janson et al. Aug 2006 A1
20060187322 Janson et al. Aug 2006 A1
20060187338 May et al. Aug 2006 A1
20060227236 Pak Oct 2006 A1
20070024737 Nakamura et al. Feb 2007 A1
20070126911 Nanjo Jun 2007 A1
20070177025 Kopet et al. Aug 2007 A1
20070188653 Pollock et al. Aug 2007 A1
20070189386 Imagawa et al. Aug 2007 A1
20070257184 Olsen et al. Nov 2007 A1
20070285550 Son Dec 2007 A1
20080017557 Witdouck Jan 2008 A1
20080024614 Li et al. Jan 2008 A1
20080025634 Border et al. Jan 2008 A1
20080030592 Border et al. Feb 2008 A1
20080030611 Jenkins Feb 2008 A1
20080084484 Ochi et al. Apr 2008 A1
20080106629 Kurtz et al. May 2008 A1
20080117316 Orimoto May 2008 A1
20080129831 Cho et al. Jun 2008 A1
20080218611 Parulski et al. Sep 2008 A1
20080218612 Border et al. Sep 2008 A1
20080218613 Janson et al. Sep 2008 A1
20080219654 Border et al. Sep 2008 A1
20090086074 Li et al. Apr 2009 A1
20090109556 Shimizu et al. Apr 2009 A1
20090122195 Van Baar et al. May 2009 A1
20090122406 Rouvinen et al. May 2009 A1
20090128644 Camp et al. May 2009 A1
20090219547 Kauhanen et al. Sep 2009 A1
20090252484 Hasuda et al. Oct 2009 A1
20090295949 Ojala Dec 2009 A1
20090324135 Kondo et al. Dec 2009 A1
20100013906 Border et al. Jan 2010 A1
20100020221 Tupman et al. Jan 2010 A1
20100060746 Olsen et al. Mar 2010 A9
20100097444 Lablans Apr 2010 A1
20100103194 Chen et al. Apr 2010 A1
20100165131 Makimoto et al. Jul 2010 A1
20100196001 Ryyñanen et al. Aug 2010 A1
20100238327 Griffith et al. Sep 2010 A1
20100259836 Kang et al. Oct 2010 A1
20100283842 Guissin et al. Nov 2010 A1
20100321494 Peterson et al. Dec 2010 A1
20110058320 Kim et al. Mar 2011 A1
20110063417 Peters et al. Mar 2011 A1
20110063446 McMordie et al. Mar 2011 A1
20110064327 Dagher et al. Mar 2011 A1
20110080487 Venkataraman et al. Apr 2011 A1
20110128288 Petrou et al. Jun 2011 A1
20110164172 Shintani et al. Jul 2011 A1
20110229054 Weston et al. Sep 2011 A1
20110234798 Chou Sep 2011 A1
20110234853 Hayashi et al. Sep 2011 A1
20110234881 Wakabayashi et al. Sep 2011 A1
20110242286 Pace et al. Oct 2011 A1
20110242355 Goma et al. Oct 2011 A1
20110298966 Kirschstein et al. Dec 2011 A1
20120026366 Golan et al. Feb 2012 A1
20120044372 Cote et al. Feb 2012 A1
20120062780 Morihisa Mar 2012 A1
20120069235 Imai Mar 2012 A1
20120075489 Nishihara Mar 2012 A1
20120105579 Jeon et al. May 2012 A1
20120124525 Kang May 2012 A1
20120154547 Aizawa Jun 2012 A1
20120154614 Moriya et al. Jun 2012 A1
20120196648 Havens et al. Aug 2012 A1
20120229663 Nelson et al. Sep 2012 A1
20120249815 Bohn et al. Oct 2012 A1
20120287315 Huang et al. Nov 2012 A1
20120320467 Baik et al. Dec 2012 A1
20130002928 Imai Jan 2013 A1
20130016427 Sugawara Jan 2013 A1
20130063629 Webster et al. Mar 2013 A1
20130076922 Shihoh et al. Mar 2013 A1
20130093842 Yahata Apr 2013 A1
20130094126 Rappoport et al. Apr 2013 A1
20130113894 Mirlay May 2013 A1
20130135445 Dahi et al. May 2013 A1
20130155176 Paripally et al. Jun 2013 A1
20130182150 Asakura Jul 2013 A1
20130201360 Song Aug 2013 A1
20130202273 Ouedraogo et al. Aug 2013 A1
20130235224 Park et al. Sep 2013 A1
20130250150 Malone et al. Sep 2013 A1
20130258044 Betts-LaCroix Oct 2013 A1
20130270419 Singh et al. Oct 2013 A1
20130278785 Nomura et al. Oct 2013 A1
20130321668 Kamath Dec 2013 A1
20140009631 Topliss Jan 2014 A1
20140049615 Uwagawa Feb 2014 A1
20140118584 Lee et al. May 2014 A1
20140192238 Attar et al. Jul 2014 A1
20140192253 Laroia Jul 2014 A1
20140218587 Shah Aug 2014 A1
20140313316 Olsson et al. Oct 2014 A1
20140362242 Takizawa Dec 2014 A1
20150002683 Hu et al. Jan 2015 A1
20150042870 Chan et al. Feb 2015 A1
20150070781 Cheng et al. Mar 2015 A1
20150092066 Geiss et al. Apr 2015 A1
20150103147 Ho et al. Apr 2015 A1
20150138381 Ahn May 2015 A1
20150154776 Zhang et al. Jun 2015 A1
20150162048 Hirata et al. Jun 2015 A1
20150195458 Nakayama et al. Jul 2015 A1
20150215516 Dolgin Jul 2015 A1
20150237280 Choi et al. Aug 2015 A1
20150242994 Shen Aug 2015 A1
20150244906 Wu et al. Aug 2015 A1
20150253543 Mercado Sep 2015 A1
20150253647 Mercado Sep 2015 A1
20150261299 Wajs Sep 2015 A1
20150271471 Hsieh et al. Sep 2015 A1
20150281678 Park et al. Oct 2015 A1
20150286033 Osborne Oct 2015 A1
20150316744 Chen Nov 2015 A1
20150334309 Peng et al. Nov 2015 A1
20160044250 Shabtay et al. Feb 2016 A1
20160070088 Koguchi Mar 2016 A1
20160154202 Wippermann et al. Jun 2016 A1
20160154204 Lim et al. Jun 2016 A1
20160212358 Shikata Jul 2016 A1
20160212418 Demirdjian et al. Jul 2016 A1
20160241751 Park Aug 2016 A1
20160291295 Shabtay et al. Oct 2016 A1
20160295112 Georgiev et al. Oct 2016 A1
20160301840 Du et al. Oct 2016 A1
20160353008 Osborne Dec 2016 A1
20160353012 Kao et al. Dec 2016 A1
20170019616 Zhu et al. Jan 2017 A1
20170070731 Darling et al. Mar 2017 A1
20170187962 Lee et al. Jun 2017 A1
20170214846 Du et al. Jul 2017 A1
20170214866 Zhu et al. Jul 2017 A1
20170242225 Fiske Aug 2017 A1
20170289458 Song et al. Oct 2017 A1
20180013944 Evans, V et al. Jan 2018 A1
20180017844 Yu et al. Jan 2018 A1
20180024329 Goldenberg et al. Jan 2018 A1
20180059379 Chou Mar 2018 A1
20180120674 Avivi et al. May 2018 A1
20180150973 Tang et al. May 2018 A1
20180176426 Wei et al. Jun 2018 A1
20180198897 Tang et al. Jul 2018 A1
20180241922 Baldwin et al. Aug 2018 A1
20180295292 Lee et al. Oct 2018 A1
20180300901 Wakai et al. Oct 2018 A1
20190121103 Bachar et al. Apr 2019 A1
20200103726 Shabtay Apr 2020 A1
Foreign Referenced Citations (39)
Number Date Country
101276415 Oct 2008 CN
201514511 Jun 2010 CN
102739949 Oct 2012 CN
103024272 Apr 2013 CN
103841404 Jun 2014 CN
1536633 Jun 2005 EP
1780567 May 2007 EP
2523450 Nov 2012 EP
S59191146 Oct 1984 JP
04211230 Aug 1992 JP
H07318864 Dec 1995 JP
08271976 Oct 1996 JP
2002010276 Jan 2002 JP
2003298920 Oct 2003 JP
2004133054 Apr 2004 JP
2004245982 Sep 2004 JP
2005099265 Apr 2005 JP
2006238325 Sep 2006 JP
2007228006 Sep 2007 JP
2007306282 Nov 2007 JP
2008076485 Apr 2008 JP
2010204341 Sep 2010 JP
2011085666 Apr 2011 JP
2013106289 May 2013 JP
20070005946 Jan 2007 KR
20090058229 Jun 2009 KR
20100008936 Jan 2010 KR
20140014787 Feb 2014 KR
101477178 Dec 2014 KR
20140144126 Dec 2014 KR
20150118012 Oct 2015 KR
2000027131 May 2000 WO
2004084542 Sep 2004 WO
2006008805 Jan 2006 WO
2010122841 Oct 2010 WO
2014072818 May 2014 WO
2017025822 Feb 2017 WO
2017037688 Mar 2017 WO
2018130898 Jul 2018 WO
Non-Patent Literature Citations (16)
Entry
Statistical Modeling and Performance Characterization of a Real-Time Dual Camera Surveillance System, Greienhagen et al., Publisher: IEEE, 2000, 8 pages.
A 3MPixel Multi-Aperture Image Sensor with 0.7 μm Pixels in 0.11 μm CMOS, Fife et al., Stanford University, 2008, 3 pages.
Dual camera intelligent sensor for high definition 360 degrees surveillance, Scotti et al., Publisher: IET, May 9, 2000, 8 pages.
Dual-sensor foveated imaging system, Hua et al., Publisher: Optical Society of America, Jan. 14, 2008, 11 pages.
Defocus Video Matting, McGuire et al., Publisher: ACM SIGGRAPH, Jul. 31, 2005, 11 pages.
Compact multi-aperture imaging with high angular resolution, Santacana et al., Publisher: Optical Society of America, 2015, 10 pages.
Multi-Aperture Photography, Green et al., Publisher: Mitsubishi Electric Research Laboratories, Inc., Jul. 2007, 10 pages.
Multispectral Bilateral Video Fusion, Bennett et al., Publisher: IEEE, May 2007, 10 pages.
Super-resolution imaging using a camera array, Santacana et al., Publisher: Optical Society of America, 2014, 6 pages.
Optical Splitting Trees for High-Precision Monocular Imaging, McGuire et al., Publisher: IEEE, 2007, 11 pages.
High Performance Imaging Using Large Camera Arrays, Wilburn et al., Publisher: Association for Computing Machinery, Inc., 2005, 12 pages.
Real-time Edge-Aware Image Processing with the Bilateral Grid, Chen et al., Publisher: ACM SIGGRAPH, 2007, 9 pages.
Superimposed multi-resolution imaging, Carles et al., Publisher: Optical Society of America, 2017, 13 pages.
Viewfinder Alignment, Adams et al., Publisher: EUROGRAPHICS, 2008, 10 pages.
Dual-Camera System for Multi-Level Activity Recognition, Bodor et al., Publisher: IEEE, Oct. 2014, 6 pages.
Engineered to the task: Why camera-phone cameras are different, Giles Humpston, Publisher: Solid State Technology, Jun. 2009, 3 pages.
Related Publications (2)
Number Date Country
20210168037 A1 Jun 2021 US
20230171368 A9 Jun 2023 US
Provisional Applications (1)
Number Date Country
62567287 Oct 2017 US
Continuations (1)
Number Date Country
Parent 16121049 Sep 2018 US
Child 17165324 US