Examples disclosed herein relate in general to digital cameras and in particular to correction of images obtained with folded digital cameras.
Compact digital cameras having folded optics, also referred to as “folded cameras” are known, see e.g. co-owned international patent application PCT/IB2016/057366. In handheld electronic devices (also referred to herein as “handheld devices”) such as smartphones, tablets etc. a folded Tele camera is often part of a multi-camera and accompanied by one or more additional cameras, e.g. an Ultra-wide camera and a Wide camera. An Ultra-wide camera has a larger field of view (FOVUW) than a Wide camera, which has a larger FOVW than a Tele camera having FOVT (assuming similar image sensor sizes).
Images are acquired from a certain point of view (POV) of a camera. The POV is the direction defined by the vector that has the location of a camera's aperture as starting point and an object point at the center of the FOV as end point (see
As e.g. described in the co-owned PCT Patent Application No. PCT/IB2016/057366 and with reference to
Modern cameras that are included in handheld devices often include optical image stabilization (OIS) for mitigating undesired camera motion caused by a user's hand motion (often referred to as “hand-shake”). For OIS, optical components are moved to reduce movements of imaged objects on a camera's image sensor. The lens module and/or the image sensor and/or the OPFE and/or the entire camera can be moved. An inertial measurement unit (IMU) included in the handheld device provides motion data along 6 degrees of freedom, namely and with reference to
There is a need for and it would be advantageous to have OIS for scanning Tele cameras.
Henceforth and for simplicity, the terms “electronic device”, “electronic handheld device” “handheld device” or just “device” are used interchangeably. Henceforth and for simplicity, the term “smartphone” may be used to represent all electronic handheld devices having scanning folded cameras and implementing methods for OIS in such cameras described herein.
In various embodiments, there are provided Tele folded cameras operative to compensate for an undesired rotational motion of a handheld electronic device that includes such a camera, wherein the compensation depends on the undesired rotational motion and on a point of view of the Tele folded camera.
In various embodiments, a handheld electronic device comprises: a Tele folded camera comprising an OPFE for folding light from a first optical path that forms an angle of less than 90 degrees to a normal of the device toward a second optical path substantially orthogonal to the normal of the device, a lens with a lens optical axis along the second optical path, and an image sensor, wherein the device is a handheld electronic device; an OPFE actuator for tilting the OPFE in one or more directions to direct a point of view (POV) of the Tele folded camera towards a segment of a scene; a motion sensor for sensing an undesired rotational motion of the device; and
at least one actuator for moving at least one component of the Tele folded camera to compensate for the undesired rotational motion of the device, wherein the compensation depends on the undesired rotational motion of the device and on the Tele folded camera POV.
In some embodiments, the undesired rotation motion is around the device normal.
In some embodiments, a device further comprises a Wide camera having a field of view FOVW larger than a field of view FOVT of the Tele camera.
In some embodiments, the sensing the rotational motion includes measuring the rotation motion in three directions.
In some embodiments, the actuator for moving the component of the Tele folded camera to compensate for the device's undesired rotational motion is the OPFE actuator for tilting the OPFE in one or more directions to direct a point of view (POV) of the Tele folded camera towards a segment of a scene.
In some embodiments, the moving of the component of the Tele folded camera to compensate for the device's undesired rotational motion includes moving the lens.
In some embodiments, the moving of the component of the Tele folded camera to compensate for the device's undesired rotational motion includes moving the image sensor.
In some embodiments, a device further comprises a processing unit configured to perform a coordinate transformation to align coordinates of the Tele camera with coordinates of the handheld device or vice versa.
In some embodiments, a device further comprises a processing unit configured to perform a coordinate transformation that aligns coordinates of a reference coordinate system with coordinates of the handheld device and coordinates of the Tele camera.
In some embodiments, the coordinate transformation is performed using Rodrigues' rotation formula.
In some embodiments, the motion sensor includes an inertial measurement unit (IMU).
In some embodiments, a device further comprises a microcontroller unit (MCU) configured to read out the motion sensor and to provide control signal to the rotational motion compensation actuator. In some embodiments, the MCU is included in an application processor (AP).
In some embodiments, a device further comprises an application processor configured to provide POV control signal to the OPFE actuator for tilting the OPFE.
In various embodiments, there are provided methods comprising: providing a handheld device comprising a Tele folded camera that includes an OPFE for folding light from a first optical axis that forms an angle of less than 90 degrees to a normal of the device toward a second optical axis substantially orthogonal to a normal of the device, a lens with a lens axis along the second optical axis, and an image sensor; providing an OPFE actuator for tilting the OPFE in one or more directions to direct a point of view (POV) of the Tele folded camera towards a segment of a scene; sensing an undesired rotational motion of the device; and compensating for the undesired rotational motion, wherein the compensation depends on the undesired rotational motion and on the Tele folded camera's POV.
In some embodiments, the compensating for the undesired rotational motion includes moving a component of the Tele folded camera.
In some embodiments, the compensating for the undesired rotational motion includes compensating for a rotational motion around the device's normal direction.
In some embodiments, a method further comprises performing a coordinate transformation to align coordinates of the Tele camera with coordinates of an IMU.
In some embodiments, a method further comprises performing a coordinate transformation to coordinates of the IMU with coordinates of the Tele camera.
In some embodiments, a method further comprises performing a coordinate transformation to align coordinates of a reference coordinate system with coordinates of the IMU and coordinates of the Tele camera.
In some embodiments, the performing the coordinate transformation includes performing the transformation using Rodrigues' rotation formula.
In some embodiments, the sensing an undesired rotational motion of the device includes sensing the undesired rotational motion in three directions.
In some embodiments, the compensating for the undesired rotational motion of the device includes rotating the OPFE.
In some embodiments, the compensating for the undesired rotational motion of the device includes moving the lens.
In some embodiments, the compensating for the undesired rotational motion of the device includes moving the image sensor.
In some embodiments, the compensating for the undesired rotational motion includes calculating a changed POV caused by the undesired rotational motion in the X direction by using the equation: PFP=(PI·cos(hnd_pitch)+cross(PI, RP)·sin(hnd_pitch)+RP·(dot(PI, RP) (1-cos (hnd_pitch)))).
In some embodiments, the compensating for the undesired rotational motion includes calculating a changed POV caused by the undesired rotational motion in the Y direction by using the equation: PFY=(PI·cos(hnd_yaw)+cross(PI, RY)·sin(hnd_yaw)+RY·(dot(PI, RY)·(1-cos (hnd_yaw)))).
In some embodiments, the compensating for the undesired rotational motion includes calculating a changed POVcaused by the undesired rotational motion in the X direction by using the equation: PFR=(PI·cos(hnd_roll)+cross(PI, RR)·sin(hnd_roll)+RR·(dot(PI, RR)·(1-cos (hnd_roll)))).
In some embodiments, the compensating for the undesired rotational motion includes calculating a direction of a changed POV caused by the undesired rotational motion in X, Y and Z direction together by using the equation: PF,=PI+(PI-PFP)+(PI-PFY)+(PI-PFR).
In some embodiments, the compensating for the undesired rotational motion includes calculating a vector of a changed POV caused by by the undesired rotational motion in X, Y and Z direction together by using the equation: PF′PF′·EFLT/PF′z.
Non-limiting examples of embodiments disclosed herein are described below with reference to figures attached hereto that are listed following this paragraph. The drawings and descriptions are meant to illuminate and clarify embodiments disclosed herein, and should not be considered limiting in any way. Like elements in different drawings may be indicated like numerals.
In a first exemplary method for OIS (“Example 1”), consider OIS for Wide camera 204 that (for the sake of simplicity) may correct for pitch rotation only. For detecting the amount of undesired hand motion, one could read out the value for pitch rotation around the X axis from the IMU (“XIMU”) and move e.g. the lens in one particular direction (dir1) by a particular amount, wherein the amount (or stroke) of movement is proportional to XIMU, i.e. the lens stroke SW fulfills SW=CW.XIMU (with some constant CW). The same holds for OIS of STC 202 at zero position. By moving the lens by ST=CT.XIMU (with some constant CT) in din the hand motion is compensated.
Consider Example 1 (hand motion in pitch direction) with STC 202 at a non-zero position. OIS for Wide camera 204 may be performed as in Example 1. However, for OIS of STC 202, the method of Example 1 does not allow to perform hand motion compensation anymore, i.e. there is (in general) no CT so that by moving the Tele lens by ST=CT.XIMU, the hand motion is compensated. This is because the coordinate systems of STC 202 and the IMU are not aligned anymore.
For a second exemplary method for OIS (“Example 2”), refer to
In a first step 302, a command triggered by a human user or by a program and processed by a FOV scanner 442 (
In step 304, the IMU is read out and provides rotational movements around the Pitch, Yaw and Roll directions, i.e. XIMU, YIMU and ZIMU respectively. Usually, IMU provides data on the angular acceleration which is to be integrated for determining the rotation angle. The IMU data may be used to calculate the undesired rotational motion of the device.
In step 306, a coordinate transformation is performed. The coordinate transformation is required because the STC's POV change caused by an undesired rotational motion of the device and the sensing of the undesired rotational motion occur in different coordinate systems.
A processing unit such as an AP or a MCU may be configured for performing the coordinate transformation (e.g. AP 440 of device 400 or device 480, or MCU 470 of device 400 in
In some examples, the transformation may be performed in order to express the coordinates of the STC in the coordinate system of the IMU. Device rotations and compensation motions may then be calculated in the IMU's coordinate system.
In some examples, a 2D chart such as chart 320 shown in
If the STC is directed to a non-zero POV, a coordinate transform from the IMU's to the STC's coordinates must be performed. In some examples, Rodrigues' rotation formula may be used. The IMU's pitch/yaw/roll rotation values may be named “hnd_pitch”, “hnd_yaw” and “hnd_roll”. IMU provides hnd_pitch, hnd_yaw and hnd_roll in a coordinate system having the following unit vectors:
In general, OIS corrects small angles only. Therefore, in some situations and approximately, one may treat the pitch/yaw/roll rotations independently. For any (slight) rotation of the device, Rodrigues' rotation formula may be applied to pitch/yaw/roll rotations independently, wherein the (slight) rotation may be represented by the sum over the pitch/yaw/roll rotations. A hand motion only by hnd_pitch, or only by hnd_yaw or only by hnd_roll (in the IMU's coordinates RP, RY and RR) applied to any initial POV vector PI may result in the following final POV vector PF (“cross(x, y)” indicates the cross product of vectors x and y, “dot(x, y)” indicates the dot product of vectors x and y):
From the above equations it is evident that for compensating for undesired rotational hand motion in a STC, in contrast for a non-scanning camera like e.g. Wide camera 204, where one may compensate the undesired rotational hand motion around yaw and pitch only, one must compensate rotational hand motion around the three directions yaw, pitch and roll.
In other examples for coordinate transformation, the transformation may be performed to express the coordinates of the IMU in the coordinate system of the STC. Hand motion rotations and compensation motions may then be calculated in the STC's coordinate system. As above, Rodrigues' rotation formula may be used.
In yet other examples for coordinate transformation, the transformation may be to a third coordinate system (“reference system”). Both the coordinates of the STC and of the IMU are expressed in the reference coordinate system. Hand motion rotations and compensation motions may then be calculated in the reference coordinate system. As above, Rodrigues' rotation formula may be used.
In step 308, movement for OIS may be performed. In some examples, OIS may be performed by moving the STC's OPFE. In other examples, a lens such as lens 102 and/or an image sensor such as image sensor 106 may be moved for OIS. Assuming ideal OIS, the movement of OPFE and/or lens and/or sensor may lead to a POC vector modification POIS that exactly cancels the effect of the hand motion onto the POV vector, i.e.: PF+POIS=PI. So after performing step 308 the STC is again directed towards PI. In other examples, the entire STC may be moved for OIS. I.e. OPFE, lens and image sensor are moved together as one unit for OIS.
In some embodiments, steps 304-308 may be repeated for stabilizing the STC continuously. The OIS cycles that include steps 304-308 may be performed at frequencies of e.g. 500 Hz-100 kHz. STC images or image streams are captured while the above OIS method is performed.
In some embodiments, an IMU may be fixedly attached to the OPFE, so that when moving the OPFE, the IMU moves accordingly, too. This allows for using coordinate systems having identical basis vectors for both the STC and the IMU, so that the coordinate transform of step 306 is not required.
In some embodiments, a sensor actuator may actuate the image sensor for correcting POV aberrations of a STC image. As described in the co-owned international patent application PCT/IB2021/056311, a STC image undergoes POV aberrations. One aberration is a rotation of the STC image on the image sensor (“rotational POV aberration”). When an undesired rotational hand motion is compensated by moving an OPFE as disclosed herein, the moving of the OPFE introduces a POV aberration. A sensor actuator may be used to rotate an image sensor around a normal of the image sensor for compensating the rotational POV aberration.
In other examples, device 400 may comprise a STC that includes two OPFEs as well as an OPFE actuator for each of the two OPFEs. In some examples, the OPFE actuators may actuate the OPFEs for performing OIS as disclosed herein. In other examples, a lens actuator may actuate a lens or a sensor actuator may actuate a sensor for performing OIS as disclosed herein. A STC camera based on two OPFEs is described for example in PCT/IB2021/054186. In such a STC, the optical path within the camera is folded twice, so that one speaks of a double-folded scanning Tele camera.
Handheld device 400 further comprises a Wide (or Ultra-Wide) camera module 430 which includes a second lens module 434 that forms an image recorded by a second image sensor 432. A second lens actuator 436 may move lens module 434 for focusing and/or OIS. In some examples, the STC can scan the entire FOVW or an even larger FOV. In other examples, the STC can scan a FOV that is smaller than FOVW.
In some examples, object tracker 448 may be configured to track an object in FOVW and provide tracking data to FOV scanner 442 and/or the OIS controller 444. Based on the tracking data, FOV scanner 442 and/or the OIS controller 444 may provide control signals to OPFE actuator 414 which actuate an OPFE rotation for tracking an object with the STC. As an example, one may track an object so that it centers at the center of FOVT. Examples 3-7 described below refer to this tracking scenario, where the Wide camera image data is used to provide tracking information which triggers Tele FOV scanning.
In some examples, tracking information and OIS information may interfere and coordination between tracking and OIS may be required for achieving a desired object tracking and/or OIS outcome.
As a third exemplary method for OIS, consider a device such as device 400 or 480 including a Wide camera and a STC both not having OIS. The STC may track an object at rest so that the object's center is located at the center of FOVT. The tracking may occur in real-time (RT), i.e. we assume that there is no delay between the detection of a tracking deviation and its compensation. A device's rotational motion caused by a user's hand motion will be detected as an object movement in the Wide camera. In response, a tracking movement of the STC will be triggered and the object's location in the Tele FOV will be updated. In conclusion, in the RT scenario the object tracker performs OIS in a sense that the object will always be located in the center of FOVT and will not be affected from hand motion of a user.
As a fourth exemplary method for OIS, consider a device such as device 400 or 480 including a Wide camera not having OIS and a STC having OIS. As in example 3, we assume RT object tracking on FOVW so that a (non-moving) object's center is located at the center of FOVT. OIS may be performed in RT as well. A device's rotational motion caused by a user's hand motion will be detected as an object movement in the Wide camera. In response, a tracking movement ΔT for the STC will be triggered. Simultaneously, the device's rotational motion will also be detected by the STC's OIS and an OIS movement ΔOIS of the STC will be triggered in response. OIS movement may be performed according the OIS method disclosed herein. ΔT and ΔOIS are identical in terms of direction and magnitude, i.e. a STC movement of 2·ΔT=2·ΔOIS will be triggered, which is double the amount of movement required (i) for keeping the object at the center of FOVT (desired tracking outcome) and (ii) for suppressing the impact of hand motion on the STC image (desired OIS outcome). In conclusion, the desired outcome is not achieved for either Tele tracking or Tele OIS. Therefore, in some examples, the STC's OIS is disabled when using object tracking.
As a fifth exemplary method for OIS, consider a device such as device 400 or 480 including a Wide camera not having OIS and a STC having OIS. Object tracking may be performed on FOVW so that a (non-moving) object's center is located at the center of FOVT. However, Object tracking and OIS may not be performed in RT. In general, OIS is performed at higher frequencies than object tracking. As an example, OIS may be performed at 500 Hz-100 kHz and object tracking may be performed at 1 Hz-100 Hz. In some examples, for preventing undesired interference between OIS and object tracking, one may disable OIS when using object tracking. In other embodiments, one may separate control of OIS and object tracking in the frequency domain. As an example, for device's rotational motion caused by a user's hand motion that occurs at a frequency higher than e.g. 30 Hz, one may use OIS for device motion correction. For frequencies lower than e.g. 30 Hz one may not use OIS for device motion correction. Instead the low frequency device motion will be compensated by the object tracker.
As a sixth exemplary method for OIS, consider a device such as device 400 or 480 including a Wide camera having OIS and a STC not having OIS. Object tracking may be performed on FOVW so that a (non-moving) object's center is located at the center of FOVT. Object tracking and OIS may be performed in RT. As of the Wide camera's OIS, a device's rotational motion caused by a user's hand motion will have no impact on the Wide image stream. As the object does not move in FOVW, no tracking movement of the STC will be triggered. In conclusion, there is no hand motion compensation and the object will not be located at the center of FOVT anymore, leading to an undesired object tracking outcome. In some examples for preventing this undesired outcome, one may disable the Wide camera's OIS when performing object tracking. In other examples, object tracking control signals that are supplied to the STC may additionally include the Wide camera's OIS control signals. By superimposing the two signals, the benefits of both Wide camera OIS and proper STC tracking may be enjoyed.
As a seventh exemplary method for OIS, consider a device such as device 400 or 480 with both the Wide camera and the STC having OIS. We assume RT tracking so that an object's center is located at the center of FOVT. A device's rotational motion caused by a user's hand motion will be corrected by an OIS movement in both the Wide camera and the STC in RT. In conclusion, a user's hand motion will not impact the desired output of the object tracker.
Calibration data may be stored in a first memory 424, e.g. in an EEPROM (electrically erasable programmable read only memory) and/or in a second memory 438 and/or in a third memory 450 such as a NVM (non-volatile memory). The calibration data may comprise calibration data between Wide camera 430 and STC 410. The calibration data may further comprise calibration data between an OPFE's position and the STC's corresponding POV.
Handheld device 400 further comprises an inertial measurement unit (IMU, for example a gyroscope) 460 that supplies motion information of 400. For example, a microcontroller unit (MCU) 470 may be used to read and process data of IMU 460. In some examples, the MCU may be controlled by an OIS controller 444 which is part of AP 440. In some examples, step 304 and step 306 may be performed by the MCU and step 308 may be performed by OPFE actuator 414 (and/or lens actuator 436 and/or sensor actuator 418 in case OIS is performed by lens shift or sensor shift respectively). In some examples, MCU 470 may be integrated into AP 440.
Another embodiment of a handheld device numbered 480 and comprising a multi-aperture camera with at least one STC as disclosed herein is shown in
In some examples, additional data may be used for hand motion estimation. Additional data may e.g. be image data from the Wide camera 430 or data from additional sensing units present in the handheld device.
In some examples, image data from Wide camera 430 may be used to estimate an “optical flow” from a plurality of images as known in the art, wherein OIS controller 444 may use the data of the optical flow together with data from IMU 460 for estimating motion of device 400. In other examples, only optical flow data estimated from image data of camera 410 and/or camera 430 may be used for estimating motion of device 400.
Image generator 446 may be configured to generate images and image streams. In some examples, image generator 446 may be configured to use only first image data from camera 430. In other examples, image generator 446 may use image data from camera 410 and/or camera 430.
While this disclosure has been described in terms of certain embodiments and generally associated methods, alterations and permutations of the embodiments and methods will be apparent to those skilled in the art. The disclosure is to be understood as not limited by the specific embodiments described herein, but only by the scope of the appended claims.
Unless otherwise stated, the use of the expression “and/or” between the last two members of a list of options for selection indicates that a selection of one or more of the listed options is appropriate and may be made.
It should be understood that where the claims or specification refer to “a” or “an” element, such reference is not to be construed as there being only one of that element.
Furthermore, for the sake of clarity the term “substantially” is used herein to imply the possibility of variations in values within an acceptable range. According to one example, the term “substantially” used herein should be interpreted to imply possible variation of up to 5% over or under any specified value. According to another example, the term “substantially” used herein should be interpreted to imply possible variation of up to 2.5% over or under any specified value.
According to a further example, the term “substantially” used herein should be interpreted to imply possible variation of up to 1% over or under any specified value.
All patents and/or patent applications mentioned in this specification are herein incorporated in their entirety by reference into the specification, to the same extent as if each individual reference was specifically and individually indicated to be incorporated herein by reference. In addition, citation or identification of any reference in this application shall not be construed as an admission that such reference is available as prior art to the present invention.
This application is a 371 of international application PCT/IB2021/056617 filed Jul. 22, 2021, and is related to and claims the benefit of priority from U.S. provisional patent application No. 63/064,565 filed Aug. 12, 2020, which is incorporated herein by reference in its entirety.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/IB2021/056617 | 7/22/2021 | WO |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2022/034402 | 2/17/2022 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
4199785 | McCullough et al. | Apr 1980 | A |
5005083 | Grage et al. | Apr 1991 | A |
5032917 | Aschwanden | Jul 1991 | A |
5041852 | Misawa et al. | Aug 1991 | A |
5051830 | von Hoessle | Sep 1991 | A |
5099263 | Matsumoto et al. | Mar 1992 | A |
5248971 | Mandl | Sep 1993 | A |
5287093 | Amano et al. | Feb 1994 | A |
5394520 | Hall | Feb 1995 | A |
5436660 | Sakamoto | Jul 1995 | A |
5444478 | Lelong et al. | Aug 1995 | A |
5459520 | Sasaki | Oct 1995 | A |
5657402 | Bender et al. | Aug 1997 | A |
5682198 | Katayama et al. | Oct 1997 | A |
5768443 | Michael et al. | Jun 1998 | A |
5926190 | Turkowski et al. | Jul 1999 | A |
5940641 | McIntyre et al. | Aug 1999 | A |
5982951 | Katayama et al. | Nov 1999 | A |
6101334 | Fantone | Aug 2000 | A |
6128416 | Oura | Oct 2000 | A |
6148120 | Sussman | Nov 2000 | A |
6208765 | Bergen | Mar 2001 | B1 |
6268611 | Pettersson et al. | Jul 2001 | B1 |
6549215 | Jouppi | Apr 2003 | B2 |
6611289 | Yu et al. | Aug 2003 | B1 |
6643416 | Daniels et al. | Nov 2003 | B1 |
6650368 | Doron | Nov 2003 | B1 |
6680748 | Monti | Jan 2004 | B1 |
6714665 | Hanna et al. | Mar 2004 | B1 |
6724421 | Glatt | Apr 2004 | B1 |
6738073 | Park et al. | May 2004 | B2 |
6741250 | Furlan et al. | May 2004 | B1 |
6750903 | Miyatake et al. | Jun 2004 | B1 |
6778207 | Lee et al. | Aug 2004 | B1 |
7002583 | Rabb, III | Feb 2006 | B2 |
7015954 | Foote et al. | Mar 2006 | B1 |
7038716 | Klein et al. | May 2006 | B2 |
7199348 | Olsen et al. | Apr 2007 | B2 |
7206136 | Labaziewicz et al. | Apr 2007 | B2 |
7248294 | Slatter | Jul 2007 | B2 |
7256944 | Labaziewicz et al. | Aug 2007 | B2 |
7305180 | Labaziewicz et al. | Dec 2007 | B2 |
7339621 | Fortier | Mar 2008 | B2 |
7346217 | Gold, Jr. | Mar 2008 | B1 |
7365793 | Cheatle et al. | Apr 2008 | B2 |
7411610 | Doyle | Aug 2008 | B2 |
7424218 | Baudisch et al. | Sep 2008 | B2 |
7509041 | Hosono | Mar 2009 | B2 |
7533819 | Barkan et al. | May 2009 | B2 |
7619683 | Davis | Nov 2009 | B2 |
7738016 | Toyofuku | Jun 2010 | B2 |
7773121 | Huntsberger et al. | Aug 2010 | B1 |
7809256 | Kuroda et al. | Oct 2010 | B2 |
7880776 | LeGall et al. | Feb 2011 | B2 |
7918398 | Li et al. | Apr 2011 | B2 |
7964835 | Olsen et al. | Jun 2011 | B2 |
7978239 | Deever et al. | Jul 2011 | B2 |
8115825 | Culbert et al. | Feb 2012 | B2 |
8149327 | Lin et al. | Apr 2012 | B2 |
8154610 | Jo et al. | Apr 2012 | B2 |
8238695 | Davey et al. | Aug 2012 | B1 |
8274552 | Dahi et al. | Sep 2012 | B2 |
8390729 | Long et al. | Mar 2013 | B2 |
8391697 | Cho et al. | Mar 2013 | B2 |
8400555 | Georgiev et al. | Mar 2013 | B1 |
8439265 | Ferren et al. | May 2013 | B2 |
8446484 | Muukki et al. | May 2013 | B2 |
8483452 | Ueda et al. | Jul 2013 | B2 |
8514491 | Duparre | Aug 2013 | B2 |
8547389 | Hoppe et al. | Oct 2013 | B2 |
8553106 | Scarff | Oct 2013 | B2 |
8587691 | Takane | Nov 2013 | B2 |
8619148 | Watts et al. | Dec 2013 | B1 |
8803990 | Smith | Aug 2014 | B2 |
8896655 | Mauchly et al. | Nov 2014 | B2 |
8976255 | Matsuoto et al. | Mar 2015 | B2 |
9019387 | Nakano | Apr 2015 | B2 |
9025073 | Attar et al. | May 2015 | B2 |
9025077 | Attar et al. | May 2015 | B2 |
9041835 | Honda | May 2015 | B2 |
9137447 | Shibuno | Sep 2015 | B2 |
9185291 | Shabtay et al. | Nov 2015 | B1 |
9215377 | Sokeila et al. | Dec 2015 | B2 |
9215385 | Luo | Dec 2015 | B2 |
9270875 | Brisedoux et al. | Feb 2016 | B2 |
9286680 | Jiang et al. | Mar 2016 | B1 |
9344626 | Silverstein et al. | May 2016 | B2 |
9360671 | Zhou | Jun 2016 | B1 |
9369621 | Malone et al. | Jun 2016 | B2 |
9413930 | Geerds | Aug 2016 | B2 |
9413984 | Attar et al. | Aug 2016 | B2 |
9420180 | Jin | Aug 2016 | B2 |
9438792 | Nakada et al. | Sep 2016 | B2 |
9485432 | Medasani et al. | Nov 2016 | B1 |
9578257 | Attar et al. | Feb 2017 | B2 |
9618748 | Munger et al. | Apr 2017 | B2 |
9681057 | Attar et al. | Jun 2017 | B2 |
9723220 | Sugie | Aug 2017 | B2 |
9736365 | Laroia | Aug 2017 | B2 |
9736391 | Du et al. | Aug 2017 | B2 |
9768310 | Ahn et al. | Sep 2017 | B2 |
9800798 | Ravirala et al. | Oct 2017 | B2 |
9851803 | Fisher et al. | Dec 2017 | B2 |
9894287 | Qian et al. | Feb 2018 | B2 |
9900522 | Lu | Feb 2018 | B2 |
9927600 | Goldenberg et al. | Mar 2018 | B2 |
11190689 | Wang | Nov 2021 | B1 |
20020005902 | Yuen | Jan 2002 | A1 |
20020030163 | Zhang | Mar 2002 | A1 |
20020063711 | Park et al. | May 2002 | A1 |
20020075258 | Park et al. | Jun 2002 | A1 |
20020122113 | Foote | Sep 2002 | A1 |
20020167741 | Koiwai et al. | Nov 2002 | A1 |
20020180759 | Park | Dec 2002 | A1 |
20030030729 | Prentice et al. | Feb 2003 | A1 |
20030093805 | Gin | May 2003 | A1 |
20030160886 | Misawa et al. | Aug 2003 | A1 |
20030202113 | Yoshikawa | Oct 2003 | A1 |
20040008773 | Itokawa | Jan 2004 | A1 |
20040012683 | Yamasaki et al. | Jan 2004 | A1 |
20040017386 | Liu et al. | Jan 2004 | A1 |
20040027367 | Pilu | Feb 2004 | A1 |
20040061788 | Bateman | Apr 2004 | A1 |
20040141065 | Hara et al. | Jul 2004 | A1 |
20040141086 | Mihara | Jul 2004 | A1 |
20040240052 | Minefuji et al. | Dec 2004 | A1 |
20050013509 | Samadani | Jan 2005 | A1 |
20050046740 | Davis | Mar 2005 | A1 |
20050157184 | Nakanishi et al. | Jul 2005 | A1 |
20050168834 | Matsumoto et al. | Aug 2005 | A1 |
20050185049 | Iwai et al. | Aug 2005 | A1 |
20050200718 | Lee | Sep 2005 | A1 |
20060054782 | Olsen et al. | Mar 2006 | A1 |
20060056056 | Ahiska et al. | Mar 2006 | A1 |
20060067672 | Washisu et al. | Mar 2006 | A1 |
20060102907 | Lee et al. | May 2006 | A1 |
20060125937 | LeGall et al. | Jun 2006 | A1 |
20060170793 | Pasquarette et al. | Aug 2006 | A1 |
20060175549 | Miller et al. | Aug 2006 | A1 |
20060187310 | Janson et al. | Aug 2006 | A1 |
20060187322 | Janson et al. | Aug 2006 | A1 |
20060187338 | May et al. | Aug 2006 | A1 |
20060227236 | Pak | Oct 2006 | A1 |
20070024737 | Nakamura et al. | Feb 2007 | A1 |
20070126911 | Nanjo | Jun 2007 | A1 |
20070177025 | Kopet et al. | Aug 2007 | A1 |
20070188653 | Pollock et al. | Aug 2007 | A1 |
20070189386 | Imagawa et al. | Aug 2007 | A1 |
20070257184 | Olsen et al. | Nov 2007 | A1 |
20070285550 | Son | Dec 2007 | A1 |
20080017557 | Witdouck | Jan 2008 | A1 |
20080024614 | Li et al. | Jan 2008 | A1 |
20080025634 | Border et al. | Jan 2008 | A1 |
20080030592 | Border et al. | Feb 2008 | A1 |
20080030611 | Jenkins | Feb 2008 | A1 |
20080084484 | Ochi et al. | Apr 2008 | A1 |
20080106629 | Kurtz et al. | May 2008 | A1 |
20080117316 | Orimoto | May 2008 | A1 |
20080129831 | Cho et al. | Jun 2008 | A1 |
20080218611 | Parulski et al. | Sep 2008 | A1 |
20080218612 | Border et al. | Sep 2008 | A1 |
20080218613 | Janson et al. | Sep 2008 | A1 |
20080219654 | Border et al. | Sep 2008 | A1 |
20090086074 | Li et al. | Apr 2009 | A1 |
20090109556 | Shimizu et al. | Apr 2009 | A1 |
20090122195 | Van Baar et al. | May 2009 | A1 |
20090122406 | Rouvinen et al. | May 2009 | A1 |
20090128644 | Camp et al. | May 2009 | A1 |
20090219547 | Kauhanen et al. | Sep 2009 | A1 |
20090252484 | Hasuda et al. | Oct 2009 | A1 |
20090295949 | Ojala | Dec 2009 | A1 |
20090324135 | Kondo et al. | Dec 2009 | A1 |
20100013906 | Border et al. | Jan 2010 | A1 |
20100020221 | Tupman et al. | Jan 2010 | A1 |
20100060746 | Olsen et al. | Mar 2010 | A9 |
20100097444 | Lablans | Apr 2010 | A1 |
20100103194 | Chen et al. | Apr 2010 | A1 |
20100165131 | Makimoto et al. | Jul 2010 | A1 |
20100196001 | Ryynänen et al. | Aug 2010 | A1 |
20100238327 | Griffith et al. | Sep 2010 | A1 |
20100259836 | Kang et al. | Oct 2010 | A1 |
20100283842 | Guissin et al. | Nov 2010 | A1 |
20100321494 | Peterson et al. | Dec 2010 | A1 |
20110058320 | Kim et al. | Mar 2011 | A1 |
20110063417 | Peters et al. | Mar 2011 | A1 |
20110063446 | McMordie et al. | Mar 2011 | A1 |
20110064327 | Dagher et al. | Mar 2011 | A1 |
20110080487 | Venkataraman et al. | Apr 2011 | A1 |
20110128288 | Petrou et al. | Jun 2011 | A1 |
20110164172 | Shintani et al. | Jul 2011 | A1 |
20110229054 | Weston et al. | Sep 2011 | A1 |
20110234798 | Chou | Sep 2011 | A1 |
20110234853 | Hayashi et al. | Sep 2011 | A1 |
20110234881 | Wakabayashi et al. | Sep 2011 | A1 |
20110242286 | Pace et al. | Oct 2011 | A1 |
20110242355 | Goma et al. | Oct 2011 | A1 |
20110298966 | Kirschstein et al. | Dec 2011 | A1 |
20120026366 | Golan et al. | Feb 2012 | A1 |
20120044372 | Cote et al. | Feb 2012 | A1 |
20120062780 | Morihisa | Mar 2012 | A1 |
20120069224 | Cilia | Mar 2012 | A1 |
20120069235 | Imai | Mar 2012 | A1 |
20120075489 | Nishihara | Mar 2012 | A1 |
20120105579 | Jeon et al. | May 2012 | A1 |
20120124525 | Kang | May 2012 | A1 |
20120154547 | Aizawa | Jun 2012 | A1 |
20120154614 | Moriya et al. | Jun 2012 | A1 |
20120196648 | Havens et al. | Aug 2012 | A1 |
20120229663 | Nelson et al. | Sep 2012 | A1 |
20120249815 | Bohn et al. | Oct 2012 | A1 |
20120287315 | Huang et al. | Nov 2012 | A1 |
20120320467 | Baik et al. | Dec 2012 | A1 |
20130002928 | Imai | Jan 2013 | A1 |
20130016427 | Sugawara | Jan 2013 | A1 |
20130063629 | Webster et al. | Mar 2013 | A1 |
20130076922 | Shihoh et al. | Mar 2013 | A1 |
20130093842 | Yahata | Apr 2013 | A1 |
20130094126 | Rappoport et al. | Apr 2013 | A1 |
20130113894 | Mirlay | May 2013 | A1 |
20130135445 | Dahi et al. | May 2013 | A1 |
20130155176 | Paripally et al. | Jun 2013 | A1 |
20130182150 | Asakura | Jul 2013 | A1 |
20130201360 | Song | Aug 2013 | A1 |
20130202273 | Ouedraogo et al. | Aug 2013 | A1 |
20130235224 | Park et al. | Sep 2013 | A1 |
20130250150 | Malone et al. | Sep 2013 | A1 |
20130258044 | Betts-LaCroix | Oct 2013 | A1 |
20130270419 | Singh et al. | Oct 2013 | A1 |
20130278785 | Nomura et al. | Oct 2013 | A1 |
20130321668 | Kamath | Dec 2013 | A1 |
20140009631 | Topliss | Jan 2014 | A1 |
20140049615 | Uwagawa | Feb 2014 | A1 |
20140118584 | Lee et al. | May 2014 | A1 |
20140160311 | Hwang et al. | Jun 2014 | A1 |
20140192238 | Attar et al. | Jul 2014 | A1 |
20140192253 | Laroia | Jul 2014 | A1 |
20140218587 | Shah | Aug 2014 | A1 |
20140313316 | Olsson et al. | Oct 2014 | A1 |
20140362242 | Takizawa | Dec 2014 | A1 |
20150002683 | Hu et al. | Jan 2015 | A1 |
20150042870 | Chan et al. | Feb 2015 | A1 |
20150070781 | Cheng et al. | Mar 2015 | A1 |
20150092066 | Geiss et al. | Apr 2015 | A1 |
20150103147 | Ho et al. | Apr 2015 | A1 |
20150138381 | Ahn | May 2015 | A1 |
20150154776 | Zhang et al. | Jun 2015 | A1 |
20150162048 | Hirata et al. | Jun 2015 | A1 |
20150195458 | Nakayama et al. | Jul 2015 | A1 |
20150215516 | Dolgin | Jul 2015 | A1 |
20150237280 | Choi et al. | Aug 2015 | A1 |
20150242994 | Shen | Aug 2015 | A1 |
20150244906 | Wu et al. | Aug 2015 | A1 |
20150253543 | Mercado | Sep 2015 | A1 |
20150253647 | Mercado | Sep 2015 | A1 |
20150261299 | Wajs | Sep 2015 | A1 |
20150271471 | Hsieh et al. | Sep 2015 | A1 |
20150281678 | Park et al. | Oct 2015 | A1 |
20150286033 | Osborne | Oct 2015 | A1 |
20150316744 | Chen | Nov 2015 | A1 |
20150334309 | Peng et al. | Nov 2015 | A1 |
20150370040 | Georgiev | Dec 2015 | A1 |
20160044250 | Shabtay et al. | Feb 2016 | A1 |
20160070088 | Koguchi | Mar 2016 | A1 |
20160154202 | Wippermann et al. | Jun 2016 | A1 |
20160154204 | Lim et al. | Jun 2016 | A1 |
20160165111 | Uemura | Jun 2016 | A1 |
20160198088 | Wang | Jul 2016 | A1 |
20160212358 | Shikata | Jul 2016 | A1 |
20160212418 | Demirdjian et al. | Jul 2016 | A1 |
20160241751 | Park | Aug 2016 | A1 |
20160291295 | Shabtay et al. | Oct 2016 | A1 |
20160295112 | Georgiev et al. | Oct 2016 | A1 |
20160301840 | Du et al. | Oct 2016 | A1 |
20160353008 | Osborne | Dec 2016 | A1 |
20160353012 | Kao et al. | Dec 2016 | A1 |
20170019616 | Zhu et al. | Jan 2017 | A1 |
20170070731 | Darling et al. | Mar 2017 | A1 |
20170187962 | Lee et al. | Jun 2017 | A1 |
20170214846 | Du et al. | Jul 2017 | A1 |
20170214866 | Zhu et al. | Jul 2017 | A1 |
20170242225 | Fiske | Aug 2017 | A1 |
20170289458 | Song et al. | Oct 2017 | A1 |
20180013944 | Evans, V et al. | Jan 2018 | A1 |
20180017844 | Yu et al. | Jan 2018 | A1 |
20180024329 | Goldenberg et al. | Jan 2018 | A1 |
20180059379 | Chou | Mar 2018 | A1 |
20180120674 | Avivi et al. | May 2018 | A1 |
20180150973 | Tang et al. | May 2018 | A1 |
20180176426 | Wei et al. | Jun 2018 | A1 |
20180196238 | Goldenberg | Jul 2018 | A1 |
20180198897 | Tang et al. | Jul 2018 | A1 |
20180241922 | Baldwin et al. | Aug 2018 | A1 |
20180295292 | Lee et al. | Oct 2018 | A1 |
20180300901 | Wakai et al. | Oct 2018 | A1 |
20190121103 | Bachar et al. | Apr 2019 | A1 |
20190121216 | Shabtay et al. | Apr 2019 | A1 |
20190147606 | Zhuang | May 2019 | A1 |
20190204084 | Song | Jul 2019 | A1 |
20200221026 | Fridman et al. | Jul 2020 | A1 |
Number | Date | Country |
---|---|---|
101276415 | Oct 2008 | CN |
201514511 | Jun 2010 | CN |
102739949 | Oct 2012 | CN |
103024272 | Apr 2013 | CN |
1536633 | Jun 2005 | EP |
1780567 | May 2007 | EP |
2523450 | Nov 2012 | EP |
103841404 | Jun 2014 | IN |
S59191146 | Oct 1984 | JP |
04211230 | Aug 1992 | JP |
H07318864 | Dec 1995 | JP |
08271976 | Oct 1996 | JP |
2002010276 | Jan 2002 | JP |
2003298920 | Oct 2003 | JP |
2004133054 | Apr 2004 | JP |
2004245982 | Sep 2004 | JP |
2005099265 | Apr 2005 | JP |
2006238325 | Sep 2006 | JP |
2007228006 | Sep 2007 | JP |
2007306282 | Nov 2007 | JP |
2008076485 | Apr 2008 | JP |
2010204341 | Sep 2010 | JP |
2011085666 | Apr 2011 | JP |
2013106289 | May 2013 | JP |
20070005946 | Jan 2007 | KR |
20090058229 | Jun 2009 | KR |
20100008936 | Jan 2010 | KR |
20140014787 | Feb 2014 | KR |
101477178 | Dec 2014 | KR |
20140144126 | Dec 2014 | KR |
20150118012 | Oct 2015 | KR |
2000027131 | May 2000 | WO |
2004084542 | Sep 2004 | WO |
2006008805 | Jan 2006 | WO |
2010122841 | Oct 2010 | WO |
2014072818 | May 2014 | WO |
2017025822 | Feb 2017 | WO |
2017037688 | Mar 2017 | WO |
2018130898 | Jul 2018 | WO |
Entry |
---|
Statistical Modeling and Performance Characterization of a Real-Time Dual Camera Surveillance System, Greienhagen et al., Publisher: IEEE, 2000, 8 pages. |
A 3MPixel Multi-Aperture Image Sensor with 0.7μm Pixels in 0.11μm CMOS, Fife et al., Stanford University, 2008, 3 pages. |
Dual camera intelligent sensor for high definition 360 degrees surveillance, Scotti et al., Publisher: IET, May 9, 2000, 8 pages. |
Dual-sensor foveated imaging system, Hua et al., Publisher: Optical Society of America, Jan. 14, 2008, 11 pages. |
Defocus Video Matting, McGuire et al., Publisher: ACM SIGGRAPH, Jul. 31, 2005, 11 pages. |
Compact multi-aperture imaging with high angular resolution, Santacana et al., Publisher: Optical Society of America, 2015, 10 pages. |
Multi-Aperture Photography, Green et al., Publisher: Mitsubishi Electric Research Laboratories, Inc., Jul. 2007, 10 bages. |
Multispectral Bilateral Video Fusion, Bennett et al., Publisher: IEEE, May 2007, 10 pages. |
Super-resolution imaging using a camera array, Santacana et al., Publisher: Optical Society of America, 2014, 6 pages. |
Optical Splitting Trees for High-Precision Monocular Imaging, McGuire et al., Publisher: IEEE, 2007, 11 pages. |
High Performance Imaging Using Large Camera Arrays, Wilburn et al., Publisher: Association for Computing Machinery, Inc., 2005, 12 pages. |
Real-time Edge-Aware Image Processing with the Bilateral Grid, Chen et al., Publisher: ACM SIGGRAPH, 2007, 9 pages. |
Superimposed multi-resolution imaging, Carles et al., Publisher: Optical Society of America, 2017, 13 pages. |
Viewfinder Alignment, Adams et al., Publisher: EUROGRAPHICS, 2008, 10 pages. |
Dual-Camera System for Multi-Level Activity Recognition, Bodor et al., Publisher: IEEE, Oct. 2014, 6 pages. |
Engineered to the task: Why camera-phone cameras are different, Giles Humpston, Publisher: Solid State Technology, Jun. 2009, 3 pages. |
Office Action in related KR patent application 2021-7038426, dated Jan. 15, 2022. |
Number | Date | Country | |
---|---|---|---|
20230164437 A1 | May 2023 | US |
Number | Date | Country | |
---|---|---|---|
63064565 | Aug 2020 | US |