Optical image stabilization in a scanning folded camera

Information

  • Patent Grant
  • 11968453
  • Patent Number
    11,968,453
  • Date Filed
    Thursday, July 22, 2021
    2 years ago
  • Date Issued
    Tuesday, April 23, 2024
    a month ago
  • CPC
  • Field of Search
    • CPC
    • H04N23/687
    • H04N23/61
    • H04N23/683
    • H04N23/58
    • H04N23/6812
    • H04N23/45
    • H04N23/698
    • H04N23/55
    • G03B17/17
  • International Classifications
    • H04N13/246
    • G03B17/17
    • H04N23/45
    • H04N23/55
    • H04N23/58
    • H04N23/61
    • H04N23/68
    • H04N23/698
Abstract
A Tele folded camera operative to compensate for an undesired rotational motion of a handheld electronic device that includes such a camera, wherein the compensation depends on the undesired rotational motion and on a point of view of the Tele folded camera.
Description
FIELD

Examples disclosed herein relate in general to digital cameras and in particular to correction of images obtained with folded digital cameras.


BACKGROUND

Compact digital cameras having folded optics, also referred to as “folded cameras” are known, see e.g. co-owned international patent application PCT/IB2016/057366. In handheld electronic devices (also referred to herein as “handheld devices”) such as smartphones, tablets etc. a folded Tele camera is often part of a multi-camera and accompanied by one or more additional cameras, e.g. an Ultra-wide camera and a Wide camera. An Ultra-wide camera has a larger field of view (FOVUW) than a Wide camera, which has a larger FOVW than a Tele camera having FOVT (assuming similar image sensor sizes).



FIG. 1A shows schematically a folded Tele camera numbered 100 from a perspective view. Camera 100 includes a lens 102 with a lens optical axis 110, an optical path folding element (OPFE) 104 and an image sensor 106. OPFE 104 folds a first optical path along an axis 108 substantially parallel to the X axis from an object, scene or panoramic view section 114 into a second optical path along an axis 110 substantially parallel to the Z axis. Camera 100 is designed to rotate OPFE 104 around axis 110 (the X axis) relative to the image sensor, i.e. in the Y-Z plane, a rotation indicated by an arrow 112. That is, folded Tele camera 100 is a “scanning” Tele camera (“STC”). FIG. 1B shows OPFE 104 after rotation by 30 degrees from the zero position.



FIG. 1C shows a handheld device 120 including a STC 100 having lens 102, OPFE 104 and image sensor 106 in a top view. A device normal (“N”) is orthogonal to a screen 116 of device 120 and points towards the observer. The camera's optical axis is parallel to the X axis. In other examples, STC 100 may be included in 120 so that the camera's optical axis is parallel to the Y axis.


Images are acquired from a certain point of view (POV) of a camera. The POV is the direction defined by the vector that has the location of a camera's aperture as starting point and an object point at the center of the FOV as end point (see FIG. 3A, with two POV vectors 324 and 328 corresponding to two FOVTS 326 and 332). Instead of POV vector, one may also speak of the FOV center direction vector (FOVCD). As an example, in spherical coordinates (r, θ, φ) defined according to the ISO convention, the POV for a camera at r=0 is defined by (1, θ, φ), with the polar angle θ and azimuthal angle φ defining the location of the object point at the center of the Tele FOV. The length of the POV vector may be 1 (unit vector) or may have some constant length (e.g. EFL) or may have a varying length e.g. so that it comes to lie on a specific plane.


As e.g. described in the co-owned PCT Patent Application No. PCT/IB2016/057366 and with reference to FIGS. 1A-1B, rotation of the OPFE may be performed around the X axis and around the Y axis for “scanning” with the FOV in 2 dimensions (2D) in FIGS. 1A-1B.


Modern cameras that are included in handheld devices often include optical image stabilization (OIS) for mitigating undesired camera motion caused by a user's hand motion (often referred to as “hand-shake”). For OIS, optical components are moved to reduce movements of imaged objects on a camera's image sensor. The lens module and/or the image sensor and/or the OPFE and/or the entire camera can be moved. An inertial measurement unit (IMU) included in the handheld device provides motion data along 6 degrees of freedom, namely and with reference to FIG. 1C, linear movements in X-Y-Z, roll “tilt about” (or “tilt around”) the Z axis, yaw (tilt around the Y axis) and pitch (tilt around the X axis). Usually, OIS is provided for Pitch and Yaw rotation compensation only, and not for roll rotation, as Pitch and Yaw rotation account for the major share of image deterioration caused by hand-shake. Coordinate systems of the IMU, of a regular (i.e. a non-scanning) camera and of the including handheld device can be aligned and do not evolve in time. For a STC, this is not valid. The relation between a handheld device's coordinate system and that of a STC does change when FOV scanning is performed. Therefore, OIS as known in the art cannot be used for hand motion compensation in a STC.


There is a need for and it would be advantageous to have OIS for scanning Tele cameras.


SUMMARY

Henceforth and for simplicity, the terms “electronic device”, “electronic handheld device” “handheld device” or just “device” are used interchangeably. Henceforth and for simplicity, the term “smartphone” may be used to represent all electronic handheld devices having scanning folded cameras and implementing methods for OIS in such cameras described herein.


In various embodiments, there are provided Tele folded cameras operative to compensate for an undesired rotational motion of a handheld electronic device that includes such a camera, wherein the compensation depends on the undesired rotational motion and on a point of view of the Tele folded camera.


In various embodiments, a handheld electronic device comprises: a Tele folded camera comprising an OPFE for folding light from a first optical path that forms an angle of less than 90 degrees to a normal of the device toward a second optical path substantially orthogonal to the normal of the device, a lens with a lens optical axis along the second optical path, and an image sensor, wherein the device is a handheld electronic device; an OPFE actuator for tilting the OPFE in one or more directions to direct a point of view (POV) of the Tele folded camera towards a segment of a scene; a motion sensor for sensing an undesired rotational motion of the device; and


at least one actuator for moving at least one component of the Tele folded camera to compensate for the undesired rotational motion of the device, wherein the compensation depends on the undesired rotational motion of the device and on the Tele folded camera POV.


In some embodiments, the undesired rotation motion is around the device normal.


In some embodiments, a device further comprises a Wide camera having a field of view FOVW larger than a field of view FOVT of the Tele camera.


In some embodiments, the sensing the rotational motion includes measuring the rotation motion in three directions.


In some embodiments, the actuator for moving the component of the Tele folded camera to compensate for the device's undesired rotational motion is the OPFE actuator for tilting the OPFE in one or more directions to direct a point of view (POV) of the Tele folded camera towards a segment of a scene.


In some embodiments, the moving of the component of the Tele folded camera to compensate for the device's undesired rotational motion includes moving the lens.


In some embodiments, the moving of the component of the Tele folded camera to compensate for the device's undesired rotational motion includes moving the image sensor.


In some embodiments, a device further comprises a processing unit configured to perform a coordinate transformation to align coordinates of the Tele camera with coordinates of the handheld device or vice versa.


In some embodiments, a device further comprises a processing unit configured to perform a coordinate transformation that aligns coordinates of a reference coordinate system with coordinates of the handheld device and coordinates of the Tele camera.


In some embodiments, the coordinate transformation is performed using Rodrigues' rotation formula.


In some embodiments, the motion sensor includes an inertial measurement unit (IMU).


In some embodiments, a device further comprises a microcontroller unit (MCU) configured to read out the motion sensor and to provide control signal to the rotational motion compensation actuator. In some embodiments, the MCU is included in an application processor (AP).


In some embodiments, a device further comprises an application processor configured to provide POV control signal to the OPFE actuator for tilting the OPFE.


In various embodiments, there are provided methods comprising: providing a handheld device comprising a Tele folded camera that includes an OPFE for folding light from a first optical axis that forms an angle of less than 90 degrees to a normal of the device toward a second optical axis substantially orthogonal to a normal of the device, a lens with a lens axis along the second optical axis, and an image sensor; providing an OPFE actuator for tilting the OPFE in one or more directions to direct a point of view (POV) of the Tele folded camera towards a segment of a scene; sensing an undesired rotational motion of the device; and compensating for the undesired rotational motion, wherein the compensation depends on the undesired rotational motion and on the Tele folded camera's POV.


In some embodiments, the compensating for the undesired rotational motion includes moving a component of the Tele folded camera.


In some embodiments, the compensating for the undesired rotational motion includes compensating for a rotational motion around the device's normal direction.


In some embodiments, a method further comprises performing a coordinate transformation to align coordinates of the Tele camera with coordinates of an IMU.


In some embodiments, a method further comprises performing a coordinate transformation to coordinates of the IMU with coordinates of the Tele camera.


In some embodiments, a method further comprises performing a coordinate transformation to align coordinates of a reference coordinate system with coordinates of the IMU and coordinates of the Tele camera.


In some embodiments, the performing the coordinate transformation includes performing the transformation using Rodrigues' rotation formula.


In some embodiments, the sensing an undesired rotational motion of the device includes sensing the undesired rotational motion in three directions.


In some embodiments, the compensating for the undesired rotational motion of the device includes rotating the OPFE.


In some embodiments, the compensating for the undesired rotational motion of the device includes moving the lens.


In some embodiments, the compensating for the undesired rotational motion of the device includes moving the image sensor.


In some embodiments, the compensating for the undesired rotational motion includes calculating a changed POV caused by the undesired rotational motion in the X direction by using the equation: PFP=(PI·cos(hnd_pitch)+cross(PI, RP)·sin(hnd_pitch)+RP·(dot(PI, RP) (1-cos (hnd_pitch)))).


In some embodiments, the compensating for the undesired rotational motion includes calculating a changed POV caused by the undesired rotational motion in the Y direction by using the equation: PFY=(PI·cos(hnd_yaw)+cross(PI, RY)·sin(hnd_yaw)+RY·(dot(PI, RY)·(1-cos (hnd_yaw)))).


In some embodiments, the compensating for the undesired rotational motion includes calculating a changed POVcaused by the undesired rotational motion in the X direction by using the equation: PFR=(PI·cos(hnd_roll)+cross(PI, RR)·sin(hnd_roll)+RR·(dot(PI, RR)·(1-cos (hnd_roll)))).


In some embodiments, the compensating for the undesired rotational motion includes calculating a direction of a changed POV caused by the undesired rotational motion in X, Y and Z direction together by using the equation: PF,=PI+(PI-PFP)+(PI-PFY)+(PI-PFR).


In some embodiments, the compensating for the undesired rotational motion includes calculating a vector of a changed POV caused by by the undesired rotational motion in X, Y and Z direction together by using the equation: PF′PF′·EFLT/PFz.





BRIEF DESCRIPTION OF THE DRAWINGS

Non-limiting examples of embodiments disclosed herein are described below with reference to figures attached hereto that are listed following this paragraph. The drawings and descriptions are meant to illuminate and clarify embodiments disclosed herein, and should not be considered limiting in any way. Like elements in different drawings may be indicated like numerals.



FIG. 1A shows schematically a known folded scanning camera from a perspective view;



FIG. 1B shows the OPFE in the Tele camera of FIG. 1A after rotation by 30 degrees from a zero position;



FIG. 1C shows a scanning camera such as shown in FIGS. 1A-B integrated as a “rear” or “world-facing” camera in a smartphone;



FIG. 2A shows exemplarily a smartphone including a first, scanning Tele camera at a zero position, as well as a second, Wide camera;



FIG. 2B shows the smartphone of FIG. 2A with the Tele camera at a non-zero position;



FIG. 2C shows the smartphone of FIG. 2A with the Tele camera at another non-zero position;



FIG. 3A shows a 2-dimensional (2D) chart used to derive a coordinate system for the Tele camera;



FIG. 3B shows impact of rotational device motion caused by hand shake on the 2D chart of FIG. 3A;



FIG. 3C shows in a flow chart main steps of a method for scanning Tele camera OIS disclosed herein;



FIG. 4A shows schematically in a block diagram an embodiment of a handheld device that includes multi-aperture cameras with at least one scanning Tele camera disclosed herein;



FIG. 4B shows schematically in a block diagram another embodiment of a handheld device that includes multi-aperture cameras with at least one scanning Tele camera disclosed herein.





DETAILED DESCRIPTION


FIG. 2A shows exemplarily a smartphone 200 comprising a STC 202 at a zero position, and a Wide camera 204. Wide camera 204 is not a scanning camera and its POV (“POVW”) is parallel to a device normal N (parallel to Z-axis) of the smartphone. Device normal N is parallel (or anti-parallel) to a normal onto a surface of smartphone 200 that has the largest area. A coordinate system of the IMU of smartphone 200 (such as IMU 460 in FIGS. 4A and 4B, not shown here) may be aligned with a coordinate system of smartphone 200 such as the coordinate system shown in FIG. 2A, where the three axes of the coordinate system are parallel to the three symmetry axes of smartphone 200, so that the Z axis of the IMU's (and smartphone 200's) coordinate system is parallel to POVW. The POV of STC 202 (“POVT”) is directed its zero position (“POVT,0”), corresponding to an OPFE rotation state such as shown in FIG. 1A. With POVT at zero position, the coordinate systems of IMU, Wide camera 204 and STC 202 align.


In a first exemplary method for OIS (“Example 1”), consider OIS for Wide camera 204 that (for the sake of simplicity) may correct for pitch rotation only. For detecting the amount of undesired hand motion, one could read out the value for pitch rotation around the X axis from the IMU (“XIMU”) and move e.g. the lens in one particular direction (dir1) by a particular amount, wherein the amount (or stroke) of movement is proportional to XIMU, i.e. the lens stroke SW fulfills SW=CW.XIMU (with some constant CW). The same holds for OIS of STC 202 at zero position. By moving the lens by ST=CT.XIMU (with some constant CT) in din the hand motion is compensated.



FIG. 2B shows smartphone 200 with STC 202 at a non-zero position. POVT has an angle of α degrees with respect to POVW. For example, for α=30 degrees, this corresponds to an OPFE rotation state such as shown in FIG. 1B. The coordinate systems of IMU, Wide camera 204 and STC 202 do not align anymore.


Consider Example 1 (hand motion in pitch direction) with STC 202 at a non-zero position. OIS for Wide camera 204 may be performed as in Example 1. However, for OIS of STC 202, the method of Example 1 does not allow to perform hand motion compensation anymore, i.e. there is (in general) no CT so that by moving the Tele lens by ST=CT.XIMU, the hand motion is compensated. This is because the coordinate systems of STC 202 and the IMU are not aligned anymore.


For a second exemplary method for OIS (“Example 2”), refer to FIG. 2C. Compared to FIG. 2A, POVT is rotated by 90 degree around the Y axis, i.e. POVT and POVW are perpendicular to each other. As in Example 1, we consider OIS for the Wide camera for correction of pitch rotation only. Hand motion can be fully compensated by reading the IMU's value for rotation XIMU and by moving a lens of the Wide camera (not shown) by SW=CW.XIMU (with some constant CW) in dir1. However, the hand motion cannot be compensated by moving a lens of the STC (not shown, but similar to lens 102) by ST=CT.XIMU (with some constant CT) in dir1. Instead, the rotation direction must be modified from din to a particular direction dir2 which is different from dir1. The hand motion can be compensated by moving the STC lens by ST=CT.XIMU in dir2. In general, for a STC the OIS axes depend on the POV or scanning state of the STC and are thus not constant, as it is the case for a Wide camera.



FIG. 3A shows a 2-dimensional (2D) chart 320 for deriving a coordinate system for a STC. An aperture 322 of the STC is located at coordinates (0, 0, 0). A zero state STC POVT (POVT,0) 324 corresponds to a first optical path which is parallel to a device normal N (see FIGS. 2A-C) and may have the coordinates (0, 0, EFLT), with EFLT being the EFL of the STC. FOVT 326 corresponds to the FOVT of the STC at POVT,0 324. A desired or target POVT 328 (“POVT,T”) with corresponding FOVT 332 is shown as well.



FIG. 3B shows 2D chart 320 of FIG. 3A after the handheld device that includes the STC underwent a rotational “roll” motion around the Z axis, e.g. because of a user's hand motion. POVT,0 324 did not undergo any change. However, the corresponding FOVT changed to a rotated FOVT 326′. In contrast, the rotational motion changed POVT,T 328 to POVT,T 328′. The change of a POV such as POVT,T 328 in response to a rotational device motion depends not only on the angle or amount of rotation, but also on the position of POVT.



FIG. 3C shows in a flow chart main steps of a method for STC OIS disclosed herein.


In a first step 302, a command triggered by a human user or by a program and processed by a FOV scanner 442 (FIG. 4A) directs FOVT to a region of interest (ROI) within a scene. The scanning may be performed by rotating an OPFE with an OPFE actuator 414 (FIG. 4A). The FOV scanning by OPFE rotation is not performed instantaneously, but requires some settling time, which may be about 1-50 ms for scanning 2-5 degrees and about 5-500 ms for scanning 10-45 degrees. After the settling time, the STC is operational for capturing Tele images. The STC may be focused to an object by a user command or autonomously. The STC's scanning direction may be given by an initial (or target) POV vector PI.


In step 304, the IMU is read out and provides rotational movements around the Pitch, Yaw and Roll directions, i.e. XIMU, YIMU and ZIMU respectively. Usually, IMU provides data on the angular acceleration which is to be integrated for determining the rotation angle. The IMU data may be used to calculate the undesired rotational motion of the device.


In step 306, a coordinate transformation is performed. The coordinate transformation is required because the STC's POV change caused by an undesired rotational motion of the device and the sensing of the undesired rotational motion occur in different coordinate systems.


A processing unit such as an AP or a MCU may be configured for performing the coordinate transformation (e.g. AP 440 of device 400 or device 480, or MCU 470 of device 400 in FIG. 4A). In some examples, an AP or MCU may solve the below equations analytically, or AP or MCU may use a polynomial fit or a linear fit for solving the equations approximately. In other examples, the AP or MCU may not perform calculations but use a Look Up Table (LUT) for coordinate transformation. In some examples and such as e.g. shown in FIG. 4A, the coordinate transformation may be performed by a MCU such as MCU 470 connected to the STC module 410.


In some examples, the transformation may be performed in order to express the coordinates of the STC in the coordinate system of the IMU. Device rotations and compensation motions may then be calculated in the IMU's coordinate system.


In some examples, a 2D chart such as chart 320 shown in FIG. 3B may be used to express the coordinates of the STC in the IMU's coordinate system. Chart 320 may resemble a calibration chart for calibrating the STC or for calibrating a dual-camera, e.g. including a Tele camera and a Wide camera. STC aperture 322 may be located at (0, 0, 0). The handheld device may be pointed towards chart 320 in “landscape” direction, i.e. with reference to the coordinate system of FIG. 3B, the long side of a smartphone as shown in FIG. 1C may be parallel to the X axis and the short side parallel to the Y axis, with the STC aperture pointing towards the chart in Z direction. All POVs that the STC can reach are given by “POV vectors” or “camera pointing vector” P which are pointing to coordinates lying on chart 320. The coordinates of the zero state position may be (0, 0, EFLT) with EFLT being the EFL of the STC. At zero position, the coordinates of the IMU (and of the handheld device) overlap with the STC's coordinates.


If the STC is directed to a non-zero POV, a coordinate transform from the IMU's to the STC's coordinates must be performed. In some examples, Rodrigues' rotation formula may be used. The IMU's pitch/yaw/roll rotation values may be named “hnd_pitch”, “hnd_yaw” and “hnd_roll”. IMU provides hnd_pitch, hnd_yaw and hnd_roll in a coordinate system having the following unit vectors:

    • Pitch unit vector RP: RP=(1, 0, 0),
    • Yaw unit vector RY: RY=(0, 1, 0),
    • Roll unit vector RR: RR=(0, 0, 1).


In general, OIS corrects small angles only. Therefore, in some situations and approximately, one may treat the pitch/yaw/roll rotations independently. For any (slight) rotation of the device, Rodrigues' rotation formula may be applied to pitch/yaw/roll rotations independently, wherein the (slight) rotation may be represented by the sum over the pitch/yaw/roll rotations. A hand motion only by hnd_pitch, or only by hnd_yaw or only by hnd_roll (in the IMU's coordinates RP, RY and RR) applied to any initial POV vector PI may result in the following final POV vector PF (“cross(x, y)” indicates the cross product of vectors x and y, “dot(x, y)” indicates the dot product of vectors x and y):

    • POV vector PFP after rotation by hnd_pitch around RP (hnd_yaw, hnd_roll=0): PFP=(PI·cos(hnd_pitch)+cross(PI, RP)·sin(hnd_pitch)+RP·(dot(PI, RP)·(1-cos(hnd_pitch))));
    • POV vector PFY after rotation by hnd_yaw around RY (hnd_pitch, hnd_roll=0): PFY=(PI·cos(hnd_yaw)+cross(PI, RY)·sin(hnd_yaw)+RY(dot(PI, RY)·(1-cos(hnd_yaw))));
    • POV vector PFR after rotation by hnd_roll around RR(hnd_pitch, hnd_yaw=0): PFR=(PI·cos(hnd_roll)+cross(PI, RR)·sin(hnd_roll)+RR(dot(PI, RR)·(1-cos(hnd_roll)))).


      For small angles, a final POV vector (before normalization) PF′ that underwent both Pitch, Yaw and Roll rotations may be given by:

      PF′=PI+(PI-PFP)+(PI-PFY)+(PI-PFR)

      A normalization may be performed in order to ensure that the final POV vector PF comes to lie on chart 320. In some examples, PF may be obtained by normalizing PF′ with EFLT/PFz, wherein PFz is the z-component of PF′, i.e.:

      PF=PF′·EFLT/PFz.


From the above equations it is evident that for compensating for undesired rotational hand motion in a STC, in contrast for a non-scanning camera like e.g. Wide camera 204, where one may compensate the undesired rotational hand motion around yaw and pitch only, one must compensate rotational hand motion around the three directions yaw, pitch and roll.


In other examples for coordinate transformation, the transformation may be performed to express the coordinates of the IMU in the coordinate system of the STC. Hand motion rotations and compensation motions may then be calculated in the STC's coordinate system. As above, Rodrigues' rotation formula may be used.


In yet other examples for coordinate transformation, the transformation may be to a third coordinate system (“reference system”). Both the coordinates of the STC and of the IMU are expressed in the reference coordinate system. Hand motion rotations and compensation motions may then be calculated in the reference coordinate system. As above, Rodrigues' rotation formula may be used.


In step 308, movement for OIS may be performed. In some examples, OIS may be performed by moving the STC's OPFE. In other examples, a lens such as lens 102 and/or an image sensor such as image sensor 106 may be moved for OIS. Assuming ideal OIS, the movement of OPFE and/or lens and/or sensor may lead to a POC vector modification POIS that exactly cancels the effect of the hand motion onto the POV vector, i.e.: PF+POIS=PI. So after performing step 308 the STC is again directed towards PI. In other examples, the entire STC may be moved for OIS. I.e. OPFE, lens and image sensor are moved together as one unit for OIS.


In some embodiments, steps 304-308 may be repeated for stabilizing the STC continuously. The OIS cycles that include steps 304-308 may be performed at frequencies of e.g. 500 Hz-100 kHz. STC images or image streams are captured while the above OIS method is performed.


In some embodiments, an IMU may be fixedly attached to the OPFE, so that when moving the OPFE, the IMU moves accordingly, too. This allows for using coordinate systems having identical basis vectors for both the STC and the IMU, so that the coordinate transform of step 306 is not required.


In some embodiments, a sensor actuator may actuate the image sensor for correcting POV aberrations of a STC image. As described in the co-owned international patent application PCT/IB2021/056311, a STC image undergoes POV aberrations. One aberration is a rotation of the STC image on the image sensor (“rotational POV aberration”). When an undesired rotational hand motion is compensated by moving an OPFE as disclosed herein, the moving of the OPFE introduces a POV aberration. A sensor actuator may be used to rotate an image sensor around a normal of the image sensor for compensating the rotational POV aberration.



FIG. 4A shows schematically an embodiment of a handheld device numbered 400 and including multi-aperture cameras with at least one STC disclosed herein. Device 400 comprises a STC module 410 that includes an OPFE 412 as well as an OPFE actuator 414 for FOV scanning and/or OIS, and a Tele lens module 420 that forms a Tele image recorded by an image sensor 416. A Tele lens actuator 422 may move lens module 420 for focusing and/or OIS. Handheld device 400 may further comprise an application processor (AP) 440 that includes a FOV scanner 442, an OIS controller 444, an image generator 446 and an object tracker 448.


In other examples, device 400 may comprise a STC that includes two OPFEs as well as an OPFE actuator for each of the two OPFEs. In some examples, the OPFE actuators may actuate the OPFEs for performing OIS as disclosed herein. In other examples, a lens actuator may actuate a lens or a sensor actuator may actuate a sensor for performing OIS as disclosed herein. A STC camera based on two OPFEs is described for example in PCT/IB2021/054186. In such a STC, the optical path within the camera is folded twice, so that one speaks of a double-folded scanning Tele camera.


Handheld device 400 further comprises a Wide (or Ultra-Wide) camera module 430 which includes a second lens module 434 that forms an image recorded by a second image sensor 432. A second lens actuator 436 may move lens module 434 for focusing and/or OIS. In some examples, the STC can scan the entire FOVW or an even larger FOV. In other examples, the STC can scan a FOV that is smaller than FOVW.


In some examples, object tracker 448 may be configured to track an object in FOVW and provide tracking data to FOV scanner 442 and/or the OIS controller 444. Based on the tracking data, FOV scanner 442 and/or the OIS controller 444 may provide control signals to OPFE actuator 414 which actuate an OPFE rotation for tracking an object with the STC. As an example, one may track an object so that it centers at the center of FOVT. Examples 3-7 described below refer to this tracking scenario, where the Wide camera image data is used to provide tracking information which triggers Tele FOV scanning.


In some examples, tracking information and OIS information may interfere and coordination between tracking and OIS may be required for achieving a desired object tracking and/or OIS outcome.


As a third exemplary method for OIS, consider a device such as device 400 or 480 including a Wide camera and a STC both not having OIS. The STC may track an object at rest so that the object's center is located at the center of FOVT. The tracking may occur in real-time (RT), i.e. we assume that there is no delay between the detection of a tracking deviation and its compensation. A device's rotational motion caused by a user's hand motion will be detected as an object movement in the Wide camera. In response, a tracking movement of the STC will be triggered and the object's location in the Tele FOV will be updated. In conclusion, in the RT scenario the object tracker performs OIS in a sense that the object will always be located in the center of FOVT and will not be affected from hand motion of a user.


As a fourth exemplary method for OIS, consider a device such as device 400 or 480 including a Wide camera not having OIS and a STC having OIS. As in example 3, we assume RT object tracking on FOVW so that a (non-moving) object's center is located at the center of FOVT. OIS may be performed in RT as well. A device's rotational motion caused by a user's hand motion will be detected as an object movement in the Wide camera. In response, a tracking movement ΔT for the STC will be triggered. Simultaneously, the device's rotational motion will also be detected by the STC's OIS and an OIS movement ΔOIS of the STC will be triggered in response. OIS movement may be performed according the OIS method disclosed herein. ΔT and ΔOIS are identical in terms of direction and magnitude, i.e. a STC movement of 2·ΔT=2·ΔOIS will be triggered, which is double the amount of movement required (i) for keeping the object at the center of FOVT (desired tracking outcome) and (ii) for suppressing the impact of hand motion on the STC image (desired OIS outcome). In conclusion, the desired outcome is not achieved for either Tele tracking or Tele OIS. Therefore, in some examples, the STC's OIS is disabled when using object tracking.


As a fifth exemplary method for OIS, consider a device such as device 400 or 480 including a Wide camera not having OIS and a STC having OIS. Object tracking may be performed on FOVW so that a (non-moving) object's center is located at the center of FOVT. However, Object tracking and OIS may not be performed in RT. In general, OIS is performed at higher frequencies than object tracking. As an example, OIS may be performed at 500 Hz-100 kHz and object tracking may be performed at 1 Hz-100 Hz. In some examples, for preventing undesired interference between OIS and object tracking, one may disable OIS when using object tracking. In other embodiments, one may separate control of OIS and object tracking in the frequency domain. As an example, for device's rotational motion caused by a user's hand motion that occurs at a frequency higher than e.g. 30 Hz, one may use OIS for device motion correction. For frequencies lower than e.g. 30 Hz one may not use OIS for device motion correction. Instead the low frequency device motion will be compensated by the object tracker.


As a sixth exemplary method for OIS, consider a device such as device 400 or 480 including a Wide camera having OIS and a STC not having OIS. Object tracking may be performed on FOVW so that a (non-moving) object's center is located at the center of FOVT. Object tracking and OIS may be performed in RT. As of the Wide camera's OIS, a device's rotational motion caused by a user's hand motion will have no impact on the Wide image stream. As the object does not move in FOVW, no tracking movement of the STC will be triggered. In conclusion, there is no hand motion compensation and the object will not be located at the center of FOVT anymore, leading to an undesired object tracking outcome. In some examples for preventing this undesired outcome, one may disable the Wide camera's OIS when performing object tracking. In other examples, object tracking control signals that are supplied to the STC may additionally include the Wide camera's OIS control signals. By superimposing the two signals, the benefits of both Wide camera OIS and proper STC tracking may be enjoyed.


As a seventh exemplary method for OIS, consider a device such as device 400 or 480 with both the Wide camera and the STC having OIS. We assume RT tracking so that an object's center is located at the center of FOVT. A device's rotational motion caused by a user's hand motion will be corrected by an OIS movement in both the Wide camera and the STC in RT. In conclusion, a user's hand motion will not impact the desired output of the object tracker.


Calibration data may be stored in a first memory 424, e.g. in an EEPROM (electrically erasable programmable read only memory) and/or in a second memory 438 and/or in a third memory 450 such as a NVM (non-volatile memory). The calibration data may comprise calibration data between Wide camera 430 and STC 410. The calibration data may further comprise calibration data between an OPFE's position and the STC's corresponding POV.


Handheld device 400 further comprises an inertial measurement unit (IMU, for example a gyroscope) 460 that supplies motion information of 400. For example, a microcontroller unit (MCU) 470 may be used to read and process data of IMU 460. In some examples, the MCU may be controlled by an OIS controller 444 which is part of AP 440. In some examples, step 304 and step 306 may be performed by the MCU and step 308 may be performed by OPFE actuator 414 (and/or lens actuator 436 and/or sensor actuator 418 in case OIS is performed by lens shift or sensor shift respectively). In some examples, MCU 470 may be integrated into AP 440.


Another embodiment of a handheld device numbered 480 and comprising a multi-aperture camera with at least one STC as disclosed herein is shown in FIG. 4B. An MCU (not shown) for reading and processing motion data from IMU 460 and for supplying OIS control signals may be included into STC module 410, e.g. into the driver of OPFE actuator 414.


In some examples, additional data may be used for hand motion estimation. Additional data may e.g. be image data from the Wide camera 430 or data from additional sensing units present in the handheld device.


In some examples, image data from Wide camera 430 may be used to estimate an “optical flow” from a plurality of images as known in the art, wherein OIS controller 444 may use the data of the optical flow together with data from IMU 460 for estimating motion of device 400. In other examples, only optical flow data estimated from image data of camera 410 and/or camera 430 may be used for estimating motion of device 400.


Image generator 446 may be configured to generate images and image streams. In some examples, image generator 446 may be configured to use only first image data from camera 430. In other examples, image generator 446 may use image data from camera 410 and/or camera 430.


While this disclosure has been described in terms of certain embodiments and generally associated methods, alterations and permutations of the embodiments and methods will be apparent to those skilled in the art. The disclosure is to be understood as not limited by the specific embodiments described herein, but only by the scope of the appended claims.


Unless otherwise stated, the use of the expression “and/or” between the last two members of a list of options for selection indicates that a selection of one or more of the listed options is appropriate and may be made.


It should be understood that where the claims or specification refer to “a” or “an” element, such reference is not to be construed as there being only one of that element.


Furthermore, for the sake of clarity the term “substantially” is used herein to imply the possibility of variations in values within an acceptable range. According to one example, the term “substantially” used herein should be interpreted to imply possible variation of up to 5% over or under any specified value. According to another example, the term “substantially” used herein should be interpreted to imply possible variation of up to 2.5% over or under any specified value.


According to a further example, the term “substantially” used herein should be interpreted to imply possible variation of up to 1% over or under any specified value.


All patents and/or patent applications mentioned in this specification are herein incorporated in their entirety by reference into the specification, to the same extent as if each individual reference was specifically and individually indicated to be incorporated herein by reference. In addition, citation or identification of any reference in this application shall not be construed as an admission that such reference is available as prior art to the present invention.

Claims
  • 1. A device, comprising: a Tele folded camera comprising an optical path folding element (OPFE) for folding light from a first optical path that forms an angle of less than 90 degrees to a normal of the device toward a second optical path orthogonal to the normal of the device, a lens with a lens optical axis along the second optical path, and an image sensor, wherein the device is a handheld electronic device;an OPFE actuator for tilting the OPFE in one or more directions to direct a point of view (POV) of the Tele folded camera towards a segment of a scene such that coordinate systems of the Tele folded camera and the device are not aligned;a motion sensor for sensing an undesired rotational motion of the device;a processing unit configured to perform a coordinate transformation to align the coordinate system of the Tele folded camera with the coordinate system of the device or vice versa and to calculate compensation motion to compensate for the undesired rotational motion of the device in the aligned coordinate system of the Tele folded camera, wherein the compensation motion depends on the undesired rotational motion of the device and on the Tele folded camera POV; andat least one actuator for moving at least one component of the Tele folded camera according to the compensation motion to compensate for the undesired rotational motion of the device.
  • 2. The device of claim 1, wherein the undesired rotation motion is around the device normal.
  • 3. The device of claim 1, further comprising a Wide camera having a field of view FOVW larger than a field of view FOVT of the Tele camera.
  • 4. The device of claim 1, wherein the sensing of the undesired rotational motion includes sensing of the undesired rotational motion in three directions.
  • 5. The device of claim 1, wherein the compensating of the undesired rotational motion includes compensating the undesired rotational motion in three directions.
  • 6. The device of claim 1, wherein the at least one actuator for moving the at least one component of the Tele folded camera to compensate for the device undesired rotational motion includes an OPFE actuator that moves the OPFE.
  • 7. The device of claim 1, wherein the at least one actuator for moving the at least one component of the Tele folded camera to compensate for the device undesired rotational motion includes a lens actuator that moves the lens.
  • 8. The device of claim 1, wherein the at least one actuator for moving the at least one component of the Tele folded camera to compensate for the device undesired rotational motion includes a sensor actuator that moves the sensor.
  • 9. A device, comprising: a Tele folded camera comprising an optical path folding element (OPFE) for folding light from a first optical path that forms an angle of less than 90 degrees to a normal of the device toward a second optical path orthogonal to the normal of the device, a lens with a lens optical axis along the second optical path, and an image sensor, wherein the device is a handheld electronic device;an OPFE actuator for tilting the OPFE in one or more directions to direct a point of view (POV) of the Tele folded camera towards a segment of a scene such that coordinate systems of the Tele folded camera and the device are not aligned;a motion sensor for sensing an undesired rotational motion of the device;a processing unit configured to perform a coordinate transformation that aligns coordinates of a reference coordinate system with a coordinate system of the device and a coordinate system of the Tele folded camera and to calculate compensation motion to compensate for the undesired rotational motion of the device in the aligned coordinate system of the Tele folded camera, wherein the compensation motion depends on the undesired rotational motion of the device and on the Tele folded camera POV; andat least one actuator for moving at least one component of the Tele folded camera according to the compensation motion to compensate for the undesired rotational motion of the device.
  • 10. The device of claim 1, wherein the coordinate transformation is performed using Rodrigues' rotation formula.
  • 11. The device of claim 1, wherein the motion sensor includes an inertial measurement unit (IMU).
  • 12. The device of claim 1, further comprising a microcontroller unit (MCU) configured to read out the motion sensor and to provide a control signal to the at least one actuator for moving the at least one component of the Tele folded camera to compensate for the undesired rotational motion.
  • 13. The device of claim 12, wherein the MCU is included in an application processor.
  • 14. The device of claim 1, further comprising an application processor configured to provide POV control signals to the OPFE actuator for tilting the OPFE.
  • 15. The device of claim 1, wherein the Tele folded camera is a double-folded Tele camera comprising two OPFEs.
  • 16. The device of claim 3, wherein Wide image data is used to track an object in FOVW and wherein the tracking information is used to direct the POV of the Tele folded camera towards the tracked object for object tracking with the Tele folded camera.
  • 17. The device of claim 16, wherein the moving of a component of the Tele folded camera to compensate for the undesired rotational motion of the device is disabled during the object tracking with the Tele folded camera.
  • 18. The device of claim 16, wherein the Wide camera additionally includes a component which is moved to compensate for the undesired rotational motion of the device, and wherein the moving of the Wide camera component is disabled during the object tracking with the Tele folded camera.
  • 19. The device of claim 16, wherein the moving a component of the Tele folded camera to compensate for the undesired rotational motion of the device is performed at a frequency range different from a frequency range that is used for the object tracking with the Tele folded camera.
  • 20. The device of claim 16, wherein a frequency range <30 Hz is used for the object tracking with the Tele folded camera, and wherein a frequency range >30 Hz is used to compensate for the undesired rotational motion of the device.
  • 21. The device of claim 16, wherein a frequency range <100 Hz is used for the object tracking with the Tele folded camera, and wherein a frequency range >200 Hz is used to compensate for the undesired rotational motion of the device.
CROSS REFERENCE TO RELATED APPLICATIONS

This application is a 371 of international application PCT/IB2021/056617 filed Jul. 22, 2021, and is related to and claims the benefit of priority from U.S. provisional patent application No. 63/064,565 filed Aug. 12, 2020, which is incorporated herein by reference in its entirety.

PCT Information
Filing Document Filing Date Country Kind
PCT/IB2021/056617 7/22/2021 WO
Publishing Document Publishing Date Country Kind
WO2022/034402 2/17/2022 WO A
US Referenced Citations (299)
Number Name Date Kind
4199785 McCullough et al. Apr 1980 A
5005083 Grage et al. Apr 1991 A
5032917 Aschwanden Jul 1991 A
5041852 Misawa et al. Aug 1991 A
5051830 von Hoessle Sep 1991 A
5099263 Matsumoto et al. Mar 1992 A
5248971 Mandl Sep 1993 A
5287093 Amano et al. Feb 1994 A
5394520 Hall Feb 1995 A
5436660 Sakamoto Jul 1995 A
5444478 Lelong et al. Aug 1995 A
5459520 Sasaki Oct 1995 A
5657402 Bender et al. Aug 1997 A
5682198 Katayama et al. Oct 1997 A
5768443 Michael et al. Jun 1998 A
5926190 Turkowski et al. Jul 1999 A
5940641 McIntyre et al. Aug 1999 A
5982951 Katayama et al. Nov 1999 A
6101334 Fantone Aug 2000 A
6128416 Oura Oct 2000 A
6148120 Sussman Nov 2000 A
6208765 Bergen Mar 2001 B1
6268611 Pettersson et al. Jul 2001 B1
6549215 Jouppi Apr 2003 B2
6611289 Yu et al. Aug 2003 B1
6643416 Daniels et al. Nov 2003 B1
6650368 Doron Nov 2003 B1
6680748 Monti Jan 2004 B1
6714665 Hanna et al. Mar 2004 B1
6724421 Glatt Apr 2004 B1
6738073 Park et al. May 2004 B2
6741250 Furlan et al. May 2004 B1
6750903 Miyatake et al. Jun 2004 B1
6778207 Lee et al. Aug 2004 B1
7002583 Rabb, III Feb 2006 B2
7015954 Foote et al. Mar 2006 B1
7038716 Klein et al. May 2006 B2
7199348 Olsen et al. Apr 2007 B2
7206136 Labaziewicz et al. Apr 2007 B2
7248294 Slatter Jul 2007 B2
7256944 Labaziewicz et al. Aug 2007 B2
7305180 Labaziewicz et al. Dec 2007 B2
7339621 Fortier Mar 2008 B2
7346217 Gold, Jr. Mar 2008 B1
7365793 Cheatle et al. Apr 2008 B2
7411610 Doyle Aug 2008 B2
7424218 Baudisch et al. Sep 2008 B2
7509041 Hosono Mar 2009 B2
7533819 Barkan et al. May 2009 B2
7619683 Davis Nov 2009 B2
7738016 Toyofuku Jun 2010 B2
7773121 Huntsberger et al. Aug 2010 B1
7809256 Kuroda et al. Oct 2010 B2
7880776 LeGall et al. Feb 2011 B2
7918398 Li et al. Apr 2011 B2
7964835 Olsen et al. Jun 2011 B2
7978239 Deever et al. Jul 2011 B2
8115825 Culbert et al. Feb 2012 B2
8149327 Lin et al. Apr 2012 B2
8154610 Jo et al. Apr 2012 B2
8238695 Davey et al. Aug 2012 B1
8274552 Dahi et al. Sep 2012 B2
8390729 Long et al. Mar 2013 B2
8391697 Cho et al. Mar 2013 B2
8400555 Georgiev et al. Mar 2013 B1
8439265 Ferren et al. May 2013 B2
8446484 Muukki et al. May 2013 B2
8483452 Ueda et al. Jul 2013 B2
8514491 Duparre Aug 2013 B2
8547389 Hoppe et al. Oct 2013 B2
8553106 Scarff Oct 2013 B2
8587691 Takane Nov 2013 B2
8619148 Watts et al. Dec 2013 B1
8803990 Smith Aug 2014 B2
8896655 Mauchly et al. Nov 2014 B2
8976255 Matsuoto et al. Mar 2015 B2
9019387 Nakano Apr 2015 B2
9025073 Attar et al. May 2015 B2
9025077 Attar et al. May 2015 B2
9041835 Honda May 2015 B2
9137447 Shibuno Sep 2015 B2
9185291 Shabtay et al. Nov 2015 B1
9215377 Sokeila et al. Dec 2015 B2
9215385 Luo Dec 2015 B2
9270875 Brisedoux et al. Feb 2016 B2
9286680 Jiang et al. Mar 2016 B1
9344626 Silverstein et al. May 2016 B2
9360671 Zhou Jun 2016 B1
9369621 Malone et al. Jun 2016 B2
9413930 Geerds Aug 2016 B2
9413984 Attar et al. Aug 2016 B2
9420180 Jin Aug 2016 B2
9438792 Nakada et al. Sep 2016 B2
9485432 Medasani et al. Nov 2016 B1
9578257 Attar et al. Feb 2017 B2
9618748 Munger et al. Apr 2017 B2
9681057 Attar et al. Jun 2017 B2
9723220 Sugie Aug 2017 B2
9736365 Laroia Aug 2017 B2
9736391 Du et al. Aug 2017 B2
9768310 Ahn et al. Sep 2017 B2
9800798 Ravirala et al. Oct 2017 B2
9851803 Fisher et al. Dec 2017 B2
9894287 Qian et al. Feb 2018 B2
9900522 Lu Feb 2018 B2
9927600 Goldenberg et al. Mar 2018 B2
11190689 Wang Nov 2021 B1
20020005902 Yuen Jan 2002 A1
20020030163 Zhang Mar 2002 A1
20020063711 Park et al. May 2002 A1
20020075258 Park et al. Jun 2002 A1
20020122113 Foote Sep 2002 A1
20020167741 Koiwai et al. Nov 2002 A1
20020180759 Park Dec 2002 A1
20030030729 Prentice et al. Feb 2003 A1
20030093805 Gin May 2003 A1
20030160886 Misawa et al. Aug 2003 A1
20030202113 Yoshikawa Oct 2003 A1
20040008773 Itokawa Jan 2004 A1
20040012683 Yamasaki et al. Jan 2004 A1
20040017386 Liu et al. Jan 2004 A1
20040027367 Pilu Feb 2004 A1
20040061788 Bateman Apr 2004 A1
20040141065 Hara et al. Jul 2004 A1
20040141086 Mihara Jul 2004 A1
20040240052 Minefuji et al. Dec 2004 A1
20050013509 Samadani Jan 2005 A1
20050046740 Davis Mar 2005 A1
20050157184 Nakanishi et al. Jul 2005 A1
20050168834 Matsumoto et al. Aug 2005 A1
20050185049 Iwai et al. Aug 2005 A1
20050200718 Lee Sep 2005 A1
20060054782 Olsen et al. Mar 2006 A1
20060056056 Ahiska et al. Mar 2006 A1
20060067672 Washisu et al. Mar 2006 A1
20060102907 Lee et al. May 2006 A1
20060125937 LeGall et al. Jun 2006 A1
20060170793 Pasquarette et al. Aug 2006 A1
20060175549 Miller et al. Aug 2006 A1
20060187310 Janson et al. Aug 2006 A1
20060187322 Janson et al. Aug 2006 A1
20060187338 May et al. Aug 2006 A1
20060227236 Pak Oct 2006 A1
20070024737 Nakamura et al. Feb 2007 A1
20070126911 Nanjo Jun 2007 A1
20070177025 Kopet et al. Aug 2007 A1
20070188653 Pollock et al. Aug 2007 A1
20070189386 Imagawa et al. Aug 2007 A1
20070257184 Olsen et al. Nov 2007 A1
20070285550 Son Dec 2007 A1
20080017557 Witdouck Jan 2008 A1
20080024614 Li et al. Jan 2008 A1
20080025634 Border et al. Jan 2008 A1
20080030592 Border et al. Feb 2008 A1
20080030611 Jenkins Feb 2008 A1
20080084484 Ochi et al. Apr 2008 A1
20080106629 Kurtz et al. May 2008 A1
20080117316 Orimoto May 2008 A1
20080129831 Cho et al. Jun 2008 A1
20080218611 Parulski et al. Sep 2008 A1
20080218612 Border et al. Sep 2008 A1
20080218613 Janson et al. Sep 2008 A1
20080219654 Border et al. Sep 2008 A1
20090086074 Li et al. Apr 2009 A1
20090109556 Shimizu et al. Apr 2009 A1
20090122195 Van Baar et al. May 2009 A1
20090122406 Rouvinen et al. May 2009 A1
20090128644 Camp et al. May 2009 A1
20090219547 Kauhanen et al. Sep 2009 A1
20090252484 Hasuda et al. Oct 2009 A1
20090295949 Ojala Dec 2009 A1
20090324135 Kondo et al. Dec 2009 A1
20100013906 Border et al. Jan 2010 A1
20100020221 Tupman et al. Jan 2010 A1
20100060746 Olsen et al. Mar 2010 A9
20100097444 Lablans Apr 2010 A1
20100103194 Chen et al. Apr 2010 A1
20100165131 Makimoto et al. Jul 2010 A1
20100196001 Ryynänen et al. Aug 2010 A1
20100238327 Griffith et al. Sep 2010 A1
20100259836 Kang et al. Oct 2010 A1
20100283842 Guissin et al. Nov 2010 A1
20100321494 Peterson et al. Dec 2010 A1
20110058320 Kim et al. Mar 2011 A1
20110063417 Peters et al. Mar 2011 A1
20110063446 McMordie et al. Mar 2011 A1
20110064327 Dagher et al. Mar 2011 A1
20110080487 Venkataraman et al. Apr 2011 A1
20110128288 Petrou et al. Jun 2011 A1
20110164172 Shintani et al. Jul 2011 A1
20110229054 Weston et al. Sep 2011 A1
20110234798 Chou Sep 2011 A1
20110234853 Hayashi et al. Sep 2011 A1
20110234881 Wakabayashi et al. Sep 2011 A1
20110242286 Pace et al. Oct 2011 A1
20110242355 Goma et al. Oct 2011 A1
20110298966 Kirschstein et al. Dec 2011 A1
20120026366 Golan et al. Feb 2012 A1
20120044372 Cote et al. Feb 2012 A1
20120062780 Morihisa Mar 2012 A1
20120069224 Cilia Mar 2012 A1
20120069235 Imai Mar 2012 A1
20120075489 Nishihara Mar 2012 A1
20120105579 Jeon et al. May 2012 A1
20120124525 Kang May 2012 A1
20120154547 Aizawa Jun 2012 A1
20120154614 Moriya et al. Jun 2012 A1
20120196648 Havens et al. Aug 2012 A1
20120229663 Nelson et al. Sep 2012 A1
20120249815 Bohn et al. Oct 2012 A1
20120287315 Huang et al. Nov 2012 A1
20120320467 Baik et al. Dec 2012 A1
20130002928 Imai Jan 2013 A1
20130016427 Sugawara Jan 2013 A1
20130063629 Webster et al. Mar 2013 A1
20130076922 Shihoh et al. Mar 2013 A1
20130093842 Yahata Apr 2013 A1
20130094126 Rappoport et al. Apr 2013 A1
20130113894 Mirlay May 2013 A1
20130135445 Dahi et al. May 2013 A1
20130155176 Paripally et al. Jun 2013 A1
20130182150 Asakura Jul 2013 A1
20130201360 Song Aug 2013 A1
20130202273 Ouedraogo et al. Aug 2013 A1
20130235224 Park et al. Sep 2013 A1
20130250150 Malone et al. Sep 2013 A1
20130258044 Betts-LaCroix Oct 2013 A1
20130270419 Singh et al. Oct 2013 A1
20130278785 Nomura et al. Oct 2013 A1
20130321668 Kamath Dec 2013 A1
20140009631 Topliss Jan 2014 A1
20140049615 Uwagawa Feb 2014 A1
20140118584 Lee et al. May 2014 A1
20140160311 Hwang et al. Jun 2014 A1
20140192238 Attar et al. Jul 2014 A1
20140192253 Laroia Jul 2014 A1
20140218587 Shah Aug 2014 A1
20140313316 Olsson et al. Oct 2014 A1
20140362242 Takizawa Dec 2014 A1
20150002683 Hu et al. Jan 2015 A1
20150042870 Chan et al. Feb 2015 A1
20150070781 Cheng et al. Mar 2015 A1
20150092066 Geiss et al. Apr 2015 A1
20150103147 Ho et al. Apr 2015 A1
20150138381 Ahn May 2015 A1
20150154776 Zhang et al. Jun 2015 A1
20150162048 Hirata et al. Jun 2015 A1
20150195458 Nakayama et al. Jul 2015 A1
20150215516 Dolgin Jul 2015 A1
20150237280 Choi et al. Aug 2015 A1
20150242994 Shen Aug 2015 A1
20150244906 Wu et al. Aug 2015 A1
20150253543 Mercado Sep 2015 A1
20150253647 Mercado Sep 2015 A1
20150261299 Wajs Sep 2015 A1
20150271471 Hsieh et al. Sep 2015 A1
20150281678 Park et al. Oct 2015 A1
20150286033 Osborne Oct 2015 A1
20150316744 Chen Nov 2015 A1
20150334309 Peng et al. Nov 2015 A1
20150370040 Georgiev Dec 2015 A1
20160044250 Shabtay et al. Feb 2016 A1
20160070088 Koguchi Mar 2016 A1
20160154202 Wippermann et al. Jun 2016 A1
20160154204 Lim et al. Jun 2016 A1
20160165111 Uemura Jun 2016 A1
20160198088 Wang Jul 2016 A1
20160212358 Shikata Jul 2016 A1
20160212418 Demirdjian et al. Jul 2016 A1
20160241751 Park Aug 2016 A1
20160291295 Shabtay et al. Oct 2016 A1
20160295112 Georgiev et al. Oct 2016 A1
20160301840 Du et al. Oct 2016 A1
20160353008 Osborne Dec 2016 A1
20160353012 Kao et al. Dec 2016 A1
20170019616 Zhu et al. Jan 2017 A1
20170070731 Darling et al. Mar 2017 A1
20170187962 Lee et al. Jun 2017 A1
20170214846 Du et al. Jul 2017 A1
20170214866 Zhu et al. Jul 2017 A1
20170242225 Fiske Aug 2017 A1
20170289458 Song et al. Oct 2017 A1
20180013944 Evans, V et al. Jan 2018 A1
20180017844 Yu et al. Jan 2018 A1
20180024329 Goldenberg et al. Jan 2018 A1
20180059379 Chou Mar 2018 A1
20180120674 Avivi et al. May 2018 A1
20180150973 Tang et al. May 2018 A1
20180176426 Wei et al. Jun 2018 A1
20180196238 Goldenberg Jul 2018 A1
20180198897 Tang et al. Jul 2018 A1
20180241922 Baldwin et al. Aug 2018 A1
20180295292 Lee et al. Oct 2018 A1
20180300901 Wakai et al. Oct 2018 A1
20190121103 Bachar et al. Apr 2019 A1
20190121216 Shabtay et al. Apr 2019 A1
20190147606 Zhuang May 2019 A1
20190204084 Song Jul 2019 A1
20200221026 Fridman et al. Jul 2020 A1
Foreign Referenced Citations (39)
Number Date Country
101276415 Oct 2008 CN
201514511 Jun 2010 CN
102739949 Oct 2012 CN
103024272 Apr 2013 CN
1536633 Jun 2005 EP
1780567 May 2007 EP
2523450 Nov 2012 EP
103841404 Jun 2014 IN
S59191146 Oct 1984 JP
04211230 Aug 1992 JP
H07318864 Dec 1995 JP
08271976 Oct 1996 JP
2002010276 Jan 2002 JP
2003298920 Oct 2003 JP
2004133054 Apr 2004 JP
2004245982 Sep 2004 JP
2005099265 Apr 2005 JP
2006238325 Sep 2006 JP
2007228006 Sep 2007 JP
2007306282 Nov 2007 JP
2008076485 Apr 2008 JP
2010204341 Sep 2010 JP
2011085666 Apr 2011 JP
2013106289 May 2013 JP
20070005946 Jan 2007 KR
20090058229 Jun 2009 KR
20100008936 Jan 2010 KR
20140014787 Feb 2014 KR
101477178 Dec 2014 KR
20140144126 Dec 2014 KR
20150118012 Oct 2015 KR
2000027131 May 2000 WO
2004084542 Sep 2004 WO
2006008805 Jan 2006 WO
2010122841 Oct 2010 WO
2014072818 May 2014 WO
2017025822 Feb 2017 WO
2017037688 Mar 2017 WO
2018130898 Jul 2018 WO
Non-Patent Literature Citations (17)
Entry
Statistical Modeling and Performance Characterization of a Real-Time Dual Camera Surveillance System, Greienhagen et al., Publisher: IEEE, 2000, 8 pages.
A 3MPixel Multi-Aperture Image Sensor with 0.7μm Pixels in 0.11μm CMOS, Fife et al., Stanford University, 2008, 3 pages.
Dual camera intelligent sensor for high definition 360 degrees surveillance, Scotti et al., Publisher: IET, May 9, 2000, 8 pages.
Dual-sensor foveated imaging system, Hua et al., Publisher: Optical Society of America, Jan. 14, 2008, 11 pages.
Defocus Video Matting, McGuire et al., Publisher: ACM SIGGRAPH, Jul. 31, 2005, 11 pages.
Compact multi-aperture imaging with high angular resolution, Santacana et al., Publisher: Optical Society of America, 2015, 10 pages.
Multi-Aperture Photography, Green et al., Publisher: Mitsubishi Electric Research Laboratories, Inc., Jul. 2007, 10 bages.
Multispectral Bilateral Video Fusion, Bennett et al., Publisher: IEEE, May 2007, 10 pages.
Super-resolution imaging using a camera array, Santacana et al., Publisher: Optical Society of America, 2014, 6 pages.
Optical Splitting Trees for High-Precision Monocular Imaging, McGuire et al., Publisher: IEEE, 2007, 11 pages.
High Performance Imaging Using Large Camera Arrays, Wilburn et al., Publisher: Association for Computing Machinery, Inc., 2005, 12 pages.
Real-time Edge-Aware Image Processing with the Bilateral Grid, Chen et al., Publisher: ACM SIGGRAPH, 2007, 9 pages.
Superimposed multi-resolution imaging, Carles et al., Publisher: Optical Society of America, 2017, 13 pages.
Viewfinder Alignment, Adams et al., Publisher: EUROGRAPHICS, 2008, 10 pages.
Dual-Camera System for Multi-Level Activity Recognition, Bodor et al., Publisher: IEEE, Oct. 2014, 6 pages.
Engineered to the task: Why camera-phone cameras are different, Giles Humpston, Publisher: Solid State Technology, Jun. 2009, 3 pages.
Office Action in related KR patent application 2021-7038426, dated Jan. 15, 2022.
Related Publications (1)
Number Date Country
20230164437 A1 May 2023 US
Provisional Applications (1)
Number Date Country
63064565 Aug 2020 US