The present invention relates to a control apparatus, an image pickup apparatus, a lens apparatus, a control method, and a program (or storage medium).
In a central projection type optical system, an image point movement on an imaging plane under camera shakes differs between a central part of an image (image center) and a periphery of the image (image periphery). As illustrated in
The image pickup apparatus disclosed in JP 2018-173632 calculates a correction amount for the image stabilization at the predetermined image point position using an image height dependent expression in an ideal optical system without considering the aberration of the central projection method. Thus, if the image stabilization is performed based on the correction amount calculated by the above expression for an actual optical system having a distortion residue, a correction residue or overcorrection occurs. Moreover, as illustrated in
JP 2018-173632 further discloses a method of more properly providing an image stabilization by adding design value information on a distortion of an optical system stored in a memory to the calculated correction amount, but this method complicates a calculation process. Furthermore, it also remains difficult to calculate the correction amount at the image point position with the image-point moving direction that has the skew relationship with the image-point moving direction at the image center.
The present invention provides a control apparatus, an image pickup apparatus, a lens apparatus, a control method, and a storage medium, each of which can easily and satisfactorily provide an image stabilization at a predetermined image point position including a center of an optical axis.
A control apparatus according to one aspect of the present invention includes at least one processor or circuit configured to execute a plurality of tasks including a first acquiring task configured to acquire information on an image shift sensitivity to a tilt of an imaging optical system corresponding to an image point position of the imaging optical system, which information includes an influence of a distortion of the imaging optical system, and a second acquiring task configured to acquire an image-stabilization driving amount that is used for an image stabilization by an image stabilizer configured to provide the image stabilization. The second acquiring task acquires the image-stabilization driving amount corresponding to a predetermined image point position using the information on the image shift sensitivity corresponding to the predetermined image point position.
An image pickup apparatus and a lens apparatus each include the above control apparatus also constitute another aspect of the present invention. A control method corresponding to the above control apparatus and a storage medium storing a program that causes a computer to execute the control method also constitute another aspect of the present invention.
Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
Referring now to the accompanying drawings, a detailed description will be given of embodiments according to the present invention. Corresponding elements in respective figures will be designated by the same reference numerals, and a duplicate description thereof will be omitted.
In the following description, in a three-dimensional orthogonal coordinate system (X-axis direction, Y-axis direction, and Z-axis direction), the X-axis direction is a long side direction of the imaging plane, the Y-axis direction is a short side direction of the imaging plane, and the Z-axis direction is an optical axis direction of the optical system.
The imaging optical system 101 includes a focus optical system 1011, a magnification varying (zoom) optical system 1012, a diaphragm (aperture stop) 1013, and an OIS optical system 1014. The imaging optical system 101 forms an object image on an imaging plane of the image sensor 201 using light from the object at an in-focus position within a set angle of view. The focus optical system 1011 provides focusing. The magnification varying optical system 1012 provides a magnification variation (zooming) in order to change an imaging angle of view. The diaphragm 1013 adjusts a light amount captured from the object. The OIS optical system 1014 provides the image stabilization to an image blur that occurs during still or motion image capturing by decentering itself from the optical axis of the imaging optical system 101. Here, OIS is an image stabilization performed by moving the OIS optical system 1014.
The lens microcomputer 102 controls the OIS optical system 1014. More specifically, the lens microcomputer 102 determines an OIS driving amount of the OIS actuator 105 using an image-stabilization (IS) driving amount from the camera microcomputer 202 and a position signal from the OIS encoder 103 that detects the position of the OIS optical system 1014. The OIS driving amount is determined so as not to exceed a movable range of the OIS actuator 105. When the OIS actuator 105 receives an OIS driving amount signal from the OIS driver 104, the OIS actuator 105 moves the OIS optical system 1014 in a direction including a component of a direction orthogonal to the Z-axis direction to decenter it from the optical axis of the imaging optical system 101 and thereby provides the image stabilization. That is, the OIS actuator 105 functions as one of the image stabilizers that provide image stabilization.
The lens memory 106 stores optical design information on the imaging optical system 101. The optical design information includes information on a tilt-image shift sensitivity of the imaging optical system 101 for each image height (information on an image shift sensitivity to a tilt of the imaging optical system 101 according to an image point position of the imaging optical system 101). The information on the tilt-image shift sensitivity is information obtained by using the designed value of the imaging optical system 101 and includes the influence of the distortion of the imaging optical system 101. Use of the information on the tilt-image shift sensitivity can provide a satisfactory image stabilization at a predetermined image point position of the imaging optical system 101, when the image pickup system 1 generates a rotation blur so that the X-Y plane orthogonal to the optical axis is tilted to the optical axis. The camera memory 211 may store the optical design information on the imaging optical system 101 including information on the tilt-image shift sensitivity. Both the lens memory 106 and the camera memory 211 may store the optical design information on the imaging optical system 101 including the information on the tilt-image shift sensitivity.
The imaging optical system 101 has a distortion DIST(h) expressed by the following expression:
DIST(h)=(h−h0)/h0
h0=f tan ω
where f is a focal length of the imaging optical system 101, and ω is a half angle of view, h is a distance (real image height) from the optical axis of the imaging optical system 101 to a position on the image plane where a principal ray having the half angle of view ω incident from the object plane is imaged, and h0 is an ideal image height of the central projection method.
Having a distortion means that a distortion amount at any image height within the imaging range is not zero. The imaging optical system having the distortion includes an imaging optical system having a magnification varying function and a focusing function and having distortion in a certain magnification varying state or in-focus state.
The image sensor 201 includes a CCD (Charge Coupled Device) image sensor, a CMOS (Complementary Metal Oxide Semiconductor) image sensor, or another image sensor. The image sensor 201 converts an object image formed on the imaging plane of the image sensor 201 by the imaging optical system 101 into an electric signal, and outputs it as an image signal. The image signal as an analog signal is converted into a digital signal by an unillustrated A/D converter and then output.
The camera microcomputer 202 controls the entire image pickup system 1. For example, the camera microcomputer 202 reads out the image signal as image data from the image sensor 201. The camera microcomputer 202 performs processing such as image processing for the image data based on the optical design information, displaying the image data on the display/operation unit 203, and saving the image data in the recording medium 204. The camera microcomputer 202 issues instructions, such as focusing, zoom magnification changing, and diaphragm adjusting of the imaging optical system 101, to the lens microcomputer 102. Some of the settings relating to the processing may be changed by an operation unit such as a display/operation unit 203 and an unillustrated button.
The camera microcomputer 202 acquires the IS driving amount (image-stabilization driving amount that is used for the image stabilization by the image stabilizer) according to a flow of
The gyro sensor 205 outputs information on an angular velocity of the image pickup system 1 as a motion detecting signal. The acceleration sensor 206 outputs information on a moving amount of the image pickup system 1 in a translational direction as a motion detecting signal. The camera microcomputer 202 when receiving the motion detecting signal transmitted from each sensor, transmits the IS driving amount to the lens microcomputer 102 or the IIS control unit 207 in the camera microcomputer 202 so as to provide the image stabilization to an object image for a motion of the image pickup system 1. In the image stabilization, either OIS or IIS may be performed, or both OIS and IIS may be performed with a determined share of image stabilization (such as 50% of OIS and 50% of IIS).
The IIS control unit 207 controls the image sensor 201. More specifically, the IIS control unit 207 determines the IIS driving amount of the IIS actuator 210 using the IS driving amount from the camera microcomputer 202 and the position signal from the IIS encoder 208 that detects the position of the image sensor 201. The IIS driving amount is determined so as not to exceed the movable range of the IIS actuator 210. When the IIS actuator 210 receives the IIS driving amount signal from the IIS driver 209, it moves the image sensor 201 in a direction including a component of a direction perpendicular to the Z-axis direction to decenter it from the optical axis of the imaging optical system 101 and provides the image stabilization. That is, the IIS actuator 210 functions as one of the image stabilizers that provide the image stabilization.
The lens apparatus 100 may include a gyro sensor 107 and an acceleration sensor 108. In this case, in the OIS, the lens microcomputer 102 determines the OIS driving amount using the IS driving amount acquired using the motion detecting signals output from these sensors and the position signal from the OIS encoder 103.
A description will now be given of processing during the image stabilization at a predetermined image point position. When the gyro sensor 205 or the acceleration sensor 206 detects a motion of the image pickup system 1, each sensor outputs the motion detecting signal (information on a blur) to the camera microcomputer 202. The camera microcomputer 202 acquires the IS driving amount using the information on the tilt-image shift sensitivity stored by the lens memory 106, IS position information on the imaging plane, and the motion detecting signals. The camera microcomputer 202 transmits the acquired IS driving amount to the lens microcomputer 102 or the IIS control unit 207.
Acquisition of Information on Tilt-Image Shift Sensitivity
In this embodiment, the tilt-image shift sensitivity is an image-point moving amount in a direction orthogonal to a predetermined rotation axis when the imaging optical system 101 is tilted to that rotation axis orthogonal to the optical axis of the imaging optical system 101 on the imaging plane.
A description will now be given of an image-point moving amount tx0 in the +X-axis direction at a center position O on the imaging plane, which is the image center when a rotation blur amount ωy about the Y-axis occurs, and an image-point moving amount tx at a predetermined image point position A.
The image-point moving amount tx0 is expressed by the following expression (1):
tx0=ωy·LS
where LS is a tilt-image shift sensitivity at an image height of 0.
Assume the imaging plane (X-Y plane) in a polar coordinate system (R-Θ coordinate system) with a center position O as an origin, and (r, θ) is a coordinate of the predetermined image point position A. That is, in this embodiment, the predetermined image point position A is a position on the imaging plane represented by a plurality of parameters. The image height on the horizontal axis in
kLS_r(hr)=LSr(hr)/LS (2)
where LSr(hr) is a tilt-image shift sensitivity at an image height hr.
The image-point moving amount tx0 is expressed by the following expressions (3) to (5) using a parallel component trx0 parallel to a straight line OA and a vertical component tθx0 perpendicular to the straight line OA:
trx0=tx0·cos θ=ωy·LS·cos θ (3)
tθx0=tx0·(−sin θ)=−ωy·LS·sin θ (4)
|tx0|=(trx02+tθx02)1/2 (5)
The parallel component trio has a positive sign in the direction separating from the center position O (R direction), and the vertical component tθx0 has a positive sign in the direction orthogonal to the R direction toward the counterclockwise direction about the center position O (θ direction). The R direction and the θ direction are also referred to as a meridional direction and a sagittal direction, respectively.
Next, consider the image-point moving amount tx at the predetermined image point position A. The parallel component trx parallel to the straight line OA is affected by the tilt-image shift sensitivity LSr(r) at the image height r, and the vertical component tθx perpendicular to the straight line OA is affected by the tilt-image shift sensitivity LS at the image height of 0. From the above, the image-point moving amount tx is expressed by the following expressions (6) to (8) using the parallel component trx and the vertical component tθx:
trx=kLS_r(r)·trx0=kLS_r(r)·ωy·LS·cos θ (6)
tθx=kLS_r(0)·tθx0=−ωy·LS·sin θ (7)
|tx|=(trx2+tθx2)1/2 (8)
In this way, the image-point moving amount tx at the predetermined image point position A when the rotation blur amount ωy occurs about the Y-axis can be calculated. Similarly, an image-point moving amount ty at the predetermined image point position A in the polar coordinate system when the rotation blur amount ωx occurs about the X-axis is expressed by the following expressions (9) to (11) using a parallel component try parallel to the straight line OA and a vertical component tθy perpendicular to the straight line OA:
As described above, the image-point moving amount t at the predetermined image point position A when the rotation blur amount (ωx, ωy) occurs about the predetermined rotation axis orthogonal to the optical axis on the imaging plane can be expressed by the following expressions (12) to (14) using a parallel component tr parallel to the straight line OA and a vertical component to perpendicular to the straight line OA.
Coefficients (K1, K2, K3, K4) in the expressions (12) and (13) are given as follows:
K1(r,θ)=kLS_r(r)·LS·cos θ
K2(r,θ)=kLS_r(r)·LS·sin θ
K3(r,θ)=−LS·sin θ
K4(r,θ)=LS·cos θ
As expressed by the expressions (12) to (14), the image-point moving amount t includes correction coefficient information (K1, K2, K3, K4) including the tilt-image shift sensitivity and the position information (r, θ) at the image point position, and a rotation blur amount (ωx, ωy). In this embodiment, the lens memory 106 previously stores as the information on the tilt-image shift sensitivity a correction coefficient table of the correction coefficient information (K1, K2, K3, K4) in a matrix format defined by the image point position illustrated in
The information on the tilt-image shift sensitivity may include the tilt-image shift sensitivity for each image height in order to reduce the information stored in the lens memory 106, or may provide the image-point moving amount t using position information on a predetermined image point position that is a target of the image stabilization. The position information on the image point position may be information on a polar coordinate system or information on a predetermined coordinate system (such as an orthogonal coordinate system).
Setting of Image-Stabilization Position Information on Imaging Plane
This embodiment can switch a setting mode of the image pickup system 1 to an image-center IS mode that sets a predetermined image point position (image stabilization position) that is a target of the image stabilization to the center of the imaging plane, or an IS spot setting mode that can set an image stabilization point to a predetermined image point position. In a case where the IS spot setting mode is set, the image stabilization position can be set on the display/operation unit 203. The position settable by the display/operation unit 203 may be linked with the image point position where autofocus is performed or the image point position where automatic photometry (light metering) is performed. The image point position where the autofocus is performed may be a position automatically detected by the pupil detection, person detection, or the like. The IS position information (r, θ) on the imaging plane is sent to the camera microcomputer 202, and the correction coefficient information to be used is selected from the correction coefficient table.
Motion Detecting Signal
The gyro sensor 205 detects angular velocities about a plurality of rotation axes of the image pickup system 1 and outputs information on the rotation blur amount as a motion detecting signal. In this embodiment, the gyro sensor 205 detects angular velocities about the X-axis and the Y-axis, and outputs information on the rotation blur amount (ωx, ωy). The acceleration sensor 206 detects accelerations in a plurality of axial directions of the image pickup system 1 and outputs information on the translational blur amount as a motion detecting signal. In this embodiment, the acceleration sensor 206 detects accelerations in the X-axis and Y-axis directions, and outputs information on the translational blur amount (ax, ay). The gyro sensor 205 may include a plurality of sensors, each of which detects an angular velocity about a single axis. Similarly, the acceleration sensor 206 may include a plurality of sensors, each of which detects acceleration in a single direction.
Acquisition of Image-Stabilization Driving Amount
The camera microcomputer 202 acquires an IS driving amount using the information on the tilt-image shift sensitivity, the IS position information, and the motion detecting signal. For example, in a case where an image blur at a predetermined image point position A due to the rotation blur is corrected by IIS, the image sensor 201 may be moved so as to cancel the image-point moving amount t. An IS driving amount x in the X-axis direction and an IS driving amount y in the Y-axis direction of the IIS actuator 210 are expressed by the following expressions (15) and (16):
x=tr·cos θ−tθ·sin θ=ωy{sin2θ+kLS_r(r)·cos2θ}LS+ωx{kLS_r(r)−1}LS·sin θ·cos θ=K′1(r,θ)·ωy+K′2(r,θ)·ωx (15)
y=tr·sin θ+tθ·cos θ=ωy{kLS_r(r)−1}LS·sin θ·cos θ+ωx{kLS_r(r)·sin2θ+cos2θ}LS=K′3(r,θ)·ωy+K′4(r,θ)·ωx (16)
Coefficients (K′1, K′2, K′3, K′4) in the expressions (15) and (16) are given as follows:
K′1(r,θ)={sin2θ+kLS_r(r)·cos2θ}LS
K′2(r,θ)={kLS_r(r)−1}LS·sin θ·cos θ
K′3(r,θ)={kLS_r(r)−1}LS·sin θ·cos θ
K′4(r,θ)={kLS_r(r)·sin2θ+cos2θ}LS
As expressed by the expressions (15) and (16), the IS driving amount (x, y) includes the correction coefficient information (K′1, K′2, K′3, K′4) and the rotation blur amount (ωx, ωy). Thus, the correction coefficient table of the correction coefficient information (K′1, K′2, K′3, K′4) in a matrix format may be stored as the information on the tilt-image shift sensitivity in the lens memory 106. By using K′1 or the like instead of the above correction coefficient information (K1, K2, K3, K4), the IS driving amount (x) at the predetermined image point position A when the rotation blur amount (ωx, ωy) occurs can be easily obtained.
In the case of OIS, the OIS eccentric (decentering) sensitivity TS(h) for each image height of the OIS optical system 1014 increases as the image height becomes higher, so that the IS driving amount may be acquired based on the OIS eccentric sensitivity TS(h). Thereby, the image stabilization can be performed with high accuracy.
Regarding the image blur derived from the translation blur, the IS driving amount may be acquired using the information on the translation blur amount from the acceleration sensor 206. The IS driving amount for the translation blur may be acquired by converting the translation blur amount (ax, ay) into the rotation blur amount (ωx, ωy) using the in-focus object distance information. In a case where the rotation blur and translation blur occur at the same time, the IS driving amount may be acquired by adding the IS driving amount for the translation blur and the IS driving amount for the rotation blur. The IS driving amount for the translation blur at the predetermined image point position may be acquired by multiplying the converted rotation blur amount by the correction coefficient included in the information on the tilt-image shift sensitivity.
In a case where the in-focus position is close to the short distance end, the translational component of the object plane generated by the rotation blur becomes large. The IS driving amount for the image blur caused by the translational component according to the object distance may be acquired by the above method.
The tilt-image shift sensitivity changes according to the object distance and the focal length (imaging angle of view) on which the imaging optical system 101 is focused. In this embodiment, the lens memory 106 stores a plurality of correction coefficient tables that differ according to an in-focus position determined by the focus optical system 1011 and a focal length determined by the magnification varying optical system 1012. Thereby, the image stabilization can be satisfactorily provided at a predetermined image point position even during the magnification variation (zooming) or focusing.
The lens apparatus 100 may be detachably attached to the image pickup apparatus 200. In this case, the information on the proper tilt-image shift sensitivity may be used for each lens apparatus 100. Thereby, even where a different lens apparatus 100 is attached to the image pickup apparatus 200 and used, the image blur at a predetermined image point position can be satisfactorily corrected.
This embodiment expands the information on the tilt-image shift sensitivity is wider than the first embodiment. Since the configuration of the image pickup system 1 and the processing in the image stabilization in this embodiment are the same as those in the first embodiment, a detailed description thereof will be omitted.
In this embodiment, a distortion amount by which the object image of the imaging optical system 101 is deformed into a barrel shape is larger than that in the first embodiment. In an imaging optical system with a small distortion amount, an image-point moving amount at any image point position in a direction orthogonal to an image-point moving direction at the image center when the imaging optical system is tilted is almost similar to an image-point moving amount at the image center. On the other hand, the imaging optical system 101 according to this embodiment has a large distortion amount. Then, the image-point moving amount in the direction orthogonal to the image-point moving direction at the image center when the imaging optical system 101 is tilted becomes smaller as a position is more separated from the image center. Thus, in this embodiment, the lens memory 106 stores the information on a significant tilt-image shift sensitivity for the image-point moving amount in the direction orthogonal to the image-point moving direction at the image center caused by the rotation blur.
Accordingly, this embodiment adds the tilt-image shift sensitivity to the information described in the first embodiment, and creates information that includes the influence of the image-point moving amount for each image height in a direction parallel to the rotation axis when the imaging optical system 101 is tilted. The tilt-image shift sensitivity coefficient kLS_θ(hθ) at the image height hθ against the tilt-image shift sensitivity LS at the image center is expressed by the following expression (17):
kLS_θ(hθ)=LSθ(hθ)/LS (17)
When a rotation blur amount (ωx, ωy) occurs, a parallel component tr parallel to the straight line OA, which is a polar coordinate system component of the image-point moving amount t at a predetermined image point position A, and a vertical component to perpendicular to the straight line OA are expressed by the following expressions (12a) and (13a), respectively:
tr=trx+try=kLS_r(r)·kLS_θ(0)·LS(ωy·cos θ+ωx·sin θ)=K1(r,θ)·ωy+K2(r,θ)·ωx (12a)
tθ=tθx+tθy=kLS_r(0)·kLS_θ(r)·LS(−ωy sin θ+ωx cos θ)=K3(r,θ)·ωy+K4(r,θ)·ωx (13a)
Coefficients (K1, K2, K3, K4) in the expression (12a) and (13a) are given as follows:
K1(r,θ)=kLS_r(r)·kLS_θ(θ)·LS·cos θ
K2(r,θ)=kLS_r(r)·kLS_θ(0)·LS·sin θ
K3(r,θ)=−kLS_r(0)·kLS_θ(r)·LS·sin θ
K4(r,θ)=kLS_r(0)·kLS_θ(r)·LS·cos θ
An IS driving amount x in the X-axis direction and an IS driving amount y in the Y-axis direction of the IIS actuator 210 are expressed by the following expressions (15a) and (16a):
x=tr·cos θ−tθ·sin θ=ωy{kLS_θ(r)·sin2θ+kLS_r(r)·cos2θ}LS+ωx{kLS_r(r)−kLS_θ(r)}LS·sin θ·cos θ=K′1(r,θ)·ωy+K′2(r,θ)·ωx (15a)
y=tr·sin θ=tθ·cos θ=ωy{kLS_r(r)−kLS_θ(r)}LS·sin θ·cos θ+ωx{kLS_r(r)·sin2θ+kLS_θ(r)·cos2θ}LS=K′3(r,θ)·ωy+K′4(r,θ)·ωx (16a)
Coefficients (K′1, K′2, K′3, K′4) in the expressions (15a) and (16a) are given as follows:
K′1(r,θ)={kLS_θ(r)·sin2θ+kLS_r(r)·cos2θ}LS
K′2(r,θ)={kLS_r(r)−kLS_θ(r)}LS·sin θ·cos θ
K′3(r,θ)={kLS_r(r)−kLS_θ(r)}LS·sin θ·cos θ
K′4(r,θ)={kLS_r(r)·sin2θ+kLS_θ(r)cos2θ}LS
As described above, this embodiment acquires the IS driving amount based on the tilt-image shift sensitivities for each image height in the direction parallel to the rotation axis and the direction orthogonal to the rotation axis. Thereby, even when the image pickup system 1 uses the imaging optical system 101 having a large distortion amount, the image stabilization can be satisfactorily provided at a predetermined image point position.
An optical system designed by a fisheye lens projection method (such as an equidistant projection method and an equisolid angle projection method) also has a significant tilt-image shift sensitivity characteristic against the image-point moving amount in the θ direction. Thus, the IS driving amount may be acquired based on the tilt-image shift sensitivities for each image height in the R and θ directions.
In a case where a large image-stabilization angle is guaranteed as a specification of the image stabilization mechanism, the IS driving amount may be determined based on the tilt-image shift sensitivity according to this embodiment.
Referring now to the accompanying drawings, a description will be given of examples of the imaging optical system 101 according to the present invention.
In each sectional view, a left side is an object side and a right side is an image side. The optical system L0 according to each example includes a plurality of lens units. As used herein, a lens unit is a group of lenses that move or stand still integrally during zooming, focusing, or image stabilization. That is, in the optical system L0 according to each example, a distance between adjacent lens units changes during zooming or focusing. The lens unit may include one or more lenses. The lens unit may include a diaphragm (aperture stop).
SP denotes the diaphragm. IP denotes an image plane, on which an imaging plane of an image sensor (photoelectric conversion element), such as a CCD sensor or a CMOS sensor, is disposed. The OIS optical system is eccentric to the optical axis of the optical system L0 during OIS.
The projection method of the optical system L0 according to Examples 1, 2, and 4 is a central projection method (Y=f tan θ). The projection method of the optical system L0 according to Example 3 is an equisolid angle projection method (Y=2·f·sin(θ/2)).
In the spherical aberration diagram, Fno denotes an F-number and indicates the spherical aberration amounts for the d-line (wavelength 587.6 nm) and the g-line (wavelength 435.8 nm). In the astigmatism diagram, S denotes an astigmatism amount in a sagittal image plane, and M denotes an astigmatism amount in a meridional image plane. The distortion diagram illustrates a distortion amount for the d-line. A chromatic aberration diagram illustrates a lateral chromatic aberration for the g-line. ω denotes an imaging half angle of view (°).
Numerical examples 1 to 4 corresponding to Examples 1 to 4, respectively, will be illustrated below.
In the surface data in each numerical example, r denotes a radius of curvature of each optical surface, and d (mm) denotes an on-axis distance (a distance on the optical axis) between an m-th surface and an (m+1)-th surface, where m is a surface number counted from the light incident surface. nd denotes a refractive index of each optical member for the d-line, and vd denotes an Abbe number of the optical member. The Abbe number vd of a certain material is expressed as follows:
vd=(Nd−1)/(NF−NC)
where Nd, NF, and NC are refractive indexes for the d-line (wavelength 587.6 nm), F-line (wavelength 486.1 nm), and C-line (wavelength 656.3 nm) in the Fraunhofer lines.
In each numerical example, all values of d, a focal length (mm), an F-number, and a half angle of view (°) are values when the optical system L0 according to each example is focused on an object at infinity (infinity object). A backfocus (BF) is a distance on the optical axis from the final surface of the lens (the lens surface closest to the image plane) to the paraxial image plane in terms of an air conversion length. An overall optical length is a length obtained by adding the backfocus to a distance on the optical axis from the frontmost surface of the lens (the lens surface closest to the object) to the final surface of the lens.
In a case where the optical surface is aspherical, an asterisk * is added to the right side of the surface number. The aspherical shape is expressed as follows:
where X is a displacement amount from the surface vertex in the optical axis direction, h is a height from the optical axis in the direction perpendicular to the optical axis, R is a paraxial radius of curvature, k is a conical constant, A4, A6, A8, A10, and A12 are aspherical coefficients of each order. “e±XX” in each aspherical coefficient means “×10±XX”
The tilt-image shift sensitivity data and the eccentric sensitivity data for the eccentricity of the OIS optical system are illustrated in each numerical example. An acquiring method of them will be described with reference to
The tilt-image shift sensitivity for each image height in the tilt direction (R direction) can be acquired by dividing by the tilt angle ωx the image-point moving amount ΔyLSr(hr) as a difference of an imaging position on the image plane IP corresponding to each half angle of view between
The eccentric sensitivity of the OIS optical system for each image height in the eccentric direction (R direction) is acquired by dividing by the eccentricity y of the OIS optical system the image-point moving amount ΔyTSr(hr) as a difference of the imaging position on the image plane IP corresponding to each half angle of view between
As described above, the configuration according to the present invention can easily and satisfactorily provide an image stabilization at a predetermined image point position including the center of the optical axis.
Each embodiment expresses the information on the image shift sensitivity to the tilt of the imaging optical system 101 according to the image point position as a correction coefficient table of the correction coefficient information in a matrix format defined by the image point position, but the present invention is not limited to this embodiment. It may be the tilt-image shift sensitivity LSr(hr) or LSθ(hθ) or the off-axis correction coefficient information acquired from the tilt-image shift sensitivity. That is, the information on the image shift sensitivity may be information that can provide a moving amount of a predetermined image point position to the tilt of the imaging optical system 101.
Each embodiment has described the tilt-image shift sensitivity as information for each image height in the direction (R direction) orthogonal to the tilt rotation axis of the imaging optical system 101 and in the direction parallel to the rotation axis of the tilt. However, the tilt-image shift sensitivity may be information determined for each image point position over the entire imaging plane against a predetermined tilt direction. In that case, it may be the tilt-image shift sensitivity directly acquired from the image-point moving amount over the entire imaging plane acquired using the designed value of the imaging optical system 101.
Each numerical example may acquire the image point position using the imaging position of the principal ray, but may acquire it using the peak position of MTF (Modulation Transfer Function).
The camera microcomputer 202 may perform image stabilization using an electronic image stabilization function that changes the effective pixel area of the image sensor 201. That is, the camera microcomputer 202 may function as one of the image stabilizers.
Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.
Each of the above embodiments can provide a control apparatus, an image pickup apparatus, a lens apparatus, a control method, and a storage medium, each of which can easily and satisfactorily provide an image stabilization to a predetermined image point position including a center of an optical axis.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Application No. 2021-033221, filed on Mar. 3, 2021, which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | Kind |
---|---|---|---|
2021-033221 | Mar 2021 | JP | national |
Number | Name | Date | Kind |
---|---|---|---|
20050157181 | Kawahara | Jul 2005 | A1 |
20110001858 | Shintani | Jan 2011 | A1 |
20130004151 | Wakamatsu | Jan 2013 | A1 |
20140111659 | Miyasako | Apr 2014 | A1 |
20150326785 | Tsubaki | Nov 2015 | A1 |
20200260010 | Nakajima et al. | Aug 2020 | A1 |
20210092296 | Kuribayashi | Mar 2021 | A1 |
20220174216 | Ozone | Jun 2022 | A1 |
Number | Date | Country |
---|---|---|
2018173632 | Nov 2018 | JP |
WO-2020195232 | Oct 2020 | WO |
Entry |
---|
Extended European Search Report issued in European Appln. No. 22158954.2 dated Jul. 28, 2022. |
Number | Date | Country | |
---|---|---|---|
20220286615 A1 | Sep 2022 | US |