The present invention relates to lens control processing performed in an optical apparatus such as a video camera.
It is often the case that an optical apparatus including an optical system of an inner focus type has a function of moving, when moving a lens unit for zooming, a lens unit for focusing based on electronic cam tracking data stored in a memory in advance to correct image plane variation accompanying zooming. This function facilitates zooming while an in-focus state is maintained.
In such a lens system of the inner focus type, even though the lens system has a same focal length, different object distances change an in-focus position of the focus lens 905 for focusing on an object that is a focusing target. When the object distance is changed at each focus length, continuously plotting the in-focus positions of the focus lens 905 at which in-focus object images are formed on the image pickup surface 906 provides plural electronic cam tracks for the respective object distances as shown in
During zooming performed by moving the zoom lens 902, moving the focus lens 905 so as to trace a cam track selected from the plural electronic cam tracks according to the object distance enables zooming while maintaining an in-focus state.
As shown in
Japanese Patent No. 2901116 discloses an optical apparatus using a TV-AF method which determines a cam track to be traced by a focus lens using an AF evaluation value during zooming. This apparatus compares, during zooming, the AF evaluation values (focus signals) obtained by minutely moving the focus lens in a close direction and an infinite direction with each other to determine a direction in which an in-focus position exists (hereinafter referred to as “in-focus direction”). Then, the apparatus moves a center position of the minute movement of the focus lens by a predetermined amount in the determined in-focus direction to repeat the minute movement, thereby determining one cam track to be traced by the focus lens during zooming.
However, as shown in
Further, a large movement amount of the focus lens to obtain the AF evaluation values for the plural cam tracks may greatly exceed a depth of focus, which causes defocusing.
Moreover, in the TV-AF method, a cycle of obtaining the AF evaluation value corresponds to a cycle of a vertical synchronizing signal, and hence accuracy of determining the cam track decreases as a zooming speed becomes higher. As a result, a frequency of erroneously determining the cam track increases.
Under these circumstances, Japanese Patent No. 2795439 discloses an optical apparatus capable of maintaining an in-focus state even during high-speed zooming and performing zooming without any dependence on an captured scene or camera work. This apparatus detects a distance (object distance) to a focusing target object, and restricts a movable range of a focus lens for correcting a cam track (that is, for obtaining an AF evaluation value) based on the detected distance.
However, Japanese Patent No. 2795439 does not describe how a cam track to be traced should be corrected within the restricted movable range of the focus lens. Hence, a simple restriction on the movable range of the focus lens causes the following problems in a real image pickup environment.
First, the apparatus disclosed in Japanese Patent No. 2795439 restricts the movable range of the focus lens based on a distance detection result and its detection accuracy. Then, the apparatus determines one cam track to be traced from cam tracks within the movable range. However, as shown in
As described above, in the TV-AF method, the cycle of obtaining the AF evaluation values corresponds to the cycle of the vertical synchronizing signal, and hence the number of times of obtaining the AF evaluation values is reduced as the zooming speed increases. As a result, the number of times of moving the center position of the minute movement is reduced. In other words, since the cam track to be traced by the focus lens is determined within the movable range with a small number of times of moving the center position of the movement, erroneous determination of the cam track may often be made, which reduces determination accuracy of the cam track.
Similarly, during a long time exposure such as a so-called slow shutter, even if the zooming speed is not high, the cycle of obtaining the AF evaluation value corresponds to an exposure cycle. As a result, the determination accuracy of the cam track is reduced.
Thus, the optical apparatus disclosed in Japanese Patent No. 2795439 does not determine the cam track to be traced by the focus lens from plural cam tracks for all object distances from an infinite end to a closest end, but narrows down the cam track to be traced within the restricted movable range based on the detection distances. However, when high-speed zooming is performed, a number of and an amount of the center position movement need to be set by taking the number of times of obtaining the AF evaluation value into consideration. Nonetheless, Japanese Patent No. 2795439 discloses no setting method that takes the number of times of obtaining the AF evaluation value into consideration
The present invention is directed to a lens control apparatus, an optical apparatus and a lens control method enabling high quality zooming while maintaining an in-focus state even during high-speed zooming.
The present invention provides as one aspect thereof a lens control apparatus configured to move a first lens unit for zooming and a second lens unit for focusing. The apparatus includes a focus signal generator configured to generate a focus signal indicating a focus state of an optical system from a photoelectrical conversion signal of an optical image formed by the optical system including the first and second lens units, a memory configured to store data generated for each predetermined in-focus distance and indicating a relationship between a position of the first lens unit and a position of the second lens unit, a controller configured to control, based on the data, movement of the second lens unit accompanying movement of the first lens unit, and a detector configured to detect information corresponding to a distance to a focusing target object. The controller moves the second lens unit in an infinite direction and a close direction within a movable range set based on the information corresponding to the distance. The controller changes a movement amount of a center position of the movement of the second lens unit in the infinite and close directions according to at least one selected from (a) information on an operation of the optical system, (b) information on a state of the optical system, and (c) information on a photoelectric conversion operation of the optical image.
The present invention provides as another aspect thereof an optical apparatus including an optical system configured to include a first lens unit for zooming and a second lens unit for focusing, and the above lens control apparatus.
The present invention provides as still another aspect thereof a lens control method for moving a first lens unit for zooming and a second lens unit for focusing. The method includes a focus signal generation step of generating a focus signal indicating a focus state of an optical system from a photoelectrical conversion signal of an optical image formed by the optical system including the first and second lens units, a control step of controlling movement of the second lens unit accompanying movement of the first lens unit based on data generated for each predetermined in-focus distance and indicating a relationship between a position of the first lens unit and a position of the second lens unit, and a detection step of detecting information corresponding to a distance to a focusing target object. The control step moves the second lens unit in an infinite direction and a close direction within a movable range set based on the information corresponding to the distance. The control step changes a movement amount of a center position of the movement of the second lens unit in the infinite and close directions according to at least one selected from (a) information on an operation of the optical system, (b) information on a state of the optical system, and (c) information on a photoelectric conversion operation of the optical image.
Other aspects of the present invention will become apparent from the following description and the attached drawings.
Exemplary embodiments of the present invention will hereinafter be described with reference to the accompanying drawings.
First, a basic technology for an embodiment of the present invention will be described before description of the embodiment.
In
Additionally, reference characters p2, . . . , p6 denote positions on an in-focus cam track to be traced by the focus lens calculated based on the two representative cam tracks. The positions on the in-focus cam track are calculated by the following expression (1):
p
(n+1)
=|p
(n)
−a
(n)
|/|b
(n)
−a
(n)
|×|b
(n+1)
−a
(n+1)
|+a
(n+1) (1)
According to this expression (1), for example, when the focus lens is located at the position p0 in
Next, description will be made of a case where there is not a restriction in which a stop position of the zoom lens should be set only on a boundary of zoom areas having stored representative cam track data.
In
a0, a1, . . . , ak−1, ak, . . . , an
b0, b1, . . . , bk−1, bk, . . . , bn
When a zoom lens position is Zx which is not located on the boundary of the zoom areas (hereinafter referred to as “zoom area boundary”) and a focus lens position is px, ax and bx are calculated by the following expressions (2) and (3):
a
x
=a
k−(Zk−Zx)×(ak−ak−1)/(Zk−Zk−1) (2)
b
x
=b
k−(Zk−Zx)×(bk−bk−1)/(Zk−Zk−1) (3)
In other words, an internal division ratio is obtained from a current zoom lens position and two zoom area boundary positions (e.g., Zk and Zk−1 in
Further, according to an internal division ratio obtained from ax, px and bx, two cam track data for a same focal length in the stored four representative cam track data are internally divided as in the case of the expression (1). This way, pk and pk−1 can be obtained.
During zooming from a wide-angle side to a telephoto side, a moving speed of the focus lens necessary for maintaining an in-focus state can be known from a difference between a focus position pk which is a tracking movement destination and a current focus position px and a time period necessary for moving the zoom lens from Zx to Zk.
During zooming from the telephoto side to the wide-angle side, a moving speed of the focus lens for maintaining an in-focus state can be known from a difference between a focus position pk−1 which is a tracking movement destination and the current focus position px and a time period necessary for moving the zoom lens from Zx to Zk−1.
In
Next, description will be made of a cam track tracing method for solving the aforementioned problem in which the cam track to be traced by the focus lens during zooming from the wide-angle side to the telephoto side cannot be determined.
Thus, continuously maintaining the focus lens at the peak (point P) of the AF evaluation value can solve the problem in which a cam track to be traced during zooming cannot be determined. However, the peak of the AF evaluation value is a relative value, which is sequentially changed due to change of an angle of view or a contrast change of an object by the zooming. Thus, minutely moving the focus lens in the infinite direction and the close direction to observe changes in AF evaluation value enables detection of a direction in which the peak (in-focus position) of the AF evaluation value is present.
In
At the point P, when the focus lens is minutely moved, the AF evaluation value fluctuates in a small width. This fluctuation is as shown in
In
When the focus lens is located at a point F0 away by L from the center cam track to the infinite side, in order to obtain an AF evaluation value at the point F0, the focus lens is moved to a point F1 (point away by L from the center cam track to the infinite side) with the same inclination as that of the center cam track 1900. Thus, an AF evaluation value E0 is obtained.
Then, the focus lens is moved to a point F2 away by L from the center cam track 1900 to the close side.
Then, in order to obtain an AF evaluation value at the point F2, the focus lens is moved to a point F3 (point away by L from the center cam track 1900 to the close side) with the same inclination as that of the center cam track 1900, and thereby an AF evaluation value E2 is obtained. Similarly, the focus lens is moved to a point F4 away by L from the center cam track 1900 to the infinite side. Repeating these operations enables minute movement of the focus lens around the center cam track 1900.
In the above movement of the focus lens, since the cam track to be traced (in-focus cam track) is the center cam track 1900, the AF evaluation values E0, E2, E4, . . . of the close side and the infinite side obtained by the minute movement are approximately constant as shown in
However, the center cam track is not always a cam track (in-focus cam track) to be traced by the focus lens. Referring to
When the AF evaluation values are as shown in
Thus, the position F′5 is moved closer by an amount W to the close side than the virtual center cam track 1901. Then, the minute movement is performed again with a cam track (virtual center cam track) 1902 located closer by the amount W to the close side than the cam track 1901.
Similarly, with the virtual center cam track 1902 set as a center, the minute movement is performed several number of times to compare obtained AF evaluation values with one another. The amount (center movement amount) W is set based on the comparison result, and change of the center cam track is repeatedly performed until an in-focus cam track is determined.
As described above, this embodiment determines (specifies), based on the AF evaluation values obtained in the minute movement of the focus lens during zooming, a cam track according to which the AF evaluation values are approximately constant while changing the center cam track. As a result, the problem in which the cam track to be traced by the focus lens during zooming from the wide-angle side to the telephoto side cannot be determined can be solved.
The aforementioned zooming control is generally performed, for a reason that focus detection is performed by using an image pickup signal from the image pickup element, in synchronization with a vertical synchronizing signal of an image (video).
At Step S703, the microcomputer detects a state of an operation system of a camera body. The microcomputer receives information on a zoom switch unit operated by a user, and displays magnification variation (zooming) operation information such as a zoom lens position on the display in order to notify the user of zooming execution.
At Step S704, the microcomputer performs AF processing. In other words, the microcomputer performs autofocus processing according to changes of the AF evaluation signal.
At Step S705, the microcomputer performs zooming processing. In other words, the microcomputer performs processing of a compensating operation for maintaining an in-focus state during zooming. Specifically, in order to trace the cam track shown in
At Step S706, the microcomputer selects any of the moving directions and the moving speeds for the zoom lens and the focus lens calculated from the processing at Steps S704 and S705 to use during the AF processing and the zooming. This process is for moving the zoom lens and the focus lens between a controlled telephoto end and a controlled wide-angle end and a controlled closest end and a controlled infinite end, each controlled end being provided by software such that the lens comes into contact with mechanical ends.
At Step S707, the microcomputer outputs a control signal to a motor driver according to information of the moving direction and information of the moving speed for zooming and focusing set at Step S706 to control drive and stop of each lens. After completion of the processing at Step S707, the microcomputer returns to step S703.
The series of processing shown in
Hereinafter, referring to
At Step S400 shown in
At Step S401, the microcomputer compares the AF evaluation values of the close side and the infinite side with respect to the center cam track obtained by the minute movement with one another to determine a direction of a center movement. In this determination method, as shown in
When the focus lens is continuously reciprocated a predetermined number of times in a same area with respect to the center cam track to repeatedly obtain AF evaluation values, the microcomputer determines that this area including an in-focus point (in-focus cam track). When performing the center movement, the microcomputer sets signs + and − respectively indicating the close side and the infinite side.
Subsequently, at Step S402, the microcomputer determines a necessity of the center movement. If the center movement is necessary (that is, if the lens system is in an out-of-focus state where no AF evaluation value indicating an in-focus state is obtained), the microcomputer calculates at step S403 the center movement amount W. If no center movement is necessary, in other words, if the lens system is in an in-focus state, the microcomputer returns to step S405.
The center movement amount W is calculated at Step S403 based on a focal length (zoom lens position) and the like. As shown in
On the other hand, since the spaces between the cam tracks are wider on the telephoto side than those on the wide-angle side, if the center movement amount W is equal to that on the wide-angle side, changing between cam tracks for different object distances is difficult. Thus, the center movement amount W is changed depending on the focal length, and its value is determined based on the space between the cam tracks. The center movement amount W is set to a value enough for the focus lens to minutely move (cover) on all the cam tracks from a cam track for an infinite object distance to a cam track for a closest object distance.
A sign is added to the determined center movement amount W based on the moving direction of the focus lens. At Step S404, the microcomputer adds the center movement amount W to a minute movement center focus lens position Px to update Px as follows:
P
x
=P
x
±W
When the center movement occurs, the focus lens is minutely moved by using this updated Px as a new minute movement center focus lens position.
Next, at Step S405, the microcomputer performs processing shown in
At Step S501 shown in
At Step S601, the microcomputer clears a zoom area variable v. At Step S602, the microcomputer calculates a zoom lens position Z(v) on a boundary of the zoom area v with the following expression (6). This zoom lens position Z(v) corresponds to the zoom lens positions Z0, Z1, Z2, . . . shown in
Z(v)=(telephoto end zoom lens position−wide-angle end zoom lens position)×v/s+wide-angle end zoom lens position (6)
At Step S603, the microcomputer determines whether or not the zoom lens position Z(v) obtained at Step S602 is coincident with the current zoom lens position Zx. If the zoom lens position Z(v) is coincident with the current zoom lens position Zx, the microcomputer at step S607 sets a boundary flag to 1 which indicates that the zoom lens position Zx is located on the boundary of the zoom area v.
If the zoom lens position Z(v) is not coincident with the current zoom lens position Zx at Step S603, the microcomputer at Step S604 determines whether or not a relationship of Zx<Z(v) is established. If the relationship is established at Step S604, the current zoom lens position Zx is located between Z(v−1) and Z(v), and the microcomputer at Step S606 sets the boundary flag to 0. If the relationship is not established at Step S604, the microcomputer at step S605 increments the zoom area v and then returns to step S602.
Repeating the above processing enables, when the processing goes out of the flowchart of
Returning to
First, at Step S502, the microcomputer clears an object distance variable n. At Step S503, the microcomputer determines whether or not the current zoom lens position is located on the zoom area boundary. If the boundary flag is 0, the microcomputer proceeds to Step S505 and subsequent steps since the current zoom lens position is not located on the zoom area boundary.
At Step S505, the microcomputer sets Z(v) to Zk, and sets Z (v−1) to Zk−1. Then, at Step S506, the microcomputer reads four table data A(n, v−1), A(n, v), A(n+1, v−1) and A(n+1, v). At Step S507, the microcomputer calculates ax and bx with the expressions (2) and (3).
On the other hand, if the boundary flag is 1 at Step S503, the microcomputer at Step S504 loads an in-focus position A(n, v) corresponding to a zoom lens position (v in this description) at the object distance n and A(n+1, v) corresponding to a zoom lens position at the object distance n+1. The microcomputer respectively stores them as ax and bx in the memory.
At Step S508, the microcomputer determines whether or not the minute movement center focus lens position Px is equal to or higher than ax. If Px is equal to or higher than ax, the microcomputer at step S509 determines whether or not the minute movement center focus lens position Px is equal to or higher than bx. If Px is not equal to or higher than bx, the minute movement center focus lens position Px is located between the object distances n and n+1, and the microcomputer stores cam track parameters for this case in the memory at Steps S513 to S515. At step S513, the microcomputer sets α=Px−ax. At Step S514, the microcomputer sets β=bx−ax. At Step S515, the microcomputer sets γ=n.
The microcomputer determines “No” at Step S508 when the minute movement center focus lens position Px is a superinfinite position. In this case, the microcomputer at Step S512 sets α=0, and then proceeds to Step S514 and subsequent steps thereof to store cam track parameters for the infinite object distance in the memory.
The microcomputer determines “Yes” at Step S509 when the minute movement center focus lens position Px is located further to the close side. In this case, the microcomputer increments the object distance n at Step S510, and determines whether or not the object distance n is further to the infinite side than a position m corresponding to the closest position at Step S511. If the position n is further to the infinite side than the closest distance position m, the microcomputer returns to Step S503.
The microcomputer determines “No” at Step S511 when the minute movement center focus lens position Px is located at a superclose position. In this case, the microcomputer proceeds to Step S512 and subsequent steps thereof to store cam track parameters for the closest object distance in the memory.
Referring back to
Then, at Step S406, the microcomputer calculates a zoom lens position (movement destination position from the current zoom lens position) Z′x where the zoom lens will reach after one vertical synchronizing time period (1V). The zoom lens position Z′x after one vertical synchronizing time period can be calculated by the following expression (7), where Zsp(pps) denotes a zooming speed determined at Step S400.
Z′
x
=Z
x
±Zsp/vertical synchronizing frequency (7)
In this expression, pps represents a unit for a rotation speed of a stepping motor used as the zoom motor, specifically a step amount (1 step=1 pulse) of rotation per second. Signs + and − of the expression (7) respectively indicate a telephoto direction and a wide-angle direction corresponding to the moving direction of the zoom lens.
Then, at Step S407, the microcomputer determines a zoom area v′ where the zoom lens position Z′x is located. At Step S407, the microcomputer performs processing similar to that shown in
At step S408, the microcomputer determines whether or not the zoom lens position Z′x after one vertical synchronizing time period is located on the boundary of the zoom area. In the case of the boundary flag=0, the microcomputer determines that the zoom lens position Z′x is not located on the boundary, and then proceeds to Step S409 and subsequent steps thereof.
At Step S409, the microcomputer sets Zx←Z(v′), Zk−1←Z(v′, −1). At Step S410, the microcomputer reads four table data A(γ, v′−1), A(γ, v′), A(γ+1, v′) and A(γ+1, v′) where an object distance γ is specified by processing of
On the other hand, if the microcomputer determines “Yes” at Step S408, the microcomputer at Step S412 loads an in-focus position A(γ, v′) corresponding to the zoom area v′ at the object distance γ and an in-focus position A(γ+1, v′) corresponding to the zoom area v′ at the object distance γ+1. Then, the microcomputer stores these in-focus positions as a′x and b′x in the memory.
Next at Step S413, the microcomputer calculates an in-focus position (target position) P′x of the minute movement center focus lens when the zoom lens position reaches Z′x. By using the expression (1), a tracking target position after one vertical synchronizing period of time can be represented by the following expression (8):
P′
x=(b′x−a′x)×α/β+a′x (8)
Next at Steps S414 to S417, the microcomputer performs sign determination of an amplitude L based on a minute movement count N. As shown in
Similarly, at Step S414, when the minute movement count N is one of 2, 3, 6 and 7, the focus lens is moved further on the close side than the minute movement center cam track, the microcomputer adds the amplitude L to the minute movement center focus lens position. Thus, at Step S416, the focus lens position P′L ahead by 1V becomes P′x+L.
Next at Step S417, the microcomputer increments the minute movement count N. Then at Step S418, when the minute movement count N reaches 8, the microcomputer initializes the minute movement count N to 0. A maximum value of the minute movement count N is not necessarily 8. In this embodiment, four minute movements in the infinite direction, the infinite direction, the close direction and the close direction constitute one cycle. However, one cycle may be constituted by alternate two movements in the infinite direction and the close direction.
The above processing enables calculation of the focus lens position P′L ahead by 1V. After the completion of this processing, at Step S706 of
As shown in
ΔF=P′L−PL
ΔF={(b′x−a′x)×α/β+a′x±L}−(Px±L)±W
The sign of L is set based on the minute movement count N, and the center movement amount W is calculated based on a comparison result of the AF evaluation values.
Thus, depending on whether the difference AF is positive or negative, a direction for moving the focus lens is selected from one of the close direction and the infinite direction. As a result, the minute movement of the focus lens enables determination of a cam track to be traced.
The presupposed technology of the present invention has been described. Hereinafter, differences of an embodiment of the present invention from the presupposed technology will be described.
In this case, a microcomputer in the interchangeable lens performs a zoom operation described below in response to a signal transmitted from the camera body. The alternative embodiments of the present invention further include not only the video camera but also various other image pickup apparatuses such as a digital still camera.
In
The lens units 101, 102, 104 and 105 and the aperture stop 103 constitute an image taking optical system. The image taking optical system is a rear focus (inner focus) optical system including four lens units having positive, negative, positive and positive optical powers in order from the object side. In the figure, each lens unit includes one lens. In reality, however, each lens unit may include plural lenses.
Reference numeral 106 denotes an image pickup element such as a CCD sensor or a CMOS sensor. A light flux from an object passed through the image taking optical system forms an image on the image pickup element 106. The image pickup element 106 photoelectrically converts the object image to output an image pickup signal (photoelectric conversion signal). The image pickup signal is amplified to an optimal level by an amplifier (AGC) 107 to be input to a camera signal processing circuit 108.
The camera signal processing circuit 108 converts the input image pickup signal into a standard television signal (video signal) to output the signal to an amplifier 110. The television signal amplified to an optimal level by the amplifier 110 is output to a magnetic recording/reproducing device 111 to be recorded in a magnetic recording medium such as a magnetic tape. As the recording medium, other media such as a semiconductor memory and an optical disk may be used.
The television signal amplified by the amplifier 110 is transmitted to an LCD display circuit 114 to be displayed as a captured image by an LCD 115. The LCD 115 also displays an image to notify a user of an image pickup mode, an image pickup state and various warnings. Such an image is generated by a character generator 113 controlled by a camera microcomputer 116, and is mixed to the television signal by the LCD display circuit 114, thereby being superimposed on the captured image and displayed therewith.
The image pickup signal input to the camera signal processing circuit 108 can be simultaneously compressed by using an internal memory, and then recorded in a still image recording medium 112 such as a card medium.
The image pickup signal input to the camera signal processing circuit 108 is also input to an AF signal processing circuit 109 as a focus signal generator. An AF evaluation value signal (focus signal) generated by the AF signal processing circuit 109 and indicating a focus state of the image taking optical system is read by communication with the camera microcomputer 116.
The camera microcomputer 116 reads states of a zoom switch 130 and an AF switch 131, and detects a state of a photo switch 134.
In a state where the photo switch 134 is half-pressed, a focusing operation by AF is started, and focus lock is activated in an in-focus state. In a state where the photo switch 134 is fully pressed, the focus lock is performed irrespective of an in-focus or out-of focus state to capture an image in a memory (not shown) of the camera signal processing circuit 108, and a still image is recorded in a magnetic tape or the still image recording medium 112.
The camera microcomputer 116 determines whether a current image pickup mode is a moving image pickup mode or a still image pickup mode according to a state of a mode switch 133, and controls the magnetic recording/reproducing device 111 and the still image recording medium 112 via the camera signal processing circuit 108. The camera microcomputer 116 accordingly supplies a television signal suited to recording. When the mode switch 133 is set in a reproduction mode, the camera microcomputer 116 controls reproduction of the television signals recorded in the magnetic recording/reproducing device 111 and the still image recording medium 112.
The camera microcomputer 116 includes a computer zoom unit 119 provided as a controller. When the zoom switch 130 is operated while the AF switch 131 is OFF, the computer zoom unit 119 outputs a zoom signal to a zoom motor driver 122 via a motor controller 118 according to an internal program. The zoom signal is for moving the zoom lens 102 in a telephoto direction or a wide-angle direction corresponding to a direction where the zoom switch 130 is operated.
The zoom motor driver 122 moves the zoom lens 102 via a zoom motor 121 in response to reception of the zoom signal in a direction corresponding to the operation direction of the zoom switch 130.
A cam data memory (memory) 120 stores representative cam track data shown in
The camera microcomputer 116 includes an AF control unit 117.
When the AF switch 131 is ON and the zoom switch 130 is operated, zooming needs to be performed while maintaining an in-focus state. In this case, the computer zoom unit 119 obtains the AF evaluation value signal from the AF signal processing circuit 109 and distance information to an object (focusing target object) from an object distance detection circuit (detector) 127. Then, the computer zoom unit 119 moves the zoom lens 102 and the focus lens 105 based on the cam track data, the AF evaluation value signal and the distance information.
A detection signal (information corresponding to distance) from the object distance detection circuit 127 is subjected to calculation processing by a distance information processor 128 of the camera microcomputer 116, and is output as the distance information to the computer zoom unit 119.
When the zoom switch 130 is not operated while the AF switch 131 is ON, the AF control unit 117 outputs a signal to the focus motor driver 126 via the motor controller 118 so as to move the focus lens 105 such that a level of the AF evaluation value signal becomes a maximum. Thus, the focus lens 105 is moved via the focus motor 125 to perform an autofocus operation.
The object distance detection circuit 127 measures a distance to the object by a triangulation method using an active sensor, and outputs the distance information which is a measuring result. In this case, as the active sensor, an infrared sensor which is often used in a compact camera can be used.
This embodiment describes an example where distance detection is performed by the triangulation method. However, other distance detection methods can be employed. For example, a TTL phase difference detection method can be employed in which a signal (phase difference signal as information corresponding to distance) corresponding to the object distance is obtained.
In this case, an element such as a half-prism or a half-mirror for dividing a light flux that has passed through an exit pupil of the image taking optical system is provided, and the light flux emerging from the element is guided to at least two line sensors via a sub-mirror or an image-forming mirror. Then, correlation calculation is performed on outputs of these line sensors, thereby obtaining a deviation direction and a deviation amount of the outputs. An object distance is obtained from the deviation direction and amount.
Of light fluxes from the object 201, a light flux passing through the first optical path forms an image on the line sensor 203 by the image-forming lens 202, and a light flux passing through the second optical path forms an image on the line sensor 205 by the image forming lens 204.
The two line sensors are separated from each other by the base length B. Hence, as obvious from
For the detector, a method of obtaining a signal equivalent to a distance to the object by measuring a propagation speed using an ultrasonic sensor can be used.
The distance information from the object distance detection circuit 127 is transmitted to the distance information processor 128. The distance information processor 128 performs the following three processes PROCESS 1 to PROCESS 3.
(PROCESS 1) The distance information processor 128 calculates, of the cam tracks for various object distances shown in
Specifically, the distance information processor 128 calculates an object distance (meters) corresponding to a virtual cam track which internally divides a γ column cam track and a γ+1 column cam track at a internal division ratio of α/β by using the current lens positions and the cam track parameters a, β and γ. The cam track parameters α, β and γ are converted into the object distance by using predetermined correlation table data. Thus, a real distance to the object can be obtained.
(PROCESS 2) The distance information processor 128 performs reverse conversion of the distance to the object obtained from the object distance detection circuit 127 by using the correlation table data described above in PROCESS 1 to obtain a cam track expressed by the cam track parameters α, β and γ. In the reverse conversion, of the correlation table data, wide-angle side data where the cam tracks of
(PROCESS 3) A difference between the real distance to the object obtained in PROCESS 1 and the distance to the object obtained from the object distance detection circuit 127 in PROCESS 2, and a direction thereof are calculated.
Among these PROCESSES 1, 2 and 3, PROCESS 2 enables determination of a cam track corresponding to the distance information detected by the object distance detection circuit 127.
The camera microcomputer 116 also performs exposure control. The camera microcomputer 116 refers to a luminance level of the television signal generated by the camera signal processing circuit 108, and controls an iris driver 124 to drive an IG meter 123 so that the luminance level can be appropriate for exposure. Thus, the camera microcomputer 116 controls an aperture diameter (aperture amount or aperture value) of the aperture stop 103.
The aperture diameter of the aperture stop 103 is detected by an iris encoder 129 to perform feedback control of the aperture stop 103. When appropriate exposure control cannot be performed only by the aperture stop 103, an exposure time period of the image pickup element 106 is controlled by a timing generator (TG) 132 to perform a fast shutter and a slow shutter which is called as long-time exposure. When there is a shortage of exposure due to image pickup under low illuminance or the like, the camera microcomputer 116 controls a gain of the television signal via the amplifier 107.
The user can manually set an image pickup mode and a camera function which are suited to each of various image pickup conditions.
Next, referring to
In this embodiment, based on the distance information obtained by the object distance detection circuit 127, a cam track to be traced (followed) by the focus lens 105 is determined (generated), and thereby zooming is performed while maintaining an in-focus state.
At Step S400, the computer zoom unit 119 determines a moving speed (hereinafter referred to as “zooming speed”) of the zoom lens 102 during zooming.
At Step S300, the computer zoom unit 119 calculates cam track parameters αd, βd and γd corresponding to the distance information obtained by the object distance detection circuit 127, for example, by the following method.
First, in order to obtain a correlation between the distance information and the representative cam tracks shown in
Concerning a focal length, of the cam track data shown in
Accordingly, at a point of time when the zoom lens 102 is located on the wide-angle side, interpolation calculation is performed based on the cam track parameters on the telephoto side, whereby one cam track to be traced by the focus lens 105 can be determined.
Step S300 is executed for each predetermined cycle (e.g., one vertical synchronizing cycle). Thus, even when the object distance changes during zooming, based on the distance information from the object distance detection circuit 127, latest cam tracks to be traced by the focus lens 105 are sequentially updated.
At Step S301, the computer zoom unit 119 determines a correction range of the cam tracks based on the cam track parameters αd, βd and γd corresponding to the distance information obtained by the object distance detection circuit 127. This correction range corresponds to a movable range of the focus lens 105 during a correction operation of the cam tracks using the AF evaluation values, in other words, a movable range for the minute movement of the focus lens 105 performed to obtain the AF evaluation values, for example, a range between an upper limit 201 and a lower limit 202 shown in
In this embodiment, for example, when the distance information (object distance) 203 obtained by the object distance detection circuit 127 is 5 m, the correction range is restricted within an increase/decrease range of ±50 cm with respect to the object distance. In other words, the upper limit 201 is equivalent to a cam track corresponding to an object distance of 4.5 m, and the lower limit 202 is equivalent to a cam track corresponding to an object distance of 5.5 m. This increase/decrease range may be determined according to detection accuracy of the object distance detection circuit 127.
In this embodiment, after a cam track to be followed (hereinafter referred to as “following cam track”) is roughly determined based on the distance information, the following cam track is redetermined (regenerated) more accurately by the minute movement of the focus lens 105 performed to obtain the AF evaluation values. The correction range is set to restrict a movable range of the focus lens 105 during the minute movement.
Employing such a configuration eliminates necessity of setting detection resolution (detection accuracy) of the object distance detection circuit 127 so high, accordingly enabling realization of a low-cost and compact optical apparatus.
Conventionally, an object distance has been undeterminable during zooming from the wide-angle side to the telephoto side. As a result, a center position (movement center) of the minute movement of the focus lens is greatly moved during the zooming to determine a following cam track among many cam tracks, creating a possibility of defocusing (image blur).
On the other hand, in this embodiment, the correction range is set, and a following cam track is determined among a small number of cam tracks restricted within the correction range. Thus, the movement amount of the center position (center movement amount) of the focus lens in the minute movement is small, enabling suppression of defocusing.
The small center movement amount of the focus lens provides an advantage of reducing a moving speed of the focus lens.
As shown in
On the other hand, restricting the correction range to reduce the center movement amount enables mitigation of such a problem.
As a real operation in this embodiment, determination of the following cam track by the minute movement of the focus lens is performed only within the correction range between the upper limit 201 and the lower limit 202 shown in
Thus, in this embodiment, the correction range is set according to the distance detection resolution of the object distance detection circuit 127, and accurate determination of the following cam track is performed using the AF evaluation values obtained only within the correction range. As a result, erroneous operations or defocusing caused by use of the AF evaluation values to redetermine the following cam track can be suppressed.
Referring back to
At Step S402, the computer zoom unit 119 determines a necessity of the center movement based on the result of Step S401, and starts calculation of the center movement amount from step S302 if the center movement is necessary. If the center movement needs to be performed, in other words, in an in-focus state, the computer zoom unit 119 proceeds to Step S405.
The calculation of the center movement amount implemented at Steps S302 to S311 will be described below in detail.
Determination of the following cam track during zooming can be performed, if the zooming is started from the wide-angle side, while suppressing defocusing more on the wide-angle side than on the telephoto side. A reason is as follows.
All the cam tracks are densely set on the wide angle side. Hence, even when any of the cam tracks on the infinite side and the close side is selected as the following cam track, all the cam tracks can be covered by a small center movement. Additionally, a depth of focus (distance range where defocusing cannot be recognized by human eyes) is larger (deeper) on the wide-angle side, and thus the following cam track can be determined while suppressing defocusing.
However, an angle of view is wider on the wide-angle side, and hence various objects enter an AF target angle of view, and detection of a distance to a main object that is a focusing target object is difficult. As a result, there is a problem of a difficulty of determining the following cam track.
On the telephoto side, the cam tracks are away from one another. Hence, in order to determine one following cam track from the plural cam tracks, the center movement amount needs to be set larger than that on the wide-angle side. However, due to a small depth of focus on the telephoto side, the large center movement amount causes a difficulty of determining the following cam track while suppressing defocusing. On the other hand, the angle of view on the telephoto side is narrower than that on the wide-angle side, and hence the distance to the main object can be easily obtained, which facilitates determination of the following cam track.
In other words, the wide-angle side has characteristics that the determination of the following cam track is difficult while defocusing hardly occurs. The telephoto side has characteristics that the determination of the following cam track is easy while defocusing easily occurs.
For such characteristics, the depth of focus is an important factor. The depth of focus is determined based on the aperture diameter (aperture amount) of the aperture stop 103. When the aperture diameter of the aperture stop 103 is large, the depth of focus is small. In other words, defocusing easily Occurs. Conversely, when the aperture diameter is small, the depth of focus is large, and defocusing is difficult to occur.
Thus, in this embodiment, according to the aperture diameter (depth of focus) of the aperture stop 103, the center movement amount that is a movement amount of the movement center (center position) in the minute movement of the focus lens 105 during zooming is changed, thereby facilitating determination of the following cam track while suppressing defocusing. The aperture diameter of the aperture stop 103 and the depth of focus are information on an operation or a state of the image taking optical system.
When the depth of focus is superimposed thereon, the range of the depth of focus is a range surrounded with an upper limit side curved line 221 and a lower limit side curved line 222. For example, a wide-angle side depth of field (the depth of field is a value obtained by converting the depth of focus into an object distance) is ±1 m around 5 m, a telephoto side depth of field is ±30 cm, which is smaller than the wide-angle side depth of field, and a range of the depth of field (that is, a range of the depth of focus) is smaller than the correction range.
As shown in
The use of the object distance detection circuit 127 can solve the above-described problem of presence of various objects in the AF target angle of view on the wide angle side. The object distance detection circuit 127 detects object distances at one or several points in the angle of view irrespective of a size of the angle of view, and hence the object distance detection circuit 127 can detect a distance to the main object even when the angle of view is wide.
In other words, in the wide-angle side range where “depth of focus>correction range” is set at the time of starting zooming, through use of the object distance detection circuit 127, a following cam track present within the correction range is determined by the minute movement of the focus lens 105 and cam track changing by the center movement. This processing is important for zooming in which defocusing is suppressed.
Next, referring back to Step S302 of
At Step S302, the computer zoom unit 119 obtains the aperture diameter (aperture value) from the aperture stop 103. At Step S303, the computer zoom unit 119 calculates the depth of focus based on the aperture value obtained at Step S302 and the zoom lens position.
At Step S304, the computer zoom unit 119 obtains a zooming speed Zsp and an exposure time period (shutter speed) of the image pickup element 106 determined by the timing generator (TG) 132.
At Step S305, the computer zoom unit 119 determines the size relationship of the depth of focus and the correction range. When the correction range is smaller than the depth of focus (“correction range<depth of focus”), the computer zoom unit 119 proceeds to Step S306 to calculates a zooming time period Tz (zooming time period in a range where “depth of focus>correction range” is established) to the point P where the depth of focus is equal to the correction range. The zooming time period can be calculated as follows:
Zooming time period Tx[s]=distance[pulse]÷zooming speed Zsp[pps] to point P
At Step S307, the computer zoom unit 119 calculates a center movement frequency M during the zooming time period Tz. The center movement frequency M is proportional to the number of times of obtaining the AF evaluation values synchronized with the vertical synchronizing cycle V, and calculated by using the vertical synchronizing cycle V. In other words, the center movement frequency M can be calculated as follows:
Center moving frequency M=zooming time period Tz[s]×vertical synchronizing cycle V[hz]
The center movement frequency M corresponds to the number of times of obtaining the AF evaluation values.
Next, the computer zoom unit 119 calculates, by center movements performed M times during the zooming time period Tz, the center movement amount W′ that enables the minute movement of the focus lens 105 within the correction range from the lower limit 202 to the upper limit 201. This center movement amount W′ can be calculated by dividing the correction range by the center movement frequency M.
First at Step S308, the computer zoom unit 119 adds an error K to the center movement frequency M by taking some errors into consideration to calculate the following C:
C=M−K (K is an integer of 1 or more)
The above calculation presumes an image pickup condition where the object distance is not changed.
Next at Step S309, the computer zoom unit 119 calculates one center movement amount W′ as follows:
Center moving amount W′=correction range/C
When the shutter speed is lower than the vertical synchronizing cycle V[hz], the center movement frequency M is calculated as follows:
Center moving frequency M=zooming time period Tz[s]÷shutter speed [s]
The center movement amount W′ may be changed according to the focal length (zoom lens position) in addition to the zooming speed and the shutter speed (exposure time period of the image pickup element 106). The zooming speed and the focal length are information on an operation or a state of the image taking optical system. The shutter speed is information on a photoelectric conversion operation of the optical image.
Next, referring to
Referring back to Step S305 of
At Step S310, the computer zoom unit 119 calculates the center movement amount W′ as follows:
Center moving amount W′=correction range/C′
In the expression, C′ denotes a value that satisfies a relationship of W′<depth of focus, and a value that is obtained by performing actual measurement by a sufficient number of times beforehand and where no defocusing occurs until completion of zooming is used.
After completion of the processing, at Step S311, the computer zoom unit 119 updates the minute movement position Px as follows by adding the center movement amount W′ thereto:
Minute movement position Px=Px+W′
Next at Step S405, the computer zoom unit 119 calculates the cam track parameters corresponding to the center cam track of the minute movement. This processing is similar to that of Step S405 shown in
As described above, in this exemplary embodiment, within the correction range (within moving range) for performing the minute movement of the focus lens which is a range restricted based on the object distance, the movement amount of the center position in the minute movement is changed according to the aperture value (in other words, to the depth of focus). As a result, defocusing can be suppressed while improving determination accuracy of the following cam track by using the AF evaluation values.
Changing the center movement amount according to the zooming speed, the focal length and the shutter speed enables further improvement of the determination accuracy of the following cam track.
Thus, according to this embodiment, the center movement amount is changed according to at least one of information on the operation or the state of the optical system and information on the photoelectric conversion operation of the optical image. Thus, completion of zooming without determining any following cam track due to a reduction in number of times of obtaining the AF evaluation values caused by the detection cycle of the AF evaluation values during a fast zooming operation or a slow shutter operation, or due to the characteristics of the cam tracks on the telephoto side can be prevented. Furthermore, zooming where defocusing caused by the minute movement of the focus lens for determining the following cam track is suppressed can be performed.
Furthermore, the present invention is not limited to these embodiments and various variations and modifications may be made without departing from the scope of the present invention.
This application claims the benefit of Japanese Patent Application No. 2009-005505, filed on Jan. 14, 2009, which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | Kind |
---|---|---|---|
2009-005505 | Jan 2009 | JP | national |