Preferred embodiments of the present invention will be described in detail in accordance with accompanied drawings.
A lens block of the video camera includes a lens group that has a focus lens 1 configured to focus a subject image incident on an image-capture lens 1c on an image-capture surface of the image capture device, a position detector configured to detect positions of each lens, a lens drive mechanism configured to drive each lens, and a lens driver configured to control movement of the lens drive mechanism. Lenses such as a wobbling lens used to determine the directions of focal position other than the focus lens 1 and the image-capture lens 1c are omitted from the lens block shown in
The focus lens 1 includes the position detector la configured to detect positions of the focus lens 1 or focus positions, the lens drive mechanism 1b configured to move the positions of the focus lens in the direction of an optical axis, and the lens driver 2 configured to move the lens drive mechanism. Likewise, a wobbling lens (not shown) includes a wobbling lens driving mechanism configured to move a position detector and lens position in the direction of an optical axis in order to perform appropriate wobbling. The lens block includes an aperture stop (not shown) configured to limit an amount of light that can pass through; and the aperture stop includes an aperture stop position detector configured to detect the aperture size of the aperture stop and an aperture stop drive mechanism configured to open and close the aperture stop.
The lens driver 2 is supplied with respective detected signals from the position detector la including: a signal to indicate focus positions, a signal to indicate an amount of wobbling, and a signal to indicate the aperture size of the aperture stop. The lens driver 2 including a lens CPU and a lens drive circuit is configured to move a focus (focal point) of the focus lens 1 according to instructions transmitted from the control unit 9. The lens driver 2 is connected with a user interface (not shown) configured to set modes of auto-focus or initiate the auto-focus operation, so that the lens driver 2 is supplied with operation signals according to operation of the user interface. The lens driver 2 includes a storage (not shown) having a ROM or EEPROM, on which information is stored, such as focal length data of the focus lens 1 and the wobbling lens, aperture ratio data, the name of manufacturer, and manufacturer's serial numbers.
The lens driver 2 generates lens drive signals based on the stored information, respective detected signals, and focus control signals or wobbling control signals described later supplied from the control unit 9. The lens driver 2 also supplies generated lens drive signals to the lens drive mechanism 1b to move the focus lens 1 to a desired focus position. The lens driver 2 supplies the generated lens drive signals to a wobbling lens drive mechanism to wobble the wobbling lens, so that the focus lens 1 may detect a direction of a focus position. The lens driver 2 further generates aperture stop drive signals to control the aperture size of the aperture stop.
In the video camera shown in
In image signal generator 5, electric signals output from the image capture device 3 are subject to appropriate signal processing, and image signals complied with a prescribed standard are generated. The image signals are transmitted to a circuit group (image signal processor 6), and are also input to the evaluation value calculator 7. The evaluation value calculator 7 is configured to filter high frequency components of image signals in a specific region provided within a captured image frame, and calculates the evaluation values relative to the image contrast. In imaging a typical subject, the evaluation values generally increases as a subject image approximates in-focus state, and the evaluation value is relative maximum when the subject image is in-focus. The evaluation value is updated once for one field of image signals. Auto-focus operation using evaluation values is well-known technology in the art, one example of which is described in detail in Japanese Unexamined Patent Application Publication No. 10-213736 previously disclosed by the applicant of the present invention.
The aforementioned processing is performed for each of three primary colors R (Red), G (Green), and B (Blue). For example, the camera block includes a color separating prism (not shown). The color separating prism separates light incident from the lens block into the three primary colors R, G, and B, and supplies the R component light to R component image capture device, the G component light to G component light to G component image capture device, and the B component light to B component image capture device, respectively. In
The subject images for each color formed on the image capture device 3 are subject to prescribed processing before the subject images are photo-electrically converted into signals by the image capture device 3 and output to the image signal generator 5. The image signal generator 5, for example, includes a preamplifier (not shown) and an A/D (Analog/Digital) converter. The level of the electric signals input to the image signal generator 5 is amplified by the preamplifier, and correlated double sampling is performed on the signals to eliminate a reset noise, and the A/D converter converts analog signals into digital image signals.
Further, the image signal generator 5 is configured to perform gain control, black level stabilizer, or dynamic range control, and the like of the supplied image signals for each color, and supply the image signals thus obtained to the image signal processor 6, the evaluation value calculator 7, and the luminance addition value calculator 8.
The image signal processor 6 performs various signal processing of the image signals supplied from the image signal generator 5, and generates output image signals. For example, the image signal processor 6 performs knee correction to compress image signals at or above a certain level, gamma correction to set a correct level for image signals according to a configured gamma curve, and white clip processing or black clip processing to limit image signal levels to a prescribed range. The image signal processor 6 also performs edge enhancement processing or linear matrix processing, encode processing, or the like to generate output image signals in a desired format.
The evaluation value calculator 7 filters the high frequency components using image signals in a specific region provided within the captured image frame of the image signals to calculate evaluation values ID corresponding to the image contrast and supply the calculated evaluation values ID to the control unit 9.
The image signal generator 5 having such as a preamplifier and A/D converter, the image signal processor 6, the evaluation value calculator 7, and the like, perform respective processing using the vertical direction synchronization signal VD, the horizontal direction synchronization signal HD, and the clock signal CLK synchronized with the image signals supplied from units, the image signal processor 6, the evaluation value calculator 7. The vertical direction synchronization signal VD, the horizontal direction synchronization signal HD, and the clock signal CLK may alternatively be obtained from the clock signal generator.
The evaluation value calculator 7 is described more in detail below.
The image signal generator 21 performs the following operation:
DY=0.30 R+0.59 G+0.11 G using the image signals R, G, B supplied from the image signal generator 5 and generate a luminance signal DY. The luminance signal DY is generated in this manner, because it is sufficient to simply detect changes in the level of contrast and determine whether contrast is high or low in order to determine whether a subject image is in-focus or out of focus.
The evaluation value generator 22 generates the evaluation values ID0 to ID13. The evaluation values ID0 to ID13 are obtained by summing the frequency components of image signals in a specific region (hereinafter called “evaluation frame”) provided within the captured image frame, and provide values corresponding to the blurring of the image.
Evaluation value names indicating the attributes (data used_evaluation frame size_evaluation calculation method) are provided with the evaluation values ID0 to ID13.
The data used in the evaluation value names are broadly divided into “IIR” and “Y”. “IIR” implies data including high-frequency components obtained from the luminance signal DY using a HPF (high-pass filter); whereas “Y” implies data using original frequency components of the luminance signal DY without using a HPF.
When using a HPF, an IIR type (infinite impulse response type) HPF is used. Evaluation values are divided into IIR0, IIR1, IIR3, and IIR4 according to the type of HPF; these represent HPFs having different respective cutoff frequencies. Thus, by setting HPFs having different cutoff frequencies, for example, by using a HPF having a high cutoff frequency in the vicinity of in-focus position, changes in the evaluation value can be increased compared with the case of using a HPF with a low cutoff frequency. Further, when the captured image is largely out of focus, changes in the evaluation value can be increased using a HPF with a low cutoff frequency compared with the case of using a HPF with a high cutoff frequency. In this manner, HPFs having different cutoff frequencies may be set according to the focusing state during auto-focus operation in order to select the optimal evaluation value.
The evaluation frame size implies the size of the image region used in evaluation value generation. As shown in
Evaluation frame size W1: 116 pixels×60 pixels
Evaluation frame size W2: 96 pixels×60 pixels
Evaluation frame size W3: 232 pixels×120 pixels
Evaluation frame size W4: 192 pixels×120 pixels
Evaluation frame size W5: 576 pixels×180 pixels
Thus, different evaluation values can be generated corresponding to the frame sizes by setting one of the plurality of frame sizes. Hence, an appropriate evaluation value can be obtained by setting one of the evaluation values ID0 to ID13, regardless of the size of the target subject.
Evaluation value calculation methods include the HPeak, HIntg, VIntg, and Satul methods. The HPeak system implies calculating horizontal evaluation values usby the peak system; the HIntg system includes calculating horizontal evaluation values by the whole integration system; the VIntg system involves calculating vertical-direction evaluation values by the integration system and the Satul system includes the number of saturated luminance.
The HPeak method is an evaluation value calculation method in which a HPF is used to determine high-frequency components from horizontal-direction image signals, and is used to compute the evaluation values ID0, ID1, ID2, ID3, ID9, ID10, and ID11.
The high-frequency components of the luminance signals DY are filtered by the HPF 31, and absolute values selected by the absolute value processing circuit 32. Subsequently, the horizontal-direction frame control signals WH are multiplied by the multiplication circuit 33 to obtain absolute values of high-frequency components within the evaluation frame. That is, if frame control signals WH the multiplication value of which is “0” outside the evaluation frame are supplied to the multiplication circuit 33, then only the absolute values of horizontal-direction high-frequency components within the evaluation frame are supplied to the line peak hold circuit 34.
Here, the frame control signals WH in the vertical direction form a squarewave; however, the frame control signals WH in the horizontal direction do not merely include characteristics of a mere square wave but include characteristics of a triangular wave, so that the multiplied value of the frame control signals WH is reduced in the periphery of the frame (both ends). Thus, as the subject image within the frame approximates in-focus state, it is possible to reduce effects caused by the subject image interfering the external edges around the periphery of the frame (high luminance edges in the evaluation frame, including noise, drastic change, or the like of the evaluation values) or variability in the evaluation values caused by movement of the subject can be decreased. The line peak hold circuit 34 holds the peak values for each line. The vertical-direction integration circuit 35 adds peak values held for each line within the evaluation frame in the vertical direction, based on vertical-direction frame control signals WV, thereby obtaining the evaluation value. This method is called the HPeak method, because horizontal-direction (H) peaks are held temporarily.
The HIntg method is defined as a total-integration type horizontal-direction evaluation value calculation method.
The HIntg method is divided into IIR1 and Y. The IIR1 employs high-frequency components as the data, whereas the Y employs original luminance signals DY. Luminance addition values are obtained by a luminance addition value calculation filter circuit, resulting by removing the HPF 31 from the total-integration type horizontal-direction evaluation value calculation filter of
The VIntg method is a total-integration type vertical-direction evaluation value calculation method, used for obtaining the evaluation values ID4 and ID5. In both the HPeak method and the HIntg method, values are added in the horizontal direction to generate evaluation values; however, in the VIntg method, high-frequency components are added in the vertical direction to generate the evaluation values. For example, in a case of an image the upper half of which is white while the lower half is black, such as an image with a horizon or other scenes, so that there are only high-frequency components in the vertical direction but are no high-frequency components in the horizontal direction, the HPeak method horizontal-direction evaluation value does not function effectively. Hence the evaluation value in VIntg method is used in order that AF functions effectively for such scenes.
The Satul method is a calculation method in which the number of luminance signals DY that are saturated; that is, a luminance level equal to or above a prescribed level, within an evaluation frame is determined, and the outcome is used in calculating the evaluation values ID8. In calculating the evaluation values ID8, the luminance level of the luminance signal DY is compared with a threshold a, and the number of pixels for which the luminance level of the luminance signal DY is equal to or above the threshold a in the evaluation frame is counted for each field, and the outcome is determined as the evaluation values ID8.
The configuration of the video camera is described by referring back to
The control unit 9, for example, includes a CPU (Central Processing Unit), RAM (Random Access Memory), and ROM (Read Only Memory), and retrieves computer programs stored in the ROM onto the RAM to run the programs, and hence a prescribed control and processing such as auto-focus operation are performed. The control unit 9 receives evaluation values calculated by an evaluation value calculator 7 once for one field, and searches the peak of the evaluation values. The auto-focus operation is performed using instructions as a trigger from a one-shot switch 13 that directs activation of the auto-focus operation. The control unit 9 and the lens driver 2 of the lens block are configured such that the control unit 9 and the lens driver 2 may communicate with each other using predetermined formats and protocols, and collaborate to control the auto-focus operation. The lens driver 2 supplies various information such as the focus position or the value indicating the aperture stop size to control unit 9. The lens driver 2 generates lens drive signals based on focus control signals or wobbling control signals supplied from the control unit 9 to perform drive processing on the focus lens 1 and wobbling lens. The control unit 9 generates and supplies the focus control signal for controlling to drive the focus lens 1 or the wobbling control signals for controlling to drive the wobbling lens to the lens driver 2, based on the evaluation values ID calculated by the evaluation calculator and the various information retrieved from the driver 2.
Each of the lens driver 2 and the control unit 9 incorporates a microcomputer and a memory to perform the auto-focus operation by retrieving to run a program stored in the non-volatile memory.
A memory 10 is a storage unit into which data are written and from which data are read out by the control unit 9. The storage unit is configured to store information such as the focus position of the focus lens 1 and the evaluation value calculated at the evaluation value calculator 7. The memory 10 includes a non-volatile memory such as semiconductor memories.
Indicators 11G, 11R are one example of display units; each of which includes a light emitting diode (LED; Light Emitting Diode (green, red) respectively. The indicator 11G or 11R lights up based on the outcome of reliability of the subject image being in-focus is assessed by the control unit 9. It is apparent that neither type nor color used for an indicator may be limited to those described above as the example.
An interface 12 (hereinafter called “IF unit”) is one example of a signal output unit. The IF unit 12 outputs signals according to the outcome of assessed reliability of the subject image being in-focus to outside the auto-focus apparatus or video camera. The operation signals input from outside are transmitted from the IF unit 12 to the control unit 9, so that the movement of the video camera is controlled based on the operation signals acquired from the outside.
A liquid crystal monitor driver 14 is configured to generate an image signal output from a signal processor 6 and drive signals for displaying characters, icons, or the like on a monitor 15 directed by the control unit 9. The drive signal is supplied to the monitor 15 based on respective synchronization signals and the clock signal included in the image signals.
The monitor 15 is one example of display units, for which a liquid crystal display device may be used. The monitor 15 receives drive signals supplied from the monitor driver 14 to display images according to supplied signals. The monitor 15 may be a view finder for video cameras.
In the video camera according to the configuration, when the focus is converged at the peak of the evaluation values while the video camera operates to search the peak of the evaluation values, the history of the evaluation values is searched and a user is provided with this information. If, for example, the history of the evaluation values satisfies a prescribed condition, a green indicator 11G lights up to indicate that the subject image is in-focus. If, on the other hand, the history of the evaluation values does not satisfy the prescribed condition, a red indicator 11R lights up to indicate that the subject image may be out-of-focus. According to this embodiment, after having converged focus by the auto-focus operation, whether the subject image is in-focus or out-of-focus is determined, and a user is informed of the outcome of determination by lighting up indicators 11G, 11R, displaying the outcome on the monitor 15, and the like.
Alternatively, special icons 16, which are small pictures representing particular computer processing or items, may be provided on a screen of the monitor 15 or a view finder for displaying the obtained outcome. The icons may be changed in shape and color according to the outcome for discrimination.
Moreover, the outcome may be divided into three phases or four more phases as follows, which may be shown with three indicators such as green, yellow, and red:
“highly reliable to obtain an accurate in-focus status”,
“fairly reliable to obtain an accurate in-focus status”, and
“unreliable to obtain an accurate in-focus status”.
A method of determining reliability of whether a subject image is in-focus or out-of-focus is described as follows with reference to the following
The vertical axes of the graphs 8A, 8B, and 8C respectively indicate the luminance addition values, the evaluation value, and movements of the focus lens, and the horizontal axes of three indicate time.
The curves shown on the graphs are plotted once for one field of the image signals or a plurality of data obtained on an irregular base.
In this embodiment, the velocity of focusing varies with the focus position and the evaluation value; however, the velocity of focusing is not limited to this method, and the velocity of focusing may be configured to remain constant regardless of distance.
By contrast, the evaluation value may change according to change in focus status.
When the focus lens returns to the position corresponding to the point at which the relative maximum has been detected, the evaluation value obtained is generally larger than the relative maximum as shown in
Accordingly, the evaluation value obtained when the focus lens returns to and stops at the position corresponding to the point at which the relative maximum has been detected is generally smaller than the evaluation value obtained while the focus lens is passing the focus position at which the relative maximum is detected.
9B, 9C respectively illustrate fluctuations of luminance addition values, evaluation values, and focus while the focus lens of the video camera searches the position corresponding to the peak of the evaluation values at which inaccurate focus may be determined.
According to the embodiments of the present invention, whether a subject image is in-focus or out-of-focus at the focus position calculated by auto-focus unit is determined with a high reliability by examining the histories of the evaluation values and the luminance addition values as described above.
Here, conditions used in criteria for determining whether a subject image is in-focus or out-of-focus are described below.
The present embodiments employ two conditions A and B in the criteria. The condition A is used to determine whether or not auto-focus operation terminates normally, using the history of the evaluation values.
In the condition A, if the relative maximum of evaluation values is defined as ea and the evaluation value obtained when the focus lens returns to and stops at the position corresponding to the point at which the relative maximum has been detected is defined as eb, a value obtained by dividing the evaluation value eb by the relative maximum ea is larger than the prescribed threshold. The condition A is represented by the following equation 1.
α<eb/ea 1
where α represents a constant.
The aforementioned α is defined based on the results obtained by experiments or tests conducted.
For example, when the value obtained from dividing the evaluation value eb by the relative maximum ea is larger than the prescribed threshold (equation 1 is satisfied) as shown in
By contrast, when the value obtained from dividing the evaluation value eb by the relative maximum ea is smaller than the prescribed threshold, which is represented by the following equation α<eb/ea as shown in
Next, the condition B is described below. The condition B includes a luminance condition in addition to the above condition A in order to more rigorously determine whether a subject image is in-focus or out-of-focus; that is, the condition B implies the more rigorous version of the condition A. According to the condition B, when luminance change is detected, the control unit 9 determines that wobbling may have occurred while detecting the relative maximum and hence the subject image is out-of-focus state, unless the following equations 1 and 2 are both satisfied simultaneously.
In the condition B, if the relative maximum of evaluation values is defined as ea and the evaluation value obtained when the focus lens returns to and stops at the position corresponding to the point at which the relative maximum has been detected is defined as eb, the value obtained by dividing the evaluation value eb by the relative maximum ea is larger than a first threshold. In addition, as shown in
ea×α<eb 1
where α represents a constant.
γ1<Y2/Y0<γ2 2
where γ1 and γ1 represent a constant.
The condition B includes a condition (equation 2) to determine whether or not the values indicating the luminance change are within a prescribed range. If the condition B is not satisfied (e.g., see
A plurality of methods for providing information to a user or notifying a user may be prepared by changing a combination of the conditions (equation 1 and equation 2) and a display method (see below).
equation 1
equation 1 and equation 2
Indicator
icons on the view finder or monitor screen
Signal output (signals are transmitted from the video camera to external equipment by allocating a specific signal line)
Next, auto-focus processing using the video camera according to the present embodiment will be described by referring to the flowchart shown in
In
The control unit 9 periodically stores evaluation values and focus positions in memory 10 as a background processing; that is, the control unit 9 stores the evaluation values and the focus positions in the background, and operates to search the peak of evaluation values based on the stored information. As shown in the flowchart of
After initiating the AF1 cycle operation, the control unit 9 retrieves evaluation values and focus positions stored in the memory 10, and sets directions of movements of the focus lens 1 based on the retrieved evaluation value and focus positions.
The control unit 9 then determines whether or not the relative maximum has been detected (step S2) as shown in the flowchart of
In the determination processing at step S2, when the relative maximum is detected, the control unit 9 controls the lens driver 2 to return the focus lens to the position corresponding to the point at which the relative maximum has been detected (step S4).
The control unit 9 analyzes the history of the evaluation values. That is, the control unit 9 analyzes a relation between evaluation values at the relative maximum and evaluation values at the current position of the focus lens, and determines whether a subject image is in-focus or out-of-focus using the aforementioned conditions A and B (step S5).
The control unit 9 provides information based on the outcome as to determining whether a subject image is in-focus or out-of-focus at the aforementioned step S5 (step S6).
If, for example, the history of the evaluation values satisfies the aforementioned condition A (or condition B), a green indicator 1G lights up to indicate a reliable status that the subject image is in-focus. If, on the other hand, the history of the evaluation values does not satisfy the prescribed condition, a red indicator 11R lights up to indicate unreliable status that the subject image is in-focus status. Alternatively, a user may also be informed or notified by displaying prescribed icons 16 on the screen of monitor 15 or the like, through outputting a result of determining whether the subject image is in-focus or out-of-focus (focal determination signals) from the control unit 9 to the monitor driver 14, in addition to lighting indicators 11G and 1R. Further, the focal determination signals may be output from the control unit 9 to the external equipment through IF unit 12 that is used as a signal output unit, and a user may be informed by some other display methods using external equipment.
Moreover, it is possible to provide a plurality of results of determining whether the subject image is in-focus or out-of-focus using a condition C of relatively less rigorous equation 3, which includes a plurality of thresholds on ratios of the evaluation values, in addition to the equation 1. Accordingly, it is possible to assess reliability in more specific determination as to whether the subject image is in-focus or out-of-focus, and to provide detailed information on the focal determination to a user.
If the relative maximum of evaluation values is defined as ea and the evaluation value obtained when the focus lens returns to and stops at the position corresponding to the point at which the local maximum has been detected is defined as eb, the condition is represented by the following equations:
α<eb/ea 1
β<eb/ea 2
where α and β represent constants (α>β)
If, for example, the outcome of determining whether the subject image is in-focus or out-of-focus does not satisfy the equation 3 at step S5, a red indicator lights up or an icon “▪” is displayed on the monitor 15 to indicate that it is unreliable to obtain an accurate in-focus status.
In addition, if the outcome of determining whether the subject image is in-focus or out-of-focus satisfies the equation 3 only at step S5, a yellow indicator lights up or an icon “□” is displayed on the monitor 15 to indicate that it is a little unreliable or fairly reliable to obtain an accurate in-focus status.
Furthermore, if the outcome of determining whether the subject image is in-focus or out-of-focus satisfies the equation 1 at step S5, a green indicator lights up or an icon “o” is displayed on the monitor 15 to indicate that it is highly reliable to obtain an accurate in-focus status.
According to the configuration of the embodiment, the control unit 9 searches the peak of the evaluation values in auto-focus processing. If the relative maximum of the evaluation values is detected, the evaluation value is computed when the focus lens returns to the focus position corresponding to the point at which the peak of the evaluation value has been detected. The control unit 9 analyzes the history of the evaluation values. That is, the control unit 9 analyzes a relation between evaluation values at the relative maximum and evaluation values when the focus lens returns to the focus position corresponding to the point at which the peak of the evaluation value has been detected, and assesses reliability of the subject image being in-focus. The above method may provide an accurate result of whether a subject image is in-focus or out-of-focus without adverse effects of fluctuation in the evaluation values caused by movement of the focus lens.
Moreover, since a user is informed (warned) of the outcome of whether a subject image is in-focus or out-of-focus, the user is able to grasp the current in-focus state. As a result, the user can select whether to stop the focus lens or to re-operate to search the peak of the evaluation values after examining the informed result. Thus, the present embodiment may eliminate a case with one shot operation in which auto-focus operation terminates when the focus lens stops at the position where a subject image is still out-of-focus and hence the subject image remains blurred.
Next, a second embodiment of the present invention is described below. In this embodiment, when the control unit 9 has determined reliability as to whether a subject image is in-focus or out-of-focus, the user can select whether to stop the focus lens or to re-operate to search the peak of the evaluation values based on the outcome of determining whether a subject image is in-focus or out-of-focus instead of providing the user of the obtained information (step S6 in
If, for example, the outcome of determining whether a subject image is in-focus or out-of-focus does not satisfy the equation 1 at step S5, the control unit 9 re-operates to search the peak of the evaluation values. If, on the other hand, the outcome of determining whether a subject image is in-focus or out-of-focus satisfies the equation 1 at step S5, the control unit 9 allows the focus lens to remain unmoved until subsequent auto-focus restart conditions are instructed. Further, if the outcome of determining whether a subject image is in-focus or out-of-focus satisfies neither equation 1 nor equation 2 at step S5, the control unit 9 may re-operate to search the peak of the evaluation values. In this embodiment, the equation 1 and equation 2 are used as examples; however, a combination of the equation 1 and equation 3 may also be used for determining reliability in obtaining a focus state of the subject image.
It may be preferable that the focus lens be fixed in order for experts to control the focus if commercially-used or professionally-used video cameras are employed. Accordingly, it is possible to improve operability of a video camera or the like if a user is allowed to appropriately select whether to only obtain information on reliability of a focus state of the subject image or to re-operate to search the peak of the evaluation values. Furthermore, since the control unit 9 is configured to automatically operate to search the peak of evaluation values without a user re-pressing the switch 13 depending on a certain outcome of determination as to whether a subject image is in-focus or out-of-focus, an accuracy of focus state can be improved simultaneously in addition to improving usability of a video camera without forcing the user to perform a new operation. This embodiment may provide a similar effect obtained in the first embodiment.
Next, a third embodiment of the present invention is described below. According to the present embodiment, the video camera shown in
The control unit 9 operates to search the peak of the evaluation values and detects the local maximum of evaluation values at step S5 in
Further, the control unit 9 assesses as to whether the angular velocity signals detected by the angular velocity sensor are within a prescribed range of magnitudes at a point where the relative maximum of the evaluation values has been detected, in this embodiment. If the angular velocity signals are outside the prescribed range of magnitudes; that is, the range of the angular velocity signals do not satisfy the condition, the control unit 9 determines that the video camera has wobbled, and hence obtaining in-focus status of the subject image is unreliable regardless of the outcome determined by using the evaluation values. Below are equations used for determining whether a subject image is in-focus or out-of-focus using angular velocity signals.
Vpan<Vmin or Vmax<Vpan Or
Vtilt<V min or Vmax<Vtilt 4
where Vpan and Vtilt represent angular velocity signals in the pan and tilt directions respectively, and
Vmax and Vmin (Vmax>Vmin) are constants.
If the detected angular velocity signals do not satisfy the equation 4, the control unit 9 determines that the video camera has wobbled, and hence obtaining an in-focus status of the subject image is unreliable regardless of the outcome determined by using the evaluation values. Thus, obtaining the more accurate result of focus adjustment and improved reliability is secured by determining whether a subject image is in-focus or out-of-focus while eliminating wobbling of the subject or the video camera. This embodiment may provide a similar effect obtained in the first embodiment.
It should be noted that the present invention is not limited to the above-described embodiments; for example, an image capture apparatus according to the embodiments of the present invention can be applied to a digital camera in place of the above-described video camera, and various other changes and modifications can of course be made without deviating from the gist of the invention.
Further, the auto-focus operation using operation signals of one-shot switch 13 as a trigger is described above; however, the present invention can be applied to full automatic focus operation that constantly performs automatic focusing regardless of the instructions directed by the switch 13.
It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and alterations may occur depending on design requirements and other factors insofar as they are within the scope of the appended claims or the equivalents thereof.
Number | Date | Country | Kind |
---|---|---|---|
P2006-182566 | Jun 2006 | JP | national |