This application is based upon and claims the benefit of priority of the prior Japanese Patent Application No. 2016-229352, filed Nov. 25, 2016, the entire contents of which are incorporated herein by reference.
The present invention relates to an imaging device that performs still-image photographing by using a mechanical shutter, a control method thereof, and a recording medium.
Conventionally, as an example of an imaging device, a digital single-lens reflex camera is known.
The digital single-lens reflex camera generally includes a focal-plane shutter (hereinafter also simply referred to as a “shutter”), and an exposure time of still-image photographing is controlled by the shutter. The digital single-lens reflex camera also includes a function called a live view for displaying an image formed on an image sensor on a display device, and framing is performed by using the function.
In the digital single-lens reflex camera above, ranging, photometry, or subject analysis such as person recognition, is performed during the live view, and the position of a lens and an exposure amount in still-image photographing are determined. Here, it is important to rapidly switch still-image photographing and the live view in order to frame a moving subject.
As an example of the digital single-lens reflex camera, the following camera is known. In this camera, an aperture of a shutter device starts to open from a shielded state toward a fully open state, and when the aperture enters into a state in which the aperture is open by a prescribed amount before the aperture is fully open, an image signal starts to be captured from an image sensor, and the operation of focusing means and the operation of exposure determination means are performed (see, for example, Japanese Patent No. 5760188). It is considered that an object of this camera is that, after still-image photographing, a shutter will start to open from a closed state (a state in which the image sensor is shielded from light), and that, immediately after the shutter is driven to a position in which there is no influence from a shutter blade (hereinafter also simply referred to as a “blade”) on an image formed on the image sensor, an image signal will start to be captured from the image sensor and other operations will be performed.
In one aspect of the present invention, an imaging device is provided that includes an image sensor that forms an image with light from a subject by using a photographing lens, converts the formed image into an electrical signal, and outputs the electrical signal, and a display device that performs a live-view display according to an output of the image sensor. The imaging device includes: a mechanical shutter that controls the image sensor to be in a light-shielding state or in an exposed state; a fully open detection sensor that detects that the mechanical shutter is in a fully open state; a ranging operation circuit that performs a ranging operation according to the output of the image sensor; and a control circuit that controls the image sensor and the mechanical shutter to photograph a still image and image a live view. The control circuit controls the image sensor to start imaging of the live view after photographing the still image, and the ranging operation circuit performs the ranging operation by using the output of the image sensor obtained by imaging the live view after the fully open detection sensor has detected that the mechanical shutter is in the fully open state.
In another aspect of the present invention, an imaging device is provided that includes an image sensor that forms an image with light from a subject by using a photographing lens, converts the formed image into an electrical signal, and outputs the electrical signal, and a display device that performs a live-view display according to an output of the image sensor. The imaging device includes: a mechanical shutter that controls the image sensor to be in a light-shielding state or in an exposed state; a fully open detection sensor that detects that the mechanical shutter is in a fully open state; a photometric operation circuit that measures brightness of the subject according to the output of the image sensor; and a control circuit that controls the image sensor and the mechanical shutter to photograph a still image and image a live view. The control circuit controls the image sensor to start imaging of the live view after photographing the still image, and the photometric operation circuit performs the photometric operation by using the output of the image sensor obtained by imaging the live view after the fully open detection sensor has detected that the mechanical shutter is in the fully open state.
In yet another aspect of the present invention, a method for controlling an imaging device that includes an image sensor that converts a subject image formed by a photographing lens into an electrical signal and outputs the electrical signal, a mechanical shutter that controls the image sensor to be in a light-shielding state or in an exposed state, and a display device that performs a live-view display according to an output of the image sensor is provided. The method includes: starting, by the image sensor, imaging of a live view after photographing a still image; detecting that the mechanical shutter is in a fully open state, after the starting; and performing a ranging operation or a photometric operation according to the output of the image sensor obtained by imaging the live view, after the detecting.
In yet another aspect of the present invention, a non-transitory computer-readable recording medium storing a program for causing a computer of an imaging device to perform a process is provided. The imaging device includes an image sensor that converts a subject image formed by a photographing lens into an electrical signal and outputs the electrical signal, a mechanical shutter that controls the image sensor to be in a light-shielding state or in an exposed state, and a display device that performs a live-view display according to an output of the image sensor. The process includes: starting, by the image sensor, imaging of a live view after photographing a still image; detecting that the mechanical shutter is in a fully open state, after the starting; and performing a ranging operation or a photometric operation according to the output of the image sensor obtained by imaging the live view, after the detecting.
Embodiments of the present invention are described below in detail with reference to the drawings.
As illustrated in
The lens unit 200 is detachable via a not-illustrated lens mount that is provided on a front face of the body unit 100, and the lens unit 200 is exchangeable in this camera.
The lens unit 200 includes a photographing lens 201 (201a and 201b), a diaphragm 202, a lens driving mechanism 203, and a diaphragm driving mechanism 204.
The photographing lens 201 is driven in an optical-axis direction by a not-illustrated direct current (DC) motor that is provided in the lens driving mechanism 203. The diaphragm 202 is driven by a not-illustrated stepping motor that is provided in the diaphragm driving mechanism 204. By driving the diaphragm 202 using the diaphragm driving mechanism 204, an amount of light that passes through the photographing lens 201 is adjusted. Respective components in the lens unit 200, such as the lens driving mechanism 203 or the diaphragm driving mechanism 204, are controlled by the control microcomputer 110 described later.
The body unit 100 has the configuration below.
A light flux from a not-illustrated subject that enters the body unit 100 via the photographing lens 201 and the diaphragm 202 within the lens unit 200 (a subject image that has passed through an optical system) passes through a shutter unit 101 in an open state, and an image is formed on an image sensor 102. The shutter unit 101 is a focal-plane shutter unit that is provided on an optical axis of the lens unit 200. The image sensor 102 performs photoelectric conversion on the formed subject image so as to generate an analog electrical signal. Photoelectric conversion performed by the image sensor 102 is controlled by an image sensor driving integrated circuit (IC) 103. The image sensor driving IC 103 converts the analog electrical signal obtained as a result of photoelectric conversion performed by the image sensor 102 into a digital electrical signal to be processed by an image processing IC 104. The image processing IC 104 converts the digital electrical signal obtained as a result of conversion performed by the image sensor driving IC 103 into an image signal.
The image processing IC 104 is connected, for example, to the image sensor 102, the image sensor driving IC 103, a synchronous dynamic random access memory (SDRAM) 105 which functions as a storage area, a rear liquid crystal monitor 106, a backlight device 107 that irradiates a liquid crystal within the rear liquid crystal monitor 106 with light from the rear, and a recording medium 109 via a communication connector 108. These components are configured so as to be able to provide an electronic recording display function in addition to an electronic imaging function. The SDRAM 105 is implemented by a commercially available memory IC.
The recording medium 109 is an external recording medium such as one of various types of semiconductor memory cards or an external hard disk drive (HDD), and the recording medium 109 is mounted so as to be communicable with the body unit 100 via the communication connector 108 and to be exchangeable.
The image processing IC 104 is also connected to the control microcomputer 110 that controls respective components within the body unit 100 and respective components within the lens unit 200. The control microcomputer 110 includes, for example, a not-illustrated timer that measures a photographing interval at the time of continuous photographing, and the control microcomputer 110 has functions of counting, mode setting, detection, determination, computation, and the like, in addition to a function of the control of the entire operation of the camera. As an example, the control microcomputer 110 causes the rear liquid crystal monitor 106 to display a report to a user (a photographer) that indicates an operation state of the camera. The control microcomputer 110 includes, for example, a CPU and a memory, and the functions of the control microcomputer 110 are implemented by the CPU executing a program stored in the memory. Specifically, the control microcomputer 110 is implemented by an application specific integrated circuit (ASIC). The control microcomputer 110 is connected to a shutter driving control circuit 111, a camera operation switch (SW) 112, a not-illustrated power source circuit, and the like.
The control microcomputer 110 and the respective components (such as the lens driving mechanism 203 or the diaphragm driving mechanism 204) of the lens unit 200 are electrically connected to each other such that a signal can be transmitted or received via a not-illustrated communication connector, by the lens unit 200 being mounted on the body unit 100.
The shutter driving control circuit 111 controls the movements of a front curtain and a rear curtain that are not illustrated in the shutter unit 101. In addition, the shutter driving control circuit 111 communicates, with the control microcomputer 110, a signal for controlling the opening/closing operation of the shutter unit 101, a signal at the time when the front curtain is fully open, a signal at the time when the rear curtain is fully open, and other signals.
The camera operation switch 112 is configured by a switch group including operation buttons needed for a user to operate the camera, such as a release switch that issues an instruction to perform a photographing operation, a mode change switch that switches a photographing mode to a continuous photographing mode, a normal photographing mode, or the like, or a power switch that switches the ON/OFF state of a power source.
The not-illustrated power source circuit converts a voltage of a not-illustrated battery serving as a power source into a voltage needed by each of the circuit units of the camera, and supplies the converted voltage.
The EVF unit 300 consists of an EVF liquid crystal device 301, a backlight device 302 that irradiates the EVF liquid crystal device 301 with light from the rear, and an eyepiece 303, and a user can view, for example, a live-view display through the eyepiece 303.
In the camera according to this embodiment, the live-view display can be displayed on both the rear liquid crystal monitor 106 and the EVF liquid crystal device 301, and a user can select which of them the live-view display will be displayed on according to the photographing situation.
A photographing operation and a live-view operation of the camera according to this embodiment are described next.
In the camera according to this embodiment, the photographing operation is performed as below.
First, the image processing IC 104 is controlled by the control microcomputer 110, and when image data (a digital electrical signal) that is output from the image sensor 102 and the image sensor driving IC 103 is input to the image processing IC 104, the image processing IC 104 stores the image data in the SDRAM 105, which is a memory for temporary storage. The SDRAM 105 is also used as a work area in which the image processing IC 104 performs image processing. In addition, the image processing IC 104 may perform image processing for converting the image data into joint photographic experts group (JPEG) data or the like, and may store the data in the recording medium 109.
Upon receipt of a signal for controlling the driving of the shutter unit 101 from the control microcomputer 110, the shutter driving control circuit 111 controls the shutter unit 101 so as to perform an opening and closing operation. At this time, prescribed image processing is performed on the image data from the image sensor 102 and the image sensor driving IC 103, and the image data is recorded in the recording medium 109 such that the photographing operation is completed.
In the camera according to this embodiment, the live-view operation is performed as below.
A light flux from the photographing lens 201 is guided to the image sensor 102. The image sensor 102 continuously performs exposure, for example, at a ratio of about 60 frames per second (60 fps). At this time, image data output from the image sensor 102 and the image sensor driving IC 103 is converted into a video signal by the image processing IC 104 and is given to the rear liquid crystal monitor 106 such that a video image of the subject can be displayed on the rear liquid crystal monitor 106. Alternatively, the video signal may be given to the EVF liquid crystal device 301 such that the video image of the subject can be displayed on the EVF liquid crystal device 301. The display above is referred to as a “live-view display”, and is well-known. Hereinafter, the “live view” or the “live-view display” may be simply referred to as an “LV”.
At the time of the LV operation, a light flux from the photographing lens 201 is always guided to the image sensor 102, and therefore the photometric processing of the brightness of the subject and known ranging processing on the subject can be performed by the image processing IC 104 on the basis of the image data output from the image sensor 102 and the image sensor driving IC 103. Hereinafter, the photometric processing of the brightness of the subject that is performed by the image processing IC 104 and the control microcomputer 110 on the basis of the image data output from the image sensor 102 and the image sensor driving IC 103, as described above, is referred to as “LV photometry”.
The shutter unit 101 in the camera according to this embodiment is described next.
The shutter unit 101 includes a front curtain, a rear curtain, a photointerrupter (PI) for the front curtain, and a PI for the rear curtain, and the shutter unit 101 has a configuration in which, when the shutter unit 101 is in the fully open state, a member provided at the end of the front curtain shields the PI for the front curtain from light and a member provided at the end of the rear curtain shields the PI for the rear curtain from light. Hereinafter, the PI for the front curtain is referred to as “PI(F)” (“F” is the initial for “First”), and the PI for the rear curtain is referred to as “PI(S)” (“S” is the initial for “Second”).
In the shutter unit 101, each of the front curtain and the rear curtain is driven according to a publicly known method. Accordingly, a method for driving a curtain is not particularly described here, but as an example, the curtain may be driven according to a general method for providing a spring in a member that moves the curtain, storing the force of the spring due to the rotational force of a motor, and releasing the force of the spring at the time of exposure. Alternatively, as described in Japanese Laid-Open Patent Publication No. 2006-047345 or Japanese Laid-Open Patent Publication No. 2014-191225, the curtain may be directly driven by an actuator at the time of exposure. In this embodiment, assume, as an example, that the curtain is driven according to a general method using a spring.
An aperture of the shutter unit 101 at the time when the front curtain and the rear curtain are open to a drivable range (namely, in the fully open state) is not determined by the front curtain and the rear curtain, but is determined by another member such as the exterior of the shutter unit 101. An aperture at this time is defined as an image frame, and the upper end and the lower end of the aperture are respectively defined as an image frame upper end and an image frame lower end.
The operation of the camera according to this embodiment is described next with reference to a timing chart (
In the operations of the front curtain and the rear curtain in
In
In the operations of the front curtain and the rear curtain illustrated in
In the state of PI(S) illustrated in
In a case in which a shutter unit in which, after the front curtain enters into the fully open state after exposure, the front curtain may enter again within the image frame and a portion of the front curtain may shield the image sensor 102 from light is employed as the shutter unit 101, a timing of “A: fully open” is a timing at which a state in which both PI(F) and PI(S) are shielded (a Low-signal output state) is detected.
In the driving timing illustrated in
In the imaging operation illustrated in
In the operation of the display device illustrated in
Here, a relationship among PI(S), the imaging operation, and the ranging operation illustrated in
In
After “A: fully open”, frame (3) obtained after the output of PI(S) becomes a Low signal can be used in ranging. In this embodiment, assume that an image sensor including pixels for which a phase difference can be obtained (an image sensor having a phase difference AF function) is employed as the image sensor 102. By doing this, when the image processing IC 104 processes a frame image of frame (3), a focusing state and a focusing position of the photographing lens 201 can be computed from an output of the phase difference. When computation has been successfully completed, the front curtain starts to be driven in a direction in which the image sensor 102 is shielded from light in preparation for the next still-image photographing (see “start transition to ‘immediately before exposure’”). At this time, the diaphragm 202 may be driven. In this case, at a point in time at which the computation above has been completed, frame (4) in the process of being captured becomes a frame image in which a portion is shielded from light. However, even if a dark image is displayed by reducing the size of an aperture of the diaphragm 202, a display gradually becomes darker until blackout (a black display) is reached at “B: immediately before exposure”, and therefore this does not cause discomfort in appearance. After the computation above, after completion of the capturing of frame (4) that was in the process of being captured, the continuous capturing of the LV operation is terminated. After the termination of capturing, the photographing lens 201 is driven (see “start to drive lens” in
As illustrated in
In S402, live-view preparatory processing is performed. In a live view, an amount of light that enters the image sensor 102 is adjusted according to a sensitivity of the image sensor 102 (hereinafter also referred to as an “imaging sensitivity”), the speed of an electronic shutter, and a diaphragm position (the position of the diaphragm 202) within the lens unit 200, but a luminance of the subject is not clear at the time when the live view is started. Therefore, in the live-view preparatory processing, LV photometry is performed according to an imaging output (an output from the image sensor 102 and the image sensor driving IC 103) in the setting of a prescribed imaging sensitivity, a prescribed speed of the electronic shutter, and a prescribed diaphragm position, and the exposure of a start frame of the live view is determined according to the obtained photometric value.
In S403, a live-view display is started. By doing this, a user can confirm a subject image by viewing the eyepiece 303 of the EVF unit 300, and can also confirm the subject image by viewing the rear liquid crystal monitor 106.
In S404, LV photometry is performed, and control is performed to update exposure according to the obtained photometric value in such a way that the exposure of the live view becomes a target exposure.
In S405, it is determined whether a release has been turned on by operating a release switch of the camera operation switch 112.
When the determination result in S405 is Yes, the processes of S406 to S408 are performed, and the processing moves on to S409. When the determination result in S405 is No, the processing moves on to S409.
In S406, still-image exposure is determined. More specifically, an aperture value, a shutter speed, and an imaging sensitivity in still-image photographing are determined according to the photometric value obtained in S404.
In S407, a focusing state is computed (ranging is performed) from a frame image of the live view, and the position of the photographing lens 201 in still-image photographing is computed.
In S408, still-image photographing processing is performed. Details of this processing will be described later with reference to
In S409, it is determined whether the power source of the camera has been turned off by operating the power switch of the camera operation switch 112.
When the determination result in S409 is No, the processing returns to S404, and when the determination result in S409 is Yes, the operation of the camera is terminated.
In the operation above illustrated in
As illustrated in
In S501, a diaphragm is driven. Here, a diaphragm is driven in order to change from an aperture value in the live view (LvAv) to an aperture value in still-image photographing (still-image Av) that is determined in S406 described above or S516 described later.
In S502, the capturing of the LV operation is terminated.
In S503, the photographing lens 201 is driven to the lens position computed in S407 described above or S513 described later in order to bring a still image into focus. Here, while a continuous photographing operation is continued, when the ranging operation of S513 described later is finished (S515: the ranging operation has been finished), the image sensor may be performing imaging. In this case, after an output of the image sensor of a corresponding frame is obtained, the driving of a focus lens in S503 is started, and a live-view display is performed according to the obtained output of the frame (
In S504, the front curtain of the shutter unit 101 is closed, and the image sensor 102 is shielded from light by the front curtain.
In S505, electric charges stored in the image sensor 102 in the light-shielding state are swept out, and the camera enters into a state in which electric charges are stored that correspond to an amount of light that enters the image sensor 102 while the shutter unit 101 is open (a state in which the storage of a captured still image is started).
In S506, still-image exposure is performed. Here, the front curtain that has shielded the image sensor 102 from light is first opened, and after an exposure time needed for the still-image exposure determined in S406 has passed, the rear curtain of the shutter unit 101 shields the image sensor 102 from light again.
In S507, the driving of a diaphragm is started. Here, the driving of the diaphragm is started in order to change from an aperture value in still-image photographing (still-image Av) to an aperture value in a live view (LvAv).
In S508, an operation to open the rear curtain of the shutter unit 101 (the opening driving of the rear curtain) is started.
In S509, a photographed still image is captured. More specifically, an output of the image sensor 102 in a state in which the image sensor 102 is shielded from light by the rear curtain is transferred to the image processing IC 104 via the image sensor driving IC 103. Then, the image processing IC 104 converts the output into an image, and records the image in the recording medium 109 via the communication connector 108.
In S510, the capturing of the LV operation is started. The capturing of the LV operation is started at a timing that matches a driving cycle of the display device such that an image is displayed immediately after capturing.
In S511, the start of the capturing of a frame to be used in ranging is awaited. As an example, when the speed of continuous photographing is increased, the start of the capturing of the first frame after the start of the capturing of the LV operation may be awaited. Alternatively, when the speed of continuous photographing is reduced, the camera may wait during a prescribed time period in order to increase a continuous photographing interval, may await the start of the capturing of a frame to be captured next, and may perform ranging by using a frame closest to a frame in the next still-image photographing.
In S512, it is determined whether the rear curtain is in the fully open state and whether the driving of the diaphragm started in S507 has been completed. In this determination, it can be determined whether an image to be formed on the image sensor 102 is an image that has been captured when the diaphragm is in a state according to the aperture value in the live view (LvAv) and when the shutter unit 101 is in the fully open state.
In S511 and S512, a start timing of the capturing of a frame may be a timing immediately before a timing at which the image sensor 102 starts integration. Alternatively, a vertical synchronizing signal of the image sensor 102, which is a start point of a frame, may be used as the start timing of the capturing of the frame.
When the determination result in S512 is No, the processing returns to S511, and when the determination result in S512 is Yes, the processing moves on to S513.
In S513, a ranging operation is performed by using a frame captured after the determination result in S512 becomes Yes. In this embodiment, the image sensor 102 is an image sensor including pixels for which a phase difference can be obtained, and therefore a focusing state of the subject is obtained in a single frame. An amount of the driving of a lens needed to make the subject in-focus (an amount of the driving of the photographing lens 201) is calculated according to the obtained focusing state. The amount of the driving of the lens is also an amount of the driving of the lens to a lens position that makes the subject in-focus.
When a manual focusing mode has been set by operating the camera operation switch 112 and a user manually performs focusing, a frame used in ranging and the ranging operation are not needed. However, also in this case, a single frame after the rear curtain is fully opened may be obtained such that the appearance of a display during continuous photographing is equivalent to the appearance during ranging.
In S514, it is determined whether the ranging operation of S513 has been successfully completed (finished). More specifically, it is determined whether the shutter unit 101 has maintained the fully open state (whether PI(S) has maintained the light-shielding state) and whether a normal focusing state has been detected in the ranging operation of S513.
In S515, it is determined from the determination result of S514 whether the ranging operation of S513 has been successfully completed (finished). More specifically, it is determined from the determination result of S514 whether the shutter unit 101 has maintained the fully open state (whether PI(S) has maintained the light-shielding state) and whether a normal focusing state has been detected in the ranging operation of S513.
When the manual focusing mode has been set, it is only determined in S514 whether the shutter unit 101 has maintained the fully open state (whether PI(S) has maintained the light-shielding state), and it is only determined in S515 from the determination result of S514 whether the shutter unit 101 has maintained the fully open state (whether PI(S) has maintained the light-shielding state).
When the determination result of S515 is No, the processing returns to S511, and when the determination result of S515 is Yes, the processing moves on to S516.
In S516, a photometric value is computed according to the output (a still image) of the image sensor 102 obtained in S509, and an aperture value, a shutter speed, and an imaging sensitivity in still-image photographing of the next frame are determined. The aperture value may be computed according to a frame captured after the determination result of S512 becomes Yes. In S516, subject analysis such as person recognition may be further performed by using the frame captured after the determination result of S512 becomes Yes.
In S517, it is determined whether the release has maintained the ON state.
When the determination result in S517 is Yes, the processing returns to S501, S502, and S504, and when the determination result in S517 is No, the still-image photographing processing returns.
In the still-image photographing processing above of
As described above, according to the first embodiment, by including a sensor (PI(F) and PI(S)) that detects that the shutter unit 101 is in the fully open state and appropriately cooperating with imaging control, improvements in the speed of continuous photographing and a reduction in image loss are achieved, and ranging (photometry, subject analysis, and the like as needed) can be reliably performed.
An imaging device according to a second embodiment of the present invention is different from the imaging device according to the first embodiment in a portion of still-image photographing processing. Accordingly, in the description of the second embodiment, differences are principally described, and the same components as the components described in the first embodiment are described by using the same reference numerals.
In still-image photographing processing according to the second embodiment, the determination of whether a ranging operation has been successfully completed (finished) is performed, for example, by calculating a time.
As illustrated in
After S510, the processes of S601 to S603 are performed.
In S601, a ranging operation is performed by using a frame captured when the rear curtain is in the fully open state in the capturing of the frame. Also in the second embodiment, the image sensor 102 is an image sensor including pixels for which a phase difference can be obtained, and therefore a focusing state of a subject is obtained in a single frame, and an amount of the driving of a lens (an amount of the driving of the photographing lens 201) that is needed to make the subject in-focus is calculated according to the focusing state. The amount of the driving of the lens is also an amount of the driving of the lens to a lens position that makes the subject in-focus.
In S602, it is determined whether the ranging operation of S601 has been successfully completed (finished). More specifically, by calculating a time, it is determined whether the rear curtain is in the fully open state (whether PI(S) is in the light-shielding state) and whether the driving of a diaphragm started in S507 has been completed at the timing of the capturing of a frame used in the ranging operation of S601.
Specifically, when relational expression (1) described below is satisfied at a point in time at which the ranging operation of S601 is completed, it is determined that the rear curtain is in the fully open state (PI(S) is in the light-shielding state) at the timing of the capturing of a frame used in the ranging operation of S601.
Ta>Tb (1)
In this relational expression, Ta is a time period that has passed after an output of a Low signal of PI(S) was detected.
Tb is the total time of a time needed to capture a frame used in the ranging operation and a ranging operation time.
Relational expression (1) is satisfied when a time period from the start of the capturing of a frame used in the ranging operation to the present time is shorter than a time period from a point in time at which the rear curtain enters into the fully open state to the present time. In this case, it is determined that the rear curtain is in the fully open state (PI(S) is in the light-shielding state) at a timing at which a frame used in the ranging operation of S601 is captured.
The time needed to capture a frame used in the ranging operation of Tb may be a time period from a timing immediately before a timing at which the image sensor 102 starts integration to the end of the capturing of the frame. Alternatively, more flexibly, a vertical synchronizing signal of the image sensor 102, which is a start point of a frame, may be a start timing of the capturing of the frame. The ranging operation time is a time period from the end of the capturing of a frame used in the ranging operation to a start time of the determination of whether the ranging operation has been successfully completed (finished) (the determination start time of S602). When a time margin is provided to the determination start time in consideration of a time measurement error or the like, Tb may be a time obtained by further adding the time margin.
In addition, the determination of whether the driving of the diaphragm started in S507 has been completed can be performed similarly by counting a time from the completion of the driving of the diaphragm to the present time.
Further, in S602, it is also determined whether a focusing state has been successfully detected in S601.
In S601 and S602, when a manual focus mode has been set and a frame used in the ranging operation is not needed, it is determined considering the appearance of the display during the continuous photographing that the ranging operation has been successfully completed (finished) after one frame after the rear curtain was fully opened is obtained. In this case, when relational expression (2) described below is satisfied, it is determined that the ranging operation has been successfully completed (finished).
Ta>Tc (2)
In this relational expression, Ta is as described above, and Tc is the time needed to capture a single frame.
Relational expression (2) is different from relational expression (1) in that the right side does not include a ranging operation time, but Tc may be a time obtained by further adding a fixed time equivalent to the ranging operation time in order to make a display during continuous photographing appear similar to the display in a case in which the ranging operation is performed.
In S603, it is determined from the determination result of S602 whether the ranging operation has been successfully completed (finished). More specifically, it is determined from the determination result of S602 whether the rear curtain is in the fully open state (whether PI(S) is in the light-shielding state) and whether the driving of the diaphragm started in S507 has been completed at the timing of the capturing of a frame used in the ranging operation in S601, and whether a focusing state has been successfully detected in S601.
When the determination result of S603 is No, the processing returns to S601.
When the determination result of S603 is Yes, the processes of S516 and S517 are performed similarly to the still-image photographing processing according the first embodiment (see
According to the second embodiment in which the still-image photographing processing above is performed, effects similar to those in the first embodiment can be achieved.
In the first and second embodiments described above, the shutter unit 101 is not limited to a focal-plane mechanical shutter, and a mechanical shutter of another type may be employed.
The present invention is not limited to the embodiments above with no change, and in an implementing stage, components can be varied and embodied without departing from the gist of the embodiments above. Various inventions can be made by appropriately combining a plurality of components disclosed in the embodiments above. As an example, some of the components disclosed in the embodiments may be deleted. Further, components disclosed in different embodiments may be appropriately combined.
Number | Date | Country | Kind |
---|---|---|---|
2016-229352 | Nov 2016 | JP | national |