The present invention relates to an image pickup apparatus and a method for controlling the same, and more particularly to a technique for performing distance measurement and focus detection during image pickup.
The image pickup surface phase difference method, which is one of focus detection techniques in a digital camera, obtains three-dimensional data of a subject by indirectly detecting a shift of a focus at each point on the subject. In recent years, techniques that observe information of light field from a subject to obtain three-dimensional information of an object have also been proposed.
A Time of Flight (TOF) method (Hansard et al., ‘Time-of-Flight Cameras Principles, Methods and Applications’, Springer Publishing (2013), hereinafter referred to as Hansard et al.) is one of the techniques used for the purpose of three-dimensional object recognition in a field such as terrain observation and automatic driving. As another one of the techniques, Light Detection And Ranging or LIDAR, which is a technique using the TOF method, is also known. Since the techniques can detect a direct distance to an object, the techniques have been put to practical use in various fields.
A system using a general TOF method has a light source for projecting light to an object and a light receiver, and calculates a propagation time of light that was emitted by the light source, reached an object, and then was received by the light receiver, to estimate a propagation distance of the light, that is, a distance to the object. Such a system may employ an image sensor or the like having an array of photodetectors as the light receiver so as to estimate a distance to each position on the object. That is, it is possible for the system to obtain three-dimensional structure information of the subject, unevenness information of the terrain, and the like.
For example, Japanese Laid-Open Patent Publication (kokai) No. 2018-77143 describes a three-dimensional measurement apparatus including a light source for irradiating an object (subject) with light and an image sensor. For another example, Japanese Laid-Open Patent Publication (kokai) No. 2016-90785 discloses a technique using an image pickup apparatus including an image-forming optical system, a light source, and an image sensor including focus detection pixels. In the technique, distance measurement using the TOF method is made by using pixels not used for focus detection among pixels of the image sensor. This technique allows image pickup apparatuses including an image-forming optical system and an image sensor, to carry out focus detection and distance detection with improved accuracy and further to carry out image pickup of a subject and obtaining three-dimensional information of the subject with high resolution.
In image pickup apparatuses, light enters an image sensor after passing through an image-forming optical system. In image pickup apparatuses in the related art, light is projected from the light source to the subject without passing through the image-forming optical system. It causes shading due to a difference between a region of light projection from the light source to the subject and an image pickup view angle of the image sensor, and a difference between a light projection direction and an image pickup direction. Even in image pickup apparatuses in which light is projected from a light source to a subject through an image-forming optical system, a situation may occur in which the light projection region is different from the image pickup view angle of the image sensor. In particular, in a case where conditions (F value or aperture value, focal length, optical system itself, or the like) of the image-forming optical system are changed, a degree of coincidence between the light projection region and the image pickup view angle changes, and there is a concern that projection efficiency of light from the light source decreases.
In addition, image pickup apparatuses in the related art are broadly classified into a group of techniques that measures a wide range of distance at high speed by simultaneously projecting a wide light projection region with a divergent light source and a group of techniques that measures a distance by performing scanning with a laser beam having high directivity. In the techniques of the former group, the light intensity per unit solid angle is rapidly attenuated as the distance from the light source to the object increases due to the divergence of the light from the light source. Therefore, the techniques of the former group are not suitable for distance measurement from a medium distance to a long distance. Meanwhile, in the techniques of the latter group, since the light intensity per unit solid angle is basically hard to be attenuated, it is possible to measure a distance from a medium distance to a long distance. However, the techniques need the light projection region to be two-dimensionally scanned and there is a problem in that it takes time until the distance measurement is completed.
The present invention provides image pickup apparatuses capable of measuring a distance to a subject in a wide range at high speed with high accuracy and methods for controlling the same.
Accordingly, a first aspect of the present invention provides an image pickup apparatus comprising: an image-forming optical system; an image pickup device comprising a microlens array; and a light source comprising another microlens array, where the light source is configured to emit pulsed light through the another microlens array. The image pickup apparatus further comprises a partial reflecting mirror disposed in the image-forming optical system, and the partial reflecting mirror is configured to project light emitted by the light source to a subject and to guide light from the subject to the image pickup device. The image pickup apparatus further comprises: at least one memory; and at least one processor configured to execute instructions stored in the at least one memory to calculate a distance from the subject to an image pickup surface of the image pickup device using a signal output by the image pickup device receiving light emitted by the light source and reflected on the subject. The partial reflecting mirror is disposed to make an angle between an optical axis of the image-forming optical system and a normal line to a surface of the partial reflecting mirror larger than 0° and smaller than 90°. The image pickup device and the light source are separately disposed at a position where light from the subject travels after reflected on the partial reflecting mirror and a position where light from the subject travels after passing through the partial reflecting mirror so that a pupil distance of the light source comprising the another microlens array and a pupil distance of the image pickup device comprising the microlens array coincide with each other and the image pickup device and the light source are conjugate with each other with respect to the subject via the image-forming optical system.
Accordingly, a second aspect of the present invention provides an image pickup apparatus comprising: an image-forming optical system; and an image pickup device. The image pickup device comprises a light receiver and a light source, where the light receiver is configured to receive light from a subject through the image-forming optical system, and the light source is configured to emit pulsed light toward the subject. The image pickup apparatus further comprises: at least one memory; and at least one processor configured to execute instructions stored in the at least one memory to calculate, by a TOF method, a distance from the subject to an image pickup surface of the image pickup device using a signal output by the light receiver receiving light emitted by the light source and reflected on the subject.
Accordingly, a third aspect of the present invention provides an image pickup apparatus comprising: an image pickup device; a light source configured to emit pulsed light to a subject; an image-forming optical system configured to guide light from the subject to the image pickup device; at least one memory; and at least one processor. The at least one processor is configured to execute instructions stored in the at least one memory to: calculate, by a TOF method, a distance from the subject to an image pickup surface of the image pickup device using a signal output by the image pickup device receiving light emitted by the light source and reflected on the subject; determine a condition for light emission of the light source according to a condition for pickup of an image of the subject; and control light emission of the light source to the subject on a basis of the determined condition for light emission.
Accordingly, a fourth aspect of the present invention provides a method for controlling an image pickup apparatus. The method comprises: focusing an image-forming optical system on a predetermined subject; and projecting light in an infrared wavelength band from a light source, through a microlens array, to a partial reflecting mirror disposed in the image-forming optical system. The method further comprises, upon the light in an infrared wavelength band being reflected on the partial reflecting mirror and projected to the subject through the image-forming optical system, measuring a distance from the subject to an image pickup surface of an image pickup device by a TOF method, using a signal output by the image pickup device receiving light reflected on the subject and traveling through the image-forming optical system. The method further comprises: generating a distance map from a result of measurement of the distance by the TOF method; performing image pickup using a signal output from the image pickup device receiving light in a visible light wavelength band, traveling from the subject and entering the image-forming optical system; and storing image data generated by the distance map and the image pickup in a storage unit. In the image-forming optical system, the partial reflecting mirror is disposed to make an angle between an optical axis of the image-forming optical system and a normal line to a surface of the partial reflecting mirror larger than 0° and smaller than 90°. The light source and the image pickup device are separately disposed at a position where light from the subject travels after reflected on the partial reflecting mirror and a position where light from the subject travels after passing through the partial reflecting mirror so that a pupil distance of the light source comprising microlenses and a pupil distance of the image pickup device comprising a plurality pixels and microlenses disposed for the respective pixels coincide with each other and that the image pickup device and the light source are conjugate with each other with respect to the subject via the image-forming optical system.
Accordingly, a fifth aspect of the present invention provides a method for controlling an image pickup apparatus. The method comprises: projecting light from a light source included in an image pickup device to a subject; and upon the light being projected from the light source and reflected on the subject, calculating a distance from the subject to an image pickup surface of the image pickup device by a TOF method, using a signal output by a first light receiver included in the image pickup device, receiving the light reflected on the subject. The method further comprises: generating a distance map from a result of calculation of the distance by the TOF method; performing image pickup using a signal output from a second light receiver included in the image pickup device, receiving light in a visible light wavelength band, traveling from the subject to enter the image pickup device; and storing image data generated by the distance map and the image pickup in a storage unit.
Accordingly, a sixth aspect of the present invention provides a method for controlling an image pickup apparatus. The method comprises: projecting light from a light source to a subject; and upon the light being projected from the light source and reflected on the subject, calculating a distance from the subject to an image pickup surface of an image pickup device by a TOF method, using a signal output by the image pickup device receiving the light reflected on the subject. The method further comprises: generating a distance map from a result of calculation of the distance by the TOF method; adjusting an image-forming optical system by using a result of calculation of the distance, to focus the image-forming optical system on a partially selected region of the subject; performing image pickup on the subject with the image pickup device, with the image-forming optical system focused on the partially selected region; and storing image data generated by the distance map and the image pickup in a storage unit.
According to the present invention, it is possible to provide image pickup apparatuses capable of measuring a distance to a subject in a wide range at high speed with high accuracy and methods for controlling such an image pickup apparatus.
Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
The present invention will now be described in detail below with reference to the accompanying drawings showing embodiments thereof.
Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings.
The camera MPU 111 performs overall control of the image pickup apparatus 100. The operation switches 114 receive various instructions from a user, and transmit the received instructions to the camera MPU 111. The display unit 113 displays a subject image or various setting conditions for the image pickup apparatus 100. It should be noted that the display unit 113 includes a touch panel. The memory 115 includes a ROM which stores various information necessary for controlling the image pickup apparatus 100, and a storage medium such as an SD card which stores image data or the like. The focus detection unit 112 includes an image pickup sensor, and performs focus detection on a subject by a contrast AF method based on an image pickup signal.
In
The image-forming optical system 103 may have a known configuration generally used in a digital camera. That is, the image-forming optical system 103 may be a lens barrel (interchangeable lens) detachable from a lens-interchangeable camera such as a single-lens reflex camera, or a collapsible lens barrel or a fixed lens barrel provided in an image pickup apparatus body on which the image pickup device 101 is mounted. Further, the image-forming optical system 103 may be capable of changing an F value by an aperture (not shown) and/or changing a focal length by driving a zoom lens. Further, the image-forming optical system 103 may be configured so as to change in the intensity of transmission light by an ND filter (not shown), a polarization state by a polarizing filter, or the like. Lenses and an aperture of the image-forming optical system 103 are driven by the lens drive circuit 110 under a control of the camera MPU 111.
The partial reflecting mirror 104 is disposed at a position to split light from the subject into light traveling to the image pickup device 101 and light traveling to the light source 102. In other words, the partial reflecting mirror 104 is disposed in the image-forming optical system 103, at a position to project light emitted by light source 102 to the subject and to guide light from the subject to the image pickup device 101. The partial reflecting mirror 104 is configured so that an angle between a normal vector of a reflecting surface of the partial reflecting mirror 104 and the optical axis AO of the image-forming optical system is larger than 0° and smaller than 90°. The partial reflecting mirror 104 is a multilayer mirror in which thin films of silicon oxide (SiO2) and thin films of niobium oxide (Nb2O5) are layered on one surface, which faces the point O, of a pellicle film made of resin, and an antireflection film (AR coat) is applied to the other surface of the pellicle film.
A ratio between a transmittance and a reflectance of the partial reflecting mirror 104 is approximately six to four. The transmittance is set to be slightly higher than the reflectance, which enables the image pickup device 101 to efficiently receive light traveling from the subject through the image-forming optical system 103 and entering the image pickup device 101. The material of the substrate of the partial reflecting mirror 104 is not limited to resin, and various transparent materials such as glass can be used. The configuration of the coating of the mirror, having the partial reflecting function is not limited to the above-described configuration.
In the light source 102, surface emitting lasers (Vertical Cavity Surface Emitting Lasers) with a center emission wavelength of about 800 nm are used as the light emission parts 203. A wavelength being out of a visible light wavelength band that is used for pickup of a subject image, is used for light to be emitted by the light source 102, which enables an easy distinction between signals of distance measurement using the TOF method and signals of image pickup.
The light emission parts 203 are not limited to the above-described configuration, and for example, stripe lasers, light emitting diodes (LEDs), quantum dot elements, and organic EL elements can be used. Since many types of minute light emitter including surface emitting lasers, emit divergent light, the microlenses 204 are disposed to suppress the divergence of divergent light from the light emission parts 203. The light source 102 may have a single light emission part (single light source), but requires a microlens array.
It should be noted that the light source 102 is not limited to the configuration described with reference to
In the present embodiment, the light source 102 is configured such that each of the positions of the microlenses 204 corresponding to respective light emission parts 203 is largely eccentric toward the center of the entire light emitting region of the light source 102 as the distance of each of the microlenses 204 from the center increases. Such a configuration changes the propagation direction of light emitted by each light emission part 203 toward the optical axis of the image-forming optical system, which makes light from all light emission parts 203 intersect at almost one point on the optical axis. In the present embodiment, this point is called a light source pupil 214, and a distance from the light emitting surface to the light source pupil is called a light source pupil distance 215.
The light source 102 is configured so that a light source pupil 214 coincides with a sensor pupil, which will be described later, of the image pickup device 101. Such a configuration keeps the light projection region constant in comparison with the subject region corresponding to the image pickup surface M of the image pickup device 10, and thus, the distance measurement can be performed with high light projection efficiency. It should be noted that the light source 102 may have a single light emission part, and in this case, the above-described light source pupil is not defined, and it can be handled as if the light source pupil and the sensor pupil coincide with each other.
Meanwhile, in a case where the sizes of the image pickup surface M of the image pickup device 101 and the entire light emitting region of the light source 102 are different from each other, the light source pupil distance and the sensor pupil distance may be adjusted to be different so that the image pickup view angle and the light projection region approximately coincide with each other. In the present embodiment, the sizes of the image pickup surface M of the image pickup device 101 and the entire light emitting region of the light source 102 are the same as each other. In other words, the sizes and shapes of an effective pixel region on the image pickup device 101 and a region of the light source 102 where the light emission parts 203 are disposed coincide with each other. Accordingly, the sensor pupil distance and the light source pupil distance coincide with each other.
It should be noted that a slight difference between the sensor pupil distance and the light source pupil distance due to a manufacturing process or an assembly error of the image pickup apparatus 100 can be tolerated. In the description of the present embodiment, expressions “same” and “coincide with” are not strictly interpreted, and have a certain range which allows intentional shift, an assembly error, or the like as long as desired performance is obtained.
When the focal length of the image-forming optical system 103 is short or the F value is large, the amount of light from the light source 102 per unit area on a plane at a certain distance decreases. Therefore, the image pickup apparatus 100 may have the configuration so as to change the intensity of light emitted by the light source 102 in accordance with a change in the focal length or the F value. By transmitting information on the focal length and the F value of the image-forming optical system 103 to the light source drive circuit 105 from the image-forming optical system 103, the lens drive circuit 110, or the camera MPU 111 as needed, it is possible to perform the distance measurement with an appropriate amount of light.
In the image pickup device 101, the effective pixel region has a horizontal size of 22.32 mm, a vertical size of 14.88 mm, an effective pixel number of 6000 in the horizontal direction, and an effective pixel number of 4000 in the vertical direction.
The TOF control unit 106 controls driving of the light source drive circuit 105 according to a command from the camera MPU 111 when performing the distance measurement by the TOF method. The TOF calculation unit 107 calculates a distance from a predetermined point on the subject to the image pickup surface M of the image pickup device 111 using a signal output by the pixel 302 of the image pickup device 101. The image pickup device drive circuit 108 controls driving of the image pickup device 111. The image processing circuit 109 generates image data from signals output by the pixels 301, 303, and 304 of the image pickup device 111.
Next, an image pickup sequence performed by the image pickup apparatus 100 will be described.
In S601, the camera MPU 111 detects that an AF button is pressed. It should be noted that the pressing of the AF button indicates a state where a so-called release button constituted by a two-stage switch is half-pressed, and thus, start of an AF operation is instructed to the camera MPU 111. In S602, the camera MPU 111 controls the focus detection unit 112 to perform focus detection processing by the contrast AF method. In S603, the camera MPU 111 drives the lens drive circuit 110 based on the focus detection signal from the focus detection unit 112 to move the focus lens in the optical axis direction. Accordingly, in S604, the MPU 111 can bring the image-forming optical system 103 into a state in which the subject is focused (in focus).
In S605, the camera MPU 111 drives the light source 102 by driving the light source drive circuit 105 through the TOF control unit 106. As a result, a periodic rectangular pulsed laser beam with a center wavelength of 800 nm is output by the light source 102.
Since the light source 102 and the image pickup device 101 are disposed in a conjugate relationship through the image-forming optical system 103, a position of the point R on the light emitting surface of the light source 102 and the point T on the image pickup surface M of the image pickup device 101 coincide with each other. Moreover, as described above, in the present embodiment, since the image pickup surface M of the image pickup device 101 and the entire light emitting region of the light source 102 have the same shape and the same size as each other, there is less wasted light which is projected outside the image pickup view angle, and thus, it is possible to increase light projection efficiency.
The light receiving timing of IR pixels (pixels 302) of the image pickup device 101 is that determined by a periodic rectangular pulse, and the image pickup device 101 thereby detects a time lag between a pulsed laser light from the light source 102 and the corresponding received reflected light and generates a detection signal or a signal related thereto. Then, the TOF calculation unit 107 calculates a distance 506 between the point S on the subject H and the image pickup surface M from the generated signal. It should be noted that a method of detecting pulsed light emitted by the light source 102 and the reflected light thereof, and a calculation of signals can use a known TOF method, and for example, can use a phase detection method of Hansard et al. described above. It should be noted that various techniques have been studied and proposed for the TOF method, and these techniques can be used in the present embodiment, and it is not necessary to use a specific technique in a limited manner.
When the distance measurement by the TOF method is performed in S605, in S606, the camera MPU 111 generates a distance map of the subject based on a result of the distance measurement by the TOF method in S605, and stores the generated distance map in an internal memory of the camera MPU 111. After that, the camera MPU 111 advances the processing to S609. Meanwhile, during the execution of S605, a user can operate the touch panel provided on the display unit 113 of the image pickup apparatus 100 to select an arbitrary region in the image as a focusing region (AF region). Therefore, in S607, the camera MPU 111 determines whether or not a specific region in the image is selected. In a case where the camera MPU 111 determines that the specific region is not selected (NO in S607), the processing proceeds to S609. In a case where the camera MPU 111 determines that the specific region is selected (YES in S607), the processing proceeds to S608. In S608, the camera MPU 111 determines the region selected in S607 as the focusing region.
In S609, the camera MPU 111 drives the focus lens based on the distance map created and stored in S606 and the focusing region determined in S607, and performs focus adjustment (focusing) on the focusing region. By driving the focus lens using absolute distance information to the subject obtained by the TOF method in this way, it is possible to perform the focus adjustment with high speed.
In S610 after the focus adjustment, the camera MPU 111 determines whether or not a shooting button is pressed. It should be noted that the pressing of the shooting button indicates a state where a so-called release button configured by a two-stage switch is fully pressed, and thus, start of the shooting is instructed to the camera MPU 111. In a case where the camera MPU 111 determines that the shooting button is not pressed (NO in S610), in the present embodiment, the camera MPU 111 determines that the user stops the shooting, and thus, the present processing ends. Meanwhile, in a case where the camera MPU 111 determines that the shooting button is pressed (YES in S610), the processing proceeds to S611.
In S611, the camera MPU 111 stores image data of a picked-up image and the distance map created in S606 in a storage medium such as an SD card included in the memory 115, and thus, the present processing ends. It should be noted that the image data is generated using signals from R, G, B pixels (pixels 301, 303, 304) of the image pickup device 101, and the distance map is generated using signals from the IR pixels (pixels 302).
It should be noted that the order of each processing in the flowchart of
The image pickup apparatus 100 according to the first embodiment includes a focus detection unit 112 having an image pickup sensor that performs focus adjustment by a contrast AF method. Meanwhile, the image pickup device according to the second embodiment is different from the image pickup apparatus 100 according to the first embodiment in that the image pickup device according to the second embodiment includes an image pickup device 101A which enables focus detection by an image pickup surface phase difference method, and includes a focus detection unit 118 associated therewith.
The pixel 901 is different from the pixel 301 of
In the image pickup sequence in the image pickup apparatus 100A, the camera MPU 111 controls the focus detection unit 118 to perform focus detection by the image pickup surface phase difference method in the focus detection in S602. After the focus detection starts, the camera MPU 111 determines whether or not the focus detection is possible in S1001. In a case where the camera MPU 111 determines that the focus detection is possible (YES in S1001), the processing proceeds to S603. In a case where the camera MPU 111 determines that the focus detection is not possible (NO in S1001), the processing proceeds to S1002. The case where the determination in S1001 is “NO” is a case where it is difficult to detect the defocus amount by the image pickup surface phase difference method. Therefore, in S1002, the camera MPU 111 performs a search processing that drives the focus lens until the defocus amount can be detected, and when the defocus amount can be detected, the processing proceeds to S603. The processing after S603 is the same as the image pickup sequence shown in
It should be noted that the image pickup sequence in the image pickup apparatus 100A is not limited to the flow in
When creating the distance map of the subject by the TOF method and performing the focus adjustment by the image pickup surface phase difference method, the camera MPU 111 holds the partial reflecting mirror 104 at the down position by the mirror drive circuit 119. In other words, as described in the first and second embodiments, the partial reflecting mirror 104 reflects light emitted by the light source 102 to irradiate the subject at the down position, and guides the reflected light to the image pickup device 101A. As a result, the distance measurement by the TOF method is possible.
Meanwhile, when the partial reflecting mirror 104 is located at the down position during a main shooting (S611) by the image pickup device 101A, the amount of light incident on the image pickup device 101A decreases because a portion of the light is reflected by the partial reflecting mirror 104. Accordingly, image pickup sensitivity decreases. Therefore, the camera MPU 111 moves the partial reflecting mirror 104 to the up position during the main shooting by the mirror drive circuit 119 so that the incident light from the subject is directly incident on the image pickup device 101A. Thereby, the image pickup sensitivity can be increased.
The image pickup apparatus 100C is significantly different from the image pickup apparatus 100 according to the first embodiment in that the image pickup apparatus 100C does not include the partial reflecting mirror 104, the light source 102, and the light source drive circuit 105, and the image pickup device 101B includes light emitters described below and includes a light emitter drive circuit 1105 which drives the light emitters. It should be noted that among components of the image pickup apparatus 100C, the same reference numerals are assigned to the same components as those of the image pickup apparatus 100 according to the first embodiment, and common descriptions are omitted.
A main optical system of the image pickup apparatus 100C includes an image-forming optical system 1103 and an image pickup device 101B. A broken line AB shown in
For example, the light emitter 1202 has a light emission part made of a gallium arsenide (GaAs) based compound semiconductor with a center emission wavelength of 850 nm. The light emitter drive circuit 1105 causes the light emitter 1202 to emit light according to a command from the camera MPU 111.
In the light emitting unit, the light emitters 1202 are arranged one-dimensionally and in the same plane along each of two facing sides on an outer periphery of the light receiving unit. Their light emission parts are, for example, light emitting diodes (LEDs), but are not limited thereto, and surface emitting lasers (VCSEL), stripe lasers, quantum dot elements, organic EL elements, or the like can also be used. Although light emitted by surface emitting lasers and many other small light emitters is divergent light, in the light emitter 1202, the divergent of light emitted by each light emission part is suppressed by a microlens.
In the light emitters 1202, the distance between the light emission parts and the microlenses is set so that the projection region of light from the light emitters 1202 covers the image pickup view angle, and the light emitters 1202 and the image pickup focal plane do not have a conjugate relationship. In other words, the image pickup device 101B is designed such that the light emitted by the minute light emitters 1202 covers a wide area of the subject. Specifically, as shown in
It should be noted that the arrangement of the light emitter 1202 and the light receiver 1203 and/or the light projection region in the image pickup device 101B are not limited to the above-described configuration. For example, it may be designed so that light from the light emitters 1202 is projected on a desired area on which the distance measurement is to be performed, within the image pickup view angle. In particular, by setting an area equivalent to the image pickup view angle or an area inside the same to the light projection region, the light projection efficiency from the light emitter 1202 can be increased.
In a case where the focal length of the image-forming optical system 1103 is short or the F value is large, the amount of light from the light emitters 1202 per unit area on a surface at a certain distance from the image pickup device 101B decreases. Therefore, the light emitters 1202 that are changeable in the light emission intensity according to a change in the focal length or the F value is also one of the desirable configurations. By transmitting information on the focal length and/or F value of the image-forming optical system 1103 to the light emitter drive circuit 1105 from the image-forming optical system 1103 or the lens drive circuit (not shown) or the camera MPU 111 as needed, the distance measurement can be performed with an optimal amount of light. It should be noted that, here, the configuration in which the light emitters 1202 are disposed in one row is described, but a configuration in which the light emitters 1202 are replaced with a single light emission part is also possible.
Each light receiver 1203 of the image pickup device 101B has a structure capable of performing the focus detection and the focus adjustment using a pupil division phase difference method.
The sub photoelectric converter 1503 corresponds to a first pupil division region 1602, the sub photoelectric converter 1504 corresponds to a second pupil division region 1601, and a separation part 1604 corresponds to a third pupil division region 1605. Therefore, in the incident light from the subject, light which has passed through the first pupil division region 1602 and light which has passed through the second pupil division region 1601 enter the sub photoelectric converter 1503 and the sub photoelectric converter 1504 via the microlens 1507, respectively. Moreover, light which has passed through the third pupil division region 1605 enters the separation part 1604 via the microlens 1507.
Hereinafter, a subject image based on signals output from the sub photoelectric converters 1503 of respective pixels of the image pickup device 101B is referred to as an “A image”, and a subject image based on signals output from the sub photoelectric converters 1504 of respective pixels of the image pickup device 101B is referred to as a “B image”. Moreover, a subject image of signals obtained by adding the signal of the A image and the signal of the B image for each pixel is referred to as an “A+B image”. The image pickup device 101B can detect a defocus amount (a focus shift amount) of a subject image having a luminance distribution in the x direction, by detecting an image shift amount (relative position) between the A image and the B image.
The calculation of the image shift amount is given by, for example, shifting the relative position of the A image and the B image, obtaining the sum (reliability value) of squares of the differences between the A image signal and the B image signal for each pixel, and defining the shift amount from which the smallest reliability value has been defined as an image shift position. This is because the reliability value decreases, accuracy in the calculation of the image shift amount increases. As described above, in the image pickup apparatus 110C, the image pickup device 101B performs the focus detection by the pupil division phase difference method, and the lens drive circuit 110 drives lenses under the control of the camera MPU 111 based on the focus detection information. Accordingly, it enables the image pickup apparatus 110C to perform the focus adjustment by a phase difference method for adjusting the subject position and the focus position of the image-forming optical system 1103.
Next, an image pickup sequence in the image pickup apparatus 100C will be described.
In S1801, the camera MPU 111 detects that the AF button is pressed. It should be noted that the processing of S1801 is the same as the above-described S601. In S1802, the camera MPU 111 controls the focus detection unit 118, performs phase-difference-based focus detection processing by the pupil division phase difference method using the signal output from the image pickup device 101B, and determines whether or not the focus detection is possible. In a case where the camera MPU 111 determines that the focus is detected (YES in S1802), the processing proceeds to S1807, and in a case where the camera MPU 111 determines that the focus is not detected (NO in S1802), the processing proceeds to S1803.
In the case where the focus is not detected in S1802, the defocus on the subject is too large, and thus, in S1803, the camera MPU 111 performs the distance measurement by the TOF method. Specifically, a signal is transmitted from the TOF control unit 106 to the light emitter drive circuit 1105 through the camera MPU 111, and the light emitter drive circuit 1105 drives the light emitters 1202 included in the image pickup device 101B. The light emitters 1202 output periodic rectangular pulsed light with a center wavelength of 850 nm. The light projected onto the subject from the light emitter 1202 becomes partially reflected light or scattered light, passes through the image-forming optical system 1103 to enter the image pickup surface M of the image pickup device 101B, and is received by the IR pixels 1303.
In the image pickup apparatus 100C, as described above, the light projection region given by the group of light emitters 1202 is equivalent to the subject field region corresponding to the image pickup view angle of the group of light receivers 1203. Therefore, it is possible to reduce the wasted light which is projected outside the image pickup view angle, and thus, it is possible to increase light projection efficiency. In addition, the light receiving timing of IR pixels 1303 of the image pickup device 101B is that determined by a periodic rectangular pulse, and the image pickup device 101B thereby detects a time lag between a pulsed laser light from the light emitters 1202 and light received by the IR pixels 1303. The image pickup device 101B generates a detection signal of the detected time lag between the emitted light and the received light or a signal related thereto, and the TOF calculation unit 107 calculates the distance between the subject and the image pickup surface from the generated signal.
It should be noted that a method of detecting pulsed light emitted by the light emitters 1202 and the reflected light thereof, and a calculation of signals can use a known TOF method, and for example, can use a phase detection method of Hansard et al. described above. It should be noted that various techniques have been studied and proposed for the TOF method, and these techniques can be used in the present embodiment, and it is not necessary to use a specific technique in a limited manner.
When the distance measurement by the TOF method is performed in S1803, in S1804, the camera MPU 111 generates a distance map of the subject based on a result of the distance measurement by the TOF method in S1803, and stores the generated distance map in the internal memory of the camera MPU 111. After that, the camera MPU 111 advances the processing to S1807. Meanwhile, during the execution of S1803, a user can operate the touch panel provided on the display unit 113 of the image pickup apparatus 100C to select an arbitrary region in the image as a focusing region. Therefore, in S1805, the camera MPU 111 determines whether or not a specific region in the image is selected. In a case where the camera MPU 111 determines that the specific region is not selected (NO in S1805), the processing proceeds to S1807. In a case where the camera MPU 111 determines that the specific region is selected (YES in S1805), the processing proceeds to S1806. In S1806, the camera MPU 111 determines the region selected in S1805 as the focusing region.
In S1807, in a case where the processing is directly advanced from S1802 to S1807, the camera MPU 111 drives the focus lens constituting the image-forming optical system 1103 based on the focus detection information detected in S1802 to perform the focus adjustment (focusing). In addition, in S1807, in a case where the camera MPU 111 goes through S1804 or passes through a case where the determination in S1805 is NO, the camera MPU 111 drives the focus lens based on the distance map stored in S1804 to perform the focus adjustment (focusing). In S1807, in a case where the camera MPU 111 passes through the case where the determination in S1805 is NO, the camera MPU 111 drives the focus lens based on the distance map stored in S1804 and the focusing region determined in S1806 to perform the focus adjustment (focusing) on the focusing region. By driving the focus lens using the absolute distance information to the subject obtained by the TOF method, it is possible to perform the focus adjustment at high speed.
The processing of S1808 and S1809 is the same as the processing of S610 and S611 in the first embodiment, respectively. That is, in S1808, the camera MPU 111 determines whether the shooting button is pressed and the start of the shooting is instructed to the camera MPU 111. In a case where the camera MPU 111 determines that the shooting button is not pressed (NO in S1808), the camera MPU 111 determines that the user stops the shooting, and thus, the present processing ends. Meanwhile, in a case where the camera MPU 111 determines that the shooting button is pressed (YES in S1808), the processing proceeds to S1809.
In S1809, the camera MPU 111 stores the image data of a picked-up image and the distance map created in S1804 in a storage medium such as an SD card included in the memory 115, and thus, the present processing ends. It should be noted that the image data is generated using signals from the R pixels 1301, the G pixels 1302, and the B pixels 1304 of the image pickup device 101B, and the distance map is generated using signals from the IR pixels 1303.
In the image pickup apparatus 100C, it is desirable to change, according to the F value or the focal length of the image-forming optical system 1103, the light emission intensity of the light emitters 1202 in the focus detection by the TOF method. For example, in a case where the F value is large in the same shooting situation, the intensity of the light incident on the image pickup device 101B is small because an effective diameter of an imaging lens is small. Conversely, in a case where the F value is small in the same shooting situation, the intensity of the light incident on the image pickup device 101B is large. Accordingly, the intensities of the reflected light and the scattered light from the subject change according to the F value of the image-forming optical system 1103, which makes the TOF signal obtained by the light receiver 1203 unstable and decreases the detection accuracy decreases, or may lead a case where the detection is impossible. A similar problem may occur in a case where the size of the subject field changes according to the focal length of the image-forming optical system 1103 and the light projection intensity per unit solid angle of the image pickup view angle changes.
Therefore, to stably perform the TOF focus detection, the light emitters 1202 may be configured so that the light emission intensity thereof can be controlled by the TOF control unit 106 according to the F value, or the light emission intensity of the light emitters 1202 may be controlled according to the focal length of the image-forming optical system 1103. It should be noted that in a case where the received light intensity of the IR light by the image pickup device 101B is too small or too large in the first measurement by the TOF method, the light emitters 1202 may be controlled to correct the light projection intensity in the next measurement, which makes an image pickup sequence more desirable.
Meanwhile, in the image pickup sequence in the flowchart of
In this situation, the focus detection by the TOF method using the IR light which does not require visible light is effective, and it is preferable that the first focus detection in the drive sequence is performed by the TOF method. However, in the focus detection by the TOF method, in some cases, a distance resolution may be insufficient or the distance measurement accuracy may be low with respect to a subject field depth. Therefore, the camera MPU 111 preferably controls the focus detection unit 118 after the focus adjustment using the focus detection result by the TOF method to perform the focus detection by the pupil division phase difference method, and determines whether or not a state after the focus adjustment by the TOF method falls into a focusing range. As a result, the camera may go to the shooting operation when the state is the in-focus state, and may drive the focus lens again based on the focus detection result by the pupil division phase difference method when the state is the out-of-focus state so as to perform the focus adjustment.
Next, a variation of the image pickup device 101B will be described. In the image pickup apparatus 100C, the image pickup device 101B separately includes the light emitters 1202 and the light receivers 1203. Alternatively, an image pickup device including light receiving/emitting elements having both light receiving and light emitting functions may be used. In this case, in the in-focus state, light output from a light emission part of one pixel of the image pickup device reaches substantially one point on the subject, is reflected or scattered on the subject, goes back to be formed into an image on the original pixel again, and is received by a light receiving part. Accordingly, it is possible to perform the distance measurement by the TOF method using the light emission signal and the light reception signal. As a result, it is possible to increase a resolution of a distance measurement area by the TOF method.
For example, as the light receiving/emitting elements, light receiving/emitting elements (Japanese Laid-Open Patent Publication (kokai) No. S58-134483) focused on a light emitting/receiving function of LEDs, gallium nitride (GaN) based light receiving/emitting elements having a multiple quantum well structure, and light emitting and receiving elements using a nanorod are known. The light receiving/emitting elements having a multiple quantum well structure are disclosed in Y. Wang et al., Proc. SPIE 10823, Nanophotonics and Micro/Nano Optics IV, 108230H (25 Oct. 2018). The light receiving/emitting elements using a nanorod are disclosed in N. Oh et al., Science 355, 616 (2017) or the like, but light receiving/emitting elements to be used are not limited thereto.
The image pickup apparatus 100D is significantly different from the image pickup apparatus 100 according to the first embodiment in that the image pickup apparatus 100D does not include the partial reflecting mirror 104 and has a light source 2102 instead of the light source 102. Among components of the image pickup apparatus 100D, the same reference numerals are assigned to the same components as those of the image pickup apparatus 100 according to the first embodiment, and common descriptions are omitted. Further, since the image-forming optical system 1103 of the image pickup apparatus 100D is substantially the same as the image-forming optical system 1103 of the image pickup apparatus 100C according to the fourth embodiment, the same reference numerals are assigned, and descriptions thereof are omitted.
In the light source 2102, the microlens array is disposed on a distance control unit 2201 which can control the distance to the light emission parts 2203, so that emission conditions (emission state) of the light from the light emission parts 2203 can be changed according to an image pickup condition, as described later. It should be noted that the light source 2102 may be constituted by a single light emitter or light emission part.
In the light source 2102, surface emitting lasers (VCSELs) with a center emission wavelength of about 800 nm are used as the light emission parts 2203. A wavelength like an infrared light wavelength, being out of a visible light wavelength band that is used for pickup of a subject image by the image pickup device 101, is used for light to be emitted by the light source 2102, which enables an easy distinction between signals of distance measurement using the TOF method and signals of image pickup. It should be noted that the light emission parts 2203 are not limited to the above-described configuration, and for example, stripe lasers, light emitting diodes (LEDs), quantum dot elements, and organic EL elements can be used. Since light emitted by many minute light emitters including the surface emitting lasers is divergent light, the microlenses 2204 are disposed to suppress the divergence of divergent light from the light emission parts 2203.
The configuration of the light source 2102 is not limited to the above configuration. For example, by adjusting the emission direction so that the light emitted by the light emission parts 2203 is collimated, it enables the light source 2102 to project the light with sufficient intensity even in a case where the distance to the subject is long. In this case, in order to achieve projection of light by the light source 2102 with high efficiency in consideration of a manufacturing variation of the light source 102, the light source 2102 may be configured such that the light projection region exists slightly inside the subject area which can be picked up by the image pickup device 101.
When the focal length of the image-forming optical system 1103 is short or the F value is large, the amount of light from the light source 2102 per unit area on a plane located at a predetermined distance from the image pickup device 101 decreases. Therefore, the image pickup apparatus 100D may have the configuration so as to change the intensity of light emitted by the light source 2102 in accordance with a change in the focal length or the F value of the image-forming optical system 1103. The image-forming optical system 1103, the lens drive circuit 110, or the camera MPU 111 may transmit information on the focal length and the F value of the image-forming optical system 1103 to the light source drive circuit 105, as needed, as to change the intensity of the light emitted by the light source 2102. It enables the distance measurement with an appropriate amount of light.
It should be noted that since the image pickup view angle changes according to the focal length of the image-forming optical system 1103, to control the degree of divergence or convergence of light emitted by the light source 2102, it is desirable to control the light source drive circuit 105 and the control lens 2220 so that the intensity of the emitted light per unit solid angle is constant. The camera MPU 111 controls the light source drive circuit 105 and the control lens 2220 so that an angle range in which the image-forming optical system 1103 can receive light and a angle range of projection light from the light source 2102 coincide with each other according to the focal length of the image-forming optical system 1103.
In the image pickup apparatus 100D, it is desirable to change the light projection conditions to desired conditions according to a difference in the AF mode of the focus detection unit 112. For example, as shown in
The image pickup device 101 is equivalent to the image pickup device 101 included in the image pickup apparatus 100 according to the first embodiment, and since configurations thereof and the like are described with reference to
Next, an image pickup sequence in the image pickup apparatus 100D will be described. The image pickup sequence in the image pickup apparatus 100D can be executed according to the flowchart in
Light (a ray at an optical center is indicated by a solid line 2501) emitted at a position R on an emission surface of the light source 2102 reaches a point S on the subject J. Light emitted by the light emission parts 2203 has a certain width, and scattered light 2502 including reflected light due to the emitted light is made at the point S on the subject J. A portion of the scattered light 2502 passes through an area (between dashed lines 2504 and 2505) where light passing through an opening of the image-forming optical system 1103 can pass through, and is formed into an image at a point T on the image pickup surface M of the image pickup device 101. It should be noted that in
The image pickup apparatus 100D is configured such that the size and shape of an effective pixel area on the image pickup device 101 and the size and shape of an area where the light emission parts 2203 are arranged on the emission surface of the light source 2102 coincide with each other. Accordingly, there is less wasted light which is projected outside the image pickup view angle, and thus, it is possible to increase light projection efficiency. In addition, the light receiving timing of IR pixels (pixels 302) of the image pickup device 101 is that determined by a periodic rectangular pulse, and the image pickup device 101 thereby detects a time lag between light emitted by the light source 2102 and the received reflected light of the IR pixels and generates a detection signal or a signal related thereto. Then, the TOF calculation unit 107 calculates the distance between the point S on the subject J and the image pickup surface M from the generated signal. These operations are the method of the distance measurement processing by the TOF method in S605.
It should be noted that a method of detecting pulsed light emitted by the light source 2102 and the reflected light thereof, and a calculation of signals can use a known TOF method, and for example, can use a phase detection method of Hansard et al. described above. It should be noted that various techniques have been studied and proposed for the TOF method, and these techniques can be used in the present embodiment, and it is not necessary to use a specific technique in a limited manner.
In the image pickup apparatus 100E, the image pickup device 101 included in the image pickup apparatus 100D according to the fifth embodiment is changed to the image pickup device 101A included in the image pickup apparatus 100A according to the second embodiment, and the image pickup apparatus 100E includes a focus detection unit 118 associated therewith. Other configurations of the image pickup apparatus 100E are the same as those of the image pickup apparatus 100D. Therefore, among components of the image pickup apparatus 100E, the same reference numerals are assigned to the same components as those of the image pickup apparatus 100D according to the fifth embodiment and the image pickup apparatus 100 according to the first embodiment, and common descriptions are omitted.
As described in the second embodiment, the image pickup device 101A is an image pickup device capable of performing the focus detection using the image pickup surface phase difference method, the structure thereof is described with reference to
Next, an image pickup sequence in the image pickup apparatus 100E will be described.
In S2601, the camera MPU 111 detects that the AF button is pressed. It should be noted that processing of S2601 is the same as that of S601 described above. In S2602, the camera MPU 111 controls the focus detection unit 118 to perform the focus detection processing by the image pickup surface phase difference method using the signal output from the image pickup device 101A. In S2603, the camera MPU 111 determines whether or not the focus detection is possible. In a case where the camera MPU 111 determines that the focus detection is possible (YES in S2603), the processing proceeds to S2604, and in a case where the camera MPU 111 determines that the focus detection is not possible (NO in S2603), the processing proceeds to S2606.
In S2604, the camera MPU 111 drives the lens drive circuit 110 based on the focus detection signal from focus detection unit 118 to move the focus lens in the optical axis direction. Accordingly, in S605, the MPU 111 can bring the image-forming optical system 1103 into the state in which the subject is focused (in focus), and thereafter, the processing proceeds to S2611.
In a case where the focus cannot be detected by the image pickup surface phase difference method, in S2606, the camera MPU 111 performs the distance measurement by the TOF method. Here, the distance measurement by the TOF method can be performed in the same manner as the distance measurement by the TOF method in the fifth embodiment, and detailed descriptions thereof are omitted. Moreover, the processing of S2606 to 52612 is performed in the same manner as the processing of S605 to S611 of the flowchart of
Embodiments of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiments and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiments, and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiments and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiments. The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Application No. 2019-084356, filed Apr. 25, 2019, and Japanese Patent Application No. 2020-028123, filed Feb. 21, 2020, which are hereby incorporated by reference wherein in its entirety.
Number | Date | Country | Kind |
---|---|---|---|
2019-084356 | Apr 2019 | JP | national |
2020-028123 | Feb 2020 | JP | national |
Number | Date | Country | |
---|---|---|---|
Parent | 16855233 | Apr 2020 | US |
Child | 17697295 | US |