Field of the Invention
The present invention relates to a three-dimensional image capturing apparatus using digital holography.
Description of the Related Art
Previously, as one of methods of measuring a distribution of a transparent object, mainly a refractive index distribution, digital holography is known. Here, a basic technology of the digital holography will be described. First, an incident light beam emitted from a light source such as a laser with high coherency is separated into an object light beam and a reference light beam by using a beam splitter or the like. Next, a specimen is disposed on the object light beam so that only the object light beam passes through the specimen. Then, the object light beam and the reference light beam are combined, i.e., interfere with each other, by using the beam splitter or the like, and an interference fringe that is generated by interference between the object light beam and the reference light beam is recorded by a digital sensor. Finally, the distribution of the transparent object, for example a phase distribution is calculated by using a calculator based on the recorded information.
The digital holography has mainly two methods of an Off-Axis method and a phase shift method. The Off-Axis method can calculate a phase based on a single hologram, but it is necessary to care about a pitch of the interference fringe and the like. The phase shift method can fully utilize NA of an optical system, but it is necessary to photograph at least three holograms in order to calculate the phase. Among these methods, the Off-Axis method where the number of photographing times is reduced is commonly used for requiring a photography speed. U.S. Pat. No. 6,078,392 discloses a digital holography technology based on the Off-Axis method.
Recently, as one of methods of photographing the transparent object three-dimensionally, the digital holography is applied. As a unit (multiplexing method) of obtaining information relating to a depth direction, i.e., direction along an optical axis, there are mainly angle multiplexing and wavelength multiplexing. The angle multiplexing is a method of sequentially illuminating object light beams with different incident angles to calculate three-dimensional information based on the information for each incident angle. The wavelength multiplexing is a method of using object light beams with different wavelengths to calculate the three-dimensional information based on information for each wavelength. In order to obtain a high-quality three-dimensional image, the angle multiplexing of these multiplexing methods is commonly used.
Hereinafter, a method of photographing the transparent object by using the angle multiplexing three-dimensionally will be described. In the digital holography described above, an incident angle of the object light beam with respect to a specimen is changed, and interference fringes with respect to a plurality of angles are recorded. It is possible to change the angle of the object light beam with respect to the specimen by changing an angle of a galvanometer mirror that is disposed on a conjugate plane of the specimen. Finally, the distribution of the transparent object, for example the refractive index distribution, is calculated by a calculator based on the information for each recorded angle. US Patent Application Publication No. 2009/125242 discloses a method of photographing the transparent object three-dimensionally by the angle multiplexing.
In order to obtain a high-quality reproduced image in the method of photographing the transparent object three-dimensionally by the angle multiplexing, a large number of photographing times is required. Specifically, the number of photographing times around 100 to 150 times is required, and US Patent Application Publication No. 2009/125242 discloses the number of photographing times around 100 times. In order to sequentially perform such photography, a long photographing time is required. In the method of US Patent Application Publication No. 2009/125242, photographing a moving object is achieved with a small number of photographing pixels to reduce the photographing time per one shot. However, in order to solve a common problem that increases the number of photographing pixels to widen a photographing area, the number of photographing times is a problem to be solved.
If a configuration where a plurality of object light beams are incident is adopted so as to reduce the number of photographing times, the object light beams interfere with each other and accordingly the configuration does not work simply. On the other hand, it is considered that the interference of the object light beams is avoided by the wavelength multiplexing. However, the wavelength multiplexing needs light sources whose number corresponds to the number of images for overlap photographing, and thus a system is complicated. Although it is possible to avoid the interference of the object light beams by polarization multiplexing, the polarization multiplexing only has two degrees of freedom at most for the polarization up, and accordingly there is the upper limit for the multiplexing. In addition, it is necessary to prepare an element for controlling the polarization of each object light beam, add a polarizer to a sensor, and prepare two sensors, and thus the system is complicated.
The present invention provides an image capturing apparatus and an image capturing method which are capable of acquiring highly-accurate three-dimensional data with a simple configuration by a small number of photographing times.
An image capturing apparatus as one aspect of the present invention is capable of performing three-dimensional tomography of an object by digital holography, and includes a splitting element configured to split a light beam emitted from a light source into an object light beam and a reference light beam, an illumination system configured to control a plurality of object light beams that are generated from the object light beam and that move in directions different from each other to be incident on the object simultaneously, a composite element configured to cause the plurality of object light beams to interfere with the reference light beam, an image sensor configured to acquire hologram generated by interference of each of the plurality of light beams with the reference light beam, and a controller configured to control the illumination system so that the plurality of object light beams interfere with each other on the image sensor.
An image capturing method as another aspect of the present invention is capable of performing three-dimensional tomography of an object by digital holography, and the method includes the steps of splitting a light beam emitted from a light source into an object light beam and a reference light beam, controlling a plurality of object light beams that are generated from the object light beam and that move in directions different from each other to be incident on the object simultaneously, combining the plurality of object light beams with the reference light beam, and acquiring hologram generated by interference of each of the plurality of light beams with the reference light beam, and the step of controlling the plurality of object light beams to be incident on the object simultaneously includes controlling the plurality of object light beams to interfere with each other on an image sensor.
Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
Exemplary embodiments of the present invention will be described below with reference to the accompanied drawings. In each of the drawings, the same elements will be denoted by the same reference numerals and the duplicate descriptions thereof will be omitted.
First, referring to
In
Next, referring to
An angle of a beam in an object that is determined depending on an incident angle is defined by an angle with respect to an optical axis and an azimuth angle (θ,φ). The upper limit of θ is determined by NA=n·sin θ. Symbol NA denotes a NA (numerical aperture), and symbol n denotes a refractive index of oil of the object 205. When the NA is 1.4, the upper limit of θ is 62.7 degrees, and accordingly a range of θ is from 0 to 62.7 degrees. A range of φ is from 0 to 360 degrees. In this embodiment, θ is set to 34 degrees with sufficient margin.
A basic concept of the image capturing apparatus 200 in this embodiment is to reduce the number of photographing times by acquiring a plurality of real image spectra while a plurality of object light beams are simultaneously incident.
Next, each spectrum will be described referring to expressions. In expression (1) below, symbols EO1, EO2, and ER denote electric fields of the first object light beam, the second object light beam, and the reference light beam, respectively. Symbols AO1, AO2 and AR denote amplitudes of the electric fields of the first object light beam, the second object light beam, and the reference light beam, respectively. Symbols ψ1 and ψ2 denote phases of the first object light beam and the second object light beam, respectively. Symbols α and β are values calculated based on the beam angle θ in the object.
E
O1(x,y)=AO1(x,y)exp[iψ1(x,y)+2πixα]
E
O2(x,y)=AO2(x,y)exp[iψ2(x,y)+2πiyβ]
E
R(x,y)=2AR (1)
As represented by expression (2) below, the hologram can be calculated as the square of an absolute value of a sum of the three electric fields described above.
The spectrum of the hologram can be calculated by the Fourier transform of expression (2), and it is represented by expression (3) below.
In expression (3), symbol “-” as a superscript represents a complex conjugate, and an operator “*” represents cross-correlation that is represented by expression (5) below. Based on the electric field represented by expression (1), the spectrum of terms where the influence of the inclination is eliminated is represented by expression (4) below. In expression (4), symbol FT denotes the Fourier transform, and symbol (ξ,η) denotes a position of the spectrum in a frequency space f.
{tilde over (E)}
O1(ξ,η)=FT[AO1(x,y)exp[iψ1(x,y)
{tilde over (E)}
O2(ξ,η)=FT[AO2(x,y)exp[iψ2(x,y)
{tilde over (E)}
R(ξ,η)=FT[ER(x,y)] (4)
f*g(x,y)=∫−∞∞f(x′,y′)
On the right side of expression (3), the first term, the second term, and the third term correspond to the zeroth order light beam 431, the real image spectrum 432 of the first object light beam 411, and the real image spectrum 433 of the second object light beam 412, respectively. The fourth term and the fifth term correspond to the virtual image spectrum 434 of the first object light beam 411 and the virtual image spectrum 435 of the second object light beam 412, respectively. The sixth term and the seventh term correspond to the crosstalk 436 of the virtual image spectrum of the first object light beam 411 and the real image spectrum of the second object light beam 412, and the crosstalk 437 of the real image spectrum of the first object light beam 411 and the virtual image spectrum of the second object light beam 412, respectively.
Hereinafter, the crosstalk will be described in detail. Positions of the spectrum of the crosstalk are represented by (ξ6,η6)=(−α,β) for the sixth term and (ξ7,η7)=(α,−β) for the seventh term. The position of ξ is determined depending on the first object light beam 411, and the position of η is determined depending on the second object light beam 412. On the other hand, the positions of the real image spectra of the first object light beam 411 and the second object light beam 412 are represented by (ξ1,η1)=(α,0) and (ξ2,η2)=(0,β), respectively. Accordingly, the positions of the crosstalk for the sixth term and for the seventh term can be described as (ξ6,η6)=(ξ1,−η2) and (ξ7,η7)=(−ξ2,η1), respectively. FIG. 3D is a diagram of illustrating a position vector of each spectrum. The position of the crosstalk can be determined based on a vector sum 444 of a position vector 441 of the real image spectrum of an object light beam and an inverse vector 443 of a position vector 442 of the real image spectrum of other object light beams (i.e., vector sum of the position vector 441 and the inverse vector 443).
This conclusion will be generalized in a case where N object light beams are incident. In this case, the hologram is represented by expression (6) below.
The spectrum of this hologram is obtained by the Fourier transform of expression (6), and it is represented by expression (7) below.
In this embodiment, a region where the real image spectrum has a value is represented by S(ξ,η) (represented by S (f) when the special frequency is denoted by f). In other words, a function where its value indicates 1 when ξ and η satisfying expression (8) below and the value indicates 0 in other cases is defined as S(ξ,η).
In this case, a region where the virtual image spectrum has a value is represented by S(−ξ,−η) according to the fourth term, and a region where the spectrum of the crosstalk has a value is represented by S(ι,η)*S(ξ,η) according to the fifth and sixth terms. In other words, it corresponds to autocorrelation of S(ξ,η). In order to extract the real image spectrum, it is desirable that the real image spectrum is separated from each of the virtual image spectrum and the spectrum of the crosstalk. In other words, it is desirable that a region where both of S(ξ,η) and S(ξ,η)*S(ξ,η) indicate values other than zero does not exist (i.e., a solution that simultaneously satisfies conditions of |S(f)|>0 and |S(f)*S(f)|>0 (f: spatial frequency, *: cross-correlation) does not exist). In order to achieve this, it is necessary that at least the zeroth order light component of the real image spectrum and the spectrum of the crosstalk occurring due to the component should not overlap with each other. The zeroth order light component of the real image spectrum is a real image spectrum of (ξ1,η1)=(α,0) or (ξ2,η2)=(0,α), and it corresponds to a spectrum of the object light beam that travels in a straight line from a specimen. Accordingly, S(ξ, η) is redefined as a region where the zeroth order light component of the real image spectrum occurs. In other words, S(ξ,η) is defined as represented by expression (9) below.
The object light beam may be incident so that the region where both of S(ξ,η) and S(ξ,η)*S(ξ,η) indicate values other than zero does not exist.
In the above description, the case where all the object light beams interfere with the reference light beam is considered, but this embodiment is not limited thereto. The object light beam and the reference light beam can be prepared by splitting a plurality of light beams which do not have coherency, and then they can be controlled to interfere on a sensor surface to photograph overlaid hologram.
Hereinafter, a case where M light sources that do not have coherency exist will be considered. Nm object light beams that are obtained by splitting the m-th (m: 1 to M) light source and the reference light beam are denoted by EOmj and ERm, respectively. These are controlled to interfere on the sensor surface to acquire the hologram. In this case, the strength of the hologram is represented by expression (10) below.
In order to extract the real image spectrum, it is desirable that the real image spectrum is separated from each of the virtual image spectrum, the spectrum of the crosstalk, and the real image spectrum obtained from another light source. The photographed overlaid hologram is represented as linear combination of the holograms obtained by each light source, and accordingly the above consideration on each term can be adopted independently. In other words, a region where the zeroth order light component of the real image spectrum obtained from the m-th light source occurs is denoted by Sm(ξ,η). In this case, it is desired that both of S(ξ,η) and Sm′(ξ,η), both of Sm(ξ,η) and Sm′(ξ,η)*Sm′(ξ,η), and both of Sm(ξ,η) and Sm′(−ξ,−η) indicate values other than zero does not exist. As a result, the plurality of real image spectrum can be extracted. The plurality of light beams that do not have coherency can be prepared by various methods. For example, there are light beams with different wavelengths or light beams with polarizations orthogonal to each other.
Next, referring to
In
Subsequently, the specimen 505 (object) is disposed on the plurality of object light beams (the first object light beam 531 and the second object light beam 532) so that only the plurality of object light beams pass through the specimen 505. In order to increase an angle of the beam with respect to the optical axis in the specimen, the condenser lens 504 and an objective lens 506 are disposed. Furthermore, the condenser lens 504 and the objective lens 506 are disposed in front of and behind the specimen 505, respectively, so that the angle of the beam in the specimen can be set to be large. A beam splitter 507 (composite element) combines each of the plurality of object light beams (the first object light beam 531 and the second object light beam 532) with the reference light beam 512, that is, it causes each of the plurality of object light beams to interfere with the reference light beam. The digital sensor 508 (image sensor such as a CMOS sensor) records (acquires) an interference fringe (hologram) that is generated by interference of each of the object light beams and the reference light beam. As described above, the image capturing apparatus 500 changes the plurality of object light beams with respect to the specimen 505 so that the interference fringes with respect to a plurality of angles can be recorded.
The image capturing apparatus 500 does not need a mechanism for preventing the interference of the plurality of object light beams, such as laser optical sources whose number corresponds to the number of overlaid images, a unit for changing the wavelength of the incident light, and a polarizer for controlling the polarization. Furthermore, an array of the sensor recording the hologram does not have to be provided with an element that can distinguish the two object light beams, such as a polarizer and a color filter. Accordingly, three-dimensional data can be acquired with a simpler configuration.
A shape of the aperture stop 521 for forming the illumination shape corresponds to each of (34,0) degree and (34,90) degree as the angle (θ,φ) of the beam, and it also corresponds to the real image spectra 432 and 433 of
As described above, in order to extract the real image spectrum, it is desirable that the real image spectrum is separated from each of the virtual image spectrum and the spectrum caused by the crosstalk. In view of the description of the combination of the plurality of object light beams using the autocorrelation, it is desirable that the combination of the object light beams is selected so that the real image spectrum does not overlap with the autocorrelation of the real image spectrum. In more detail, it is necessary to determine the combination of the object light beams so that at least a distance from a center position of each real image spectrum to a center position of each spectrum obtained by the autocorrelation of the real image spectrum is longer than a radius of a simulated aperture stop that extracts the real image spectrum. The simulated aperture stop means an aperture stop which is set by simulation on the computer for extracting the real image spectrum from spectra.
Next, Embodiment 1 of the present invention will be described. In this embodiment, as an example of a configuration that controls a position of crosstalk occurring due to interference of a plurality of object light beams, a configuration that controls a position of a spectrum of the crosstalk to be located at a position different from a position of a real image spectrum of each of the plurality of object light beams will be described. Specifically, a configuration that controls the spectrum of the crosstalk to be kept away from the vicinity of the center of a frequency space where the real image spectrum exists will be described.
In order to keep the spectrum of the crosstalk away from the vicinity of the center of the frequency space where the real image spectrum exists, an angle between a position vector of the real image spectrum of an object light beam and an inverse vector of a position vector of the real image spectrum of another object light beam only needs to be small. In order to achieve this, an angle between the plurality of object light beams only needs to be set to a large angle. The large angle means for example an angle larger than 90 degrees as an azimuth angle φ. However, when the angle is too large, for example 180 degrees, the real image spectrum of an object light beam overlaps with the virtual image spectrum of another object light beam, and therefore it is difficult to extract the real image spectrum of the object light beam. Accordingly, it is preferred that a range of the azimuth angle φ is larger than 90 degrees and is smaller than 150 degrees (90 to 150 degrees).
The combination of the angles of the plurality of object light beams is not only a condition on two pairs of object light beams, but also a condition that is applied to more object light beams. For example, when the azimuth angle φ of the first object light beam is set to 0 degree, a combination where the azimuth angle φ of the second object light beam is set to an angle that is larger than 90 degrees and is smaller than 150 degrees and the azimuth angle φ of the third object light beam is set to an angle that is larger than −150 degrees and is smaller than −90 degrees is considered. More specifically, for example, as azimuth angles φ of the three object light beams, 0, 120, and 240 degrees are considered. In this case, with respect to any of the combinations of the object light beams, the angles are satisfied so that a difference of the azimuth angles φ is larger than 90 degrees and is smaller than 150 degrees.
As described above, in this embodiment, the angle between the position vector of the real image spectrum of an object light beam and the position vector of the real image spectrum of another object light beam is set to an angle that is larger than 90 degrees and that is smaller than 150 degrees. As a result, the spectrum of the crosstalk can be kept away from the area of the real image spectrum. Furthermore, the plurality of real image spectra do not overlap with each other, and the real image spectrum and the virtual image spectrum do not overlap with each other. The term “do not overlap with each other” means that main components of the respective spectra do not overlap with each other, and at least the zeroth order light components of the respective real image spectra may not overlap with each other.
In order to obtain a high-quality three-dimensional reproduced image, a large number of object light beams are required. In this embodiment, as an example, it is assumed that 120 object light beams are required. When the plurality of object light beams are not simultaneously incident, as a typical example, the azimuth angle φ may be set to 0 degree for 1st photography, 3 degrees for 2nd photography, . . . , and 357 degrees for 120th photography. When the combination of the object light beams described in this embodiment is used, the angle may be set to 0, 120, and 240 degrees for 1st photography, 3, 123, and 243 degrees for 2nd photography, . . . , and 117, 237, and 357 degrees for 40th photography. While a conventional apparatus needs photography (tomography) 120 times when applying the 120 object light beams, this embodiment can reduce the number of photographing times (image capturing times) to 40 times.
For the 1st photography, the distribution of
Next, a way of calculation of the reproduced image will be described. Based on the spectrum of the hologram that is obtained by the 1st photography illustrated in
In this embodiment, the reference light beam is perpendicular to the sensor. Alternatively, the reference light beam may be incident obliquely on the sensor. When the reference light beam is inclined with respect to the sensor, the spectrum of the crosstalk does not move although the real image spectrum is kept away from the origin of the frequency space, and accordingly it is possible to apply a larger number of object light beams simultaneously. Hereinafter, this will be described referring to expressions.
When the reference light beam is incident obliquely with respect to the sensor, an electric field ER of the reference light beam is represented by expression (11) below.
E
R(x,y)=2ARexp[2πixαR] (11)
For a simple description, it is assumed that the reference light beam is inclined with respect to an x direction. When the spectrum of the hologram is calculated by using the electric field ER of expression (11), expression (12) below is obtained.
According to the second term and the third term of expression (12), it is understood that the real image spectrum is shifted by ξ=−αR due to the inclination of the reference light beam. On the other hand, according to the sixth term and the seventh term of expression (12), the spectrum of the crosstalk is not changed by the inclination of the reference light beam.
Next, Embodiment 2 of the present invention will be described. In this embodiment, as an example of a configuration that controls a position of crosstalk occurring due to interference of a plurality of object light beams, a configuration that controls spectra of crosstalks to be collected into a predetermined region (partial region in a frequency space) will be described.
In order to collect the spectra of the crosstalks in the partial region of the frequency space, an angle between a position vector of a real image spectrum of an object light beam and an inverse vector of a position vector of a real image spectrum of another object light beam only has to be large. In order to achieve this, an angle of a plurality of object light beams may be small, and an azimuth angle φ may be within a certain range.
Referring to
While the case where the two object light beams are simultaneously incident are described in this embodiment, many object light beams can be simultaneously incident as long as the center of an adjacent real image spectrum is not included in the simulated aperture stop. When the real image spectrum is close to the spectrum of the crosstalk, a diameter of the simulated aperture stop which extracts a real image spectrum closest to the crosstalk can be set to be smaller than a diameter of the simulated aperture stop which extracts another real image spectrum.
In order to obtain a high-quality three-dimensional reproduced image, a large number of object light beams are required. In this embodiment, as an example, it is assumed that 120 object light beams are required. When the plurality of object light beams are not simultaneously incident, as a typical example, the azimuth angle φ may be set to 0 degree for 1st photography, 3 degrees for 2nd photography, . . . , and 357 degrees for 120th photography. When the combination of the object light beams described in this embodiment is used, the angles may be set as follows:
for 1st photography, 0, 30, 60, and 90 degrees,
for 2nd photography, 3, 33, 63, and 93 degrees, . . . ,
for 10th photography, 27, 57, 87, and 117 degrees,
for 11th photography, 120, 150, 180, and 210 degrees,
for 12th photography, 123, 153, 183, and 213 degrees, . . . ,
for 20th photography, 147, 177, 207, and 237 degrees,
for 21st photography, 240, 270, 300, and 330 degrees,
for 22nd photography, 243, 273, 303, and 333 degrees, . . . ,
for 30th photography, 267, 297, 327, and 357 degrees.
While a conventional apparatus needs photography 120 times when applying the 120 object light beams, this embodiment can reduce the number of photographing times to 30 times.
Next, Embodiment 3 of the present invention will be described. While Embodiment 2 describes the configuration of the image capturing apparatus using the plurality of object light beams with a single wavelength, this embodiment can acquire a number of wavefronts at the same time by overlap photography (overlap tomography) of the hologram by using the plurality of object light beams with different wavelengths or with different polarizations.
In this embodiment, the plurality of object light beams include a plurality of object light beams with different wavelengths from each other which are split from a plurality of light beams emitted from a plurality of light sources (first light source and second light source). The controller controls the plurality of object light beams with different wavelengths so as not to interfere with each other. In this embodiment, a case where overlap photography (overlap tomography) is performed on holograms by using a light source with light sources with different wavelengths of 543 nm and 553 nm (a plurality of light sources with different wavelengths) will be described. A specimen is polystyrene beads having a refractive index of 1.587 and a diameter of 10 μm that are immersed in oil having a refractive index of 1.575.
In this embodiment, the first object light beams and the second object light beams are incident in directions symmetric with respect to the optical axis. The direction symmetric with respect to the optical axis corresponds to replacing α and β, with −α and −β in expression (1), respectively. In this case, the fifth term of expression (7) that represents the spectrum of the crosstalk is changed as expression (13) below.
This means that the spectrum of the crosstalk occurs at the position of (αj-αj′, βj-βj′), and it is the same as the position of the spectrum of the crosstalk that is represented by the seventh term of expression (7). In other words, if the first object light beams and the second object light beams are incident in the directions symmetric with respect to the optical axis, it is possible to collect the spectra of the crosstalks at the same position. By adopting this arrangement, a region that the spectra of the crosstalks occupy can be minimized. The first object light beams and the second object light beams are not necessarily incident in strictly symmetric directions, but an incident direction can be determined within a range where the real image spectrum does not roughly overlap with the spectrum of the crosstalk.
In order to obtain a high-quality three-dimensional reproduced image, similarly to Embodiment 2, 120 object light beams are required. When the combination of the object light beams described in this embodiment is used, the azimuth angle of each object light beam may be set as follows:
for 1st photography, 0, 30, and 60 degrees with respect to the first object light beams, and 180, 210, and 240 degrees with respect to the second object light beams,
for 2nd photography, 3, 33, and 63 with respect to the first object light beams, and 183, 213, and 243 degrees with respect to the second object light beams, . . . ,
for 10th photography, 27, 57, and 87 degrees with respect to the first object light beams, and 207, 237, and 267 degrees with respect to the second object light beams,
for 11th photography, 270, 300, and 330 degrees with respect to the first object light beams, and 90, 120, and 150 degrees with respect to the second object light beams,
for 12th photography, 273, 303, and 333 degrees with respect to the first object light beams, and 93, 123, and 153 degrees with respect to the second object light beams, . . . ,
for 20th photography, 297, 327, and 357 degrees with respect to the first object light beams, and 117, 147, and 177 degrees with respect to the second object light beams,
While a conventional apparatus needs photography 120 times for acquiring the 120 object light beams, this embodiment can reduce the number of photographing times to 20 times.
In this embodiment, it is preferred that an azimuth angle of the second object light beams is set to be within a range from +90 degrees to +270 degrees with respect to the azimuth angle of the second reference light beam. In other words, the reference light beam and the plurality of object light beams that are obtained based on a light beam emitted from at least one of the plurality of light sources are incident on the digital sensor (image sensor) at azimuth angles φR and φ, respectively, and it is preferred that the azimuth angles φR and φ satisfy a condition of φR+90°≦φ≦φR+270°. In this range, the second reference light beam and the second object light beams are incident on the sensor in directions roughly symmetric with respect to the optical axis, and accordingly the real image spectrum of the second object light beams can be kept away from the origin. As a result, the overlap with the real image spectrum, the virtual image spectrum, and the spectrum of the crosstalk can be avoided.
In this embodiment, the second reference light beam is obliquely incident on the sensor. In this case, the real image spectrum of the second object light beams moves in a region different from the position indicated by a solid-line arrow 721 of
Referring to
When the condition represented by expression (14) is satisfied, the overlap of the spectrum of the crosstalk and the real image spectrum can be avoided.
Furthermore, there is a condition relating to an angle Δφ1 between the position vectors of the real image spectra of the first object light beams (i.e., (maximum value of) a difference between the azimuth angles of the plurality of object light beams split from a light beam emitted from the first light source) and an angle Δφ2 between the position vectors of the real image spectra of the second object light beams (i.e., (maximum value of) a difference between the position vectors of the real image spectra of the second object light beams). This will be described referring to
|Δφ1|≦240°−|Δφ2| (15)
When the angles Δφ1 and Δφ2 are determined so as to satisfy expression (15), the overlap of the spectrum of the crosstalk and the real image spectrum can be avoided.
In this embodiment, functions that do not indicate zero at a position of a spatial frequency f of the zeroth order light component of m-th, m′-th, and j-th real image spectra of the plurality of object light beams are denoted by Sm(f), Sm′ (f), and Sj(f), and cross-correlation is denoted by *. Then, conditional expressions of (A):|Sm(f)|>0, (B):|Sm′(f)|>0 (m′=1˜M, m≠m′), (C):|Sm′(f)*Sj(f)|>0 (m′=1˜M, j=1˜M), and (D):|Sm′(−f)|>0 (m′=1˜M) are considered. In this case, it is preferred that a solution that simultaneously satisfies a condition of expressions (A) and (B), a condition of expressions (A) and (C), and a condition of expressions (A) and (D) does not exist.
While this embodiment uses light beams with a plurality of wavelengths, it is not limited thereto. As long as a plurality of light beams that do not interfere with each other is used, it is possible to obtain an effect similar to this embodiment. For example, two light beams with linear polarization where polarization directions are orthogonal to each other may be used.
In each embodiment, as angles of the object light beams, the combination of the plurality of azimuth angles φ while fixing the angle θ with respect to the optical axis is described, and alternatively a combination of the angles θ with respect to a plurality of optical axes while fixing the azimuth angle φ may be used. Combinations of both the angles θ with respect to the optical axes and the azimuth angles φ can also be used.
According to each embodiment, an image capturing apparatus and an image capturing method that are capable of acquiring highly-accurate three-dimensional data with a simple configuration and by a small number of photographing times can be provided.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Application No. 2016-002702, filed on Jan. 8, 2016, which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | Kind |
---|---|---|---|
2016-002702 | Jan 2016 | JP | national |