The present invention relates to an imaging apparatus.
WO2014/020791A discloses the technology of using a polarization color filter plate having three light transmission regions having different polarization characteristics and color characteristics (wavelength selectivity) and an imaging element which comprises three polarization filters having different polarization characteristics and capturing an image of three wavelength ranges (so-called multispectral image of three bands).
One embodiment according to the technology of the present disclosure provides an imaging apparatus that captures a multispectral image of four bands or more.
(1) An imaging apparatus comprising an imaging optical system including a first optical element having a plurality of aperture regions of which at least one transmits light beams of a plurality of types of wavelength ranges, and a second optical element that polarizes the light beams transmitted through the first optical element in a plurality of directions, in a vicinity of a pupil of the imaging optical system, an image sensor that receives the light beams transmitted through the imaging optical system, the image sensor receiving light beams transmitted through a plurality of types of third optical elements having different transmission wavelength ranges and a plurality of types of fourth optical elements having different transmission polarization directions, and a signal processing unit that processes signals output from the image sensor to generate a plurality of image signals, in which the number of transmission wavelength ranges of at least one of the aperture regions of the first optical element is equal to or less than the number of transmission wavelength ranges of the third optical element.
(2) The imaging apparatus according to (1) above, in which all of the aperture regions of the first optical element transmits light beams of the plurality of types of wavelength ranges.
(3) The imaging apparatus according to (1) above, in which at least one of the aperture regions of the first optical element transmits a light beam of one type of a wavelength range.
(4) The imaging apparatus according to any one of (1) to (3) above, in which the second optical element allows a part of the light beams transmitted through the first optical element to pass as an unpolarized light beam.
(5) The imaging apparatus according to (1) or (2) above, in which the number of aperture regions provided in the first optical element is equal to or less than the number of transmission polarization directions of the fourth optical element, and all of the aperture regions have the same number of transmission wavelength ranges as the number of transmission wavelength ranges of the third optical element.
(6) The imaging apparatus according to any one of (1) to (4) above, in which at least one of the aperture regions of the first optical element has a transmission wavelength range that partially overlaps with a plurality of the wavelength ranges of the third optical element.
(7) The imaging apparatus according to any one of (1) to (6) above, in which in a case in which the number of transmission wavelength ranges of the imaging optical system is defined as k, the number of transmission wavelength ranges of the third optical element is defined as m, and the number of transmission polarization directions of the fourth optical element is defined as n, the imaging optical system and the image sensor satisfy a relationship of k≤m×n.
(8) The imaging apparatus according to any one of (1) to (7) above, in which at least one set of the aperture regions of the first optical element transmits light beams of the same wavelength ranges.
(9) The imaging apparatus according to any one of (1) to (8) above, in which at least one of the aperture regions of the first optical element is disposed on an optical axis.
(10) The imaging apparatus according to any one of (1) to (9) above, in which the signal processing unit performs interference removal processing to generate the plurality of image signals.
Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings.
[Configuration of Imaging Apparatus]
The imaging apparatus according to the present embodiment is an imaging apparatus that captures a multispectral image of four bands. The imaging apparatus according to the present embodiment mainly comprises an imaging optical system 10, an image sensor 100, and a signal processing unit 200.
[Imaging Optical System]
The imaging optical system 10 is composed of a combination of a plurality of lenses 12. The imaging optical system 10 includes a bandpass filter unit 16 and a polarization filter unit 18 in a vicinity of a pupil thereof. In addition, the imaging optical system 10 includes a focus adjustment mechanism (not shown). The focus adjustment mechanism adjusts a focus by, for example, moving the entire imaging optical system 10 back and forth along an optical axis L.
The bandpass filter unit 16 is an example of a first optical element. The bandpass filter unit 16 is configured by a frame 16A comprising two aperture regions 16A1 and 16A2, and two bandpass filters 16B1 and 16B2 provided in the frame 16A. Note that in the following, if necessary, the two aperture regions 16A1 and 16A2 are distinguished from each other by referring one aperture region 16A1 provided in the frame 16A to as a first aperture region 16A1 and referring the other aperture region 16A2 to as a second aperture region 16A2. In addition, the two bandpass filters 16B1 and 16B2 are distinguished from each other by referring the bandpass filter 16B1 provided in the first aperture region 16A1 to as a first bandpass filter 16B1 and referring the bandpass filter 16B2 provided in the second aperture region 16A2 to as a second bandpass filter 16B2.
The two aperture regions 16A1 and 16A2 provided in the frame 16A have a circular aperture shape and are disposed symmetrically with the optical axis L interposed therebetween. The frame 16A has a light shielding property and allows light beams to pass through only the two aperture regions 16A1 and 16A2.
Each of the first bandpass filter 16B1 and the second bandpass filter 16B2 is configured by a so-called multi-bandpass filter, and transmits the light beams of a plurality of types of the wavelength ranges. In the imaging apparatus according to the present embodiment, each of the first bandpass filter 16B1 and the second bandpass filter 16B2 transmits the light beams of two types of the wavelength ranges. Note that the first bandpass filter 16B1 and the second bandpass filter 16B2 transmit the light beams of the wavelength ranges different from each other.
As shown in
As shown in
The polarization filter unit 18 is an example of a second optical element. The polarization filter unit 18 polarizes the light beams transmitted through the bandpass filter unit 16 in a plurality of directions. The polarization filter unit 18 is configured by a frame 18A comprising two aperture regions 18A1 and 18A2, and one polarization filter 18B2 provided in one aperture region 18A2 of the frame 18A. Note that in the following, if necessary, the two aperture regions 18A1 and 18A2 are distinguished from each other by referring one aperture region 18A1 provided in the frame 18A to as a first aperture region 18A1 and referring the other aperture region 18A2 to as a second aperture region 18A2. The polarization filter 18B2 is provided in the second aperture region 18A2.
The frame 18A has a light shielding property and allows the light beams to pass through only the two aperture regions 18A1 and 18A2. The two aperture regions 18A1 and 18A2 correspond to the two aperture regions 16A1 and 16A2 of the bandpass filter unit 16 and are disposed so as to overlap with the aperture regions 16A1 and 16A2 at the same position. That is, the first aperture region 18A1 has the same aperture shape (circular shape) as the first aperture region 16A1 of the bandpass filter unit 16 and is disposed so as to overlap with the first aperture region 16A1 at the same position. In addition, the second aperture region 18A2 has the same aperture shape as the second aperture region 16A2 of the bandpass filter unit 16 and is disposed so as to overlap with the second aperture region 16A2 at the same position. Therefore, the light beam, which passes through the first aperture region 16A1 of the bandpass filter unit 16, passes through the first aperture region 18A1 of the polarization filter unit 18, and the light beam, which passes through the second aperture region 16A2 of the bandpass filter unit 16, passes through the second aperture region 18A2 of the polarization filter unit 18.
The polarization filter 18B2 provided in the second aperture region 18A2 transmits the light beam of a polarization direction θ (for example, an azimuthal angle of 60°). The polarization direction (polarization azimuth) is represented by an angle (azimuthal angle) formed by a polarization transmission axis with an x-axis (horizontal axis) in an xy plane orthogonal to the optical axis L.
The polarization filter is provided only in the second aperture region 18A2. Therefore, the first aperture region 18A1 transmits an unpolarized light beam.
In the imaging optical system 10 having the configuration described above, a pupil region is split into two regions by the bandpass filter unit 16 and the polarization filter unit 18. That is, the pupil region is split into a first pupil region defined by the first aperture region 16A1 of the bandpass filter unit 16 and the first aperture region 18A1 of the polarization filter unit 18, and a second pupil region defined by the second aperture region 16A2 of the bandpass filter unit 16 and the second aperture region 18A2 of the polarization filter unit 18. The light beams having different characteristics are emitted from the pupil regions. That is, the light beam, which is the unpolarized light beam, having the wavelength range λ11 (first light beam), light beam, which is the unpolarized light beam, having the wavelength range λ12 (second light beam), the light beam, which is the light beam of the polarization direction θ, having the wavelength range λ21 (third light beam), and the light beam, which is the light beam of the polarization direction θ, having the wavelength range λ22 (fourth light beam) are emitted. The first light beam and the second light beam are the light beams which pass through the first aperture region 16A1 of the bandpass filter unit 16 and the first aperture region 18A1 of the polarization filter unit 18. The third light beam and the fourth light beam are the light beams which pass through the second aperture region 16A2 of the bandpass filter unit 16 and the second aperture region 18A2 of the polarization filter unit 18.
[Image Sensor]
As shown in
In the image sensor 100 according to the present embodiment, one pixel block PB (X, Y) is configured by four (2×2) adjacent pixels P1 to P4, and the pixel blocks PB (X, Y) are regularly arranged along the horizontal direction (x-axis direction) and the vertical direction (y-axis direction). Hereinafter, if necessary, the pixels P1 to P4 are distinguished from each other by referring the pixel P1 to as a first pixel P1, referring the pixel P2 to as a second pixel P2, referring the pixel P3 to as a third pixel P3, and referring the pixel P4 to as a fourth pixel P4. The pixels P1 to P4 have different optical characteristics.
The image sensor 100 includes a pixel array layer 110, a polarization filter element array layer 120, a spectral filter element array layer 130, and a micro lens array layer 140. The layers are disposed in the order of the pixel array layer 110, the polarization filter element array layer 120, the spectral filter element array layer 130, and the micro lens array layer 140 from an image plane side to an object side.
The pixel array layer 110 is configured by two-dimensionally arranging a large number of photodiodes 112. One photodiode 112 configures one pixel. The photodiodes 112 are regularly arranged along the horizontal direction (x-axis direction) and the vertical direction (y-axis direction).
The polarization filter element array layer 120 is configured by two-dimensionally arranging two types of the polarization filter elements 122A and 122B having different transmission polarization directions (polarization directions of the transmitted light beams). Hereinafter, if necessary, the polarization filter elements 122A and 122B are distinguished from each other by referring the polarization filter element 122A to as a first polarization filter element 122A and referring the polarization filter element 122B to as a second polarization filter element 122B. The polarization filter elements 122A and 122B are arranged at the same intervals as the photodiodes 112, and each of which is provided for each pixel. The first polarization filter element 122A transmits the light beam of a first polarization direction θ1 (for example, the azimuthal angle of 90°). The second polarization filter element 122B transmits the light beam of a second polarization direction θ2 (for example, the azimuthal angle of 0°), which is different from the first polarization direction θ1. The polarization filter elements 122A and 122B are examples of a fourth optical element.
The polarization filter elements 122A and 122B are regularly arranged in each pixel block PB (X, Y).
As shown in
The spectral filter element array layer 130 is configured by two-dimensionally arranging two types of the spectral filter elements 132A and 132B having different transmission wavelength characteristics from each other. Hereinafter, if necessary, the spectral filter elements 132A and 132B are distinguished from each other by referring the spectral filter element 132A to as a first spectral filter element 132A and referring the spectral filter element 132B to as a second spectral filter element 132B. The spectral filter elements 132A and 132B are arranged at the same intervals as the photodiodes 112, and each of which is provided for each pixel. The spectral filter elements 132A and 132B are examples of a third optical element.
As shown in
In addition, as shown in
As shown in
The micro lens array layer 140 is configured by two-dimensionally arranging a large number of micro lenses 142. The micro lenses 142 are arranged at the same intervals as the photodiodes 112, and each of which is provided for each pixel. The micro lenses 142 are provided for a purpose of efficiently condensing the light beams from the imaging optical system 10 on the photodiodes 112.
In the image sensor 100 configured as described above, in each pixel block PB (X, Y), each of the pixels P1 to P4 receives the light beams from the imaging optical system 10 as follows. That is, the first pixel P1 receives the light beams from the imaging optical system 10 via the first spectral filter element 132A (wavelength range λA) and the first polarization filter element 122A (polarization direction θ1). In addition, the second pixel P2 receives the light beams from the imaging optical system 10 via the first spectral filter element 132A (wavelength range λA) and the second polarization filter element 122B (polarization direction θ2). In addition, the third pixel P3 receives the light beams from the imaging optical system 10 via the second spectral filter element 132B (wavelength range λB) and the second polarization filter element 122B (polarization direction θ2). In addition, the fourth pixel P4 receives the light beams from the imaging optical system 10 via the second spectral filter element 132B (wavelength range λB) and the first polarization filter element 122A (polarization direction θ1). In this way, each of the pixels P1 to P4 of the pixel block PB (X, Y) has different combinations of the spectral filter elements 132A and 132B and the polarization filter elements 122A and 122B, so that the light beams of different wavelength ranges and the polarization characteristics can be received.
[Signal Processing Unit]
The signal processing unit 200 processes signals output from the image sensor 100 to generate image signals (image data) of the multispectral image of four bands. That is, the image signals of four types of the wavelength ranges λ11, λ12, λ21, and λ22 transmitted through the bandpass filter unit 16 of the imaging optical system 10 are generated.
As shown in
The analog signal processing unit 200A takes in an analog pixel signal output from each pixel of the image sensor 100, performs predetermined signal processing (for example, sampling two correlation pile processing, amplification processing, and the like), converts the processed pixel signal into a digital signal, and outputs the converted digital signal.
The image generation unit 200B performs predetermined signal processing on the pixel signal after being converted into the digital signal to generate an image signal of each of the wavelength ranges λ11, λ12, λ21, and λ22.
Each pixel block PB (X, Y) includes the first pixel P1, the second pixel P2, the third pixel P3, and the fourth pixel P4. Therefore, by separating and extracting the pixel signals of the first pixel P1, the second pixel P2, the third pixel P3, and the fourth pixel P4 from each pixel block PB (X, Y), four image signals D1 to D4 are generated. However, interference (crosstalk) has occurred in these four image signals. That is, since the light beam of each wavelength range is incident on each of the pixels P1 to P4, the generated image is an image in which images of the wavelength ranges are mixed at a predetermined ratio. Therefore, the image generation unit 200B performs interference removal processing to generate the image signal of each wavelength range.
Hereinafter, the interference removal processing performed by the signal processing unit 200 will be described.
In each pixel block PB (X, Y), the pixel signal (signal value) obtained from the first pixel P1 is referred to as α1, the pixel signal obtained from the second pixel P2 is referred to as α2, the pixel signal obtained from the third pixel P3 is referred to as α3, and the pixel signal obtained from the fourth pixel P4 is referred to as α4. From each pixel block PB (X, Y), the four pixel signals α1 to α4 can be obtained. The image generation unit 200B calculates four pixel signals β1 to β4 corresponding to the light beams of the wavelength ranges λ11, λ12, λ21, and λ22 from the four pixel signals α1 to α4, and removes the interference. Specifically, the image generation unit 200B calculates the four pixel signals β1 to β4 corresponding to the light beams of the wavelength ranges λ11, λ12, λ21, and λ22 are calculated by Equation 1 using the following matrix A, and removes the interference.
Note that the pixel signal β1 is the pixel signal corresponding to the light beam of the wavelength range λ11, the pixel signal β2 is the pixel signal corresponding to the light beam of the wavelength range λ12, the pixel signal β3 is the pixel signal corresponding to the light beam of the wavelength range λ21, and the pixel signal β4 is the pixel signal corresponding to the light beam of the wavelength range λ22. Therefore, the image signal of the wavelength range λ11 is generated from the pixel signal β1, the image signal of the wavelength range λ12 is generated from the pixel signal β2, the image signal of the wavelength range λ21 is generated from the pixel signal β3, and the image signal of the wavelength range λ22 is generated from the pixel signal β4. Hereinafter, the reason why the interference can be removed by Equation 1 will be described.
The interference occurs by the light beam of each of the wavelength ranges λ11, λ12, λ21, and λ22 mixed into each of the pixels P1 to P4. A ratio (interference ratio) at which the light beam of each of the wavelength ranges λ11, λ12, λ21, and λ22 emitted from the imaging optical system 10 is received by each of the pixels P1 to P4 is bij (i=1 to 4, j=1 to 4). Here, b11 is a ratio of the light beam of the wavelength range λ11 received by the first pixel P1, b12 is a ratio of the light beam of the wavelength range λ12 received by the first pixel P1, b13 is a ratio of the light beam of the wavelength range λ21 received by the first pixel P1, and b14 is a ratio of the light beam of the wavelength range λ22 received by the first pixel P1. In addition, b21 is a ratio of the light beam of the wavelength range λ11 received by the second pixel P2, b22 is a ratio of the light beam of the wavelength range λ12 received by the second pixel P2, b23 is a ratio of the light beam of the wavelength range λ21 received by the second pixel P2, and b24 is a ratio of the light beam of the wavelength range λ22 received by the second pixel P2. In addition, b31 is a ratio of the light beam of the wavelength range λ11 received by the third pixel P3, b32 is a ratio of the light beam of the wavelength range λ12 received by the third pixel P3, b33 is a ratio of the light beam of the wavelength range λ21 received by the third pixel P3, and b34 is a ratio of the light beam of the wavelength range λ22 received by the third pixel P3. In addition, b41 is a ratio of the light beam of the wavelength range λ11 received by the fourth pixel P4, b42 is a ratio of the light beam of the wavelength range λ12 received by the fourth pixel P4, b43 is a ratio of the light beam of the wavelength range λ21 received by the fourth pixel P4, and b44 is a ratio of the light beam of the wavelength range λ22 received by the fourth pixel P4. The ratio bij is uniquely determined by the transmission wavelength range and the transmission polarization direction (including the unpolarized light beam) set in the imaging optical system 10, and the transmission wavelength range and the transmission polarization direction set in each pixel of the image sensor 100. That is, the ratio bij is uniquely determined by a combination of the transmission wavelength range set in each of the aperture regions 16A1 and 16A2 of the bandpass filter unit 16 and the transmission wavelength range set in each of the spectral filter elements 132A and 132B of the image sensor 100 and a combination of the transmission polarization direction (including the unpolarized light beam) set in each of the aperture regions 18A1 and 18A2 of the polarization filter unit 18 and the polarization direction set in each of the polarization filter elements 122A and 122B of the image sensor 100. Therefore, the ratio bij in which the light beam of each of the wavelength ranges λ11, λ12, λ21, and λ22 emitted from the imaging optical system 10 is received by each of the pixels P1 to P4 can be obtained in advance.
The following relationship is satisfied between the pixel signals α1 to α4 obtained by the pixels P1 to P4 of each pixel block PB (X, Y) and the pixel signals β1 to β4 corresponding to the light beams of the wavelength ranges λ11, λ12, λ21, and λ22.
Regarding the pixel signal α1 obtained by the first pixel P1, “b11*β1+b12*β2+b13*β3+b14*β4=α1 . . . Equation 2” is satisfied (“*” is a symbol of integration).
Regarding the pixel signal α2 obtained by the second pixel P2, “b21*β1+b22*β2+b23*β3+b24*β4=α2 . . . Equation 3” is satisfied.
Regarding the pixel signal α3 obtained by the third pixel P3, “b31*β1+b32*β2+b33*β3+b34*β4=α3 . . . Equation 4” is satisfied.
Regarding the pixel signal α4 obtained by the fourth pixel P4, “b41*β1+b42*β2+b43*β3+b44*β4=α4 . . . Equation 5” is satisfied.
Here, the simultaneous equations of Equations 2 to 5 can be expressed by Equation 6 using a matrix B.
B1 to β4, which are the solutions of the simultaneous equations of Equations 2 to 5, are calculated by multiplying both sides of Equation 6 by an inverse matrix B−1 of the matrix B.
In this way, the pixel signals β1 to β4 corresponding to the wavelength ranges λ11, λ12, λ21, and λ22 can be calculated from the signal values (pixel signals) α1 to α4 of the pixels P1 to P4 based on the ratio in which the light beam of the wavelength ranges λ11, λ12, λ21, and λ22 emitted from the imaging optical system 10 received by the pixels P1 to P4 of the pixel block PB (X, Y).
In Equation 1, the inverse matrix B−1 of Equation 7 is set to A (B−1=A). Therefore, elements aij of the matrix A in Equation 1 can be acquired by obtaining the inverse matrix B−1 of the matrix B.
The coefficient storage unit 200C stores the elements aij of the matrix A for performing the interference removal processing, as a coefficient group.
The image generation unit 200B acquires the coefficient group from the coefficient storage unit 200C, calculates the pixel signals β1 to β4 corresponding to the wavelength ranges λ11, λ12, λ21, and λ22 by Equation 1 from the pixel signals α1 to α4 obtained from the pixels P1 to P4 of each pixel block PB (X, Y), and generates the image signals of the wavelength ranges λ11, λ12, λ21, and λ22.
The image signals of the wavelength ranges λ11, λ12, λ21, and λ22 generated by the image generation unit 200B are output to the outside and stored in a storage device (not shown), if necessary. In addition, the image signals thereof are displayed on a display (not shown), if necessary.
[Action of Imaging Apparatus]
The light beams incident on the imaging optical system 10 become four types of the light beams of different characteristics, which are incident on the image sensor 100. Specifically, the light beams become the light beam, which is the unpolarized light beam, having the wavelength range λ11 (first light beam), the light beam, which is the unpolarized light beam, having the wavelength range λ12 (second light beam), the light beam, which is the light beam of the polarization direction θ, having the wavelength range λ21 (third light beam), and the light beam, which is the light beam of the polarization direction θ, having the wavelength range λ22 (fourth light beam), which are incident on the image sensor 100.
In each pixel block PB (X, Y) of the image sensor 100, the light beam of each of the wavelength ranges emitted from the imaging optical system 10 is received in each of the pixels P1 to P4 at a predetermined ratio. That is, the light beam of each of the wavelength ranges is received at the predetermined ratio bij by the actions of the polarization filter elements 122A and 122B and the spectral filter elements 132A and 132B provided in each of the pixels P1 to P4.
The signal processing unit 200 calculates the pixel signals β1 to β4 corresponding to the light beams of the wavelength ranges λ11, λ12, λ21, and λ22 from the pixel signals α1 to α4 obtained from the pixels P1 to P4 of each pixel block PB (X, Y) of the image sensor 100, and generates the image signals of the wavelength ranges λ11, λ12, λ21, and λ22. That is, the signal processing unit 200 performs arithmetic processing (interference removal processing) by Equation 1 using the matrix A, calculates the pixel signals β1 to β4 corresponding to the light beam of the wavelength ranges λ11, λ12, λ21, and λ22 from the pixel signals α1 to α4 of the pixels P1 to P4 obtained from the image sensor 100, and generates the image signals of the wavelength ranges λ11, λ12, λ21, and λ22.
In this way, with the imaging apparatus according to the present embodiment, the image of four types of wavelength ranges (multispectral image of four bands) can be captured by using one imaging optical system 10 and one (single plate) image sensor 100.
In addition, in the imaging apparatus according to the present embodiment, the bandpass filter unit 16 of the imaging optical system 10 transmits the light beams of a plurality of types of the wavelength ranges from one aperture region. As a result, it is possible to increase an aperture size as compared with a configuration in which the light beam of each wavelength range is individually extracted. As a result, the sensitivity can be improved. In particular, the number of transmission wavelength ranges set in each aperture region (the number of transmission wavelength ranges of the bandpass filter provided in each aperture region) is set to the same number as the number of transmission wavelength ranges of the image sensor (the number of types of the spectral filter elements provided in the image sensor), so that it is possible to increase the aperture size as much as possible and the sensitivity can be improved.
In addition, in the imaging apparatus according to the present embodiment, one of the aperture regions of the polarization filter unit 18 of the imaging optical system 10 is transparent. As a result, it is possible to prevent a decrease in an amount of the light beams due to the polarized light beam. Therefore, it is possible to improve the sensitivity by this configuration as well. In addition, with this configuration, it is possible to capture a polarized image at the same time.
The imaging apparatus according to the present embodiment is different from the imaging apparatus according to the first embodiment in that polarization filters 18B1 and 18B2 are provided in both the first aperture region 18A1 and the second aperture region 16A2 of the polarization filter unit 18. Other configurations are the same as those of the imaging apparatus according to the first embodiment. Therefore, here, only the configuration of the polarization filter unit 18 will be described.
As shown in
Note that in a case in which the polarization filter is provided in each of all of the aperture regions as in the present embodiment, a configuration can also be adopted in which one of a plurality of types of the polarization filter elements provided in the image sensor is transparent.
In addition, in a case of capturing the multispectral image of four bands, the interference can be prevented by adopting the following configuration. That is, the transmission polarization directions of the first polarization filter 18B1 and the second polarization filter 18B2 provided in the polarization filter unit 18 are set to be directions orthogonal to each other, and the transmission polarization directions of the first polarization filter element 122A and the second polarization filter element 122B provided in the image sensor 100 are set to be directions orthogonal to each other. For example, the first polarization filter 18B1 of the polarization filter unit 18 uses the polarization filter which transmits the light beam of the azimuthal angle of 90°, and the second polarization filter 18B2 thereof uses the polarization filter which transmits the light beam of the azimuthal angle of 0°. On the other hand, the first polarization filter element 122A of the image sensor 100 uses the polarization filter element which transmits the light beam of the azimuthal angle of 90°, and the second polarization filter element 122B uses the polarization filter element which transmits the light beam of the azimuthal angle of 0°. As a result, in the image sensor 100, it is possible to separate the light beam of each wavelength range and to receive the light beam. Therefore, in the signal processing unit 200, it is possible to generate the image signal of each wavelength range without performing the interference removal processing.
The imaging apparatus according to the present embodiment has different configurations of the bandpass filter unit 16 and the polarization filter unit 18 from the imaging apparatus according to the first embodiment. Other configurations are the same as those of the imaging apparatus according to the first embodiment. Therefore, here, only the configurations of the polarization filter unit 18 and the bandpass filter unit 16 will be described.
As shown in
The first aperture region 16A1 has a rectangular aperture shape, and is disposed in a region on one side of the frame 16A that is split into two equal parts. The second aperture region 16A2 and the third aperture region 16A3 both have a circular aperture shape, and are disposed in the other region of the frame 16A that is split into two equal parts.
The bandpass filters 16B1, 16B2, and 16B3 having different transmission wavelength ranges are provided in the aperture regions 16A1 to 16A3, respectively. Hereinafter, the three bandpass filters 16B1, 16B2, and 16B3 are distinguished from each other by referring the bandpass filter 16B1 provided in the first aperture region 16A1 to as a first bandpass filter 16B1, referring the bandpass filter 16B2 provided in the second aperture region 16A2 to as the second bandpass filter 16B2, and referring the bandpass filter 16B3 provided in the third aperture region 16A3 to as a third bandpass filter 16B3.
As shown in
As shown in
As shown in
As shown in
The aperture regions 18A1 to 18A3 have the same aperture shape as the corresponding aperture regions 16A1 to 16A3 of the bandpass filter unit 16 and are disposed so as to overlap with the aperture regions 16A1 to 16A3 at the same position. Therefore, the light beam, which passes through each of the aperture regions 16A1 to 16A3 of the bandpass filter unit 16, is incident on each of the corresponding aperture regions 18A1 to 18A3.
The first aperture region 18A1 is transparent. Therefore, the unpolarized light beam passes through the first aperture region 18A1.
The polarization filter 18B2 is provided in the second aperture region 18A2 and the third aperture region 18A3. The polarization filter 18B2 transmits the light beam of the polarization direction θ (for example, the azimuthal angle of 60°).
With the imaging apparatus having the configuration described above, the light beam, which passes through the first aperture region 16A1 of the bandpass filter unit 16, passes through the first aperture region 18A1 of the polarization filter unit 18 and is incident on the image sensor 100. In addition, the light beam, which passes through the second aperture region 16A2 of the bandpass filter unit 16, passes through the second aperture region 18A2 of the polarization filter unit 18 and is incident on the image sensor 100. In addition, the light beam, which passes through the third aperture region 16A3 of the bandpass filter unit 16, passes through the third aperture region 18A3 of the polarization filter unit 18 and is incident on the image sensor 100.
The bandpass filter (first bandpass filter 16B1) which transmits the light beams of two types of the wavelength ranges λ11 and λ12 is provided in the first aperture region 16A1 of the bandpass filter unit 16. In addition, the bandpass filters (second bandpass filter 16B2 and third bandpass filter 16B3) which transmit the light beams of one type of the wavelength ranges λ2 and λ3 are provided in the second aperture region 16A2 and the third aperture region 16A3. Therefore, four types of the light beams having different characteristics are emitted from the imaging optical system 10. That is, the light beam, which is the unpolarized light beam, having the wavelength range λ11 (first light beam), the light beam, which is the unpolarized light beam, having the wavelength range λ12 (second light beam), the light beam, which is the light beam of the polarization direction θ, having the wavelength range λ21 (third light beam), and the light beam, which is the light beam of the polarization direction θ, having the wavelength range λ22 (fourth light beam) are emitted. Therefore, in the imaging apparatus according to the present embodiment, it is also possible to capture the image of four wavelength ranges (multispectral image of four bands) as in the imaging apparatus according to the first embodiment.
Note that in the present embodiment, the first aperture region 18A1 of the polarization filter unit 18 transparent, but as in the second embodiment, the polarization filter that polarizes the passing light beam in a predetermined polarization direction may be provided.
The imaging apparatus according to the present embodiment has a different configuration of the bandpass filter unit 16 from the imaging apparatus according to the first embodiment. Specifically, the transmission wavelength characteristic of the second bandpass filter 16B2 provided in the second aperture region 16A2 of the bandpass filter unit 16 is different from that of the imaging apparatus according to the first embodiment. Other configurations are the same as those of the imaging apparatus according to the first embodiment. Therefore, here, only the transmission wavelength characteristic of the second bandpass filter 16B2 will be described.
Note that in
As shown in
By using the second bandpass filter 16B2 having the configuration described above, the image sensor 100 receives the light beams of substantially four wavelength ranges. That is, the light beam transmitted through the second bandpass filter 16B2 is the light beam of one type of the wavelength range λ2, but it is substantially separated into two types of the wavelength ranges by the first spectral filter element 132A and the second spectral filter element 132B. Therefore, the light beams of four types of the wavelength ranges are received together with the light beams of two wavelength ranges separated by the first bandpass filter 16B1.
In this way, even in a case in which the second bandpass filter 16B2 is configured to transmit the light beam of one type of the wavelength range λ2, the first spectral filter element 132A and the second spectral filter element 132B provided in the image sensor 100 can be used to substantially separate the light beam into the light beams of two types of the wavelength ranges. In addition, with this configuration, it is possible to simplify the creation of the bandpass filter. Further, it is possible to reduce a wavelength shift that occurs in a case in which the light beam is incident on the filter obliquely.
In the imaging apparatus according to the first embodiment, in a case of capturing the multispectral image of four bands, one pixel block is configured by four pixels, and two types of the polarization filter elements and two types of the spectral filter elements are used in combination to change the optical characteristic of each pixel. For the polarization filter element provided in each pixel of one pixel block, the polarization filter elements having polarization characteristics different from each other can also be used. Hereinafter, in a case of capturing the multispectral image of four bands, a case will be described in which one pixel block is configured by four pixels and each pixel comprises the polarization filter element having a different transmission polarization direction. Note that the configuration is the same as that of the imaging apparatus according to the first embodiment except that the arrangement of the polarization filter elements is different. Therefore, here, only the arrangement of the polarization filter elements will be described.
As shown in
Note that the number of aperture regions provided in the bandpass filter unit can be set to be equal to or less than the number of the transmission polarization directions of the image sensor (equal to or less than the number of types of the polarization filter elements provided in the image sensor). In this case, it is possible to increase the aperture size as much as possible by setting the number of transmission wavelength ranges set in each aperture region to be the same as the number of transmission wavelength ranges of the image sensor. In addition, as a result, the sensitivity can be improved.
The imaging apparatus according to the present embodiment captures a multispectral image of nine bands. The configuration is the same as that of the imaging apparatus according to the first embodiment (see
[Imaging Optical System]
The imaging optical system 10 of imaging apparatus according to the present embodiment has different configurations of the bandpass filter unit 16 and the polarization filter unit 18 from that of the imaging optical system according to the first embodiment.
The bandpass filter unit 16 has the three aperture regions 16A1, 16A2, and 16A3 in the frame 16A. Hereinafter, if necessary, the aperture regions 16A1 to 16A3 are distinguished from each other by referring the aperture region 16A1 to as the first aperture region 16A1, referring the aperture region 16A2 to as the second aperture region 16A2, and referring the aperture region 16A3 to as the third aperture region 16A3.
The aperture regions 16A1 to 16A3 have the same aperture shape (rectangular shape) and are disposed symmetrically with respect to the optical axis L. Specifically, the second aperture region 16A2 is disposed on the optical axis, and the first aperture region 16A1 and the third aperture region 16A3 are disposed symmetrically with the second aperture region 16A2 interposed therebetween.
The bandpass filters 16B1 to 16B3 having different transmission wavelength characteristics are provided in the aperture regions 16A1 to 16A3, respectively. Hereinafter, if necessary, the bandpass filters 16B1 to 16B3 are distinguished from each other by referring the bandpass filter 16B1 provided in the first aperture region 16A1 to as a first bandpass filter 16B1, referring the bandpass filter 16B2 provided in the second aperture region 16A2 to as the second bandpass filter 16B2, and referring the bandpass filter 16B3 provided in the third aperture region 16A3 to as the third bandpass filter 16B3.
As shown in
As shown in
As shown in
As shown in
The three aperture regions 18A1, 18A2, and 18A3 correspond to the three aperture regions 16A1, 16A2, and 16A3 of the bandpass filter unit 16 and are disposed so as to overlap with the aperture regions 16A1, 16A2, and 16A3 at the same position. That is, the first aperture region 18A1 has the same aperture shape as the first aperture region 16A1 of the bandpass filter unit 16 and is disposed so as to overlap with the first aperture region 16A1 at the same position. In addition, the second aperture region 18A2 has the same aperture shape as the second aperture region 16A2 of the bandpass filter unit 16 and is disposed so as to overlap with the second aperture region 16A2 at the same position. In addition, the third aperture region 18A3 has the same aperture shape as the third aperture region 16A3 of the bandpass filter unit 16 and is disposed so as to overlap with the third aperture region 16A3 at the same position. Therefore, the light beam, which passes through the first aperture region 16A1 of the bandpass filter unit 16, passes through the first aperture region 18A1, the light beam, which passes through the second aperture region 16A2 of the bandpass filter unit 16, passes through the second aperture region 18A2, and the light beam, which passes through the third aperture region 16A3 of the bandpass filter unit 16, passes through the third aperture region 18A3.
The polarization filters 18B2 and 18B3 having different transmission polarization directions are provided in the second aperture region 18A2 and the third aperture region 18A3, respectively. Specifically, the second aperture region 18A2 comprises the polarization filter 18B2 which allows the light beam of the polarization direction θ2 (for example, the azimuthal angle of 150°) to pass. The polarization filter 18B3 which allows the light beam of the polarization direction θ3 (for example, the azimuthal angle of 150°) to pass is provided in the third aperture region 18A3.
The polarization filter is provided only in the second aperture region 18A2 and the third aperture region 18A3. Therefore, the first aperture region 18A1 transmits the unpolarized light beam.
In the imaging optical system 10 having the configuration described above, the pupil region is split into three regions by the combination of the bandpass filter unit 16 and the polarization filter unit 18. That is, the pupil region is split into the first pupil region defined by the first aperture region 16A1 of the bandpass filter unit 16 and the first aperture region 18A1 of the polarization filter unit 18, the second pupil region defined by the second aperture region 16A2 of the bandpass filter unit 16 and the second aperture region 18A2 of the polarization filter unit 18, and a third pupil region defined by the third aperture region 16A3 of the bandpass filter unit 16 and the third aperture region 18A3 of the polarization filter unit 18. The light beams having different characteristics are emitted from the pupil regions. That is, the light beam, which is the unpolarized light beam, having the wavelength range 11 (first light beam), the light beam, which is the unpolarized light beam, having the wavelength range λ12 (second light beam), the light beam, which is the unpolarized light beam, having the wavelength range λ13 (third light beam), the light beam, which is the light beam of the polarization direction θ2, having the wavelength range λ21 (fourth light beam), the light beam, which is the light beam of the polarization direction θ2, having the wavelength range λ22 (fifth light beam), the light beam, which is the light beam of the polarization direction θ2, having the wavelength range λ23 (sixth light beam), the light beam, which is the light beam of the polarization direction θ3, having the wavelength range λ31 (seventh light beam), the light beam, which is the light beam of the polarization direction θ3, having the wavelength range λ32 (eighth light beam), and the light beam, which is the light beam of the polarization direction θ3, having the wavelength range λ33 (ninth light beam) are emitted.
[Image Sensor]
As shown in
The configuration is the same as that of the imaging apparatus according to the first embodiment (
The polarization filter element array layer 120 comprises three types of the polarization filter elements 122A, 122B, and 122C having different transmission polarization directions. Hereinafter, if necessary, the polarization filter elements 122A to 122C are distinguished from each other by referring the polarization filter element 122A to as the first polarization filter element 122A, referring the polarization filter element 122B to as the second polarization filter element 122B, and referring the polarization filter element 122C to as the third polarization filter element 122C. The first polarization filter element 122A transmits the light beam of the polarization direction θ1 (for example, the azimuthal angle of 90°). The second polarization filter element 122B transmits the light beam of the polarization direction θ2 (for example, the azimuthal angle of 45°). The third polarization filter element 122C transmits the light beam of the polarization direction θ3 (for example, the azimuthal angle of 0°).
As shown in
The spectral filter element array layer 130 comprises three types of the spectral filter elements 132A, 132B, and 132C having different transmission wavelength characteristics. Hereinafter, if necessary, the spectral filter elements 132A to 132C are distinguished from each other by referring the spectral filter element 132A to as the first spectral filter element 132A, referring the spectral filter element 132B to as the second spectral filter element 132B, and referring the spectral filter element 132C to as a third spectral filter element 132C.
As shown in
As shown in
As shown in
As shown in
In the image sensor 100 configured as described above, in each pixel block PB (X, Y), each of the pixels P1 to P9 receives the light beam from the imaging optical system 10 as follows. That is, the first pixel P1 receives the light beams from the imaging optical system 10 via the first spectral filter element 132A and the first polarization filter element 122A. The second pixel P2 receives the light beams from the imaging optical system 10 via the first spectral filter element 132A and the second polarization filter element 122B. The third pixel P3 receives the light beams from the imaging optical system 10 via the first spectral filter element 132A and the third polarization filter element 122C. The fourth pixel P4 receives the light beams from the imaging optical system 10 via the second spectral filter element 132B and the first polarization filter element 122A. The fifth pixel P5 receives the light beams from the imaging optical system 10 via the second spectral filter element 132B and the second polarization filter element 122B. The sixth pixel P6 receives the light beams from the imaging optical system 10 via the second spectral filter element 132B and the third polarization filter element 122C. The seventh pixel P7 receives the light beams from the imaging optical system 10 via the third spectral filter element 132C and the first polarization filter element 122A. The eighth pixel P8 receives the light beams from the imaging optical system 10 via the third spectral filter element 132C and the second polarization filter element 122B. The ninth pixel P9 receives the light beams from the imaging optical system 10 via the third spectral filter element 132C and the third polarization filter element 122C. In this way, the pixels P1 to P9 of the pixel block PB (X, Y) receive the light beams having different characteristics, respectively, by having different combinations of the spectral filter elements 132A to 132C and the polarization filter elements 122A to 122C.
[Signal Processing Unit]
The signal processing unit 200 processes the signals output from the image sensor 100 to generate the image signals of the multispectral image of nine bands. That is, the image signals of nine wavelength ranges λ11, λ12, λ13, λ21, λ22, λ23, λ31, λ32, and λ33 which are transmitted through the bandpass filter unit 16 of the imaging optical system 10 are generated. The signal processing unit 200 is the same as the signal processing unit 200 according to the first embodiment in that it performs the predetermined interference removal processing to generate the image signal of each wavelength range.
[Action of Imaging Apparatus]
The light beams incident on the imaging optical system 10 become nine types of the light beams having different characteristics, which are incident on the image sensor 100. Specifically, the light beams become the light beam, which is the unpolarized light beam, having the wavelength range λ11 (first light beam), the light beam, which is the unpolarized light beam, having the wavelength range λ12 (second light beam), the light beam, which is the unpolarized light beam, having the wavelength range λ13 (third light beam), the light beam, which is the light beam of the polarization direction θ2, having the wavelength range λ21 (fourth light beam), the light beam, which is the light beam of the polarization direction θ2, having the wavelength range λ22 (fifth light beam), the light beam, which is the light beam of the polarization direction θ2, having the wavelength range λ23 (sixth light beam), the light beam, which is the light beam of the polarization direction θ3, having the wavelength range λ31 (seventh light beam), the light beam, which is the light beam of the polarization direction θ3, having the wavelength range λ32 (eighth light beam), and the light beam, which is the light beam of the polarization direction θ3, having the wavelength range λ33 (ninth light beam), which are incident on the image sensor 100.
In each pixel block PB (X, Y) of the image sensor 100, the light beam emitted from the imaging optical system 10 is received in each of the pixels P1 to P9 at a predetermined ratio. That is, the light beam of the wavelength range λ11, the light beam of the wavelength range λ12, the light beam of the wavelength range λ13, the light beam of the wavelength range λ21, the light beam of the wavelength range λ22, the light beam of the wavelength range λ23, the light beam of the wavelength range λ31, the light beam of the wavelength range λ32, the light beam of the wavelength range λ33 are received at a predetermined ratio.
The signal processing unit 200 calculates the pixel signals β1 to β9 corresponding to the light beams of the wavelength ranges λ11, λ12, λ13, λ21, λ22, λ23, λ31, λ32, and λ33 from the pixel signals α1 to α9 obtained from the pixels P1 to P9 of each pixel block PB (X, Y) of the image sensor 100, and generates the image signals of the wavelength ranges λ11, λ12, λ13, λ21, λ22, λ23, λ31, λ32, and λ33.
In this way, with the imaging apparatus according to the present embodiment, the multispectral image of nine bands can be captured by using one imaging optical system 10 and one image sensor 100.
[Modification Example of Bandpass Filter Unit and Polarization Filter Unit]
As shown in
As shown in
The polarization filter unit 18 according to the present embodiment is a combination of the polarization filter unit shown in
As shown in
As shown in
As in the present embodiment, it is possible to increase the aperture size by appropriately using the multi-bandpass filter. As a result, the sensitivity can be improved.
As shown in
As shown in
As shown in
According to the bandpass filter unit having the configuration described above, in the imaging optical system, the light beams of eight types of the wavelength ranges λ11, λ12, λ21, λ22, λ23, λ31, λ32, and λ33 are separated. However, the image sensor can receive the light beams of substantially nine types of the wavelength ranges. That is, the light beams of the wavelength range λ11 separated by the first bandpass filter are substantially separated into two wavelength ranges by the first spectral filter element 132A and the second spectral filter element 132B. Therefore, the image sensor can receive the light beams of substantially nine types of the wavelength ranges.
The imaging apparatus according to the present embodiment captures a multispectral image of twelve bands. Note that the configuration is the same as that of the imaging apparatus according to the first embodiment (see
[Imaging Optical System]
The imaging optical system of imaging apparatus according to the present embodiment has different configurations of the bandpass filter unit 16 and the polarization filter unit 18 from that of the imaging optical system 10 of an imaging apparatus 1 according to the first embodiment.
The bandpass filter unit 16 has the three aperture regions (first aperture region 16A1, second aperture region 16A2, and third aperture region 16A3) in the frame 16A. The bandpass filters (first bandpass filter 16B1, second bandpass filter 16B2, third bandpass filter 16B3) having different transmission wavelength characteristics are provided in the aperture regions 16A1 to 16A3, respectively. Each of the bandpass filters 16B1 to 16B3 is configured by the multi-bandpass filter, and transmits the light beams of four types of the wavelength ranges different from each other. Specifically, the first bandpass filter 16B1 transmits the light beams of wavelength ranges Rλ1, Gλ1, Bλ1, and IRλ1. The second bandpass filter 16B2 transmits the light beams of wavelength ranges Rλ2, Gλ2, Bλ2, and IRλ2. The third bandpass filter 16B3 transmits the light beams of wavelength ranges Rλ3, Gλ3, Bλ3, and IRλ3.
As shown in
From the imaging optical system having the configuration described above, twelve types of the light beams having different characteristics are emitted. That is, the light beam, which is the light beam of the polarization direction θ1, having the wavelength range Rλ1 (first light beam), the light beam, which is the light beam of the polarization direction θ1, having the wavelength range Gλ1 (second light beam), the light beam, which is the light beam of the polarization direction θ1, having the wavelength range Bλ1 (third light beam), the light beam, which is the light beam of the polarization direction θ1, having the wavelength range IRλ1 (fourth light beam), the light beam, which is the light beam of the polarization direction θ2, having the wavelength range Rλ2 (fifth light beam), the light beam, which is the light beam of the polarization direction θ2, having the wavelength range Gλ2 (sixth light beam), the light beam, which is the light beam of the polarization direction θ2, having the wavelength range Bλ2 (seventh light beam), the light beam, which is the light beam of the polarization direction θ2, having the wavelength range IRλ2 (eighth light beam), the light beam, which is the light beam of the polarization direction θ3, having the wavelength range Rλ3 (ninth light beam), the light beam, which is the light beam of the polarization direction θ3, having the wavelength range Gλ3 (tenth light beam), the light beam, which is the light beam of the polarization direction θ3, having the wavelength range Bλ3 (eleventh light beam), and the light beam, which is the light beam of the polarization direction θ3, having the wavelength range IRλ3 (twelfth light beam) are emitted.
[Image Sensor]
As shown in
The configuration is the same as that of the imaging apparatus according to the first embodiment (
The polarization filter element array layer 120 comprises four types of the polarization filter elements (first polarization filter element 122A, second polarization filter element 122B, third polarization filter element 122C, fourth polarization filter element 122D) having different transmission polarization directions. Specifically, the first polarization filter element 122A transmits the light beam of the polarization direction θ1 (for example, the azimuthal angle of) 45°. The second polarization filter element 122B transmits the light beam of the polarization direction θ2 (for example, the azimuthal angle of 90°). The third polarization filter element 122C transmits the light beam of the polarization direction θ3 (for example, the azimuthal angle of 135°). The fourth polarization filter element 122D transmits the light beam of the polarization direction θ4 (for example, the azimuthal angle of 0°).
As shown in
The spectral filter element array layer 130 comprises four types of the spectral filter elements (first spectral filter element 132A, second spectral filter element 132B, third spectral filter element 132C, fourth spectral filter element 132D) having different transmission wavelength characteristics.
As shown in
As an example, in the imaging apparatus according to the present embodiment, the first spectral filter element 132A is configured by the spectral filter element which transmits the light beam of the red (R) wavelength range Rλ. The second spectral filter element 132B is configured by the spectral filter element which transmits the light beam of the green (G) wavelength range Gk. The third spectral filter element 132C is configured by the spectral filter element which transmits the light beam of the blue (B) wavelength range Bk. The fourth spectral filter element 132D is configured by the spectral filter element which transmits the light beam of the infrared (IR) wavelength range IRK.
The four types of the wavelength ranges Rλ1, Gλ1, Bλ1, and IRλ1 transmitted through the first bandpass filter 16B1 are set within ranges of the transmission wavelength range Rλ of the first spectral filter element 132A, the transmission wavelength range Gλ of the second spectral filter element 132B, the transmission wavelength range Bλ of the third spectral filter element 132C, the transmission wavelength range IRλ of the fourth spectral filter element 132D, respectively. That is, the wavelength range Rλ1 is set within the range of the transmission wavelength range Rλ of the first spectral filter element 132A. In addition, the wavelength range Gλ1 is set within the range of the transmission wavelength range Gλ of the second spectral filter element 132B. In addition, the wavelength range Bλ1 is set within the range of the transmission wavelength range Bλ of the third spectral filter element 132C. In addition, the wavelength range IRλ1 is set within the range of the transmission wavelength range IRλ of the fourth spectral filter element 132D.
In addition, the four types of the wavelength ranges Rλ2, Gλ2, Bλ2, and IRλ2 transmitted through the second bandpass filter 16B2 are set within ranges of the transmission wavelength range Rλ of the first spectral filter element 132A, the transmission wavelength range Gλ of the second spectral filter element 132B, the transmission wavelength range Bλ of the third spectral filter element 132C, the transmission wavelength range IRλ of the fourth spectral filter element 132D, respectively. That is, the wavelength range Rλ2 is set within the range of the transmission wavelength range Rλ of the first spectral filter element 132A. In addition, the wavelength range Gλ2 is set within the range of the transmission wavelength range Gλ of the second spectral filter element 132B. In addition, the wavelength range Bλ2 is set within the range of the transmission wavelength range Bλ, of the third spectral filter element 132C. In addition, the wavelength range IRλ2 is set within the range of the transmission wavelength range IRλ of the fourth spectral filter element 132D.
In addition, the four types of the wavelength ranges Rλ3, Gλ3, Bλ3, and IRλ3 transmitted through the third bandpass filter 16B3 are set within ranges of the transmission wavelength range Rλ of the first spectral filter element 132A, the transmission wavelength range Gλ of the second spectral filter element 132B, the transmission wavelength range Bλ of the third spectral filter element 132C, the transmission wavelength range IRλ of the fourth spectral filter element 132D, respectively. That is, the wavelength range Rλ3 is set within the range of the transmission wavelength range Rλ of the first spectral filter element 132A. In addition, the wavelength range Gλ3 is set within the range of the transmission wavelength range Gλ of the second spectral filter element 132B. In addition, the wavelength range Bλ3 is set within the range of the transmission wavelength range Bλ, of the third spectral filter element 132C. In addition, the wavelength range IRλ3 is set within the range of the transmission wavelength range IRλ of the fourth spectral filter element 132D.
In the image sensor 100 configured as described above, in each pixel block PB (X, Y), each of the pixels P1 to P16 receives the light beam from the imaging optical system 10 as follows. That is, the first pixel P1 receives the light beams from the imaging optical system 10 via the first spectral filter element 132A and the first polarization filter element 122A. The second pixel P2 receives the light beams from the imaging optical system 10 via the first spectral filter element 132A and the second polarization filter element 122B. The third pixel P3 receives the light beams from the imaging optical system 10 via the second spectral filter element 132B and the first polarization filter element 122A. The fourth pixel P4 receives the light beams from the imaging optical system 10 via the second spectral filter element 132B and the second polarization filter element 122B. The fifth pixel P5 receives the light beams from the imaging optical system 10 via the first spectral filter element 132A and the third polarization filter element 122C. The sixth pixel P6 receives the light beams from the imaging optical system 10 via the first spectral filter element 132A and the fourth polarization filter element 122D. The seventh pixel P7 receives the light beams from the imaging optical system 10 via the second spectral filter element 132B and the third polarization filter element 122C. The eighth pixel P8 receives the light beams from the imaging optical system 10 via the second spectral filter element 132B and the fourth polarization filter element 122D. The ninth pixel P9 receives the light beams from the imaging optical system 10 via the third spectral filter element 132C and the first polarization filter element 122A. The tenth pixel P10 receives the light beams from the imaging optical system 10 via the third spectral filter element 132C and the second polarization filter element 122B. The eleventh pixel P11 receives the light beams from the imaging optical system 10 via the fourth spectral filter element 132D and the first polarization filter element 122A. The twelfth pixel P12 receives the light beams from the imaging optical system 10 via the fourth spectral filter element 132D and the second polarization filter element 122B. The thirteenth pixel P13 receives the light beams from the imaging optical system 10 via the third spectral filter element 132C and the third polarization filter element 122C. The fourteenth pixel P14 receives the light beams from the imaging optical system 10 via the third spectral filter element 132C and the fourth polarization filter element 122D. The fifteenth pixel P15 receives the light beams from the imaging optical system 10 via the fourth spectral filter element 132D and the third polarization filter element 122C. The sixteenth pixel P16 receives the light beams from the imaging optical system 10 via the fourth spectral filter element 132D and the fourth polarization filter element 122D.
[Signal Processing Unit]
The signal processing unit 200 processes the signals output from the image sensor 100 to generate the image signals of the multispectral image of twelve bands. That is, the image signals of twelve types of the wavelength ranges Rλ1, Gλ1, Bλ1, IRλ1, Rλ2, Gλ2, Bλ2, IRλ2, Rλ3, Gλ3, Bλ3, and IRλ3 which are transmitted through the bandpass filter unit 16 of the imaging optical system are generated. The signal processing unit 200 is the same as the signal processing unit 200 according to the first embodiment in that it performs the predetermined interference removal processing to generate the image signal of each wavelength range.
[Action of Imaging Apparatus]
The light beams incident on the imaging optical system 10 become twelve types of the light beams having different characteristics, which are incident on the image sensor 100. Specifically, the light beams become the light beam, which is the light beam of the polarization direction θ1, having the wavelength range Rλ1 (first light beam), the light beam, which is the light beam of the polarization direction θ1, having the wavelength range Gλ1 (second light beam), the light beam, which is the light beam of the polarization direction θ1, having the wavelength range Bλ1 (third light beam), the light beam, which is the light beam of the polarization direction θ1, having the wavelength range IRλ1 (fourth light beam), the light beam, which is the light beam of the polarization direction θ2, having the wavelength range Rλ2 (fifth light beam), the light beam, which is the light beam of the polarization direction θ2, having the wavelength range Gλ2 (sixth light beam), the light beam, which is the light beam of the polarization direction θ2, having the wavelength range Bλ2 (seventh light beam), the light beam, which is the light beam of the polarization direction θ2, having the wavelength range IRλ2 (eighth light beam), the light beam, which is the light beam of the polarization direction θ3, having the wavelength range Rλ3 (ninth light beam), the light beam, which is the light beam of the polarization direction θ3, having the wavelength range Gλ3 (tenth light beam), the light beam, which is the light beam of the polarization direction θ3, having the wavelength range Bλ3 (eleventh light beam), and the light beam, which is the light beam of the polarization direction θ3, having the wavelength range IRλ3 (twelfth light beam), which are incident on the image sensor 100.
In each pixel block PB (X, Y) of the image sensor 100, the light beams (first light beam to twelfth light beam) emitted from the imaging optical system 10 are received in each of the pixels P1 to P16 at a predetermined ratio.
The signal processing unit 200 calculates the pixel signals β1 to β12 corresponding to the light beams of the wavelength ranges Rλ1, Gλ1, Bλ1, IRλ1, Rλ2, Gλ2, Bλ2, IRλ2, Rλ3, Gλ3, Bλ3, and IRλ3 from the pixel signals α1 to α16 obtained from the pixels P1 to P16 of each pixel block PB (X, Y) of the image sensor 100, and generates the image signals of the wavelength ranges Rλ1, Gλ1, Bλ1, IRλ1, Rλ2, Gλ2, Bλ2, IRλ2, Rλ3, Gλ3, Bλ3, and IRλ3.
In this way, with the imaging apparatus according to the present embodiment, the multispectral image of twelve bands can be captured by using one imaging optical system 10 and one image sensor 100.
[Imaging Apparatus Capturing Multispectral Image of N Bands]
As described above, according to the present invention, it is possible to capture the multispectral image of four bands or more. Here, the type of the polarization filter element provided in the image sensor is assumed to n type (n≥2) and the type of the spectral filter element is assumed to m type (m≥2). n means the number of transmission polarization directions of the image sensor by the polarization filter element, and m means the number of transmission wavelength ranges of the image sensor by the spectral filter element.
The number of bands of the multispectral image captured by the imaging apparatus is assumed to N. The imaging apparatus according to the embodiment of the present invention can capture the multispectral image of N=m×n bands at maximum. In this case, in the image sensor, one pixel block is configured by m×n pixels having different combinations of the polarization filter element and the spectral filter element.
The type of the wavelength range emitted from the imaging optical system (including a case of separation in combination with the spectral filter element) is assumed to k type. That is, the number of transmission wavelength ranges of the imaging optical system is assumed to k. Since the imaging apparatus according to the embodiment of the present invention can capture the multispectral image of n×m bands at maximum, k≤m×n is satisfied. The imaging optical system and the image sensor are set to satisfy k≤m×n. Specifically, in the imaging optical system, the aperture region is set and the wavelength range and the polarization direction of the light beam transmitted through each aperture region are set such that k≤m×n is satisfied. In addition, in the image sensor, one pixel block is set and the combination of the polarization filter element and the spectral filter element of the pixels configuring each pixel block are set such that k≤m×n is satisfied.
Note that the number of transmission wavelength ranges k of the imaging optical system is counted within the wavelength range having the sensitivity in both the spectral filter element and the image sensor. That is, for example, even in a case of a configuration in which the imaging optical system transmits the light beams of the four wavelength ranges λ1 to λ4, in a case in which the image sensor does not have the sensitivity for the light beam of the wavelength range λ4, the number of the transmission wavelength range is substantially k=3. Similarly, also in a case in which the spectral filter element does not have the sensitivity for the light beam of the wavelength range λ4 (a case in which the light beam of the wavelength range λ4 is not substantially transmitted), the number of transmission wavelength ranges is substantially k=3. Therefore, in a case of capturing the multispectral image of four bands or more, the band to be captured within the wavelength range having the sensitivity by both the spectral filter element (third optical element) and the image sensor, that is, the transmission wavelength range of the imaging optical system (transmission wavelength range of the first optical element) is set.
Here, a case will be considered in which the type of the polarization filter element provided in the image sensor is assumed to n type, the type of the spectral filter element is assumed to m type, n×m=q is satisfied, and one pixel block is configured by q pixels. In this case, q pixel signals α1, α2, aq are output from each pixel block of the image sensor. In a case in which the imaging optical system emits the light beams of k wavelength ranges, the arithmetic equation for calculating k pixel signals β1, β2, . . . , βk corresponding to the light beams of the wavelength ranges from the q pixel signals α1, α2, . . . , αq is defined as follows using the matrix A.
As described above, the matrix A is the inverse matrix B−1 of the matrix B having, as an element, a ratio of the light beam of the wavelength range received by each pixel of each pixel block.
Note that it is preferable that the imaging optical system have a configuration in which at least one of the aperture regions of the bandpass filter unit transmits the light beams of a plurality of types of the wavelength ranges. That is, a configuration is adopted in which the bandpass filter (multi-bandpass filter) which transmits the light beams of a plurality of types of wavelength ranges is provided in at least one of the aperture regions. As a result, an aperture area can be increased and the sensitivity can be improved. In this case, the number of wavelength ranges transmitted through one aperture region (the number of transmission wavelength ranges) is the number of types of the spectral filter elements provided in the image sensor, at maximum. Note that the type of the spectral filter element provided in the image sensor means the number of transmission wavelength ranges of the spectral filter element in the image sensor, the number of transmission wavelength ranges possessed by at least one of the aperture regions of the bandpass filter unit can be set to be equal to or less than the number of transmission wavelength ranges of the spectral filter element.
In a case in which the imaging optical system comprises a set of the bandpass filters having the same transmission wavelength characteristics, the image is shifted in a case in which the focus is shifted. The focus can be more easily detected by using this property. Hereinafter, the imaging apparatus that can more easily detect the focus will be described. Note that the configuration other than the bandpass filter unit is the same as that of the imaging apparatus according to the first embodiment. Therefore, here, only the configuration of the bandpass filter unit will be described.
As shown in
The bandpass filters (first bandpass filter 16B1, second bandpass filter 16B2, third bandpass filter 16B3) are provided in the aperture regions 16A1 to 16A3. The first bandpass filter 16B1 and the third bandpass filter 16B3 have the same transmission wavelength characteristics and transmit the light beams of the wavelength ranges λ11, λ12, and λ13. On the other hand, the second bandpass filter 16B2 has different transmission wavelength characteristics from the first bandpass filter 16B1 and the third bandpass filter 16B3, and transmits the light beams of the wavelength ranges λ21, λ22, and λ23.
With the imaging apparatus comprising the bandpass filter unit 16 according to the present embodiment, since the image is shifted in a case in which the focus is shifted, the focus can be more easily detected for the image of the wavelength ranges λ11, λ12, and λ13 passing through the first aperture region 16A1 and the image of the wavelength ranges λ11, λ12, and λ13 passing through the third aperture region 16A3.
As shown in
The bandpass filters (first bandpass filter 16B1, second bandpass filter 16B2, third bandpass filter 16B3, fourth bandpass filter 16B4, fifth bandpass filter 16B5, sixth bandpass filter 16B6, and seventh bandpass filter 16B7) are provided in the aperture regions 16A1 to 16A7. The first bandpass filter 16B1 is configured by the multi-bandpass filters, which transmit the light beams of three types of the wavelength ranges (λ11, λ12, and λ13). The other bandpass filters 16B2 to 16B7 are configured by the bandpass filters which transmit the light beam of one type of the wavelength range, and transmit the light beams of wavelength ranges different from each other, excluding the second bandpass filter 16B2 and the fifth bandpass filter 16B5. The second bandpass filter 16B2 and the fifth bandpass filter 16B5 have the same transmission wavelength characteristics and transmit the light beam of the wavelength range λ21.
With the imaging apparatus comprising the bandpass filter unit 16 according to the present embodiment, since the image is shifted in a case in which the focus is shifted, the focus can be more easily detected for the image of the wavelength range λ21 passing through the second aperture region 16A2 and the image of the wavelength range λ21 passing through the fifth aperture region 16A5.
In each of the embodiments described above, the aperture shape of the aperture region provided in the bandpass filter unit and the polarization filter unit is a circular shape or a rectangular shape, but the aperture shape of the aperture region is not limited to this.
As shown in
The bandpass filters (16b1, 16b2, 16b3, 16b4, 16b5, 16b6, and 16b7) having predetermined wavelength transmission characteristics are provided in the aperture regions (central aperture region 16a1 and outer peripheral aperture region 16a2, 16a3, 16a4, 16a5, 16a6, and 16a7) of the bandpass filter unit 16. Specifically, the bandpass filter (multi-bandpass filter) 16b1 which transmits the light beams of three types of the wavelength ranges λ11, λ12, and λ13 is provided in the central aperture region 16a1. The bandpass filter 16b2 which transmits the light beam of one type of the wavelength range λ31 is provided in the outer peripheral aperture region 16a2. The bandpass filter 16b3 which transmits the light beam of one type of the wavelength range λ22 is provided in the outer peripheral aperture region 16a3. The bandpass filter 16b4 which transmits the light beam of one type of the wavelength range λ32 is provided in the outer peripheral aperture region 16a4. The bandpass filter 16b5 which transmits the light beam of one type of the wavelength range λ23 is provided in the outer peripheral aperture region 16a5. The bandpass filter 16b6 which transmits the light beam of one type of the wavelength range λ33 is provided in the outer peripheral aperture region 16a6. The bandpass filter 16b7 which transmits the light beam of one type of the wavelength range λ21 is provided in the outer peripheral aperture region 16a7.
The polarization filters (18b1, 18b2, and 18b3) are provided in the aperture regions (central aperture region 18a1 and outer peripheral aperture region 18a2, 18a3, 18a4, 18a5, 18a6, and 18a7) of the polarization filter unit 18. Specifically, the polarization filter 18b1 which transmits the light beam of the polarization direction θ1 is provided in the central aperture region 18a1. The polarization filter 18b3 which transmits the light beam of the polarization direction θ2 is provided in the outer peripheral aperture regions 18a2, 18a4, and 18a6. The polarization filter 18b2 which transmits the light beam of the polarization direction θ3 is provided in the outer peripheral aperture regions 18a3, 18a5, and 18a7.
Generally, in the lens, the aberration is smaller and an image quality is higher as the lens is closer to the center (optical axis). Therefore, by setting the transmission wavelength range of each region in response to the required image quality, the image quality of the obtained image can be controlled for each wavelength range. For example, for the wavelength range that requires high image quality, the image quality can be controlled for each wavelength range by the allocation to the central aperture region.
For the setting of the aperture region (aspect of splitting the pupil region), various aspects, such as an aspect of splitting concentrically, an aspect of splitting along the circumferential direction, and an aspect of splitting in a grid, can be adopted. In addition, the sizes (areas) of the aperture regions do not have to be the same, and may be different for each aperture region.
[Configuration of Imaging Optical System]
In the embodiments described above, the configuration has been adopted in which the bandpass filter unit and the polarization filter unit are configured by optical elements, respectively, but a configuration can be adopted in which the bandpass filter unit and the polarization filter unit are configured by one optical element having the functions of the bandpass filter unit and the polarization filter unit. For example, a configuration can be adopted in which the bandpass filter and the polarization filter are held in one frame.
[Configuration of Image Sensor]
The arrangement of the pixels configuring one pixel block is not limited to that of each of the embodiments described above. The arrangement of the pixels can be appropriately changed in response to the number of pixels configuring one pixel block and the like.
In addition, in the embodiments described above, the configuration has been adopted in which the polarization filter element and the spectral filter element are disposed between the photodiode and the micro lens, but a configuration can be adopted in which one or both of the polarization filter element and the spectral filter element are disposed in front of the micro lens (subject side). Note that by disposing the polarization filter element and the spectral filter element between the micro lens and the photodiode, it is possible to effectively prevent the light beams from being mixed into adjacent pixels. As a result, the interference can be further prevented.
[Interference Removal Processing in Signal Processing Unit]
The signal processing unit can also generate the image signal in each wavelength range without performing the interference removal processing. For example, as described above, depending on the combination of the transmission polarization direction of the polarization filter provided in the imaging optical system and the transmission polarization direction of the polarization filter element provided in the image sensor, the light beam of each wavelength range can be received without causing the interference. Therefore, in such a case, the image signal of each wavelength range can be generated without performing the interference removal processing. In addition, even in a case in which the interference occurs, in a case in which the influence is small or in a case in which a user recognizes the influence to be acceptable, it is possible to generate the image signal in each wavelength range without performing the interference removal processing.
[Configuration of Signal Processing Unit]
The function of the image generation unit 200B (arithmetic unit) in the signal processing unit 200 can be realized by using various processors. The various processors include, for example, a central processing unit (CPU), which is a general-purpose processor that executes software (program) to realize various functions. In addition, the various processors described above also include a graphics processing unit (GPU), which is a processor specialized in the image processing and a programmable logic device (PLD), which is a processor of which a circuit configuration can be changed after manufacturing such as a Field Programmable Gate Array (FPGA). Further, the various processors described above also include a dedicated electric circuit, which is a processor having a circuit configuration specially designed for executing specific processing, such as an application specific integrated circuit (ASIC).
The functions of the units may be realized by one processor, or may be realized by a plurality of processors of the same type or different types (for example, a plurality of FPGAs, or a combination of the CPU and the FPGA, or a combination of the CPU and the GPU). In addition, a plurality of the functions may be realized by one processor. As an example of configuring a plurality of functions with one processor, first, as represented by a computer such as a server, there is a form in which one processor is configured by a combination of one or more CPUs and software, and the processor realizes a plurality of functions. Second, as represented by a system on chip (SoC), there is a form in which a processor is used in which the functions of the entire system are realized by a single integrated circuit (IC) chip. In this way, the various functions are configured by one or more of the above various processors as a hardware structure. Further, the hardware structure of these various processors is, more specifically, an electric circuit (circuitry) in which the circuit elements such as semiconductor elements are combined. These electric circuits may be electric circuits that realize the functions described above by using logical sum, logical product, logical denial, exclusive logical sum, and logical operations combining the above.
In a case in which the processor described above or the electric circuit executes software (program), the processor (computer) readable code of the software to be executed is stored in a non-transitory recording medium such as a read only memory (ROM), and the processor refers to the software. The software stored in the non-transitory recording medium includes a program for executing image input, analysis, display control, and the like. The code may be recorded on the non-transitory recording medium such as various optical magnetic recording devices, a semiconductor memory, and the like, instead of the ROM. In a case of processing using the software, for example, a random access memory (RAM) can be used as a transitory storage region, and for example, refer to data stored in an electronically erasable and programmable read only memory (EEPROM) (not shown).
The coefficient storage unit 200C of the signal processing unit 200 can be realized by, for example, a memory such as the read-only memory (ROM), the electrically erasable programmable read-only memory (EEPROM), and the like.
[Configuration of Imaging Apparatus]
The imaging apparatus can also be configured as an interchangeable lens type imaging apparatus in which the imaging optical system can be exchanged. In this case, since the matrix A is uniquely determined for each lens (imaging optical system), the matrix A is prepared for each lens, and the coefficient group thereof is stored in the coefficient storage unit. In a case in which the lens is exchanged, the coefficient group of the matrix A corresponding to the exchanged lenses is read out from the coefficient storage unit, the arithmetic processing is executed, and each image is generated.
Number | Date | Country | Kind |
---|---|---|---|
JP2019-108829 | Jun 2019 | JP | national |
The present application is a Continuation of PCT International Application No. PCT/JP2020/021935 filed on Jun. 3, 2020 claiming priority under 35 U.S.C. § 119(a) to Japanese Patent Application No. 2019-108829 filed on Jun. 11, 2019. Each of the above applications is hereby expressly incorporated by reference, in its entirety, into the present application.
Number | Name | Date | Kind |
---|---|---|---|
9118796 | Hiramoto et al. | Aug 2015 | B2 |
20090244355 | Horie | Oct 2009 | A1 |
20100085537 | Ramella-Roman | Apr 2010 | A1 |
20130083172 | Baba | Apr 2013 | A1 |
20140192255 | Shroff | Jul 2014 | A1 |
20160065938 | Kazemzadeh | Mar 2016 | A1 |
20200241262 | Bodkin | Jul 2020 | A1 |
20220272234 | Mordechai | Aug 2022 | A1 |
Number | Date | Country |
---|---|---|
2009258618 | Nov 2009 | JP |
2012247645 | Dec 2012 | JP |
2013077935 | Apr 2013 | JP |
2014020791 | Feb 2014 | WO |
Entry |
---|
“International Search Report (Form PCT/ISA/210) of PCT/JP2020/021935,” dated Jul. 28, 2020, with English translation thereof, pp. 1-5. |
“Written Opinion of the International Searching Authority (Form PCT/ISA/237)” of PCT/JP2020/021935, dated Jul. 28, 2020, with English translation thereof, pp. 1-6. |
Number | Date | Country | |
---|---|---|---|
20220078359 A1 | Mar 2022 | US |
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2020/021935 | Jun 2020 | US |
Child | 17528171 | US |