The present technology relates to a signal processing device, a signal processing method, and a program, and enables high-resolution distance information to be obtained easily.
Conventionally, various methods have been proposed for non-contact measurement of the distance (hereinafter referred to as “subject distance”) from an imaging device to a subject. For example, an active method that emits infrared rays, ultrasonic waves, lasers, or the like and calculates the subject distance based on the time taken for the reflected wave to return, the angle of the reflected wave, and the like and a passive method that calculates the distance to the subject based on stereo images of the subject without requiring a device for emitting infrared rays and the like.
In the passive method, as illustrated in NPL 1 and NPL 2, edge images are generated using an image based on an ordinary ray and an image based on an extraordinary ray obtained by performing imaging through a birefringent material having a birefringent effect and the subject distance is calculated based on matching results of corresponding points in the edge images.
Seung-Hwan Baek, et al. (2016) “Birefractive stereo imaging for single-shot depth acquisition”, ACM Trans. Graphics (Proc. SIGGRAPH Asia 2016), vol. 35, no. 6, pp 194, 2016.
Andreas Meuleman, et al. (2020) “Single-shot Monocular RGB-D Imaging using Uneven Double Refraction”, Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2020, pp 2465-2474.
By the way, when a passive method is used to calculate the distance to the subject without requiring a device for emitting infrared rays or the like, the distance cannot be calculated, for example, since a matching result of corresponding points cannot be obtained for a portion where no edge is detected in the edge images. Therefore, it is difficult to obtain high-resolution distance information.
Therefore, it is an object of the present technology to provide a signal processing device, a signal processing method, and a program that can easily obtain high-resolution distance information.
A first aspect of the present technology provides a signal processing device including:
In the present technology, the polarized imaging unit generates polarized images based on subject light incident through a birefringent material. The polarized imaging unit has an imaging surface perpendicular to an optical axis of the birefringent material. The polarized imaging unit is configured using polarized pixels whose polarization directions have a phase difference of 90°, and the polarization direction matches the horizontal direction and the vertical direction of the birefringent material. The parallax image generator separates images with different polarization angles using the polarized images generated by the polarized imaging unit and generates an ordinary ray image and an extraordinary ray image as parallax images. For example, the parallax image generator generates the ordinary ray image using a polarized pixel whose polarization direction matches one of the horizontal direction and the vertical direction of the birefringent material, and generates the extraordinary ray image using a polarized pixel whose polarization direction matches the other direction.
The polarized imaging unit is configured using polarized pixels having a predetermined polarization direction and non-polarized pixels that are non-polarized, and the polarization direction matches the horizontal direction or the vertical direction of the birefringent material. The parallax image generator generates one of the ordinary ray image and the extraordinary ray image using the polarized pixels, and generates the other image based on an image generated using the polarized pixels and an image generated using the non-polarized pixels.
The polarized imaging unit is configured using polarized pixels having three or more different polarization directions, and the parallax image generator calculates a polarization model based on pixel values of the polarized pixels having three or more different polarization directions and generates the parallax image based on the calculated polarization model. For example, the parallax image generator searches for a polarization direction in which the other image included in one of the ordinary ray image and the extraordinary ray image is minimized, and generates an image having a phase difference of 90° from the image of the searched polarization direction as the parallax image. The parallax image generator searches for a polarization direction in which an edge component of the polarized image based on the polarization model is minimized. The parallax image generator may search for one polarization direction of two polarized images based on the polarization model, in which a sum of differences for each pixel of the two polarized images whose polarization directions have a phase difference of 90° is maximized. The parallax image generator may search for a polarization direction having a phase difference of 45° from one polarization direction of two polarized images based on the polarization model, in which a sum of differences for each pixel of the two polarized images whose polarization directions have a phase difference of 90° is minimized. The parallax image generator may search for one polarization direction of two polarized images based on the polarization model, in which a sum of differences for each pixel between an added image of two polarized images whose polarization directions have a phase difference of 90° and a polarized image having a phase difference of 45° is minimized.
The parallax image generator generates an ordinary ray image and an extraordinary ray image having a parallax in a horizontal direction as parallax images using a predetermined image parallelization function.
The distance measuring unit calculates a distance to a distance measurement position based on a parallax of the distance measurement position of the subject in the ordinary ray image and the extraordinary ray image generated by the parallax image generator.
A second aspect of the present technology provides a signal processing method including:
A third aspect of the present technology provides a program for causing a computer to perform distance measurement using polarized images, the computer executing:
The program of the present technology is a program that can be provided in a general-purpose computer capable of executing various program codes by a storage medium provided in a computer-readable format or a communication medium, for example, a storage medium such as an optical disc, a magnetic disk or a semiconductor memory, or a communication medium such as a network. The provision of such a program in a computer-readable format allows processing according to the program to be realized on the computer.
An embodiment for implementing the present technique will be described below. Here, description will proceed in the following order.
The present technology performs imaging of a distance measurement target through a birefringent material to generate polarized images. The present technology separates images with different polarization angles using the generated polarized images, generates an ordinary ray image and an extraordinary ray image as parallax images, and calculates a distance to a distance measurement position based on a parallax of a distance measurement position in the ordinary ray image and the extraordinary ray image.
The birefringent material 21 is a material having a birefringent effect, and the incident light having passed through the birefringent material is divided into ordinary and extraordinary rays by the birefringent material 21. The birefringent material 21 is, for example, α-BBO crystal, yttrium-vanadate crystal, calcite, quartz, or the like.
The imaging optical system 22 is configured using a focus lens, a zoom lens, and the like. The imaging optical system 22 drives a focus lens, a zoom lens, and the like to form an optical image of a measurement target subject on the imaging surface of the birefringence imaging unit 20. The imaging optical system 22 may be provided with an iris (aperture) mechanism, a shutter mechanism, or the like.
The polarized imaging unit 25 is configured using a polarization element and an image sensor, and generates a polarized image.
The birefringence imaging unit 20 configured in this way generates, as parallax images, a first polarized image based on ordinary rays and a second polarized image based on extraordinary rays.
When the subject light representing the subject OB is incident on the birefringent material 21, the subject light is divided into an ordinary ray Rx and an extraordinary ray Ry and emitted to the polarized imaging unit 25. That is, the polarized imaging unit 25 receives a ray representing an image Gc obtained by mixing an image based on the ordinary ray Rx and an image based on the extraordinary ray Ry.
The image sensor of the polarized imaging unit 25 photoelectrically converts the light incident through the polarizing filter 252 to generate a polarized image. For example, in the case of
The parallax image generator 30 separates the ordinary ray image Go and the extraordinary ray image Ge based on a mixed image generated by the birefringence imaging unit 20 to generate a parallax image. The parallax image generator 30 may generate an average image by performing gain adjustment corresponding to a polarizing filter with respect to a polarized image for each polarization direction and a non-polarized image generated using non-polarized pixels (not illustrated) having no polarizing filter and generate a parallax image based on the polarized images for each polarization direction or the polarized images and the average image. Note that the average image is an image representing an average change in luminance when the polarization direction is changed. When the image sizes of the polarized images or the average images for each polarization direction are different, the parallax image generator 30 performs interpolation processing or the like so that the image sizes (the numbers of pixels in the horizontal and vertical directions) of the polarized images and the average images for each polarization direction are equal.
The distance measuring unit 40 performs corresponding point matching processing using the parallax images generated by the parallax image generator 30, and calculates the parallax of the distance measurement position P. The distance measuring unit 40 calculates the distance to the distance measurement position P on the subject OB based on the calculated parallax.
Next, the configuration and operation of the first embodiment will be described. In the first embodiment, the polarized imaging unit 25 has polarized pixels having at least two orthogonal polarization directions.
The parallax image generator 30 generates, as parallax images, an ordinary ray image based on ordinary rays and an extraordinary ray image based on extraordinary rays from the polarized images acquired by the birefringence imaging unit 20.
The distance measuring unit 40 performs corresponding point matching processing using the ordinary ray image Go and the extraordinary ray image Ge generated by the parallax image generator 30, and calculates the parallax of the distance measurement position P. The distance measuring unit 40 calculates the distance Z(P) to the distance measurement position P on the subject OB based on the calculated parallax.
Next, the operation of the first embodiment will be described. A baseline length B, which is the interval between the acquisition position of the ordinary ray image Go and the acquisition position of the extraordinary ray image Ge, which cause a parallax between the distance measurement positions Po and Pe, is measured in advance. In the birefringence imaging unit 20, the focal length f is when the distance measurement position P of the subject OB is in focus.
Here, calibration is performed such that the pixel value based on the ordinary ray that has passed through the birefringent material from the distance measurement position P on the subject OB is obtained in the polarized pixel with the polarization direction of 0°, and the pixel value based on the extraordinary ray that has passed through the birefringent material from the distance measurement position P on the subject OB is obtained in the polarized pixel with the polarization direction of 90°.
The parallax image generator 30 generates an ordinary ray image Go representing an optical image of ordinary rays using polarized pixels with the polarization direction of 0°, and an extraordinary ray image Ge representing an optical image of extraordinary rays using polarized pixels with the polarization direction of 90°.
The distance measuring unit 40 performs matching processing of the distance measurement position P using the ordinary ray image Go and the extraordinary ray image Ge generated by the parallax image generator 30, and calculates a parallax ∥PoPe∥, which is the difference between the distance measurement position Po in the ordinary ray image Go and the distance measurement position Pe in the extraordinary ray image Ge. The distance measuring unit 40 calculates, based on Equation (1), the distance Z(P) to the distance measurement position P in the subject OB based on the calculated parallax ∥PoPe∥, the baseline length B, and the focal length f.
The measurement system 10 performs calibration so that an ordinary ray image based on ordinary rays and an extraordinary ray image based on extraordinary rays can be separated from the mixed image generated by the birefringence imaging unit 20.
In step ST1, the measurement system calculates the focal length. As in the conventional calibration method, the measurement system 10 performs calibration using internal parameters, calculates the focal length f, and then the process proceeds to step ST2.
In step ST2, the measurement system adjusts the positions of the birefringent material and the image sensor. The measurement system 10 adjusts the positions of the birefringent material and the image sensor so that the z-axis (optical axis) of the birefringent material is perpendicular to the imaging surface of the image sensor of the polarized imaging unit.
In this calibration method, as illustrated in
In the calibration method described above, a straight line Li connecting the keypoint pairs at equal positions on the checkerboard is calculated for each keypoint pair. For example, a straight line L1 connecting the keypoints Pd1 and Po1, a straight line L2 connecting the keypoints Pd2 and Po2, and a straight line L3 connecting the keypoints Pd3 and Po3 are calculated. An intersection point E of a plurality of straight lines Li (i=1, 2, 3, . . . L, L=3 in
[Math. 2]
L
i={(u,v)|aiu+biv+ci=0} (2)
In this calibration method, the birefringent material 21 is rotated around the y-axis and the x-axis to adjust the position of the intersection point E, and the intersection point E is set to the position of the image center C.
The measurement system adjusts the birefringent material 21 so that the intersection point E is positioned at the image center C, thereby making the z-axis of the birefringent material perpendicular to the imaging surface of the image sensor, and then, the process proceeds to step ST3.
In step ST3, the measurement system adjusts the positions of the birefringent material and the polarizing filter. In the measurement system 10, the y-axis of the birefringent material matches the 0-degree direction of the polarizing filter in the polarized imaging unit so that a polarized image generated using polarized pixels with the polarization direction of 0° is an ordinary ray image, and a polarized image generated using polarized pixels with the polarization direction of 90° is an extraordinary ray image. In step ST3, the y-axis of the birefringent material and the 90-degree direction of the polarizing filter may be matched so that the 90-degree polarized image represents the ordinary ray image and the 0-degree polarized image represents the extraordinary ray image.
In this calibration method, as illustrated in
In the above-described calibration method, using keypoint pairs at equal positions on the checkerboard, a circle Cri centered on the keypoint of the keypoint group Pdi and passing through the corresponding keypoint of the keypoint group Pei is calculated for each keypoint pair. For example, a circle Cr1 centered on the keypoint Pd1 and passing through the keypoint Pe1, a circle Cr2 centered on the keypoint Pd2 and passing through the keypoint Po2, and a circle Cr centered on the keypoint Pd3 and passing through the keypoint Po3 are calculated. An intersection point A of a plurality of circles Cri (i=1, 2, 3, . . . L, L=3 in
In this calibration method, the position of the intersection point A is adjusted by rotating the birefringent material 21 about the z-axis so that a vector connecting the intersection point A and the image center C is aligned in the vertical direction of the image (for example, the upward vertical direction). In this calibration method, the birefringent material 21 is adjusted so that the vector connecting the intersection point A and the image center C is in the vertical direction of the image, whereby the y-axis of the birefringent material corresponds to the 0-degree polarization direction of the polarizing filter.
The measurement system performs calibration in which the y-axis of the birefringent material corresponds to a predetermined polarization direction of the polarizing filter, and then, the process proceeds to step ST4.
In step ST4, the measurement system calculates an image parallelization function. The measurement system 10 calculates an image parallelization function T that converts the polarized image generated by the birefringence imaging unit 20 into a stereo mixed image obtained by mixing right-viewpoint images and left-viewpoint images. The image parallelization function T is calculated using the method described in NPL 2, for example.
In this method, the image parallelization function T is calculated using a baseline length B set in advance. As illustrated in Equation (3), the image parallelization function T is a function that converts the coordinates t(u,v) of the image I before parallelization to the coordinates (u,v) of the image Ir obtained by mixing right-viewpoint images and left-viewpoint images.
[Math. 3]
I
r
=T(I)={Ir(u,v)←I[t(u,v)]} (3)
The image parallelization function T can be calculated using, for example, a recursive method. Specifically, as illustrated in Equation (4), coordinates t(u,v) are calculated from coordinates (0,v) at the left end to (u,v) at the right end. Here, the baseline b(u,v) of the pixel (u,v) is calculated based on Equation (5). Note that in Equation (5), the focal length f and the distance Zcb to the checkerboard are set in advance before calculating the image parallelization function. ∥PoPe∥ is defined by keypoints on a checkerboard, and pixels that are not keypoints are calculated by interpolation using values of neighboring keypoints.
After performing the calibration of
In step ST12, the measurement system performs image parallelization processing. The parallax image generator 30 of the measurement system 10 performs image parallelization processing on the polarized image acquired by the birefringence imaging unit 20 using the image parallelization function T calculated by calibration. The parallax image generator 30 performs image parallelization processing to convert the polarized image into a stereo image in which the ordinary ray image and the extraordinary ray image are images from the right viewpoint and the left viewpoint, and the distance measurement position has a parallax according to the distance, and then, the process proceeds to step ST13.
In step ST13, the measurement system acquires a 0-degree polarized image. The parallax image generator 30 of the measurement system 10 acquires a 0-degree polarized image (ordinary ray image) generated using the polarized pixel with the polarization direction of 0° as the image from one viewpoint from the stereo mixed image generated in step ST12, and then, the process proceeds to step ST14.
In step ST14, the measurement system acquires a 90-degree polarized image. The parallax image generator 30 of the measurement system 10 acquires a 90-degree polarized image (extraordinary ray image) generated using the polarized pixel with the polarization direction of 90° as the image from the other viewpoint from the stereo mixed image generated in step ST12, and then, the process proceeds to step ST15.
In step ST15, the measurement system performs corresponding point matching. The distance measuring unit 40 of the measurement system 10 performs corresponding point matching using the 0-degree polarized image (ordinary ray image) that is an image from one viewpoint acquired in step ST13, and the 90-degree polarized image (extraordinary ray image) that is an image from the other viewpoint acquired in step ST14 and calculates the positional difference ∥PoPe∥ between the distance measurement position Po in the ordinary ray image and the distance measurement position Pe in the extraordinary ray image, and then, the process proceeds to step ST16.
The distance measuring unit 40 moves the center position (xs, ys) of the reference image, which has a region size equal to that of the template image, within the range illustrated by Equations (7) and (8), and calculates the center position (xst, yst) that minimizes the error between the template image and the reference image of the search range ARs. The distance measuring unit 40 sets the distance measurement position Pe as the position corresponding to the distance measurement position Po where the error is minimized. In this case, the coordinates (xPe, yPe) of the distance measurement position Pe are the coordinates illustrated in Equation (9).
When the distance measuring unit 40 uses, for example, SAD as the error between the template image and the search image, the coordinates (xst, yst) at which the error is minimized are the coordinates (xs, ys) when the evaluation value H illustrated in Equation (10) is obtained. Note that SAD is defined as illustrated in Equation (11).
The distance measuring unit 40 performs such corresponding point matching, and calculates the parallax ∥PoPe∥ based on Equation (12).
[Math. 8]
∥P
o
P
e∥=√{square root over ((Pox−Pex)+(Poy−Pey))} (12)
The measurement system calculates the distance in step ST16. The distance measuring unit 40 of the measurement system performs calculation of Equation (1) using the focal distance f and the baseline length B set in advance and the parallax ∥PoPe∥ calculated in step ST15 and calculates the distance Z(P) to the distance measurement position P.
As described above, according to the first embodiment, it is possible to generate a polarized image representing an optical image based on ordinary rays and a polarized image representing an optical image based on extraordinary rays and measure the distance to a distance measurement position based on the parallax between the distance measurement positions in the two polarized images. Therefore, corresponding point matching can be performed even in a portion where no edge is detected, and distance information with higher resolution than when edge images are used can be obtained.
Next, a second embodiment will be described. In the first embodiment described above, the case in which the polarized imaging unit 25 is configured using polarized pixels whose polarization directions are orthogonal to each other has been described. In the second embodiment, a case in which the polarized imaging unit 25 is configured using polarized pixels having one polarization direction and non-polarized pixels will be described.
The parallax image generator 30 generates an average image from the polarized image acquired by the birefringence imaging unit 20 using the polarized image based on the ordinary ray and non-polarized pixels.
The distance measuring unit 40 performs corresponding point matching processing using the ordinary ray image Go and the extraordinary ray image Ge generated by the parallax image generator 30, and calculates the parallax of the distance measurement position P. The distance measuring unit 40 calculates the distance Z(P) to the distance measurement position P on the subject OB based on the calculated parallax.
In step ST22, the measurement system performs image parallelization processing. The parallax image generator 30 of the measurement system 10 performs image parallelization processing on the polarized image acquired by the birefringence imaging unit 20 using the image parallelization function T calculated by calibration, and ordinary ray image and extraordinary ray image to convert the polarized image into a stereo mixed image in which the ordinary ray image and the extraordinary ray image are images from the right viewpoint and the left viewpoint, and the distance measurement position has a parallax according to the distance, and then, the process proceeds to step ST23.
In step ST23, the measurement system acquires a 0-degree polarized image. The parallax image generator 30 of the measurement system 10 acquires a 0-degree polarized image (ordinary ray image Go) generated using the polarized pixel with the polarization direction of 0° as the image from one viewpoint from the stereo mixed image generated in step ST22, and then, the process proceeds to step ST24.
In step ST24, the measurement system acquires an average image. The parallax image generator 30 of the measurement system 10 acquires the average image Gmean generated using the non-polarized pixels in the stereo mixed image generated in step ST22, and then, the process proceeds to step ST25.
In step ST25, the measurement system acquires a 90-degree polarized image. The parallax image generator 30 of the measurement system 10 performs the calculation of Equation (13) using the pixel value I0 of the ordinary ray image Go acquired in step ST23 and the pixel value Imean of the average image Gmean acquired in step ST24 and calculates the pixel value Ie of the 90-degree polarized image, that is, the extraordinary ray image Ge, and then, the process proceeds to step ST26.
[Math. 9]
I
90=2×Imean−I0 (13)
In step ST26, the measurement system performs corresponding point matching. The distance measuring unit 40 of the measurement system 10 performs corresponding point matching using the 0-degree polarized image (ordinary ray image) that is an image from one viewpoint acquired in step ST23, and the 90-degree polarized image (extraordinary ray image) that is an image from the other viewpoint acquired in step ST25 and calculates the positional difference ∥PoPe∥ between the distance measurement position Po in the ordinary ray image and the distance measurement position Pe in the extraordinary ray image, and then, the process proceeds to step ST27.
The measurement system calculates the distance in step ST27. The distance measuring unit 40 of the measurement system performs calculation of Equation (1) using the focal distance f and the baseline length B set in advance and the parallax ∥PoPe∥ calculated in step ST26 and calculates the distance Z(P) to the distance measurement position P.
The measurement system may acquire a 90-degree polarized image in step ST23 and calculate a 0-degree polarized image in step ST25, the 0-degree polarized image may be an extraordinary ray image and the 90-degree polarized image may be an ordinary ray image.
As described above, according to the second embodiment, similar to the first embodiment, distance information with higher resolution than when edge images are used can be obtained. The number of polarization directions of the polarized pixels can be reduced compared to the first embodiment.
Next, a third embodiment will be described. In the first embodiment and the second embodiment described above, the case in which the polarized imaging unit 25 is configured using polarized pixels whose polarization directions are orthogonal to each other, and the case in which the polarized imaging unit 25 is configured using polarized pixels having one polarization direction and non-polarized pixels have been described. In the third embodiment, a case in which the polarized imaging unit 25 is configured using three or more types of polarized pixels will be described.
When a polarizing plate is installed perpendicular to the observation direction and the partially polarized light is observed through the polarizing plate, the luminance of the transmitted light changes each time the polarizing plate is rotated. Here, when the polarizing plate is rotated, the highest luminance is Imax and the lowest luminance is Imin, and a two-dimensional coordinate system (x-axis and y-axis) is defined on the plane of the polarizing plate, the polarization angle u, which is the angle when the polarizing plate is rotated is defined as the angle between the polarization axis of the polarizing plate and the x-axis and is expressed as the angle from the x-axis to the y-axis. The polarization axis is an axis representing the direction in which light is polarized after passing through the polarizing plate. When the polarizing plate is rotated, the polarization direction has a periodicity of 180°, and the polarization angle takes values from 0° to 180°. Here, it is known that, if the polarization angle θpol when the maximum luminance Imax is observed is defined as the phase angle φ, the luminance I observed when the polarizing plate is rotated can be represented by a polarization model illustrated in Equation (14).
Equation (14) can be converted to Equation (15). When the observed value (luminance) of the polarized pixel with a polarization direction of 0° is “I0”, the observed value (luminance) of the polarized pixel with a polarization direction of 45° is “I1”, the observed value (luminance) of the polarized pixel with a polarization direction of 90° is “I2”, and the observed value (luminance) of the polarized pixel with a polarization direction of 135° is “I3”, the coefficient a in Equation (15) is the value illustrated in Equation (16). The coefficients b and c in Equation (15) are values illustrated in Equations (17) and (18). Note that Equation (18) represents the average image described above.
In the third embodiment, a case in which an ordinary ray image and an extraordinary ray image are generated as parallax images from a polarization model based on pixel values of three or more polarized pixels will be described.
The parallax image generator 30 calculates the polarization model represented by Equation (14) or (15) for each pixel using the pixel values of the polarized image for each polarization direction, and obtains the clearest parallax image.
The parallax image generator 30 generates a polarized image Gθs, in the clearest polarization direction illustrated in
The distance measuring unit 40 performs corresponding point matching processing using the parallax images generated by the parallax image generator 30, and calculates the parallax of the distance measurement position P. The distance measuring unit 40 calculates the distance Z(P) to the distance measurement position P on the subject OB based on the calculated parallax.
Next, the operation of the third embodiment will be explained. The baseline length B and focal length f are measured in advance. By using three or more types of polarized pixels with different polarization directions, since it is possible to estimate the pixel value in a desired polarization direction as described later, it is not necessary to perform the processing of matching the polarizing filter with the y-axis direction of the birefringent material in the calibration.
In step ST31, the measurement system calculates the focal length. The measurement system 10 performs the same processing as the conventional calibration method or step ST1 in
In step ST32, the measurement system adjusts the positions of the birefringent material and the image sensor. The measurement system 10 adjusts the positions of the birefringent material and the image sensor so that the z-axis (optical axis) of the birefringent material is perpendicular to the imaging surface of the image sensor of the polarized imaging unit, and then, the process proceeds to step ST33.
In step ST33, the measurement system calculates an image parallelization function. The measurement system 10 calculates an image parallelization function T that converts the polarized image generated by the birefringence imaging unit 20 into a stereo mixed image obtained by mixing right-viewpoint images and left-viewpoint images. The image parallelization function T is calculated using the method described in NPL 2, for example.
After performing the calibration of
In step ST42, the measurement system performs image parallelization processing. The parallax image generator 30 of the measurement system 10 performs image parallelization processing on the polarized image acquired by the birefringence imaging unit 20 using the image parallelization function T calculated by calibration, and ordinary ray image and extraordinary ray image to convert the polarized image into a stereo mixed image in which the ordinary ray image and the extraordinary ray image are images from the right viewpoint and the left viewpoint, and the distance measurement position has a parallax according to the distance, and then, the process proceeds to step ST43.
In step ST43, the measurement system acquires three or more types of polarized images. The parallax image generator 30 of the measurement system 10 acquires polarized images for each of three or more polarization directions from the stereo mixed image generated in step ST42. For example, if the polarized imaging unit 25 has a polarized pixel with a polarization direction of 0°, a polarized pixel with a polarization direction of 45°, a polarized pixel with a polarization direction of 90°, and a polarized pixel with a polarization direction of 135°, the parallax image generator 30 acquires a polarized image generated using a polarized pixel with a polarization direction of 0°. The parallax image generator 30 generates a polarized image generated using polarized pixels having a polarization direction of 45°, a polarized image generated using polarized pixels having a polarization direction of 90°, and a polarized image generated using polarized pixels having a polarization direction of 135°, and then, the process proceeds to step ST44.
In step ST44, the measurement system performs cosine fitting. The parallax image generator 30 of the measurement system 10 calculates a polarization model for each polarized pixel block using the pixel values of the polarized image for each polarization direction. When the pixel value of the polarized image for each polarization direction for each pixel is obtained by interpolation processing, the parallax image generator 30 calculates the polarization model for each pixel, and then, the process proceeds to step ST45.
In step ST45, the measurement system searches for a polarization direction in which the polarized image becomes the clearest. In the first search method for searching for the polarization direction in which the polarized image becomes the clearest, the parallax image generator 30 of the measurement system 10 performs calculation of Equation (19) using a function e for edge extraction such as the Sobel method, the Laplacian method, or the Canny method. The parallax image generator 30 sets the angle β at which the evaluation value H indicating the minimum edge component is obtained as the polarization direction θs in which the polarized image becomes the clearest, that is, the polarization direction θs in which a polarized image in which an extraordinary ray image is least mixed with the ordinary ray image or an ordinary ray image is least mixed with the extraordinary ray image is obtained. In Equation (19), e(Iβ)i the pixel value (luminance) of the i-th pixel in the edge image. “1 to K” indicates a predetermined image range used for searching the polarization direction, and the predetermined image range may be the entire screen region, and may be an image range that is set in advance so as to include the measurement target subject.
In this way, the parallax image generator 30 sets the polarization direction θs in which the polarized image becomes the clearest as the polarization direction in which the edge component is minimized.
Alternatively, the parallax image generator 30 may search for the clearest polarized image in the polarization direction using another search method. In the second search method, search is performed using polarized images in which the polarization directions have a phase difference of 90°. In the second search method, the parallax image generator 30 calculates the difference value |Iβ−Iβ−90 using the pixel value Iβ of the polarized image in the polarization direction β and the pixel value Iβ−90 of the polarized image in the polarization direction (β−90). In two polarized images whose polarization directions have a phase difference of 90°, the difference value is maximized if the ordinary ray image (extraordinary ray image) does not include the extraordinary ray image (ordinary ray image), and the difference value decreases as the ordinary ray image (extraordinary ray image) is included in the extraordinary ray image (ordinary ray image). Therefore, the parallax image generator 30 performs the calculation illustrated in Equation (20), and the angle β at which the evaluation value H indicating the sum of the differences for each pixel in a predetermined image range of the polarized images whose polarization directions have a phase difference of 90° is maximized is set as the polarization direction θs at which the polarized image becomes the clearest.
In this way, the parallax image generator 30 sets the polarization direction β in which the difference between the polarized images whose polarization directions have a phase difference of 90° is maximized as the polarization direction θs in which the polarized image becomes the clearest.
Since the polarized image in the polarization direction in which the image becomes clear has a phase difference of 90°, the parallax image generator 30 may perform the calculation illustrated in Equation (21), and the angle having a phase difference of 45° from the angle β at which the evaluation value H indicating the sum of the differences for each pixel in a predetermined image range of the polarized images whose polarization directions have a phase difference of 90° is minimized may be set as the polarization direction θs at which the polarized image becomes the clearest.
Next, as a third method, search may be performed using three polarized images whose polarization directions have a phase difference of 45°. In the third method, the parallax image generator 30 performs calculation illustrated in Equation (22) using a pixel value Iβ of the polarized image in the polarization direction β, a pixel value Iβ+45 of the polarized image in the polarization direction (β+45), and a pixel value Iβ−90 of the polarized image in the polarization direction (β−90), and a polarization direction β in which the evaluation value H indicating the sum of differences in a predetermined image range between an added image of the polarized images whose polarization directions have a phase difference of 90° and a polarized image having a phase difference of 45° is minimized is set as the polarization direction θs in which the polarized image becomes the clearest.
The parallax image generator 30 adds the pixel value Iβ of the polarized image in the polarization direction β and the pixel value Iβ−90 of the polarized image in the polarization direction (β−90) to generate an added image representing the ordinary ray image and the extraordinary ray image. The parallax image generator 30 subtracts the pixel value Iβ+45 of the polarized image in the polarization direction (β+45) from the pixel value of the added image.
The parallax image generator 30 sets the polarization direction β in which the difference between the added image and the polarized image in the polarization direction (β+45) is minimized as the polarization direction θs in which the polarized image becomes the clearest.
Next, as a fourth method, the parallax image generator 30 performs search using three polarized images whose polarization directions have a phase difference of 45°. In the fourth method, the parallax image generator 30 performs calculation illustrated in Equation (23) using the pixel value Iβ of the polarized image in the polarization direction 8, the pixel value Iβ−45 of the polarized image in the polarization direction (β−45), and the pixel value Iβ−90 of the polarized image in the polarization direction (β−90), and the polarization direction β in which the evaluation value H indicating the sum of differences in a predetermined image range between the added image of the polarized images whose polarization directions have a phase difference of 90° and a polarized image having a phase difference of 45° is minimized is set as the polarization direction θs in which the polarized image becomes the clearest.
The parallax image generator 30 subtracts the pixel value Iβ of the polarized image in the polarization direction β from the pixel value Iβ−45 of the polarized image in the polarization direction (β−45), and generates a difference image in which the ordinary ray image is attenuated in the image including the ordinary ray image and the extraordinary ray image. The parallax image generator 30 subtracts the pixel value Iβ−90 of the polarized image in the polarization direction (β−90) from the pixel value of the difference image.
The parallax image generator 30 sets the polarization direction β in which the difference between the difference image and the polarized image in the polarization direction (β−90) is minimized as the polarization direction θs in which the polarized image becomes the clearest.
The parallax image generator 30 searches for a polarization direction in which the polarized image becomes the clearest based on any one of the first to fourth search methods, and then, the process proceeds to step ST46. The parallax image generator 30 may use another search method if the polarization direction cannot be searched by any one of the first to fourth search methods and may determine the polarization direction in which the polarized image becomes the clearest using the search results of a plurality of search methods.
In step ST46, the measurement system generates a polarized image based on the search result. The parallax image generator 30 of the measurement system 10 generates the polarized image in the polarization direction θs searched in step ST45 and the polarized image in the polarization direction (θs+90) or the polarization direction (θs−90) based on Equation (14) or (15), and then, the process proceeds to step ST47.
In step ST47, the measurement system performs corresponding point matching. The distance measuring unit 40 of the measurement system 10 performs corresponding point matching using the polarized image in the polarization direction θs generated in step ST46 (corresponding to one of the ordinary ray image and the extraordinary ray image) and the polarized image in the polarization direction (θs+90) or the polarization direction (θs−90) (corresponding to the other of the ordinary ray image and the extraordinary ray image), and calculates the positional difference ∥PoPe∥ between the position Po of the distance measurement target in the ordinary ray image and the position Pe of the distance measurement target in the extraordinary ray image, and then, the process proceeds to step ST48.
The measurement system calculates the distance in step ST48. The distance measuring unit 40 of the measurement system performs calculation of Equation (1) using the focal distance f and the baseline length B set in advance and the parallax ∥PoPe∥ calculated in step ST45 and calculates the distance Z(P) to the distance measurement position P.
As described above, according to the third embodiment, as in the first and second embodiments, corresponding point matching can be performed even in a portion where no edge is detected, and distance information with higher resolution than when edge images are used can be obtained. High-resolution distance information can be obtained based on the polarization characteristics of the subject.
The pixel configuration of the polarized imaging unit is not limited to the configurations of the first to third embodiments, and may be the configurations of
By providing white pixels in this way, as disclosed in the Patent Literature “WO 2016/136085”, the dynamic range in generating normal line information can be expanded as compared to the case in which white pixels are not provided. Since the white pixels have a good S/N ratio, the calculation of the color difference is less susceptible to noise.
Note that the configurations illustrated in
With such a pixel configuration, the distance to the distance measurement position can be measured based on the polarized image, and the polarization characteristics of each pixel can be obtained. A non-polarized color image can be obtained.
The technology according to the present disclosure can be applied to various fields. For example, the technology according to the present disclosure may be realized as a device equipped in any type of moving body such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, a personal mobility device, an airplane, a drone, a ship, and a robot. The technology may be realized as a device mounted in equipment that is used in a production process in a factory or equipment that is used in a construction field.
If applied to such a field, it is possible to obtain high-resolution distance information even for a subject with few edges without using a plurality of imaging devices. Therefore, the surrounding environment can be grasped three-dimensionally with high accuracy, and fatigue of drivers and workers can be reduced. Automated driving and the like can be performed more safely.
A series of processes described in the specification can be executed by hardware, software, or a composite configuration of both. When executing processing by software, a program recording a processing sequence is installed in a memory within a computer incorporated in dedicated hardware and executed. Alternatively, the program can be installed and executed in a general-purpose computer capable of executing various processes. For example, the program can be recorded in advance in a hard disk, SSD (Solid State Drive), or ROM (Read Only Memory) as a recording medium. Alternatively, the program can be temporarily or permanently stored (recorded) in a removable recording medium such as a flexible disk, a CD-ROM (Compact Disc Read Only Memory), a MO (Magneto optical) disk, a DVD (Digital Versatile Disc), a BD (Blu-Ray Disc (registered trademark)), a magnetic disk, and a semiconductor memory card. Such a removable recording medium can be provided as so-called package software.
The program may be transferred from a download site to the computer wirelessly or by wire via a network such as a local area network (LAN) or the Internet, in addition to being installed in the computer from the removable recording medium. The computer can receive the program transferred in this way and install the program in a recording medium such as a built-in hard disk.
The effects described in the present specification are merely examples and are not limited, and there may be additional effects not described. The present technology should not be construed as being limited to the embodiments of the technology described above. The embodiments of the present technology disclose the present technology in the form of examples, and it is obvious that a person skilled in the art can modify or substitute the embodiments without departing from the gist of the present technique. That is, claims should be taken into consideration in order to determine the gist of the present technology.
The signal processing device of the present technology can also have the following configuration.
The present technology includes the following imaging devices.
Number | Date | Country | Kind |
---|---|---|---|
2020-193172 | Nov 2020 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2021/038544 | 10/19/2021 | WO |