This application is the U.S. National Phase under 35 U.S.C. § 371 of International Application No. PCT/JP2018/034519, filed on Sep. 18, 2018, the entire contents are hereby incorporated by reference.
The present invention relates to a distance measuring apparatus, an imaging device, a distance measuring system, a distance measuring method, and an imaging method.
In order to support work in factory work and equipment maintenance work, a head mounted video display apparatus has been utilized. In many cases, a worker is holding an article necessary for objective work, and it is required that an input method to the video display apparatus is simple. Input means by a voice operation or a gesture operation has been devised.
In order to realize the gesture operation, it is necessary to recognize a target object (for example, fingers or the like of a wearer) to be gestured, and to further recognize a motion of the target object. A three-dimensional recognition technique using a range image is utilized for recognition of a target object and a motion thereof. However, a small and light distance measuring apparatus is required in order to reduce a load on a wearer. For example, as Patent document 1, a method of measuring a distance to a target object by attaching a special diffraction grating substrate to an image sensor and using a projection pattern generated on the image sensor by light transmitted through diffraction grating substrate has been devised.
In Patent document 1 described above, distance information (information indicating a distance to a target object) is generated from a relationship between contrast of an imaging result and a focus position. Now, in a case where a background is included in the imaging result, there is a possibility that an accurate distance cannot be measured due to this influence of the background and appropriate distance information cannot thus be generated.
It is an object of the present invention to provide a technique for generating distance information of a photographic subject more accurately in consideration of an influence of a background.
The problem is solved by the invention described in claims, for example.
According to the present invention, it becomes possible to generate distance information of a photographic subject more accurately in consideration of an influence of a background.
In embodiments described below, the invention will be described in a plurality of sections or embodiments when required as a matter of convenience. However, these sections or embodiments are not irrelevant to each other unless otherwise stated, and the one relates to the entire or a part of the other as a modification example, details, or a supplementary explanation thereof.
Further, in the embodiments described below, in a case of referring to the number of elements (including number of pieces, values, amount, range, and the like), the number of the elements is not limited to a specific number unless otherwise stated or except the case where the number is apparently limited to a specific number in principle, and the number larger or smaller than the specified number may also be applicable.
Moreover, in the embodiments described below, it goes without saying that the components (including element steps and the like) are not always indispensable unless otherwise stated or except the case where the components are apparently indispensable in principle.
Similarly, in the embodiments described below, when the shape of the components, positional relation thereof, and the like are mentioned, the substantially approximate and similar shapes and the like are included therein unless otherwise stated or except the case where it is conceivable that they are apparently excluded in principle. The same goes for the numerical value and the range described above.
Further, the same components are in principle denoted by the same reference numeral throughout the drawings for describing the embodiments, and the repetitive description thereof will be omitted.
Hereinafter, embodiments of the present invention will be described with reference to the drawings.
An embodiment of a distance measuring apparatus according to the present invention will be described. This indicates that when distance information of a target object is generated, it is possible to reduce an influence of a background, and generate the distance information with high accurate.
In the present embodiment, the distance measuring apparatus according to the present invention is mounted on a head mounted video display apparatus to be used as a distance measuring sensor for gesture recognition.
First, a configuration of the head mounted video display apparatus according to the present embodiment will be described.
The video display apparatus 101 includes a distance measuring sensor unit 102, video display units 103, a light source unit 104, and an entire control unit (not illustrated in the drawings). The head mounted video display apparatus 101 may not have an eyeglass type so long as the head mounted video display apparatus 101 has a shape that can be worn on a head, and the video display unit may be one for one eye.
The distance measuring sensor unit 102 may not be installed at a central portion of the video display apparatus 101, but may be installed at an edge thereof, for example. In this case, by installing the distance measuring sensor unit 102 at the edge compared with a case where it is installed at the central portion as illustrated in
A GPU (Graphics Processing Unit) 202 is a calculating unit specialized in real-time image processing, and mainly bears processes regarding image processing. An entire control unit 203 is realized by a CPU or the like, and controls the entire processing in the video display apparatus 101. A light source control unit 204 controls the light source unit 104. The light source unit 104 irradiates near infrared light, for example. A distance measuring unit 205 controls the distance measuring sensor unit 102. A video display control unit 211 controls the video display unit 103. A camera control unit 212 controls a camera unit 213 to photograph a still image, a moving image and the like of the outside world, for example. The camera unit 213 is imaging means.
A gesture recognizing unit 209 detects and recognizes gesture on the basis of distance information generated by the distance measuring unit 205 (information indicating the distance to the photographic subject). A gesture operation input unit 210 inputs the gesture recognized by the gesture recognizing unit 209 as an input operation to the video display apparatus 101.
A processing flow of the video display apparatus 101 illustrated in
Before describing a configuration of the distance measuring unit 205, a basic principle of imaging and distance measuring using the distance measuring sensor unit 102 will be described.
<Principle of Photographing Infinity Object>
The pattern substrate 404 is made of transparent material, such as glass or plastic, with respect to visible light, for example. The photographing pattern 405 is formed by depositing metal such as aluminum or chromium by a sputtering method used for a semiconductor process, for example. A pattern can be shaded by a pattern in which aluminum is deposited and a pattern in which aluminum is not deposited.
Note that formation of the photographing pattern 405 is not limited to this. For example, the pattern may be formed by shading by means of printing of an ink jet printer. The pattern may be formed by any means so long as modulation of a transmission factor can be realized. Further, for example, when photographing by far infrared ray is executed, material transparent to the far infrared ray, such as germanium, silicon, or chalcogenide, that is, material transparent to a wavelength that becomes a photographing target may be used for the pattern substrate 404, for example, and material that blocks the far infrared ray may be used for the photographing pattern 405.
Note that the method of forming the photographing pattern 405 on the pattern substrate 404 has been mentioned herein, but as illustrated in
As illustrated in
Note that the image signal (analog image data) is converted into a digital signal via an analog/digital converting circuit, for example, and the digital signal is outputted as digital image data. In the present specification, a case where the distance measuring sensor unit 102 outputs the image data will be described.
The fringe scanning processor 207 removes noise by fringe scanning on the image data (the sensor image) outputted from the image sensor 403, and outputs the image data to the image processor 208. For example, the fringe scanning processor 207 generates a complex sensor image having a complex number from the sensor image. The image processor 208 executes predetermined image processing for the image data outputted from the fringe scanning processor 207 to convert a data format thereof if necessary, store the image data in a storage device (not illustrated in the drawings) of the imaging device 402, and output them to an external host computer or an external recording medium.
Subsequently, a photographing principle in the imaging device 402 will be described. First, the photographing pattern 405 is a concentric circle-shaped pattern in which pitches fine down so as to be inversely proportional to radii from a center thereof, and is defined as Formula (1) as follows by using a radius r from a reference coordinate that is a center of the concentric circles and a coefficient β.
I(r)=1+cos βr2 Formula (1)
A transmission factor of the photographing pattern 405 is modulated so as to be proportional to this formula. A plate with such fringes is called as a Gabor zone plate or a Fresnel zone plate.
It is assumed that as illustrated in
IF(x)=1+cos[β(x+k)2+Φ] Formula (2)
Note that Φ indicates an initial phase of transmissivity distribution of Formula (1).
Next, development processing by a correlation developing method and a moire developing method will be described in relation to the processing in the image processor 208.
In the correlation developing method, by calculating a cross correlation function between the projection image of the photographing pattern 405 illustrated in
IB(x)=cos(βx2+Φ) Formula (3)
Since the developing pattern 1101 is used in the image processing, it is not necessary to be offset by “one” as Formula (1), and there is no problem even if it has a negative value. Fourier transforms of Formulas (1) and (3) respectively become Formula (4) and Formula (5) as follows.
Here, F indicates calculation of Fourier transform, u is a frequency coordinate in an x direction, and δ with parentheses is a delta function. What is important in this formula is that the formula after Fourier transform also becomes the Fresnel zone plate or the Gabor zone plate. Therefore, a developing pattern after Fourier transform may be generated directly on the basis of this mathematical formula. This makes it possible to reduce a calculation amount.
Next, by multiplying Formula (4) by Formula (5), it becomes Formula (6) as follows.
The term exp(−iku) expressed by this exponential function is a signal component, and this term is subjected to Fourier transform to be converted as Formula (7) as follows. It is possible to obtain a bright spot at a position of k on the original x axis.
−1[e−iku]=2πδ(x+k) Formula (7)
This bright spot indicates a light flux at infinite, and is no other than a photographing image by the imaging device 402 illustrated in
Note that the correlation developing method may be realized by a pattern that is not limited to the Fresnel zone plate or the Gabor zone plate, for example, a random pattern so long as an autocorrelation function of the pattern has a single peak.
Next, in the moire developing method, the projection image of the photographing pattern 405 illustrated in
It can be seen that a third term of this expansion is a signal component and an area in which straight, equally spaced interval patterns are overlapped in the direction of shift of the two patterns. A fringe generated at relatively low spatial frequency due to such overlap of such fringes is called as a moire fringe. Two-dimensional Fourier transform of this third term becomes Formula (9) as follows.
Here, Here, F indicates calculation of Fourier transform, u is a frequency coordinate in the x direction, and δ with parentheses is a delta function. It can be seen from this result that a peak of spatial frequency occurs at a position of u=±kβ/π in a spatial frequency spectrum of the moire fringe. This bright spot indicates a light flux at infinite, and is no other than a photographing image by the imaging device 402 illustrated in
<Noise Cancellation>
Although a signal component is focused in conversion from Formula (6) to Formula (7) and conversion from Formula (8) to Formula (9), terms other than the signal component actually impede the development. Therefore, noise cancellation based on fringe scanning is executed.
For the fringe scanning, it is necessary to use a plurality of patterns each having a different initial phase Φ as the photographing pattern 405.
Here, the developing pattern 1101 of the complex number can be expressed by Formula (11) as follows.
ICB(x)=exp(−iβx2) Formula (11)
Since the developing pattern 1101 is used in calculation processing, there is no problem even if it is a complex number. In case of the moire developing method, by multiplying Formula (10) by Formula (11), it becomes Formula (12) as follows.
ICF(x)·ICB(x)=exp[iβ(x+k)2]·exp(−iβx2)=exp[2iβkx+iβk2] Formula (12)
It can be seen that the term exp(2iβkx) expressed by this exponential function is a signal component, and the unnecessary term as in Formula (8) does not occur and noise is cancelled.
Similarly, when the correlation developing method is also confirmed, Fourier transforms of Formula (10) and Formula (11) respectively become Formula (13) and Formula (14) as follows.
Next, by multiplying Formula (13) by Formula (14), it becomes Formula (15) as follows.
It can be seen that the term exp(−iku) expressed by this exponential function is a signal component, and the unnecessary term as in Formula (8) does not occur and noise is cancelled.
Note that the example described above has been described by using the plurality of patterns of the four phases. However, Φ may be set so as to divide an angle between 0 and 2π, and is not limited to these four phases.
In order to realize photographing by the plurality of patterns described above, there are a method of switching patterns by time division (time division fringe scanning) and a method of switching patterns by space division (space division fringe scanning).
In order to realize the time division fringe scanning, for example, the photographing pattern 405 is configured by a liquid crystal display element that can electrically switch and display (that is, can change patterns) a plurality of initial phases illustrated in
On the other hand, in order to realize the space division fringe scanning, for example, as illustrated in
Subsequently, the fringe scanning calculation by the fringe scanning processor 207 will be described.
First, the fringe scanning processor 207 obtains sensor images of a plurality of phase patterns outputted from the image sensor 403 (one piece in case of the space division fringe scanning, or plural pieces in case of the time division fringe scanning). In a case where the space division fringe scanning is used, the fringe scanning processor 207 divides the obtained sensor images for each phase (S1701). In a case where the time division fringe scanning is used, the process at S1701 is not executed. Next, the fringe scanning processor 207 initializes a complex sensor image for output (S1702).
Subsequently, the fringe scanning processor 207 repeats processes at S1703 to S1705 for each initial phase. For example, in the fringe scanning using the four phases as illustrated in
Finally, the fringe scanning processor 207 outputs the complex sensor image (S1707). The processes described above by the fringe scanning processor 207 correspond to Formula (10).
Subsequently, image processing by the image processor 208 will be described.
First, the image processor 208 obtains a complex sensor image outputted from the fringe scanning processor 207, and executes two-dimensional fast Fourier transform (FFT: Fast Fourier Transform) calculation for the complex sensor image (S1801). Next, the image processor 208 generates the developing pattern 1101 (the second grid pattern) used for development processing; multiplies it by the complex sensor image subjected to the two-dimensional FFT calculation (S1802); and executes inverse two-dimensional FFT calculation (S1803). Since this calculation result becomes a complex number, the image processor 208 converts the image as a photographing target into a real number, and develops (or restores) it by converting it into an absolute value or taking out a real part thereof (S1804). Then, the image processor 208 executes a contrast emphasizing process (S1805) and color balance adjustment (S1806) for the obtained development image, and outputs it as a photographed image. As described above, the image processing by the image processor 208 based on the correlation developing method is terminated.
On the other hand,
First, the image processor 208 obtains a complex sensor image outputted from the fringe scanning processor 207. The image processor 208 generates the developing pattern 1101 to be used for development processing; multiplies it by the complex sensor image (S1901); obtains a frequency spectrum by two-dimensional FFT calculation (S1902); and cuts out data on a necessary frequency domain from this frequency spectrum (S1903). Subsequent processes in
<Photograph Principle of Finite Distance Object>
Next,
On the other hand, imaging of an object with a finite distance will be described.
For that reason, if a developing pattern designed for parallel light is used as it is to execute development processing, it is impossible to obtain a single bright spot. Therefore, in a case where the developing pattern 1101 is enlarged in accordance with the evenly enlarged projection image of the photographing pattern 405, a single bright spot can be obtained again for the enlarged projection image 2102. For this reason, it is possible to correct a coefficient β of the developing pattern 1101 by setting β/α2. This makes it possible to selectively reproduce light from the point 2101 positioned at a distance that is not necessarily infinite. Therefore, it is possible to photograph the pattern by focusing on an arbitrary position. In other words, it is possible to calculate a distance to the arbitrary position. The present principle allows distance measurement as a distance measuring sensor.
First, an initial value (infinity or distance zero, or the like) of the focus position is set (S2201); a magnification ratio α is calculated from the focus position; a coefficient β of the developing pattern 1101 is calculated (S2202); and the development processing is executed (S2203). This development processing is a process equivalent to the development processing described with reference to
Then, as illustrated in
Then, the focus position is set by shifting by Δf (S2207), and the subsequent processes at S2202 to S2205 are executed until scanning in a variable focus range set in advance is completed (S2206). After the scanning is completed, a focus position at which contrast becomes the maximum is searched for each region (S2208), and distance information is outputted (S2209).
<Influence on Distance Measurement Accuracy by Interference of Background>
In the distance measurement that has been described, for example, in a case where a point 2302 forming a person who is a distance measuring target in
First,
It(x0)=at0 exp(iθt0) Formula (19)
|It(x0)|2=at02 Formula (20)
On the other hand,
Ib(x0)=ab0 exp(iθb0) Formula (21)
I(x0)=It(x0)+It(x0)=at0 exp(iθt0)+ab0 exp(iθb0) Formula (22)
|I(x0)|2=at02+ab02+2at0ab0 cos[θt0+θb0] Formula (23)
When Formula (20) and Formula (23) are compared with each other, Formula (24) as follows, which is an interference component of the point 2302 and the point 2303, exists. This is a feature of this lensless camera system, which is not found in a conventional camera. This interference component affects distance measurement performance.
2at0ab0 cos[θt0+θb0] Formula (24)
In view of the principle described above, configurations of the distance measuring sensor unit 102 and the distance measuring unit 205 according to the present embodiment will be described.
The distance measuring unit 205 includes an interframe difference processor 206, the fringe scanning processor 207, and the image processor 208.
The interframe difference processor 206 obtains the sensor image outputted from the distance measuring sensor unit 102, and generates a difference image between the obtained sensor image and a sensor image obtained before one frame. The interframe difference processor 206 generates the difference image between frames of the sensor images. For example, the interframe difference processor 206 generates a difference image between frames of complex sensor images.
The fringe scanning processor 207 obtains the difference image outputted from the interframe difference processor 206, and executes fringe scanning calculation.
The image processor 208 obtains the complex sensor image generated by the fringe scanning processor 207, and generates distance information.
First, the interframe difference processor 206 obtains a sensor image Si(n) from the distance measuring sensor unit 102 (S2701). The “n” is a variable stored in a memory, and represents the number of times the sensor image is obtained from the distance measuring sensor unit 102. The “n” is initialized to zero when the video display apparatus 101 is activated. The sensor image Si(n) is a sensor image obtained at the nth times.
The interframe difference processor 206 stores the obtained sensor image Si(n) in the memory (S2702).
Next, the interframe difference processor 206 initializes an output image Si(n) (S2703). In a case where a sensor image Si(n−1) is stored (S2704: Yes), the interframe difference processor 206 generates a difference image between the sensor image Si(n) and the sensor image Si(n−1), and sets it to the output image So(n) (S2705).
In a case where the sensor image Si(n−1) is not stored (S2704: No), the interframe difference processor 206 sets the sensor image Si(n) to the output image So(n) (S2706). Thus, the interframe difference processor 206 determines whether the difference image of the frames is to be generated or not on the basis of a storage situation of the sensor images between the frames. In a case where it is determined that the difference image is not to be generated, the sensor image is outputted as the output image.
Finally, the interframe difference processor 206 outputs the output image So(n) (S2707) and increments the “n” (S2708), and the processing flow is terminated.
Note that although it is not described in
First, the interframe difference processor 206 obtains the sensor image outputted from the distance measuring sensor unit 102 (S2801); executes processes equivalent to those in
The fringe scanning processor 207 obtains the difference image generated by the interframe difference processor 206, and executes fringe scanning calculation (S2803). The fringe scanning calculation is a process equivalent to the processes at S1701 to S1707 illustrated in
The image processor 208 executes distance measuring process by using the complex sensor image generated by the fringe scanning processor 207, and outputs distance information (S2804). The distance measuring process is a process equivalent to the processes at S2201 to S2209 illustrated in
Note that the processing order of the interframe difference processor 206 and the fringe scanning processor 207 may be exchanged.
The interframe difference processor 206 generates a difference image from a complex sensor image outputted from the fringe scanning processor 207 (S3003). In the interframe difference processor 206, the images for generating the difference image are changed from the sensor images to the complex sensor images. However, it is possible to realize processes equivalent to those of
The image processor 208 obtains the difference image outputted from the interframe difference processor 206, and executes a distance measuring process to output distance information (S3004).
By Formula (10), in the fringe scanning calculation, a plurality of sensor images obtained in respective patterns of the initial phases is used as one piece of complex sensor image. For this reason, by first executing the fringe scanning processor 207, it is possible to reduce a calculation amount in the interframe difference processor 206.
Note that in the example described above, the difference image from the image before one frame has been used for the distance measuring process. However, how many frames before the difference image from the image to be taken may be set arbitrarily. For example, in a case where a moving distance of a photographic subject is smaller with respect to a frame rate of a distance measuring sensor, a signal component of the difference image becomes smaller, the distance measurement may not be possible. In such a case, by using a difference image from an image before several frames, it becomes possible to execute accurate distance measurement.
Further, in a case where the moving distance of the photographic subject is larger with respect to the frame rate of the distance measuring sensor, by setting the frame rate of the distance measuring sensor to a high speed, it becomes possible to execute accurate distance measurement.
Further, in a case where the sensor image Si(n−1) is not stored, the interframe difference processor 206 according to the present embodiment outputs the obtained sensor image Si(n) to use it for the distance measuring process. For example, as illustrated in
Note that as illustrated in
Further, as illustrated in
In a case where all distance measuring processes are executed by a difference image as illustrated in
On the other hand, in a case where the distance measuring process is executed using the normal sensor image as illustrated in
Further, in a case where Si(n−1) is older than a set threshold value, or in a case where a signal component of the difference image is equal to or larger than the set threshold value, the distance measurement using the normal sensor image Si(n) may be executed.
On the contrary, in a case where the signal component of the difference image is equal to or less than the threshold value, the distance measurement using the normal sensor image Si(n) may be executed. Alternatively, previous distance information may be outputted as a result of the distance measurement because there is no motion of the target object.
As described above, the interframe difference processor 206 generates the difference image between the frames of the sensor images, and the image processor 208 generates the distance information indicating the distance to the photographic subject on the basis of the calculation of the difference image and the developing pattern 1101. Thus, the video display apparatus 101 generates the difference image between the frames of the sensor images. Therefore, it becomes possible to realize the distance measuring apparatus capable of reducing an influence of a background and generating highly accurate distance information.
A different point between the present embodiment and the first embodiment is that a distance measuring sensor unit and a distance measuring unit are mounted on another apparatus.
A configuration illustrated in
A configuration illustrated in
In the present embodiment, by mounting the processes executed by the distance measuring unit 205 illustrated in
On a side of the video display apparatus 3405 illustrated in
On a side of the calculating unit 3407 illustrated in
By Formula (10), in the fringe scanning calculation, a plurality of sensor images obtained in respective patterns of initial phases is used as one piece of complex sensor image. For this reason, by mounting the fringe scanning processor 207 on the video display apparatus 3405 side, it is possible to reduce the amount of data compared with a case where a normal sensor image is transmitted.
Processing flows of the video display apparatus 3405 and the calculating unit 3407 respectively illustrated in
The processing flow of the video display apparatus illustrated in
First, the light source control unit 204 controls the light source unit 104, thereby adjusting an amount of light to be irradiated (S3701).
Next, the fringe scanning processor 207 obtains a sensor image from the distance measuring sensor unit 102 (S3702), and executes fringe scanning calculation to generate a complex sensor image (S3703). Then, the interframe difference processor 206 generates a difference image of complex sensor images, and outputs it as a distance measuring image (S3704).
Subsequently, the compressing unit 3501 obtains the distance measuring image outputted from the interframe difference processor 206 to compress it (S3705). The transmitting/receiving unit 3502 transmits the compressed data are to the calculating unit 3407 (S3706).
Finally, the transmitting/receiving unit 3502 receives the data transmitted from the calculating unit 3407 (S3707), and the video display control unit 211 causes the video display unit 103 to display it on the basis of the received information (S3708).
The processing flow of the calculating unit illustrated in
First, the transmitting/receiving unit 3602 receives the compressed data transmitted from the video display apparatus 3405 (S3801), and the extending unit 3601 extends the compressed data (S3802).
Subsequently, the distance measuring unit 3605 executes a distance measuring process for the extended data, that is, the image obtained by the distance measuring sensor unit 102 using the image subjected to a fringe scanning process and interframe difference processing, and generates distance information (S3803). The gesture recognizing unit 209 executes gesture recognition by using the distance information obtained from the distance measuring unit 3605 (S3804), and receives a gesture recognition result as an input operation by inputting it to the gesture operation input unit 210 (S3805).
Finally, the transmitting/receiving unit 3602 transmits a response according to the input operation (S3806).
By dividing functions and mounting the divided functions on another apparatus, the process to transmit and receive the data is required, and as a result, processing delay may occur. However, by compressing data to be transmitted and received, it becomes possible to minimize the processing delay.
Note that the interframe difference processor 206 or the fringe scanning processor 207 may be mounted on the calculating unit 3407 side. In this case, since a calculation amount in the video display apparatus 3405 can be reduced, it is possible to realize miniaturization and low cost of the video display apparatus 3405.
A problem when a distance measuring apparatus according to the present invention is mounted on a head mounted video display apparatus and used will be described with reference to
A photographic subject 4001 is a photographic subject at n=k−1, and a sensor image 4002 is a sensor image Si(k−1) that is outputted when an image of the photographic subject 4001 is taken by the distance measuring sensor unit 102. Further, a photographic subject 4003 is a photographic subject at n=k, and a sensor image 4004 is a sensor image Si(k) that is outputted when an image of the photographic subject 4003 is taken by the distance measuring sensor unit 102.
A difference image 4005 is a difference image So(k) between the Si(k) and the Si(k−1).
In this case, since information other than a hand portion moved by the photographic subject is excluded from the So(k), the video display apparatus 101 can generate accurate distance information on the hand portion.
In this case, the entire positional relationship between a sensor image (4104), which is outputted when an image of the photographic subject 4103 is taken by the distance measuring sensor unit 102, and a difference image (4105) of the sensor image 4002 is out of alignment. For this reason, information on the background cannot be excluded, and accurate distance measurement cannot thus be executed.
As described above, in a case where the positional relationship between the distance measuring sensor and the photographic subject changes such as the wearer moving his or her head, the problem is that the information on the background cannot be excluded even though the difference image is used, and the accurate distance measurement cannot be executed.
In the present embodiment, a method of solving the problem by obtaining movement information of the distance measuring sensor from cross correlation of the sensor images obtained by the distance measuring sensor unit 102 will be described.
First, an outline of the method of solving the problem according to the present embodiment will be described with reference to
A positional relationship between a photographic subject and a distance measuring sensor in
If the positional relationship between the background and the photographic subject is the same in the region used for the difference image, a difference image 4205 in the region excludes information other than the hand portion moved by the photographic subject, and this makes it possible to generate accurate distance information on the hand portion of the photographic subject.
At this time, the difference region determining unit 3901 determines the region in which the positional relationship between the background and the photographic subject is the same.
Next, movement information of the distance measuring sensor unit 102 is obtained using a calculation result of the cross correlation function (S4303). The sensor image outputted from the distance measuring sensor unit 102 is a projection image of the photographing pattern 405. The photographing pattern 405 shifts depending upon an angle at which a light source enters. For that reason, if an angle of a target object with respect to the distance measuring sensor unit 102 changes due to the wearer moving his or her head, the pattern to be projected also moves. The amount of movement is obtained from the cross correlation.
Finally, a region where the positional relationship between the background and the photographic subject is the same is determined from the movement information of the distance measuring sensor (S4304).
Note that since the photographing pattern 405 is projected onto the entire sensor, it is possible to obtain the moving amount even by cutting out a partial region of the sensor images Si(n−1) and Si(n) and calculating the cross correlation function.
Thus, by cutting out the partial region and calculating the cross correlation function, it is possible to reduce a calculation amount.
The fringe scanning processor 207 obtains the difference image, and executes fringe scanning calculation to generate a complex sensor image (S4604). Finally, the image processor 208 executes distance measurement using the complex sensor image, and generates distance information (S4605).
Note that in a case where the peak of the cross correlation function between the sensor images Si(n−1) and Si(n) is smaller than a set threshold value in the difference region determining unit 3901, there is a possibility that the wearer moves his or her head as the photographic subject is out of an imaging range of the distance measuring sensor. Therefore, the video display apparatus 101 may execute a distance measuring process by the Si(n) without using the difference image, or may stop the distance measuring process.
Further, the same also applies to a case where the difference image region determined by the difference region determining unit 3901 is narrower than a set threshold value.
The configuration and method described above allows the distance measurement corresponding to a motion of the wearer even in a case where the distance measuring sensor is mounted on the head mounted video display apparatus.
In the third embodiment, the difference image region has been determined from the cross correlation of the sensor images outputted from the distance measuring sensor unit 102. However, in the present embodiment, the posture detecting unit 4702 detects posture of a wearer using information obtained by the sensor unit 4701, and determines a region of a difference image.
The sensor unit 4701 is an acceleration sensor, a direction sensor, or the like, for example, and is capable of sensing a motion of the video display apparatus 101. By using a dedicated sensor for detecting the posture of the wearer, it is possible to speed up processes and reduce a calculation amount.
Specifically, the sensor unit 4701 senses motion information of the video display apparatus 101, and sends out the motion information to the posture detecting unit 4702. The posture detecting unit 4702 generates movement information (information indicating a moving direction and a moving speed) of the video display apparatus 101 from the motion information. The posture detecting unit 4702 sends out the movement information to the distance measuring unit 205. The difference region determining unit 3901 of the distance measuring unit 205 determines a region of a difference image on the basis of the movement information.
Further, the interframe difference processor 206 may be configured to determine whether a difference image between frames is generated on the basis of the movement information or not. For example, in a case where the movement information indicates that a moving amount is large, the interframe difference processor 206 determines that it is meaningless to take a difference from an immediately preceding image.
Note that the present invention is not limited to the embodiments described above, and various modifications are contained. For example, the embodiments described above have been explained in detail for explaining the present invention clearly. The present invention is not necessarily limited to one that includes all configurations that have been explained.
Further, a part of the configuration of one embodiment can be replaced by a configuration of the other embodiment. Further, a configuration of the other embodiment can be added to a configuration of one embodiment.
Further, a part of the configuration of each of the embodiments can be added to the other configuration, deleted or replaced thereby.
Further, a part or all of the respective configuration described above, the functions, processing units, and processing means may be realized by hardware that is designed by an integrated circuit, for example. Further, the respective configuration described above and the functions may be realized by software so that a processor interprets programs realizing the respective functions and execute the interpreted programs. Information on programs, tables, and files, which realize the respective functions, can be placed in a recording device such as a memory, a hard disk, or an SSD (Solid State Drive), or a recording medium such as an IC card, an SD card, or a DVD.
Further, control lines and information lines are illustrated so long as they are thought to be necessary for explanation. All of the control lines and the information line are not necessarily illustrated on a product. In fact, it may be considered that almost all of the components are connected to each other.
101 . . . video display apparatus, 102 . . . distance measuring sensor unit, 103 . . . video display unit, 104 . . . light source unit, 201 . . . CPU, 202 . . . GPU, 203 . . . entire control unit, 204 . . . light source control unit, 205 . . . distance measuring unit, 206 . . . interframe difference processor, 207 . . . fringe scanning processor, 208 . . . image processor, 209 . . . gesture recognizing unit, 210 . . . gesture operation input unit, 212 . . . camera control unit, 213 . . . camera unit, and 214 . . . video display control unit.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2018/034519 | 9/18/2018 | WO |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2020/059029 | 3/26/2020 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
10551532 | Nakamura | Feb 2020 | B2 |
10887504 | Yamaguchi | Jan 2021 | B2 |
10951793 | Nakamura | Mar 2021 | B2 |
20180095200 | Nakamura et al. | Apr 2018 | A1 |
20190339485 | Nakamura et al. | Nov 2019 | A1 |
20210112266 | Nakade | Apr 2021 | A1 |
Number | Date | Country |
---|---|---|
04-301507 | Oct 1992 | JP |
2014-187484 | Oct 2014 | JP |
2018-060549 | Apr 2018 | JP |
2018-061109 | Apr 2018 | JP |
2017149687 | Sep 2017 | WO |
Entry |
---|
International Search Report issued in corresponding International Patent Application No. PCT/JP2018/034519, dated Dec. 11, 2018, with English translation. |
Number | Date | Country | |
---|---|---|---|
20220043265 A1 | Feb 2022 | US |