The present invention relates to an image display apparatus and an image display method.
In order to support work in factory work and equipment maintenance work, a head mounted image display apparatus has been utilized. In many cases, a worker is holding an article necessary for objective work, and it is required that an input method to the image display apparatus is simple. As a simple input method, input means by a voice operation or a gesture operation has been devised.
In order to realize the gesture operation, it is necessary to recognize a target object to be gestured, and to further recognize a motion of the target object. In order to recognize a target object and a motion thereof, a three-dimensional recognition technique using a range image is utilized. For example, as Patent Document 1, a method of measuring a distance between an image display unit and a target object by using a TOF (Time Of Flight) sensor to realize gesture recognition has been devised.
As a method of obtaining a range image for recognizing gesture, there are cited a method of using a TOF sensor as described in Patent Document 1 described above, a method of using an active TOF type or Structured Light type three-dimensional distance measuring sensor, and a passive type method by a stereo camera type. However, each of the methods has a problem that a head mounted image display apparatus becomes large and is expensive in order to install it in the head mounted image display apparatus. As a result, a size of the image display apparatus is increased, and this becomes a load on a user thereof.
Further, as described above, in order to optimize a visual field area, usage of a wide angle lens or a fisheye lens, or optimization of a method of installing a distance measuring sensor may be cited. However, these also become factors that increase the size of the apparatus. As a result, a user who wears the apparatus imposes a burden.
It is an object of the present invention to provide an image display apparatus and a display method capable of reducing a load on a user when the user wears the apparatus.
The foregoing and other objects, and new features of the present invention will become more apparent from the detailed description of the present specification and the appending drawings.
An outline of representative invention of the present invention disclosed in the present application will briefly be explained as follows.
According to a representative embodiment of the present invention, there is provided an image display apparatus capable of being mounted on a head of a user. The image display apparatus includes: an image sensor configured to convert an optical image into an image signal and output the converted image signal, the optical image being captured by a plurality of light receiving elements arranged in an array on an imaging surface; a modulator provided on a light receiving surface of the image sensor, the modulator having a first pattern that includes a plurality of patterns different from each other, the modulator being configured to modulate intensity of light; a gesture detecting unit configured to divide image data obtained by receiving light transmitted through the modulator by the image sensor in accordance with the plurality of patterns included in the first pattern, the gesture detecting unit being configured to restore an image by calculation based on a second pattern corresponding to the first pattern, the gesture detecting unit being configured to obtain an image for detecting gesture of the user; a gesture recognizing unit configured to specify the gesture of the user by using the image obtained by the gesture detecting unit, the gesture recognizing unit being configured to recognize an instruction corresponding to the specified gesture; and an image display unit configured to display a result based on the instruction recognized by the gesture recognizing unit.
Effects obtained by the representative invention of the present invention disclosed in the present application will briefly be explained as follows.
Namely, according to the representative embodiment of the present invention, it is possible to reduce a load on a user when the user wears the apparatus.
Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings. Note that Further, the same components are in principle denoted by the same reference numeral throughout the drawings for describing the embodiments, and the repetitive description thereof will be omitted. On the other hand, a component that has been explained in a certain drawing by applying a reference numeral thereto is not illustrated again when another drawing is to be explained, but the component may be referred to by applying the same reference numeral thereto.
The image display apparatus 101A displays information (for example, an image or the like) in the image display units 103 positioned at the portions corresponding to the lens portions of the eyeglasses. Further, the image display apparatus 101A specifies gesture of the user by means of the distance measuring sensor unit 102 and the like; recognizes an operation (or an instruction) corresponding to the specified gesture; and changes the content to be displayed in the image display units 103 in accordance with the recognized operation. As illustrated in
Further, the image display units 103 described above may not be provided at positions corresponding to both eyes, and may be provided at a position corresponding to one eye. Further, the distance measuring sensor unit 102 may not be provided at a center portion of the image display apparatus 101, but may be provided at an end portion of the image display apparatus 101, for example. In this case, compared with a case where the distance measuring sensor unit 102 is installed at the center, it is possible to improve designability because the distance measuring sensor unit 102 becomes less noticeable.
Further, the light source unit 104 is set in advance so as to be capable of irradiating an area beyond a distance measuring range of the distance measuring sensor unit 102. For example, in a case where an irradiation range of one light source is narrow, a plurality of light sources is provided. For example, the light source units 104 may be respectively provided at both sides of the image display apparatus 101. This makes it possible to widen the irradiation range.
Subsequently, functions of the image display apparatus 101 according to the first embodiment will be described with reference to
The distance measuring sensor unit 102 is a part for imaging a photographic subject. Details thereof will be described later. The image display unit 103 is a part for displaying an image, and is a display device or the like. The light source unit 104 is a part for irradiating light. For example, the light source unit 104 irradiates near infrared light.
The entire control unit 203 is a central processing unit, and is a portion that executes the whole control in the image display apparatus 101A. The entire control unit 203 is realized by a CPU (Central Processing Unit) or the like, for example.
The light source control unit 204 is a part for controlling the light source unit 104, and operates the light source unit 104 at predetermined intervals. The light source control unit 204 is realized by a CPU or the like.
The gesture detecting unit 206 is a part of controlling the distance measuring sensor unit 102 to obtain an image for detecting gesture on the basis of image data obtained by the distance measuring sensor unit 102. The gesture detecting unit 206 is realized by a GPU (Graphics Processing Unit) or the like. The gesture detecting unit 206 has the image dividing unit 208, the image processing unit 209, and the distance measurement processing unit 210. When the image for detecting gesture is obtained, the gesture detecting unit 206 transmits the image to the gesture recognizing unit 213. Details of the gesture detecting unit 206 will be described later.
The gesture recognizing unit 213 is a part for recognizing gesture on the basis of the image detected by the gesture detecting unit 206, and executing a process based on the gesture. The gesture recognizing unit 213 is realized by a CPU or the like. When the image for detecting gesture is obtained from the gesture detecting unit 206, the gesture recognizing unit 213 analyzes the obtained image, specifies gesture of a target object (for example, a hand of the user), and recognizes an instruction corresponding to the specified gesture, thereby recognizing the gesture. The gesture recognizing unit 213 in advance stores information in which gesture and meaning of the gesture are associated with each other, and recognizes the instruction indicated by the specified gesture.
When the gesture is recognized, the gesture recognizing unit 213 transmits an instruction signal to the image display control unit 214 on the basis of the meaning of the gesture.
The image display control unit 214 is a part for controlling the image display unit 103. The image display control unit 214 receives the instruction signal from the gesture recognizing unit 213 or the like, and causes the image display unit 103 to display an image on the basis of the instruction signal. In a case where an instruction signal indicating enlargement of the image displayed by the image display unit 103 is received, for example, the image display control unit 214 enlarges and displays the displaying image.
The camera control unit 216 is a part for controlling the camera unit 217. When an instruction signal by the entire control unit 203 or the like is received, the camera control unit 216 operates the camera unit 217. Further, when an image is obtained from the camera unit 217 as a result that the camera unit 217 is operated, the camera control unit 216 transmits the image to the image display control unit 214.
The camera unit 217 is an imaging means, and is a part configured to take a still image or a moving image of an outside world. The camera unit 217 photographs the outside world in response to an instruction signal by the camera control unit 216, and transmits a photographed result to the camera control unit 216.
Subsequently, a processing procedure by the image display apparatus 101 will be described with respect to
First, at predetermined timing, the light source control unit 204 controls the light source unit 104 to adjust an amount of light to be irradiated and irradiate light (Step S1). Subsequently, the gesture detecting unit 206 obtains an image for detecting gesture on the basis of image data obtained by the distance measuring sensor unit 102 (Step S2). The gesture recognizing unit 213 recognizes gesture on the basis of the image obtained by the gesture detecting unit 206 (Step S3). The image display control unit 214 executes a display control in response to an instruction signal based on a recognition result by the gesture recognizing unit 213 (Step S4), and terminates the processing.
Subsequently, before details of configurations of the gesture detecting unit 206 and the distance measuring sensor unit 102 will be described, basic principle of imaging and distance measuring by using the distance measuring sensor unit 102 will be described.
<Photographing Principle of Infinite Object>
A structure of the distance measuring sensor unit 102 will be described with reference to
The pattern substrate 804 is made of transparent material, such as glass or plastic, with respect to visible light, for example. The photographing pattern 805 is formed by depositing metal such as aluminum or chromium by a sputtering method used for a semiconductor process, for example. A pattern can be shaded by a pattern in which aluminum is deposited and a pattern in which aluminum is not deposited.
Note that formation of the photographing pattern 805 is not limited to this. For example, the pattern may be formed by shading by means of printing of an ink jet printer. The pattern may be formed by any means so long as modulation of a transmission factor can be realized. Further, the visible light has been described herein as an example. For example, when photographing by far infrared ray is executed, material transparent to the far infrared ray, such as germanium, silicon, or chalcogenide, that is, material transparent to a wavelength that becomes a photographing target may be used for the pattern substrate 804, for example, and material that blocks the far infrared ray may be used for the photographing pattern 805.
Note that the method of forming the photographing pattern 805 on the pattern substrate 804 has been mentioned herein, but as illustrated in
Returning to
In a case where it is photographed by the above configuration, intensity of light of light for transmitting the photographing pattern 805 is modulated by the photographing pattern 805, the transmitted light is received by the image sensor 803. The image signal outputted from the image sensor 803 is subjected to image processing by the image processing unit 209 included in the gesture detecting unit 206, and is outputted to the gesture recognizing unit 213.
Subsequently, a photographing principle in the distance measuring sensor unit 102 will be described. First, the photographing pattern 805 is a concentric circle-shaped pattern in which pitches fine down so as to be inversely proportional to radii from a center thereof, and is defined as l(r)=1+cos βr2 (Formula (1)) by using a radius r from a reference coordinate that is a center of the concentric circles and a coefficient β. A transmission factor of the photographing pattern 805 is modulated so as to be proportional to this formula.
A plate with such fringes is called as a Gabor zone plate or a Fresnel zone plate.
It is assumed that as illustrated in
Next, development processing by a correlation developing method and a moire developing method of the image processing unit 209 will be described.
In the correlation developing method, the image processing unit 209 calculates a cross correlation function between the projection image of the photographing pattern 805 (
Fourier transforms of Formulas (1) and (3) are respectively as follows:
Here, F indicates calculation of Fourier transform, u is a frequency coordinate in an x direction, and δ with parentheses is a delta function. What is important in this formula is that the formula after Fourier transform also becomes the Fresnel zone plate or the Gabor zone plate. Therefore, the image processing unit 209 may directly generate a developing pattern after Fourier transform on the basis of this formula. This makes it possible to reduce a calculation amount. Next, by multiplying Formulas (4) and (5), it becomes
(Formula (6)). The term exp(−iku) expressed by this exponential function is a signal component, and this term is subjected to Fourier transform to be converted as −1 [e−iku]=2πδ(x+k) (Formula (7)). It is possible to obtain a bright spot at a position of k on the original x axis. This bright spot indicates a light flux at infinite, and is no other than a photographing image by the distance measuring sensor unit 102 illustrated in
Note that the correlation developing method may be realized by a pattern that is not limited to the Fresnel zone plate or the Gabor zone plate, for example, a random pattern so long as an autocorrelation function of the pattern has a single peak.
Next, in the moire developing method, the image processing unit 209 generates moire fringes (
IF(x)·IB(x)={1+cos[β(x+k)2+Φ]}cos(βx2+Φ)=½[2 cos(βx2+Φ)+cos(2βx2+2kβx+2βk2+2Φ)+cos(2kβx+βk2)]
(Formula (8)). It can be seen that a third term of this expansion is a signal component and an area in which straight, equally spaced interval patterns are overlapped in the direction of shift of the two patterns. A fringe generated at relatively low spatial frequency due to such overlap of such fringes is called as a moire fringe. Two-dimensional Fourier transform of this third term becomes
Here, F indicates calculation of Fourier transform, u is a frequency coordinate in the x direction, and δ with parentheses is a delta function. It can be seen from this result that a peak of spatial frequency occurs at a position of u=±kβ/π in a spatial frequency spectrum of the moire fringe. This bright spot indicates a light flux at infinite, and is no other than a photographing image by the distance measuring sensor unit 102 illustrated in
<Noise Cancellation>
Although a signal component is focused in conversion from Formula (6) to Formula (7) and conversion from Formula (8) to Formula (9), terms other than the signal component actually impede the development. Therefore, the image processing unit 209 executes noise cancellation based on fringe scanning. By using orthogonality of a trigonometric function, when a multiplication result of Formula (6) is integrated with respect to Φ as
in the correlation developing method, a noise term is cancelled and a constant multiple of the signal term remains. Similarly, when a multiply result of Formula (8) is integrated with respect to Φ as ∫02πIF(x)·IS(x)aΦ=π cos(2kβx+βk2) (Formula (11)) in the moire developing method, a noise term is cancelled and a constant multiple of the signal term remains.
Note that each of Formulas (10) and (11) is indicated by an integral form, but it is also possible to obtain the similar effect actually by calculating the toral sum of a combination of (as illustrated in
In the fringe scanning that has been explained above, it is necessary to use a plurality of patterns each having a different initial phase as the photographing pattern 805. In order to realize this, for example, there is a method of switching patterns by space division.
In order to realize space division fringe scanning, as illustrated in
Subsequently, an outline of the image processing by the image processing unit 209 based on an imaging principle described above will be described.
Since this calculation result becomes a complex number, the image processing unit 209 executes a real number converting process in which an absolute value or a real part is taken to convert an image of a photographing target into a real number and develop it (Step S15). The image processing unit 209 then executes, with respect to the obtained image, contrast enhancement processing (Step S16) and color balance adjustment (Step S17), and outputs it as a photographing image. As described above, the image processing by the image processing unit 209 is terminated.
On the other hand,
<Photographing Principle of Finite Distance Object>
Next,
On the other hand, imaging of an object with a finite distance will be described.
In a case where a spherical wave from a point 2401 that constitutes an object irradiates the photographing pattern 805 and a projection image 2402 is projected to the image sensor 803, the projection image is enlarged substantially evenly. Note that the gesture detecting unit 206 can calculate this magnification ratio α as
by using a distance f between the photographing pattern 805 and the point 2401.
Therefore, if a developing pattern designed for parallel light is used as it is to execute development processing, it is impossible to obtain a single bright spot. Therefore, in a case where the developing pattern 1501 is enlarged in accordance with the evenly enlarged projection image of the photographing pattern 805, a single bright spot can be obtained again for the enlarged projection image 2402. For this reason, it is possible to correct a coefficient β of the developing pattern 1501 by setting β/α2.
This makes it possible to selectively reproduce light from the point 2301 positioned at a distance that is not necessarily infinite. Therefore, it is possible to photograph the pattern by focusing on an arbitrary position. In other words, it is possible to calculate a distance to the arbitrary position. The present principle allows distance measurement as a distance measuring sensor.
In view of the principle described above, a configuration of the gesture detecting unit 206 according to the present embodiment will be described.
Patterns of the modulator 2502 (the photographing pattern 805) are configured to two-dimensionally arrange a plurality of initial phase patterns like patterns when the initial phases Φ of
In the example of
Thus, the modulator 2502 has the photographing pattern 805 provided on the light receiving surface of the image sensor 803 to modulate intensity of light. Namely, the gesture detecting unit 206 calculates the distance between the photographing pattern 805 and the point 2401 as described above, thereby it is possible to calculate the distance. As described above, the gesture detecting unit 206 restores the image by the calculation based on the second pattern in a divided image unit, and obtains the image for detecting gesture of the user. Note that the gesture detecting unit 206 does not restore the image in the divided image unit, but may collectively restore the image by means of a known calculation technique.
In the present embodiment, near infrared light is utilized as a light source that is installed in the image display apparatus 101A. By irradiating the near infrared light to the target object, it becomes possible to measure the distance even in a dark room. In order to block unnecessary visible light, infrared light reflected by the photographic subject is transmitted to the distance measuring sensor, and an infrared transmitting filter or a bandpass filter for blocking visible light is installed, for example.
Next, a method of shifting a gesture recognition area in order to improve usability of a gesture operation in the head mounted image display apparatus, for example, by setting a position at which the gesture operation is carried out outside a visual field range of the user to realize a more natural operation will be described.
At this time, a visual field of the distance measuring sensor, that is, the gesture recognition area is defined as an area indicated by a reference numeral “2803”.
As described above, the modulator 2502 of the distance measuring sensor unit 102 has the photographing pattern 805 that includes the plurality of patterns that are different from each other, and modulates intensity of light. The gesture detecting unit 206 divides the image data obtained by receiving the light transmitted through the modulator 2502 by the image sensor 803 in a divided unit; restores the image by the calculation based on the developing pattern 1501 corresponding to the photographing pattern 805; and obtains the image for detecting gesture of the user. This makes it possible to obtain the image of the object in the outside world without using the lens for forming an image. Therefore, it is possible to reduce a load on a user when the user wears the apparatus. Namely, it becomes possible to realize a small image display apparatus that can recognize gesture in the optimal gesture recognition area.
Subsequently, a second embodiment will be described. The present embodiment is different from the first embodiment in a configuration and an installing method of a distance measuring sensor unit 102. Compared with the first embodiment, it is indicated that a further small image display apparatus 101 can be realized.
At this time, a visual field of a distance measuring sensor, that is, a gesture recognition area becomes an area 3003 indicated by a straight line that passes through an end of the image sensor 803 and the pattern center.
At this time, the gesture recognition area becomes an area 3103 indicated by a straight line that passes through an end of the image sensor 803 and the pattern center. Thus, compared with
Here, a photographing pattern center will be described.
An example of a shifting method of the photographing pattern will be described with reference to
Further, by shifting the photographing pattern center in this manner, an image processing unit 209 executes a process to shift an image segmenting position at the time of development processing in accordance with the shift amount and segment an image.
According to the configuration described above, compared with a case where the distance measuring sensor is diagonally installed in the head mounted image display apparatus illustrated in the first embodiment, it becomes possible to realize a small image display apparatus. Further, as described above, the image display apparatus 101A can realize a more natural operation by shifting the position of the photographing pattern 805 and setting the position at which a user carries out a gesture operation to be outside a visual field range of the user. As a result, the image display apparatus 101A can improve usability of the gesture operation.
The present embodiment is different from the first embodiment in that a position of a modulator 2502 or a position of a photographing pattern 805 of the modulator 2502 is dynamically shifted. An example of a head mounted image display apparatus 101 is illustrated. For example, it is illustrated that usability of a user is improved by dynamically shifting a gesture recognition area so that the user is allowed to carry out a natural gesture depending upon a posture state such as an upright position or a sitting position.
Here, the sensor unit 3504 is a gyro sensor or a nine-axle sensor, for example, and is a part for obtaining information indicating a state of the user. By installing the head mounted image display apparatus 101B, for example, information on angular velocity, acceleration, or terrestrial magnetism is obtained. However, the sensor unit 3504 may be any sensor so long as the state of the user is obtained. The sensor unit 3504 transmits sensor information to the posture detecting unit 3503.
The posture detecting unit 3503 is a part for obtaining sensor information from the sensor unit 3504 and detecting a posture of the user on the basis of the sensor information. The posture detecting unit 3503 is realized by a CPU or the like, for example. The posture detecting unit 3503 transmits a detection result to the distance measuring sensor control unit 3502.
The gesture recognition area determining unit 3505 is a part for determining a suitable gesture recognition area in accordance with the posture detected by the posture detecting unit 3503. The gesture recognition area determining unit 3505 is realized by a CPU or the like. The gesture recognition area determining unit 3505 obtains the detection result from the posture detecting unit 3503, and determines the gesture recognition area on the basis of the obtained result. When the gesture recognition area is determined, the gesture recognition area determining unit 3505 transmits a signal indicating a shift instruction of a modulator 2502 in a distance measuring sensor unit 102 or a photographing pattern 805 to a distance measuring sensor control unit 3502 on the basis of the determination.
The distance measuring sensor control unit 3502 is a part for shifting the position of the modulator 2502 included in the distance measuring sensor unit 102 or the photographing pattern 805. The distance measuring sensor control unit 3502 is a part for shifting the position of the modulator 2502 in the distance measuring sensor unit 102 or the photographing pattern 805 on the basis of an instruction signal obtained from the gesture recognition area determining unit 3505. The distance measuring sensor control unit 3502 is realized by a CPU or the like. When the instruction signal is obtained from the gesture recognition area determining unit 3505, the distance measuring sensor control unit 3502 shifts the modulator 2502 in the distance measuring sensor unit 102 or the photographing pattern 805 on the basis of the instruction signal.
A processing flow of the image display apparatus 101B illustrated in
Here, the example in which the gesture recognition area is determined in accordance with the posture has been described. As another example, the image display apparatus 101B may include input means (or a setting unit) via which the user designates a gesture recognition area (or a shift amount). The gesture recognition area may be designated in accordance with a hand by which the user carries out gesture (right hand or left hand), or the gesture recognition area may be designated in accordance with an application used by the user.
By designation of the area, at least one of a shift direction or an angle can be designated. Since the gesture recognition area can be set flexibly in this manner, it is possible to realize a natural gesture operation in accordance with usage environment of the user, and this makes it possible to improve usability thereof.
Next, a shift example of the photographing pattern 805 of the modulator 2502 will be described with reference to
In the example of
For example, in a case where the user is at an upright position and wants to carry out a gesture operation at an upper side, the modulator 2502 is shifted to a position in the upper direction. In a case where the user is in a sitting position and it is assumed that the user is sitting on a chair and carries out a gesture operation on a desk, for example, it is good usability for the user when the gesture recognition area is shifted downward from a front face of the user. Therefore, the photographing pattern 805 of the modulator 2502 is shifted to a lower position. These are one example of the upright position and the sitting position. By shifting the position of the modulator 2502 so as to become a natural gesture operation in accordance with a posture of the user, it is possible to dynamically change the gesture recognition area.
In the example of
Here, the method of shifting the position of the modulator 2502 may be a method of mechanically shifting the modulator 2502, or may be a method of electrically shifting only the pattern. However, in a case where the modulator 2502 is shifted mechanically, a mechanism for moving the modulator 2502 is required, and there is a possibility that this causes an increase in a size of the apparatus.
Therefore, a method of electrically shifting only a pattern inexpensively will be described.
In this case, the distance measuring sensor control unit 3502 defines a portion in which light is blocked on the basis of a detection result by the posture detecting unit 3503, and controls the liquid crystal display 3901 of the distance measuring sensor unit 102. Further, the distance measuring sensor control unit 3502 notifies the gesture detecting unit 206 of the blocked portion. The gesture detecting unit 206 specifies a photographing pattern of the portion through which light is transmitted, which is specified by the blocked portion, and generates a developing pattern corresponding to the photographing pattern.
According to the configuration described above, the image display apparatus 101B changes the position of the photographing pattern 805 in accordance with the posture of the user, whereby it is possible to adjust an area where the user is expected to carry out gesture. Namely, the image display apparatus 101B dynamically shifts the gesture recognition area, for example, in accordance with a posture situation of the user such as the upright position or the sitting position or the hand by which the gesture is carried out, whereby it is possible to improve usability of the user.
An embodiment of a structure of a distance measuring sensor used in a head mounted image display apparatus 101 according to the present invention is illustrated to indicate that enlargement of a visual field is realized.
The present embodiment is different from the first embodiment to the third embodiment in that a structure of the distance measuring sensor illustrated in the first embodiment to the third embodiment is modified.
In other words, in the present embodiment, the angle of the parallel light flux 4403 illustrated in
In view of the foregoing, a gesture recognition range when the modulator 2502 is shifted as described in the second embodiment or the third embodiment will be described with reference to
An arrangement example of the light receiving element array 4201 and the microlens array 4202 according to the present embodiment will be described with reference to
A center 4804 of a microlens is shifted by a shift angle θ from the center 4303 of the microlens illustrated in
As a result, as illustrated in
According to the configuration and the method described above, it becomes possible to enlarge the gesture recognition range. It does not become a problem in a case where an image sensor with a high CRA characteristic is utilized. However, the present system allows to secure an optimal visual field.
The present invention is not limited to the image display apparatus 101A or the image display apparatus 101B described in the first embodiment to the fourth embodiment described above. A function by the display unit and a function to calculate image data may be divided by a communicating unit.
For example, as illustrated in
In this case, image data obtained by the distance measuring sensor unit 102 may be compressed and transmitted to the calculating unit 107.
Note that the present invention is not limited to the embodiments described above, and various modifications are contained. For example, the embodiments described above have been explained in detail for explaining the present invention clearly. The present invention is not necessarily limited to one that includes all configurations that have been explained.
Further, a part of the configuration of one embodiment can be replaced by a configuration of the other embodiment. Further, a configuration of the other embodiment can be added to a configuration of one embodiment.
Further, a part of the configuration of each of the embodiments can be added to the other configuration, deleted, or replaced thereby.
Further, a part or all of the respective configuration described above, the functions, processing units, and processing means may be realized by hardware that is designed by an integrated circuit, for example. Further, the respective configuration described above and the functions may be realized by software so that a processor interprets programs realizing the respective functions and execute the interpreted programs. Information on programs, tables, and files, which realize the respective functions, can be placed in a recording device such as a memory, a hard disk, or an SSD (Solid State Drive), or a recording medium such as an IC card, an SD card, or a DVD.
Further, control lines and information lines are illustrated so long as they are thought to be necessary for explanation. All of the control lines and the information line are not necessarily illustrated on a product. In fact, it may be considered that almost all of the components are connected to each other.
The present invention can be utilized for an apparatus that displays an image.
101, 101A, 101B . . . image display apparatus, 102 . . . distance measuring sensor unit, 103 . . . image display unit, 104 . . . light source unit, 106 . . . communicating unit, 107 . . . calculating unit, 203 . . . entire control unit, 204 . . . light source control unit, 206 . . . gesture detecting unit, 208 . . . image dividing unit, 209 . . . image processing unit, 210 . . . distance measurement processing unit, 213 . . . gesture recognizing unit, 214 . . . image display control unit, 216 . . . camera control unit, 217 . . . camera unit, 2502 . . . modulator, 3502 . . . distance measuring sensor control unit, 3503 . . . posture detecting unit, 3504 . . . sensor unit, 4201 . . . light receiving element array, 4202 . . . microlens array, 4301 . . . light receiving element, 4302 . . . microlens.
Number | Date | Country | Kind |
---|---|---|---|
2018-007855 | Jan 2018 | JP | national |
Number | Name | Date | Kind |
---|---|---|---|
8890813 | Minnen | Nov 2014 | B2 |
9880634 | Sugaya | Jan 2018 | B2 |
10146375 | Shahar | Dec 2018 | B2 |
11360568 | Fukushima | Jun 2022 | B2 |
20120140096 | Ostlund | Jun 2012 | A1 |
20120146903 | Arihara | Jun 2012 | A1 |
20120249416 | Maciocci et al. | Oct 2012 | A1 |
20150219808 | Gill | Aug 2015 | A1 |
20150253193 | Schilz et al. | Sep 2015 | A1 |
20150317518 | Fujimaki et al. | Nov 2015 | A1 |
20170060242 | Gill | Mar 2017 | A1 |
20170086256 | Chen | Mar 2017 | A1 |
20170214862 | Matsubara | Jul 2017 | A1 |
20190020789 | Shimano | Jan 2019 | A1 |
20200201446 | Kim | Jun 2020 | A1 |
20210141236 | Shimano | May 2021 | A1 |
Number | Date | Country |
---|---|---|
2014-518596 | Jul 2014 | JP |
2015-180869 | Oct 2015 | JP |
2015-213212 | Nov 2015 | JP |
2017145348 | Aug 2017 | WO |
Entry |
---|
International Search Report of PCT/JP2018/044570 dated Feb. 19, 2019. |
Number | Date | Country | |
---|---|---|---|
20220283648 A1 | Sep 2022 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16963274 | US | |
Child | 17742589 | US |