The present application is based upon and claims priority to Chinese Patent Application No. 201810230767.1, filed on Mar. 20, 2018, and the entire contents thereof are incorporated herein by reference.
The present disclosure relates to an augmented reality display device and method, and augmented reality glasses.
Augmented reality (AR) technology is a projection method in which virtual objects and virtual scenes are superimposed and displayed in the real world. When the virtual scene and the real scene are superimposed together, the virtual object and the real object will be shielded by each other because of their different positions in the space, and their different distances from the user, i.e., their different depth values.
The above information disclosed in this Background of the disclosure is only used to enhance an understanding of the background of the disclosure, and thus it may include information that does not constitute the prior art known to those skilled in the art.
According to an aspect of the disclosure, an augmented reality display device includes an adjustable light transmissive sheet including a plurality of pixels. Light transmission of each of the plurality of pixels is controllable. The augmented reality display device includes a spatial three-dimensional reconstruction component, configured to obtain a depth value of each real point of a real scene in a user's field of view. The augmented reality display device includes a control unit, configured to compare a depth value of a virtual point displayed in a pixel with the depth value of the real point of the real scene corresponding to the pixel. When the depth value of the real point is greater than the depth value of the virtual point, the pixel is controlled to be opaque. When the depth value of the real point is smaller than the depth value of the virtual point, the pixel is controlled to be transparent.
In one exemplary arrangement of the disclosure, the augmented reality display device further includes a virtual scene generator electrically connected to the control unit, and configured to not generate a virtual scene at the pixel corresponding to the virtual point when the depth value of the real point is smaller than the depth value of the virtual point.
In one exemplary arrangement of the disclosure, the spatial three-dimensional reconstruction component includes
In one exemplary arrangement of the disclosure, the light includes structured light.
In one exemplary arrangement of the disclosure, the structured light includes standard stripe or grid light.
In one exemplary arrangement of the disclosure, the augmented reality display device further includes
In one exemplary arrangement of the disclosure, the augmented reality display device further includes
In one exemplary arrangement of the disclosure, the adjustable light transmissive sheet includes a liquid crystal light transmissive sheet.
According to an aspect of the disclosure, there is provided an augmented reality glasses. The augmented reality display device includes
According to an aspect of the disclosure, there is provided an augmented reality display method. The method includes
In one exemplary arrangement of the disclosure, the augmented reality display method further includes
In one exemplary arrangement of the disclosure, obtaining the depth value of each real point of the real scene in the user's field of view includes emitting light, the light being reflected by the real scene in the user's field of view to form reflected light, and
In one exemplary arrangement of the disclosure, the augmented reality display method further includes monitoring eye movement information of the user in real time, and determining a sight of the user according to the eye movement information, to determine a pixel corresponding to the real point.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure, as claimed.
This section provides a summary of various implementations or examples of the technology described in the disclosure, and is not a comprehensive disclosure of the full scope or all features of the disclosed technology.
The above and other features and advantages of the present disclosure will become more obvious by the detailed description of the exemplary arrangements with reference to the drawings.
Exemplary arrangements will now be described more fully with reference to the accompanying drawings. However, the exemplary arrangements can be embodied in a variety of forms, and should not be construed as being limited to the arrangements set forth herein. In contrast, these arrangements are provided to make the present disclosure comprehensive and complete, and comprehensively convey the concepts of the exemplary arrangements to those skilled in the art. The same reference numerals in the drawings denote the same or similar structures, and thus their detailed description will be omitted.
Augmented reality (AR) technology can be classified into two types of a video perspective AR and an optical perspective AR according to the implementing principle. Referring to the schematic diagram of the video perspective augmented reality display shown in
Referring to the schematic block diagram of the electrical connections of the augmented reality display device of the present disclosure shown in
Referring to
The adjustable light transmissive sheet 62 may include a plurality of pixels, the light transmission of each of the plurality of pixels can be controlled. When a pixel operates in a light transmitting state, the user can see an external real scene through the position of the pixel. When a pixel operates in an opaque state, the user's field of view at this position of the pixel is shielded, and the user cannot see the real scene in this direction.
By controlling the light transmittance of each pixel, it is possible to control whether a real scene is visible at each pixel, thus presenting a correct shielding relationship between the real scene and the virtual scene. The adjustable light transmissive sheet 62 may be a liquid crystal light transmissive sheet, and the light transmission of each of which can be controlled. For example, the adjustable light transmissive sheet 62 may have a liquid crystal structure, and each pixel is a liquid crystal light valve. By controlling the driving voltage of each pixel, the light transmittance of each pixel can be independently controlled. However, the present disclosure is not limited thereto, and in other arrangements of the present disclosure, other pixelated or matrixed structures may also be used, in which each pixel can be individually controlled.
The spatial three-dimensional reconstruction component may include a light emitter 8 and a light receiver 9 or the like. The light emitter 8 may be used to emit light, and the real scene in the user's field of view reflects the light to form the reflected light. The light receiver 9 may be used to receive the reflected light and determine a depth value of each real point of the real scene in the user's field of view according to the reflected light.
The spatial three-dimensional reconstruction component can determine the depth value of each real point of the real scene by using the time of flight (TOF) method, the light emitter 8 can emit a light pulse to the real scene, the real scene reflects the light to form the reflected light, and the light receiver 9 receives the reflected light. The depth value of each real point of the real scene is obtained by detecting the round trip time of the light pulse.
The spatial three-dimensional reconstruction component can also determine the depth value of each real point of the real scene by using the structured light projection method, the light emitter 8 can project the structured light to the real scene, the real scene reflects the structured light, and the light receiver 9 receives the reflected structure. The reflected structured light is stripped-deformed by the unevenness of the target, and the shape and the spatial coordinate of the target can be obtained through an analysis process. The analysis processing method are already known and will not be described here. The depth value of each real point of the real scene in the user's field of view is obtained by the spatial coordinate. The structured light may be a standard stripe or grid light, and so on.
The spatial three-dimensional reconstruction component may further determine the depth value of each real point of the real scene in the user's field of view by using interferometry, stereo vision, depth from defocus measurements or the like, which will not be described here.
The augmented reality display device further includes a virtual scene generator for generating a virtual scene, and the virtual scene is reflected by the lens 61 to the user. The virtual scene generator can be a display screen, a projection device, or the like. The virtual scene generator is electrically connected to the control unit, when the depth value of the real point is smaller than the depth value of the virtual point, the pixel corresponding to the virtual point is controlled such that the virtual scene is not generated. It is possible to prevent the virtual scene from being displayed in the case that the real scene shields the virtual scene and leading to confusion for the user to determine the position.
The control unit 10 may receive the depth value of each virtual point of the virtual scene, and may be configured to compare the depth value of the virtual point displayed in the same pixel with the depth value of the corresponding real point. The following two cases may be obtained after the comparison.
When the depth value of the real point is greater than the depth value of the virtual point, it is determined that virtual scene shields the real scene at the pixel, and the pixel is controlled to be opaque such that the user can see the virtual scene instead of the real scene. Referring to the schematic diagram of a display effect of an augmented reality display device of the present disclosure shown in
When the depth value of the real point is smaller than the depth value of the virtual point, it is determined that real scene shields the virtual scene at the pixel, and the virtual scene generator is controlled to re-draw the virtual image, such that in the new virtual image, the virtual image at the pixel is not displayed, and thus the user can see the real scene instead of the virtual scene. Referring to the schematic diagram of a further display effect of an augmented reality display device of the present disclosure shown in
The augmented reality display device may further include an eye movement capture device 7, the eye movement capture device 7 is configured to monitor the eye movement information of the user in real time, and the control unit 10 determines the sight of the user according to the eye movement information, in order to determine a pixel corresponding to the real point.
Specifically, the eye movement information capture device 7 tracks the eye movement of the user in real time, and determines the direction of the sight of the user. The control unit 10 can determine the pixel of the adjustable light-transmissive sheet 62 corresponding to each real point in the real scene in the user's field of view according to the connected line of the sight and each point on the three-dimensional model of the real scene, then control whether the pixel is transparent or not, to control whether the user can view the point on the real scene. The eye movement information capture device 7 can accurately determine the field of view of the user, so that the control unit can only determine and control the pixels in the field of view range, thus reducing the calculation amount of the control unit and improving the operation speed.
Referring to the specific schematic flowchart of an augmented reality display device of the present disclosure shown in
The spatial three-dimensional reconstruction component conduct a three-dimensional modeling for the real scene in the user's field of view to obtain the depth value of each real point of the real scene at 602. The eye movement information capture device 7 tracks the eye movement of the user in real time at 604, and determines the direction of the sight of the user at 606. The control unit 10 can determine the pixel of the adjustable light-transmissive sheet 62 corresponding to each real point in the real scene in the user's field of view according to a connecting line between the sight and each point on the three-dimensional model of the real scene at 608. Concurrently, the virtual scene generator generates the virtual scene and the depth value of each virtual point of the virtual scene. The control unit 10 receives the depth value of each virtual point of the virtual scene at 610, and compares the depth value of the virtual point displayed in the same pixel with the depth value of the real point at 612. When the depth value of the real point is determined to be greater than the depth value of the virtual point at 614, it is determined that the virtual scene shields the real scene at this pixel, and the pixel is controlled to be opaque at 616, so that the user can see the virtual scene instead of the real scene. When the depth value of the real point is determined to be smaller than the depth value of the virtual point at 614, it is determined that the real scene shields the virtual scene at the pixel, and the virtual scene generator is controlled to re-draw the virtual image at 618, such that in the new virtual image, the virtual image at the pixel is not displayed, and thus the user can see the real scene instead of the virtual scene. At 620, a correct shielding relationship between the real scene and virtual scene is presented.
In addition, the present disclosure further provides an augmented reality glasses. Referring to the schematic structure diagram of one exemplary arrangement of the augmented reality glasses shown in
In this exemplary arrangement, the augmented reality glasses may include two frames 11 and two temples 12. The display assembly 6 is provided in the frame 11, for example, the lens 61 and the adjustable light-transmissive sheet 62 are provided the frame 11. The spatial three-dimensional reconstruction component is provided on the frame 11, for example, the light emitter 8 is provided on one frame 11, and the light receiver 9 is provided on the other frame 11 symmetrically with the light emitter 8. The control unit 10 is provided on the temple 12. The eye movement information capture device 7 may include two units, which are respectively provided on the upper frame sides of the two frames 11.
It will be understood by those skilled in the art that the augmented reality display device can also be provided on a helmet or a mask to form a head mounted augmented reality display device. Of course, it can also be used in automobiles, aircrafts, etc., for example, in a head up display (HUD), or in a flight aid instrument used on an aircraft.
Further, the present disclosure further provides an augmented reality display method corresponding to the augmented reality display device described above. Referring to the block flowchart of the augmented reality display method shown in
At 10, the depth value of each real point of the real scene in the user's field of view is obtained;
at 20 the depth value of each virtual point of the virtual scene is received; and
at 30, the depth value of the virtual point displayed in the same pixel is compared with that of the real point, when the depth value of the real point is greater than that of the virtual point, the pixel is controlled to be opaque; when the depth value of the real point is smaller than that of the virtual point, the pixel is controlled to be transparent.
In this exemplary arrangement, the augmented reality display method further includes: when the depth value of the real point is smaller than the depth value of the virtual point, the virtual point corresponding to the pixel is controlled such that the virtual scene is not generated.
In this exemplary arrangement, obtaining the depth value of each real point of the real scene in the user's field of view includes: emitting light, and the real scene in the user's field of view reflects the light to form the reflected light; receiving the reflected light and determining a depth value of each real point of the real scene in the user's field of view according to the reflected light.
In this exemplary arrangement, the augmented reality display method further includes monitoring an eye movement information of the user in real time, and determining the sight of the user according to the eye movement information, in order to determine a pixel corresponding to the real point, that is, the pixel displaying the real point.
The augmented reality display method has been described in detail in the specific operation process of the augmented reality display device described above, and will not be described herein again.
As can be seen from the above technical solutions, the present disclosure has at least one of the following advantages and positive effects:
The present disclosure provides an augmented reality display device. An adjustable light transmissive sheet includes a plurality of pixels, the light transmission of each of the plurality of pixels can be controlled; the spatial three-dimensional reconstruction component can obtain the depth value of each real point of the real scene in the user's field of view; the control unit compares the depth value of the virtual point displayed in the same pixel with that of the real point, when the depth value of the real point is greater than that of the virtual point, the pixel is controlled to be opaque; when the depth value of the real point is smaller than that of the virtual point, the pixel is controlled to be transparent. In one aspect, by controlling the translucency of the pixels of the adjustable light-transmissive sheet, the virtual reality scene or the real scene is controlled to be displayed, and thus realizing the selective presentation of the real scene in the user's field of view without capturing the real scene and processing the image priority to the presentation to the user. In another aspect, the user can directly view the real scene, and thus prevent confusion of the user to determine the position caused by visual deviation. In yet another aspect, the real scene can be directly transmitted to the user through the adjustable light-transmissive sheet, and there is no delay of the real scene display, and a more realistic real scene can be obtained.
The features, structures, or characteristics described above may be combined in any suitable manner in one or more arrangements, and the features discussed in the various arrangements are interchangeable, if necessary. In the above description, numerous specific details are set forth to provide a thorough understanding of the arrangements of the disclosure. However, those skilled in the art will appreciate that the technical solutions of the present disclosure may be practiced without one or more of the specific details, or other methods, components, materials, and the like may be employed. In other instances, well-known structures, materials or operations are not shown or described in detail to avoid obscuring aspects of the present disclosure.
In the present specification, the terms “a”, “an”, “the”, “this” , “said”, and “at least one” are used to mean the inclusion of the open type and means that there may be additional elements/components/etc. in addition to the listed elements/components/etc.
It should be understood that the present disclosure does not limit to the detailed structure and arrangement of the components presented in the specification. The present disclosure can have other arrangements, and can be implemented and practiced with various forms. The foregoing variations and modifications are intended to fall within the scope of the present disclosure. It is to be understood that the disclosure disclosed and claimed herein extends to all alternative combinations of two or more individual features that are mentioned or apparent in the drawings. All of these different combinations constitute a number of alternative aspects of the present disclosure. The arrangements described in the specification are illustrative of the best mode of the present disclosure, and will enable those skilled in the art to utilize this disclosure.
Number | Date | Country | Kind |
---|---|---|---|
201810230767.1 | Mar 2018 | CN | national |