This application claims priority to German Patent Application No. DE 10 2019 207 454.5, filed on May 21, 2019 with the German Patent and Trademark Office. The contents of the aforesaid Patent Application are incorporated herein for all purposes.
The invention relates to an augmented reality system (AR system) with a headset and a control element, as well as a method for operating such an augmented reality system. The invention moreover relates to a control element of an augmented reality system.
This background section is provided for the purpose of generally describing the context of the disclosure. Work of the presently named inventor(s), to the extent the work is described in this background section, as well as aspects of the description that may not otherwise qualify as prior art at the time of filing, are neither expressly nor impliedly admitted as prior art against the present disclosure.
An augmented reality system with a headset and a control element is disclosed in US 2017/0357333 A1 (incorporated by reference in its entirety). The control element disclosed in US 2017/0357333 A1 is pin-shaped and has an elongated middle part, on the end of which optical markers are provided. Moreover, the control element has a status LED and a switch.
An need exists to improve an AR system, or respectively to expand its functionality.
The need is addressed by the subject matter of the independent claims. Embodiments of the invention are described in the dependent claims, the following description, and the drawings.
The details of one or more embodiments are set forth in the accompanying drawings and the description below. Other features will be apparent from the description, drawings, and from the claims.
In the following description of embodiments of the invention, specific details are described in order to provide a thorough understanding of the invention. However, it will be apparent to one of ordinary skill in the art that the invention may be practiced without these specific details. In other instances, well-known features have not been described in detail to avoid unnecessarily complicating the instant description.
Some embodiments provide a control element, in particular a pin-like/elongated control element, in particular a control element in the form of a pin, wherein the control element comprises a first marker and a second marker for determining the orientation of the control element, as well as a light source for emitting a light beam. The control element may also comprise a gyroscope, in particular according to the teaching of US 2017/037333 A1, for tracking, or respectively determining its orientation. Moreover, it is in particular provided that the control element comprises a CPU as well as a battery for the power supply.
It may be provided that the first marker is different from the second marker. It may in particular be provided that the light source-side marker is designed as a ring or comprises a ring. The light source may comprise visible light, the light beam may however also comprise UV light or infrared light. In some embodiments, the light source is a diode, or respectively a laser diode. The light beam may be individualized, for example by a pulse and/or a code. In this manner, when the code or pulse is communicated to the headset, or respectively is known by the headset, it may identify an object marked by the light beam in a very suitable manner. In an additional embodiment, the control element has a rangefinder, or respectively the light source is part of a rangefinder.
In some embodiments, the control element has a contour, a structure or a texture, or the like that allows, or respectively enables a haptic, or respectively manual recognition of the orientation of the control element. In some embodiments, the control element has an interface to a headset. In some embodiments, the interface is a wireless interface, or respectively an interface for wireless communication. In some embodiments, the control element has one or more control elements. These serve in particular to turn on the light source, and/or to trigger the recognition of a region marked by the light beam.
Some embodiments provide an augmented reality system with a headset that comprises a transparent display, by means of which a virtual image component may be depicted, wherein the headset has a camera assembly for recording an image of the environment of the headset as well as a tracking system for determining the position (and orientation) of the virtual image component on the transparent display depending on the image of the real environment, and wherein the augmented reality system has an aforementioned control element. A headset pursuant to this disclosure is in particular also a head-mounted display (HMD), or respectively data goggles or AR goggles. A suitable headset pursuant to this disclosure is for example the Hololens® by Microsoft®. A camera assembly pursuant to this disclosure is in particular a stereo camera assembly having at least two cameras. A tracking system pursuant to this disclosure is in particular a markerless tracking system.
It may be provided that the augmented reality system comprises at least two control elements and at least two headsets. These may for example form an interactive group (with two users).
In some embodiments, the headset and the control element are data-linked by means of an in particular wireless communication system.
In some embodiments, the headset or the augmented reality system has a local position-determining module for recognizing, or respectively for determining the position of a point or a region of the environment of the headset marked by the light beam.
Some embodiments provide a method for operating an aforementioned augmented reality system, wherein a point or a region of the environment of the augmented reality system is marked in the field of vision of the transparent display by the light beam.
In some embodiments, the marked point or region is assigned a local position and/or a global position. In some embodiments, the local position or the global position is assigned a function. In some embodiments, the marked region is measured and/or separated.
Reference will now be made to the drawings in which the various elements of embodiments will be given numerical designations and in which further embodiments will be discussed.
Specific references to components, process steps, and other elements are not intended to be limiting. Further, it is understood that like parts bear the same or similar reference numerals when referring to alternate FIGS. It is further noted that the FIGS. are schematic and provided for guidance to the skilled reader and are not necessarily drawn to scale. Rather, the various drawing scales, aspect ratios, and numbers of components shown in the FIGS. may be purposely distorted to make certain features or relationships easier to understand.
The augmented reality system moreover comprises a database 14 with virtual image components, or another source of virtual image components. From this database 14, or respectively the other source, the scene generator 15 takes a virtual image component VIRT that is positioned at a specific location so that it may be displayed at this location by the transparent display. The overlap between reality and the virtual image component is in the eye of the user.
The control element 20 comprises a marker M1 and a marker M2 as well as a light source 21 for emitting a light beam 211. The control element 20 moreover comprises two control elements B1 and B2 for operating a light source 21, or respectively for triggering the recognition of a point or a region that is marked by the light beam 211.
Optionally, a global position recognition module 182 may be provided that interacts with a GPS or a similar positioning system 19 so that the local position LPOS may be converted into a global or absolute position GPOS, i.e., a position in earth coordinates.
A separation module 183 may also be provided, by means of which sections may be separated from the real image RB such that a section is marked by a local position signal LPOS defining the section. The local position identification module 181, the global position identification module 182 and the separation module 183 are operated for example by the gesture recognition module 17, wherein the gestures may be performed by a hand of a user, or by means of the control element 20. The gesture recognition module 17 interacts with the scene generator 15, or respectively the display 11 such that, by means of the display 11, e.g. selection options, menus, lists, menu structures or the like may be depicted, wherein by means of the gesture recognition module 17, certain entries that are shown by the display 11 may be selected and/or chosen.
The control element 20, or respectively 20′ may be used to link texts, or drawings, or markings to a real location. If for example an athlete is performing exercises in a parcourse, he notes which exercises are being performed in a real room with the control element 20, or respectively 20′. If he redoes his training in the subsequent week, the athlete may display his exercise instructions on the display 11. This information may for example optionally be, or be made, accessible to:
To whom the data are made accessible can, for example, be selected via the gesture recognition module 17 in conjunction with the display 11 (and the selection menu shown therewith).
By means of the control element 20, or respectively 20′, objects may also be selected in a real room (furniture or the like), and the headset 20, or respectively 20′ may be instructed to digitize the geometry in order to, for example, verify a different location therewith. If for example an object is attractive in a furniture store, it may be scanned and placed at home. Conversely, the dimensions of a room may also be measured with the control element 20, or respectively 20′, in order to digitize them.
Another example of a use of the augmented reality system 1 may consist of creating a shopping list and sharing this information via the interface such as Google maps: For example, a user specifies the supermarket XY so that another user sees a shopping list when he visits the supermarket. This other user may delete something from the list every time it is placed in the shopping cart.
In another example of a scenario according to
Moreover, by framing a desired object, the control element 20 (as well as 201, 202, 203), or respectively 20′—as shown in
Moreover, the control element 20 (as well as 201, 202, 203), or respectively 20′—as shown in
The invention has been described in the preceding using various exemplary embodiments. Other variations to the disclosed embodiments may be understood and effected by those skilled in the art in practicing the claimed invention, from a study of the drawings, the disclosure, and the appended claims. In the claims, the word “comprising” does not exclude other elements or steps, and the indefinite article “a” or “an” does not exclude a plurality. A single processor, module or other unit or device may fulfil the functions of several items recited in the claims.
The term “exemplary” used throughout the specification means “serving as an example, instance, or exemplification” and does not mean “preferred” or “having advantages” over other embodiments. The term “in particular” used throughout the specification means “serving as an example, instance, or exemplification”.
The mere fact that certain measures are recited in mutually different dependent claims or embodiments does not indicate that a combination of these measures cannot be used to advantage. Any reference signs in the claims should not be construed as limiting the scope.
Number | Date | Country | Kind |
---|---|---|---|
10 2019 207 454.5 | May 2019 | DE | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/EP2020/059452 | 4/2/2020 | WO |