This section is intended to introduce the reader to various aspects of art that may be related to various aspects of the present techniques, which are described and/or claimed below. This discussion is believed to be helpful in providing the reader with background information to facilitate a better understanding of the various aspects of the present disclosure. Accordingly, it should be understood that these statements are to be read in this light, and not as admissions of prior art.
Amusement parks and/or theme parks may include various entertainment attractions, restaurants, and rides useful in providing enjoyment to guests. Areas of the amusement park may have different themes that are specifically targeted to certain audiences. For example, certain areas may include themes that are traditionally of interest to children, while other areas may include themes that are traditionally of interest to more mature audiences. Generally, such areas having themes may be referred to as an attraction or a themed attraction. It is recognized that it may be desirable to enhance the immersive experience for guests of such attractions, such as by augmenting the themes with virtual features.
A summary of certain embodiments disclosed herein is set forth below. It should be understood that these aspects are presented merely to provide the reader with a brief summary of these certain embodiments and that these aspects are not intended to limit the scope of this disclosure. Indeed, this disclosure may encompass a variety of aspects that may not be set forth below.
In one embodiment, a wearable visualization device configured to provide a user with an augmented reality, a virtual reality, and/or a mixed reality experience includes a housing and a lens portion extending from the housing. The wearable visualization device includes a first display screen and a second display screen coupled to the housing and configured to project light onto the lens portion, where the lens portion is configured to reflect at least a portion of the light into eyes of the user. The wearable visualization device also includes a camera positioned between the first display screen and the second display screen and configured to acquire image data of the lens portion.
In one embodiment, an augmented reality, virtual reality, and/or mixed reality (AR/VR) system, includes a wearable visualization device. The wearable visualization device includes a housing, a lens portion extending from the housing, and a display assembly having a frame removably coupled to the housing. A first display screen, a second display screen, and a camera are coupled to the frame, where the camera is positioned between the first and second display screens.
In one embodiment, a wearable visualization device configured to provide a user with an augmented reality, a virtual reality, and/or a mixed reality experience includes a lens portion and a display screen configured to project virtual features onto a first location on the lens portion. The wearable visualization device includes a camera configured to acquire image data indicative of reflections viewable on the lens portion, where the reflections comprise a first reflection of a first eye of the user and a second reflection of a second eye of the user. The wearable visualization device includes a processor communicatively coupled to the camera and the display screen, where the processor is configured to adjust projection of the virtual features from the first location to a second location on the lens portion based on the image data.
Various refinements of the features noted above may be undertaken in relation to various aspects of the present disclosure. Further features may also be incorporated in these various aspects as well. These refinements and additional features may exist individually or in any combination.
These and other features, aspects, and advantages of the present disclosure will become better understood when the following detailed description is read with reference to the accompanying drawings in which like characters represent like parts throughout the drawings, wherein:
One or more specific embodiments will be described below. In an effort to provide a concise description of these embodiments, not all features of an actual implementation are described in the specification. It should be appreciated that in the development of any such actual implementation, as in any engineering or design project, numerous implementation-specific decisions must be made to achieve the developers' specific goals, such as compliance with system-related and business-related constraints, which may vary from one implementation to another. Moreover, it should be appreciated that such a development effort might be complex and time consuming, but would nevertheless be a routine undertaking of design, fabrication, and manufacture for those of ordinary skill having the benefit of this disclosure.
When introducing elements of various embodiments of the present disclosure, the articles “a,” “an,” and “the” are intended to mean that there are one or more of the elements. The terms “comprising,” “including,” and “having” are intended to be inclusive and mean that there may be additional elements other than the listed elements. Additionally, it should be understood that references to “one embodiment” or “an embodiment” of the present disclosure are not intended to be interpreted as excluding the existence of additional embodiments that also incorporate the recited features.
An amusement park may include an augmented reality (AR), a virtual reality (VR), and/or a mixed reality (combination of AR and VR) system (AR/VR system) that is configured to enhance a guest experience of an amusement park attraction by providing guests with AR/VR experiences (e.g., AR experiences, VR experiences, or both). Indeed, combinations of certain hardware configurations, software configurations (e.g., algorithmic structures and/or modeled responses), as well as certain attraction features may be utilized to provide guests with AR/VR experiences that may be customizable, personalized, and/or interactive.
For example, the AR/VR system may include a wearable visualization device, such as a head mounted display (e.g., electronic goggles or displays, eyeglasses), which may be worn by a guest and may be configured to enable the guest to view AR/VR scenes. In particular, the wearable visualization device may be utilized to enhance a guest experience by virtually overlaying features in a real-world environment of the amusement park, by providing adjustable virtual environments to provide different experiences in an amusement park ride, and so forth. Unfortunately, without the disclosed embodiments, it may be expensive and/or time-consuming to manufacture and assemble the wearable visualization device. Moreover, without the disclosed embodiments, it may be difficult to effectively integrate the wearable visualization device with an amusement park ride.
Therefore, embodiments of the present disclosure relate to a wearable visualization device having a multi-piece housing that facilitates manufacture and assembly of the wearable visualization device. In particular, the housing may include one or more detachable panels, such as a chassis, a lid, and a lens mount, which may, in an assembled configuration, form the housing. Certain of the panels may include component mating features (e.g., machined or molded features formed on surfaces of the panels) that are configured to receive and/or couple to various sub-components (e.g., electronic components; optical components) of the wearable visualization device. The component mating features enable the sub-components to be coupled to the panels prior to assembly of the housing, while various portions of the panels may be more easily accessible to an operator (e.g., a human technician; an assembly robot). After installation of the sub-components on one or more of the panels, the panels may be assembled to form the housing. In the assembled configuration, the housing may substantially isolate at least a portion of the sub-components from a surrounding ambient environment.
Embodiments of the wearable visualization device disclosed herein may also include various integration features that facilitate integration of the wearable visualization device with an attraction (e.g., an amusement park ride). For example, the integration features may include a camera that is coupled to the housing of the wearable visualization device and configured to acquire biometric information (e.g., an interpupillary distance) of a guest wearing the wearable visualization device. Particularly, the camera may acquire such biometric information when the guest first equips the wearable visualization device on their head (e.g., such as when the guest initially boards a ride vehicle of the attraction). A processing system of the wearable visualization device may be configured to calibrate certain components (e.g., one or more display screens) of the wearable visualization device based on the acquired biometric information of the guest, such that the wearable visualization device may more effectively provide the guest with AR/VR experiences. In some embodiments, the processing system may further utilize the image data acquired by the camera to determine whether the wearable visualization device is appropriately fitted on the guest's head. As an example, the processing system may utilize the acquired image data to determine whether one or more lenses or displays of the wearable visualization device are appropriately aligned with eyes of the guest (e.g., in a manner that facilitates effective presentation of AR/VR content to the guest). If the processing system determines that the wearable visualization device is misaligned on the guest's head (e.g., with respect to the eyes of the guest), the processing system may generate an alert instructing the guest and/or a ride technician operating the attraction to perform a corrective action. These and other features will be described in detail below with reference to the drawings.
With the foregoing in mind,
In the illustrated embodiment, the wearable visualization device 12 includes a lens portion 16 (e.g., AR/VR eyeglasses, goggles) that is coupled to a housing 18 of the wearable visualization device 12. The lens portion 16 may include one or more lenses 20 or displays (e.g., transparent, semi-transparent, opaque) onto which certain virtual features 24 (e.g., AR features) may be overlaid. In some embodiments, the lenses 20 may enable the user to view a real-world environment 22 (e.g., physical structures in the attraction) through the lenses 20 with certain virtual features 24 overlaid onto the lenses 20 so that the user perceives the virtual features 24 as being integrated into the real-world environment 22. That is, the lens portion 16 may at least partially control a view of the user by overlaying the virtual features 24 onto a line of sight of the user. To this end, the wearable visualization device 12 may enable the user to visualize and perceive a surreal environment 26 (e.g., a game environment) having certain virtual features 24 overlaid onto the physical, real-world environment 22 viewable by the user through the lenses 20.
By way of non-limiting example, the lenses 20 may include transparent (e.g., see-through) light emitting diode (LED) displays or transparent (e.g., see-through) organic light emitting diode (OLED) displays. In some embodiments, the lens portion 16 may be formed from a single-piece construction that spans a certain distance so as to display images to both eyes of the user. That is, in such embodiments, the lenses 20 (e.g., a first lens 28, a second lens 30) may be formed from a single, continuous piece of material, where the first lens 28 may be aligned with a first eye (e.g., left eye) of the user and the second lens 30 may be aligned with a second eye (e.g., right eye) of the user. In other embodiments, the lens portion 16 may be a multi-piece construction that is formed from two or more separate lenses 20.
In some embodiments, the wearable visualization device 12 may completely control the view of the user (e.g., using opaque viewing surfaces). That is, the lenses 20 may include opaque or non-transparent displays configured to display virtual features 24 (e.g., VR features) to the user. As such, the surreal environment 26 viewable by the user may be, for example, a real-time video that includes real-world images of the physical, real-world environment 22 electronically merged with one or more virtual features 24. Thus, in wearing the wearable visualization device 12, the user may feel completely encompassed by the surreal environment 26 and may perceive the surreal environment 26 to be the real-world environment 22 that includes certain virtual features 24. In some embodiments, the wearable visualization device 12 may include features, such as light projection features, configured to project light into one or both eyes of the user so that certain virtual features 24 are superimposed over real-world objects viewable by the user. Such a wearable visualization device 12 may be considered to include a retinal display.
As such, it should be appreciated that the surreal environment 26 may include an AR experience, a VR experience, a mixed reality experience, a computer-mediated reality experience, a combination thereof, or other similar surreal environment. Moreover, it should be understood that the wearable visualization device 12 may be used alone or in combination with other features to create the surreal environment 26. Indeed, as discussed below, the user may wear the wearable visualization device 12 throughout a duration of a ride of an amusement park ride or during another time, such as during a game, throughout a particular area or attraction of an amusement park, during a ride to a hotel associated with the amusement park, at the hotel, and so forth. In some embodiments, when implemented in the amusement park setting, the wearable visualization device 12 may be physically coupled to (e.g., tethered via a cable 32) to a structure (e.g., a ride vehicle of the amusement park ride) to block separation of the wearable visualization device 12 from the structure and/or may be electronically coupled to (e.g., via the cable 32) a computing system (e.g., a computer graphics generation system) to facilitate operation of the wearable visualization device 12 (e.g., display of the virtual features 24).
As discussed below, the wearable visualization device 12 is removably coupleable (e.g., toollessly coupleable; coupleable without tools; coupled without threaded fasteners, such as bolts; separable without tools and without breaking the components of the wearable visualization device 12 or the guest interface device 14) to the guest interface device 14 to enable the wearable visualization device 12 to quickly transition between an engaged configuration 34, in which the wearable visualization device 12 is coupled to the guest interface device 14, and a disengaged or detached configuration 36 (see, e.g.,
As discussed below, after installation of the sub-components 48 on one or more of the panels 40, the panels 40 may be assembled (e.g., coupled to one another via fasteners, adhesives, and/or other techniques) to form the housing 18. The housing 18 may therefore encapsulate the sub-components 48 to substantially seal (e.g., hermetically seal) at least a portion of the sub-components 48 within the housing 18 to shield these sub-components 48 from direct exposure to ambient environmental elements (e.g., moisture) surrounding the wearable visualization device 12. It be understood that, in other embodiments, the housing 18 may be assembled from additional or fewer panels than the lid 42, the chassis 44, and the lens mount 46. Indeed, in certain embodiments, the housing 18 may include 1, 2, 3, 4, 5, 6, or more than six individual panels 40 that, in an assembled configuration, may collectively form the housing 18.
In the illustrated embodiment, the housing 18 includes a forward end portion 50 (e.g., a first end portion) that is proximate to the lenses 20 and rearward end portion 52 (e.g., a second end portion) that is distal to the lenses 20. In particular, the rearward end portion 52 includes a first peripheral portion 54 (e.g., a first distal end) and a second peripheral portion 56 (e.g., a second distal end) that, as discussed below, facilitate removably coupling the wearable visualization device 12 to the guest interface device 14. The chassis 44 includes a first outer surface 58 that may define a first lateral end portion 60 of the housing 18 and a second outer surface, opposite to the first outer surface 58, which may define a second lateral end portion 62 of the housing 18. The lid 42 includes an upper surface 64 that may define a top portion 66 of the housing 18. The chassis includes a lower surface 68 (see, e.g.,
The chassis 44 may include a recess 72 that extends in a second direction 73, generally opposite to the first direction 71, and that slopes from the rearward end portion 52 of the housing 18 toward the forward end portion 50 of the housing 18. The wearable visualization device 12 may include a first screen 74 (e.g., a first display screen) and a second screen 76 (e.g., a second display screen), collectively referred to herein as screens 78, which may be coupled to the housing 18 and positioned within the recess 72. Particularly, as discussed below, the first screen 74 may be positioned within a first opening 80 (see, e.g.,
In some embodiments, a camera 84 may be positioned within the recess 72 and between the first and second screens 74, 76 (e.g., along a lateral axis of the wearable visualization device 12). Particularly, the camera 84 may be disposed within a camera opening 86 (see, e.g.,
The screens 78 may include any suitable displays that are configured to project virtual features onto the lenses 20. By way on non-limiting example, the screens 78 may include liquid crystal displays (LCDs), LED displays, OLED displays, or other suitable displays. In any case, the first screen 74 may project AR/VR content onto the first lens 28 and the second screen 76 may project AR/VR content onto the second lens 30. In this manner, the screens 78 may facilitate generation of the surreal environment 26 in accordance with the techniques discussed above. It should be understood that, in some embodiments, the screens 78 may include a first section or segment and a second section or segment of a single screen, instead of a two separate screens. That is, the first screen 74 may include a first section or segment of a particular screen, and the second screen 76 may include a second section or segment of the particular screen.
As shown in the illustrated embodiment, the first and second lenses 28, 30 may each include a concave curvature that extends from a midline 90 of the lens portion 16. The midline 90 may be substantially aligned with (e.g., parallel to) the centerline 88. The concave curvature of the first and second lenses 28, 30 may facilitate reflection of some of or substantially all of the light projected onto the lenses 20 by the screens 78 back into the eyes of the user. For example, the concave curvature of the first lens 28 may enable the first lens 28 to reflect light (e.g., AR/VR content projected onto the first lens 28 by the first screen 74) into a first eye 92 of the user and enables the second lens 30 to reflect light (e.g., AR/VR content projected onto the second lens 30 by the second screen 76) into a second eye 94 of the user. As such, the user may view the AR/VR content that may be projected onto the lenses 20 (e.g., by the screens 78) and, therefore, perceive the surreal environment 26. Throughout the following discussion, the lenses 20 and the screens 78 may collectively be referred to as a display system 96 of the wearable visualization device 12.
In some embodiments, an exterior surface 104 (e.g., facing away from the eyes 98 of the user) of the lens portion 16 may be coated with one or more layers of an anti-reflective coating 106. The anti-reflective coating 106 may permit ambient light (e.g., sunlight) to pass through the lens portion 16 (e.g., from the exterior surface 104 to the interior surface 97) substantially without creating reflections in the lens portion 16 (e.g., reflections that may reduce a quality of the virtual features projected onto the lens portion 16 by the screens 78). Additionally or alternatively, the exterior surface 104 may be coated with one or more layers of scratch resistant coating 108 that may protect the lens portion 16 from acquiring scratches or other surface blemishes during repetitive usage of the wearable visualization device 12. In some embodiments, one or more layers of the scratch resistant coating 108 may also be applied to the interior surface 97 of the lens portion 16.
The screens 78 may be configured to raster light (e.g., virtual features; AR/VR content) for projection onto the lenses 20 in line draw directions 129 that extend generally cross-wise and outwardly from the midline 90. For example, the first screen 74 may cyclically raster and update AR/VR content (e.g., along a raster line, represented by line 130) in a first direction 132, from a proximate portion 134 (e.g., laterally inward portion) of the first screen 74 (e.g., near the midline 90) toward a distal portion 136 (e.g., laterally-outward portion) of the first screen 74. The second screen 76 may cyclically raster and update AR/VR content (e.g., along an additional raster line, represented by line 138) in a second direction 139, from a proximate portion 141 (e.g., laterally inward portion) of the second screen 76 toward a distal portion 143 (e.g., laterally outward portion) of the second screen 76.
In this manner, the central region 120 of the lens portion 16, which may encompass the foveal vision of the user, may have a lower latency than regions of the lens portion 16 (e.g., the second sections 126, 128) corresponding to regions of the user's peripheral vision. Indeed, the screens 78 may raster updates to the projected AR/VR content onto the first sections 122, 124 of the lenses 20, which may define the user's central field of view, before rastering AR/VR content along the second sections 126, 128 of the lenses 20, which may define the user's peripheral vision. To this end, a user may experience substantially no or unperceivable latency between, for example, the virtual features viewable on the lens portion 16 and features in the real-world environment (e.g., animatronic figures) that may be coordinated with presentation of the virtual features. As a non-limiting example, a time period involved to raster and/or update AR/VR content displayed across the central region 120 of the lens portion 16, using the screens 78, may be approximately four milliseconds, approximately three milliseconds, or less than three milliseconds.
The electronics board 142 may include one or more sensors 150 that facilitate operation of the wearable visualization device 12. As a non-limiting example, the sensors 150 may include orientation and/or position sensors, such as accelerometers, magnetometers, gyroscopes, global positioning system (GPS) receivers, motion tracking sensors, electromagnetic and solid-state motion tracking sensors, one or more inertial measurement units 152 (IMUs), presence sensors, hall-effect sensors temperature sensors, voltmeters, and/or other sensors. In some embodiments, the electronics board 142 may include a communication interface 154 (e.g., including a wired or wireless transceiver) that may transmit real-time data captured via the sensors 150 to a computer graphics generation system 156 that may be located remote of the wearable visualization device 12 (e.g., on a ride vehicle) or integrated with the wearable visualization device 12 (e.g., included on the electronics board 142). In some embodiments, the electronics board 142 may be communicatively coupled to the computer graphics generation system 156 via the cable 32.
The electronics board 142 may include a memory 158 that may store individualized data (e.g., self-test results, error logs, hours of operation, serial number) of the wearable visualization device 12 and/or include instructions that facilitate communication to peripheral sub-assemblies and functions of the wearable visualization device 12 including, for example, a light assembly 160 (e.g., a light emitting diode [LED] assembly), the camera 84, and/or the IMU 152. As discussed below, the light assembly 160 may illuminate to provide various lighting effects in response to user input and/or the occurrence of events. Further, in certain embodiments, the light assembly 160 may, for example, indicate (e.g., via display of a particular color or hue of light) which type (e.g., version) of software is currently running on the electronics board 142.
The display driver board 144 may be configured to decode video signals (e.g., which may be received from the computer graphics generation system 156) and write lines of information to the screens 78. By way of example, the display driver board 144 may generate the raster lines (e.g., the lines 130, 138) to update AR/VR content projected by the screens 78. The display driver board 144 may also optimize a resolution and frequency of video information for display by the screens 78. The display driver board 144 may decode high-definition multimedia interface (HDMI) signals into Mobile Industry Processor Interface (MIPI) Alliance display serial interface (DSI) specifications. As such, it should be understood that the electronics board 142, the display driver board 144, and/or the computer graphics generation system 156 may cooperatively control the screens 78 to provide AR/VR experiences to the user in accordance with the techniques discussed above.
In some embodiments, the electronics board 142 may include an expansion port 164 (e.g., an admin port), which may be communicatively coupled to a processor 166 of the electronics board 142 or to another suitable processing system (e.g., the computer graphics generation system 156). The processor 166 may be a general-purpose processor, system-on-chip (SoC) device, an application-specific integrated circuit (ASIC), or some other similar processor configuration. The expansion port 164 may be coupled to a plug 170 located on an exterior surface of the housing 18. The expansion port 164 may enable auxiliary devices, such as a keyboard and/or mouse, to be communicatively coupled to the electronics board 142. As such, the auxiliary devices may provide a user (e.g., an authorized administrator) with additional functionality and may enable the user to control features of the wearable visualization device 12 using the auxiliary devices. As another non-limiting example, the expansion port 164 may enable integration of Bluetooth® functionality, expanded memory, one or more microphones, one or more acoustic speakers, or any other suitable auxiliary device or devices with the wearable visualization device 12.
To facilitate maintenance on the wearable visualization device 12, the electronics board 142, the display driver board 144, and/or the screens 78 may each be individually replaceable. For example, to facilitate the following discussion,
The display assembly 140 may include a frame 190 that is configured to support the screens 78, the camera 84, and the display driver board 144. The screens 78, the camera 84, and the display driver board 144 may be coupled to the frame 190 using fasteners, adhesives, and/or other suitable techniques. In some embodiments, the frame 190 may align the camera 84 with respect to the screens 78. That is, the frame 190 may ensure that the camera 84 is placed substantially equidistantly between the first and second screens 74, 76. The frame 190 may include one or more mounting tabs 192 (e.g., component mating features) that are configured to engage with respective mounting prongs 194 (e.g., component mating features) of the chassis 44. To this end, connectors, such as fasteners, adhesives, or other connectors may be used to couple the frame 190 to the chassis 44. The mounting tabs 192 and the mounting prongs 194 may be positioned such that, when the frame 190 is in an engaged configuration with the chassis 44, the first screen 74 is aligned with the first opening 80, the second screen 76 is aligned with the second opening 82, and the camera 84 is aligned with the camera opening 86.
The chassis 44 may include one or more additional mounting prongs 198 (e.g., additional component mating features) that, in some embodiments, may facilitate coupling the electronics board 142 to the chassis 44 via suitable connectors (e.g., fasteners) or adhesives. For example, the electronics board 142 may include one or more apertures 199 (e.g., component mating features) formed therein that are configured to align with the addition mounting prongs 198 (e.g., in an installed configuration of the electronics board 142 within the chassis 44). As such, suitable fasteners and/or adhesives may be used to couple the electronics board 142 to the additional mounting prongs 198. The electronics board 142, the display driver board 144, and/or the screens 78 may be communicatively coupled to one another via the connections 146 (e.g., one or more wired connections and/or optical connections).
The following discussion continues with reference to
As such, it is important to note that, by positioning the camera 84 between the first and second screens 74, 76, a single camera 84 may be used to acquire image data indicative of light (e.g., virtual features) projected onto the first and second lenses 28, 30 by both the first and second screens 74, 76, respectively. For example, the camera 84 may observe light that is projected onto the first lens 28 by the first screen 74 and is reflected back toward the camera 84 (e.g., via the reflective layer 100 of the first lens 28). Similarly, the camera 84 may observe light that is projected onto the second lens 30 by the second screen 76 and is reflected back toward the camera 84 (e.g., via the reflective layer 100 of the second lens 30).
Additionally or alternatively, the camera 84 may acquire image data of the user's eyes 98 by observing reflections of the user's eyes 98 in the lens portion 16 (e.g., when the wearable visualization system 11, having the wearable visualization device 12 and the interface device 14, is fitted on the head of a user). In some embodiments, the processor 166 (or another suitable component of the electronics board 142) may receive the image data acquired by the camera 84 and utilize the acquired image data to determine biometric information of the user. For example, as discussed below, the processor 166 may utilize the acquired image data of the user's eyes 98 to determine an interpupillary distance of the user (e.g., a distance between respective pupils of the eyes 98 of the user). The processor 166 may utilize the derived biometric information to adjust projection of AR/VR images by the screens 78 in a manner that improves a performance (e.g., a perceived user experience) of the wearable visualization device 12.
For example, to facilitate discussion,
As shown in the illustrated embodiment of
The camera 84 may be communicatively coupled to the processor 166 and configured to provide the processor 166 with feedback indicative of the image 200. The processor 166 may be configured to analyze the image 200 to detect respective edges 204 of pupils 206 of the user's eyes 98. Based on locations of the edges 204 within the image 200, the processor 166 may estimate a first pupil circumference 208 of the first eye 92 of the user and a second pupil circumference 209 of the second eye 94 of the user. The processor 166 may determine a first centroid of the first pupil circumference 208 and may determine a second centroid of the second pupil circumference 209. As such, the first centroid may be indicative of an estimated centroid of a first pupil of the first eye 92 and the second centroid may be indicative of an estimated centroid of a second pupil of the second eye 94.
Based on the estimated centroids of the pupils 206, the processor 166 may determine an interpupillary distance 210 (see, e.g.,
The following discussion continues with concurrent reference to
In some embodiments, the processor 166 may be configured to evaluate, based on the image 200, whether the wearable visualization system 11 is appropriately oriented and/or positioned on the user's head. For example, with reference to
With reference to
For example, if the processor 166 determines (e.g., based on the angle 240) that the first pupil 224 of the first eye 92 is positioned below the second pupil 230 of the second eye 94 (e.g., relative to the reference axis 234), the processor 166 may send instructions that cause the first screen 74 to adjust projection of virtual features closer to a lower portion 250 of the first screen 74, such that the projected virtual features of the first screen 74 are overlaid closer toward a lower portion 252 of the first lens 28. Additionally or alternatively, the processor 166 may send instruction that cause the second screen 76 to adjust projection of virtual features closer to an upper portion 254 of the second screen 76, such that the projected virtual features of the second screen 76 are overlaid closer toward an upper portion 256 of the second lens 30. In this manner, the processor 166 may enable the projected virtual features displayed on the first and second lenses 28, 30 to be substantially aligned with the pupils 206 of the user's first and second eyes 92, 94, respectively, even if the wearable visualization system 11 is slightly offset (e.g., tilted) on the user's head.
Conversely, if the processor 166 determines (e.g., based on the angle 240) that the first pupil 224 of the first eye 92 is positioned above the second pupil 230 of the second eye 94 (e.g., relative to the reference axis 234), the processor 166 may send instructions that cause the first screen 74 to adjust projection of virtual features closer to an upper portion 260 of the first screen 74, such that the projected virtual features of the first screen 74 are overlaid closer toward an upper portion 262 of the first lens 28. Additionally or alternatively, the processor 166 may send instruction that cause the second screen 76 to adjust projection of virtual features closer to a lower portion 264 of the second screen 76, such that the projected virtual features of the second screen 76 are overlaid closer toward a lower portion 266 of the second lens 30. Accordingly, as similarly discussed above the processor 166 may ensure that the projected virtual features displayed on the first and second lenses 28, 30 may be substantially aligned with the pupils 206 of the user's first and second eyes 92, 94, respectively, even if the wearable visualization system 11 is slightly offset (e.g., tilted) on the user's head. In other words, the processor 166 may perform a software fix to correct misalignment of the wearable visualization system 11 on the head of the user.
In some embodiments, if the processor 166 determines that a magnitude of the angle 240 is larger than the first threshold angle value, the processor 166 may generate an alert instructing the user to manually perform a corrective action (e.g., to reposition the wearable visualization system 11 on the user's head). For example, the processor 166 may instruct the screens 78 to project a message onto the lenses 20 that instruct the user to tilt the wearable visualization system 11 in a particular direction (e.g., left, right) on the user's head to cause the interpupillary axis 236 to be adjusted to be substantially parallel to the reference axis 234. Additionally or alternatively, the processor 166 may generate an audible alert (e.g., via an acoustic speaker) that provides a recorded message instructing the user to appropriately reposition the wearable visualization system 11. To this end, the processor 166 may instruct the user to perform a hardware fix to correct misalignment of the AR/VR system 10 on the user's head.
It should be noted that, in certain embodiments, the camera 84 may be positioned at or near the lens portion 16 and configured to directly acquire image data of the user's eyes. That is, in such embodiments, the camera 84 may be oriented toward the eyes 98 to acquire an image of the eyes 98, instead of acquiring an image (e.g., the image 200) of a reflection of the eyes 98 that is viewable on the interior surface 97 of the lens portion 16. Moreover, in certain embodiments, the wearable visualization device 12 may include a first camera configured to acquire image data of reflections viewable in the lens portion 16 and an additional camera directed toward the eyes of the user and configured to directly acquire image data of the user's eyes. In some embodiments, the camera 84 may be used to determine a gaze direction of the user or for determining any other suitable usage information of the user.
As noted above, in some embodiments, the AR/VR system 10 may be utilized in conjunction with an attraction (e.g., a passenger ride system). In such embodiments, the processor 166 may be configured to perform the aforementioned steps (e.g., determining the interpupillary distance 210; determining the angle 240) when the user initially equips the wearable visualization device 12 (e.g., when the user fits the wearable visualization device 12 on the guest interface device 14 fitted on the user's head during boarding of the attraction). By way of example, in such embodiments, the processor 166 may be configured to transmit an alert to a central control system of the attraction upon determining that a magnitude of the angle 240 is greater than, for example, the first threshold angle value. As such, the central control system may provide the alert to an operator (e.g., a ride technician monitoring operation of the attraction), such that the operator may assist the user in appropriately positioning the wearable visualization system 11 on the user's head prior to initiation of a ride cycle of the attraction.
In certain embodiments, the processor 166 may instruct the light assembly 160 to illuminate a particular color based on the magnitude of the angle 240. For example, if the magnitude of the angle 240 is less than or equal to the first threshold angle value, the processor 166 may instruct the light assembly 160 to illuminate in a green color or hue, thereby signaling to the user and/or the ride operator that the wearable visualization system 11 is appropriately fitted (e.g., aligned) on the user's head. As noted above, in such instances, the processor 166 may compensate for any minor misalignment of the wearable visualization system 11 on the user's head by performing a software fix (e.g., by adjusting presentation of the virtual features by the screens 78). If the magnitude of the angle 240 is greater than the first threshold angle value, the processor 166 may instruct the light assembly 160 to illuminate in, for example, a red color or hue, thereby signaling to the user and/or the ride operator that repositioning of the wearable visualization system 11 on the user's head may be desired.
In certain embodiments, one or more of the LEDs 270 may be coupled to respective light pipes 276 (e.g., optical fibers; acrylic rods) that are configured to transmit light emitted by the LEDs 270 from respective first end portions coupled to the LEDs 270 to respective distal end portions 278. As such, the light pipes 276 may enable the LEDs 270 to be positioned within, for example, a central region 279 (see, e.g.,
The following discussion continues with reference to
The wearable visualization device 12 may include a plurality of support grooves 300 that are configured to engage with respective support ribs 302 of the guest interface device 14. In some embodiments, the support grooves 300 are formed within the first and second peripheral portions 54, 56 of the housing 18 and extend along at least a portion of a surface 304 of the housing 18. For example, the support grooves 300 may extend from distal end faces 306 of the housing 18 generally along a direction 308.
The guest interface device 14 includes an interface frame 310 having a first peripheral end 312, a second peripheral end opposite to the first peripheral end 312, and a lip 314 that extends between the first peripheral end 312 and the second peripheral end. The interface frame 310 includes the plurality of support ribs 302 that protrude from an outer surface 318 of the interface frame 310. Particularly, the interface frame 310 may include a first support rib 320 that extends from the first peripheral end 312 and a second support rib that extends from the second peripheral end. As discussed below, the support ribs 302 are configured to engage with corresponding ones of the support grooves 300 to support the wearable visualization device 12 on the interface frame 310 and to facilitate coupling of the wearable visualization device 12 to the interface frame 310.
The interface frame 310 may include one or more tertiary magnets 324 that are coupled to and/or integrated with (e.g., hermetically sealed within) the interface frame 310 (e.g., within the lip 314). Further, the interface frame 310 may include a one or more quaternary magnets 326 that are coupled to and/or integrated with (e.g., hermetically sealed within) the first peripheral end 312 and/or the second peripheral end of the interface frame 310.
To couple the wearable visualization device 12 to the guest interface device 14, the user may (e.g., while holding the guest interface device 14 in the user's hands and while the guest interface device 14 is separated from the user's head; while wearing the guest interface device 14 on the user's head) translate the wearable visualization device 12 toward the guest interface device 14 in a direction 340, generally opposite to the direction 308, to enable the support ribs 302 of the guest interface device 14 to engage with the corresponding support grooves 300 of the wearable visualization device 12. The user may translate the wearable visualization device 12 along the support ribs 302 (e.g., in the direction 340) until the distal end faces 306 of the housing 18 abut corresponding receiving faces 342 of the guest interface device 14. As such, the primary magnets 280 of the wearable visualization device may align with and magnetically couple to the quaternary magnets 326 of the guest interface device 14.
At least a portion of the lid 42 of the wearable visualization device 12 may be configured to translate beneath and along the lip 314 of the guest interface device 14 to enable the secondary magnets 290 of the wearable visualization device 12 to engage with and magnetically couple to the tertiary magnets 324 of the guest interface device 14. To this end, the mechanical engagement between the support ribs 302 and the support grooves 300 may support substantially all of a weight of the wearable visualization device 12 (e.g., when coupled to the guest interface device 14), while the magnetic engagement between the primary and quaternary magnets 280, 326 and/or the secondary and tertiary magnets 290, 324 blocks the wearable visualization device 12 from disengaging (e.g., sliding off of) the guest interface device 14. Indeed, it should be understood that a force utilized to magnetically decouple the primary and quaternary magnets 280, 326 and/or to magnetically decouple the secondary and tertiary magnets 290, 324, such as when transitioning the wearable visualization device 12 from the engaged configuration 34 (e.g., as shown in
To remove the wearable visualization device 12 from the guest interface device 14, the user may translate the wearable visualization device 12 away from the guest interface device 14 in the direction 308, generally opposite to the direction 340, to enable the primary magnets 280 of the wearable visualization device 12 to magnetically decouple from the quaternary magnets 326 of the guest interface device 14 and/or to enable the secondary magnets 290 of the wearable visualization device 12 to magnetically decouple from the tertiary magnets 324 of the wearable visualization device 12. The user may continue to translate the wearable visualization device 12 in the direction 308, relative to the guest interface device 14, to remove (e.g., decouple) the wearable visualization device 12 from the guest interface device 14.
It should be appreciated that, in certain embodiments, the primary magnets 280 or the quaternary magnets 326, and/or the secondary magnets 290 or the tertiary magnets 324, may be replaced with a suitable reaction material (e.g., metallic plates). As such, the magnets 280, 290, 324, and/or 326 may be configured to attract a corresponding reaction material instead of another magnet. Moreover, in certain embodiments, any of the magnets 280, 290, 324, and/or 326 may be replaced with suitable electromagnets that are powered via a wired or wireless power source (e.g., a battery). In such cases, the electromagnets may be deactivated to enable separation of the wearable visualization device 12 from the guest interface device 14 at certain times, such as during an unloading process in which the user is unloading from the ride vehicle of the amusement park ride. Similarly, the electromagnets may be activated to facilitate securement of the wearable visualization device 12 to the guest interface device 14 at certain times, such as during a loading process in which the user is loading onto the ride vehicle of the amusement park ride. The deactivation and activation may be carried out automatically by the AR/VR system 10 based on the location of the wearable visualization device 12.
It should be noted that the magnets 280, 290, 324, and/or 326 are described herein as primary magnets, secondary magnets, tertiary magnets, and quaternary magnets, respectively, to facilitate discussion. However, other terms may be used to refer to the magnets 280, 290, 324, 326 (e.g., first magnets, second magnets, third magnets, and fourth magnets, respectively). Moreover, in certain embodiments, the primary magnets 280 or the quaternary magnets 324, and/or the secondary magnets 290 or the tertiary magnets 324, may be omitted from the AR/VR system 10.
In some embodiments, the wearable visualization device 12 may include a proximity sensor 350 (e.g., a Hall effect sensor) that is coupled to the housing 18 and located near, for example, the first peripheral portion 54 of the housing 18. Particularly, the proximity sensor 350 may be positioned near the distal end face 306 of the housing 18. The proximity sensor 350 may be communicatively coupled to the electronics board 142 and configured to provide the processor 166 (or another suitable processing component) with feedback indicative of whether the wearable visualization device 12 is in the engaged configuration 34 (e.g., mated with the guest interface device 14) or in the disengaged configuration 36 (e.g., detached from the guest interface device 14). Particularly, the proximity sensor 350 may be triggered (e.g., generate a signal) when the wearable visualization device 12 is within a threshold distance of the guest interface device 14 and, thus, may be used to determine when the wearable visualization device 12 is positioned in the engaged configuration 34. By way of example, the proximity sensor 350 may be triggered when the distal end face 306 of the first peripheral portion 54 of the wearable visualization device 12 is within a threshold distance of or in contact with the receiving face 342 of the first peripheral end 312 of the interface frame 310.
In some embodiments, the processor 166 may periodically or continuously monitor the feedback received from the proximity sensor 350. Upon receiving feedback from the proximity sensor 350 indicating that the wearable visualization device 12 in the engaged configuration 34, the processor 166 may provide an indication to the user confirming that the wearable visualization device 12 has been successfully mated with the guest interface device 14. By way of example, upon receiving feedback indicating that the wearable visualization device 12 is in the engaged configuration 34, the processor 166 may instruct the light assembly 160 to project a particular hue or color of light (e.g., green), may control one or more acoustic speakers of the wearable visualization device 12 to provide an audible message to the user, may control the screens 78 to display a message to the user on the lenses 20, and/or may provide feedback to the user and/or to an operator via another suitable medium.
It should understood that, in other embodiments, the wearable visualization device 12 may include a plurality of proximity sensors that are positioned along any suitable portion of the wearable visualization device 12. For example, the wearable visualization device 12 may include a first proximity sensor positioned within the first peripheral portion 54 of the housing 18 and a second proximity sensor positioned within the second peripheral portion 56 of the housing 18. In such embodiments, the processor 166 may determine that the wearable visualization device 12 is coupled to the guest interface device 14 upon receiving feedback that both the first and second proximity sensors are triggered. Indeed, in certain embodiments, the processor 166 may determine that the wearable visualization device 12 is coupled to the guest interface device 14 upon receiving feedback that any one particular proximity sensor is triggered or that a threshold quantity of the proximity sensors included in the wearable visualization 12 device are triggered.
For example, the IMU 152 may include a nine degree of freedom system on a chip equipped with accelerometers, gyroscopes, a magnetometer, and/or a processor for executing sensor fusion algorithms. The processor 166 may utilize feedback received from the IMU 152 to determine an orientation of the wearable visualization device 12 (e.g., relative to a direction of gravity) along various axes. In some embodiments, an orientation, referred to herein as a storage orientation, of the wearable visualization device 12, when the wearable visualization device 12 is positioned in the receptacle 370, may be known and stored on, for example, the memory 158.
The processor 166 may determine that the wearable visualization device 12 is in the storage configuration 374 upon receiving feedback from the IMU 152 that the wearable visualization device 12 is in the storage orientation and upon receiving feedback from a proximity sensor 380 (e.g., a proximity sensor disposed adjacent to the lens mount 46; the proximity sensor 350) that, for example, the lens mount 46 is a threshold distance away from a mating surface 382 of the receptacle 370 or in contact with the mating surface 382. The processor 166 may not inadvertently determine that the wearable visualization device 12 is in the storage configuration 374 when a user temporarily orients the wearable visualization device 12 in the storage orientation (e.g., such as during the process of mating the wearable visualization device 12 to the guest interface device 14). Instead, the processor 166 may determine that the wearable visualization device 12 is positioned in the in the storage configuration 374 when receiving feedback from both the IMU 152 and the proximity sensor 350 indicating that the wearable visualization device 12 is positioned within the receptacle 370 at a particular angle and is engaged with (e.g., in physical contact with) the mating surface 382. In accordance with the techniques discussed above, the processor 116 may be configured to provide an audible and/or visual alert or confirmation upon determining that the wearable visualization device 12 is transitioned to the storage configuration 374. As an example, upon determining that the wearable visualization device 12 transitioned to the storage configuration 374, the processor 166 may instruct the light assembly 160 to emit a blue color or other hue of light.
In some embodiments, the lap bar 372 may move (e.g., release) in response to the wearable visualization device 12 being in the storage configuration 374. It should be appreciated that the receptacle 370 may be positioned in any suitable portion of the ride vehicle (e.g., dashboard, arm rest, wall). The receptacle 370 may be used in other types of attractions (e.g., without a ride vehicle), and the receptacle 370 may be positioned in a wall or structure, such as in a seat or at an exit of the attraction.
As set forth above, embodiments of the present disclosure may provide one or more technical effects useful for reducing overall manufacturing costs and/or manufacturing complexity of the wearable visualization device, for facilitating performance of maintenance activities on the wearable visualization device, and for facilitating integration of the wearable visualization device in an amusement park environment. It should be understood that the technical effects and technical problems in the specification are examples and are not limiting. Indeed, it should be noted that the embodiments described in the specification may have other technical effects and can solve other technical problems.
While the embodiments set forth in the present disclosure may be susceptible to various modifications and alternative forms, specific embodiments have been shown by way of example in the drawings and have been described in detail herein. However, it should be understood that the disclosure is not intended to be limited to the particular forms disclosed. The disclosure is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the disclosure as defined by the following appended claims.
The techniques presented and claimed herein are referenced and applied to material objects and concrete examples of a practical nature that demonstrably improve the present technical field and, as such, are not abstract, intangible or purely theoretical. Further, if any claims appended to the end of this specification contain one or more elements designated as “means for [perform]ing [a function] . . . ” or “step for [perform]ing [a function] . . . ”, it is intended that such elements are to be interpreted under 35 U.S.C. 112(f). However, for any claims containing elements designated in any other manner, it is intended that such elements are not to be interpreted under 35 U.S.C. 112(f).
This application claims priority to and the benefit of U.S. Provisional Application No. 62/791,735, entitled “AUGMENTED REALITY (AR) HEADSET FOR HIGH THROUGHPUT ATTRACTIONS,” filed Jan. 11, 2019, which is hereby incorporated by reference in its entirety for all purposes.
Number | Name | Date | Kind |
---|---|---|---|
6243207 | Kawamura et al. | Jun 2001 | B1 |
6259565 | Kawamura et al. | Jul 2001 | B1 |
6384983 | Yamazaki et al. | May 2002 | B1 |
7012756 | Takagi et al. | Mar 2006 | B2 |
7019909 | Yamazaki et al. | Mar 2006 | B2 |
7350394 | Flynn et al. | Apr 2008 | B1 |
7450332 | Pasolini et al. | Nov 2008 | B2 |
7495638 | Lamvik et al. | Feb 2009 | B2 |
8025581 | Bryan et al. | Sep 2011 | B2 |
8212859 | Tang et al. | Jul 2012 | B2 |
8408041 | Ten Kate et al. | Apr 2013 | B2 |
8477425 | Border et al. | Jul 2013 | B2 |
8511827 | Hua et al. | Aug 2013 | B2 |
8576276 | Bar-Zeev et al. | Nov 2013 | B2 |
8705177 | Miao | Apr 2014 | B1 |
8767014 | Vaught et al. | Jul 2014 | B2 |
8810482 | Abdollahi et al. | Aug 2014 | B2 |
8866870 | Smith | Oct 2014 | B1 |
8867139 | Gupta | Oct 2014 | B2 |
8907865 | Miyawaki et al. | Dec 2014 | B2 |
8941559 | Bar-Zeev et al. | Jan 2015 | B2 |
8964298 | Haddick et al. | Feb 2015 | B2 |
9052505 | Cheng et al. | Jun 2015 | B2 |
9088787 | Smith et al. | Jul 2015 | B1 |
9253524 | Kaburlasos et al. | Feb 2016 | B2 |
9268138 | Shimizu et al. | Feb 2016 | B2 |
9285871 | Geisner et al. | Mar 2016 | B2 |
9286730 | Bar-Zeev et al. | Mar 2016 | B2 |
9292973 | Bar-Zeev et al. | Mar 2016 | B2 |
9310591 | Hua et al. | Apr 2016 | B2 |
9310610 | Border | Apr 2016 | B2 |
9316834 | Makino et al. | Apr 2016 | B2 |
9342610 | Liu et al. | May 2016 | B2 |
9354446 | Abdollahi et al. | May 2016 | B2 |
9360671 | Zhou | Jun 2016 | B1 |
9366870 | Cheng et al. | Jun 2016 | B2 |
9366871 | Ghosh et al. | Jun 2016 | B2 |
9383582 | Tang et al. | Jul 2016 | B2 |
9389423 | Bhardwaj et al. | Jul 2016 | B2 |
9395811 | Vaught et al. | Jul 2016 | B2 |
9402568 | Barfield | Aug 2016 | B2 |
9454007 | Smith et al. | Sep 2016 | B1 |
9454010 | Passmore et al. | Sep 2016 | B1 |
9497501 | Mount et al. | Nov 2016 | B2 |
9519144 | Lanman et al. | Dec 2016 | B2 |
D776110 | Gribetz et al. | Jan 2017 | S |
D776111 | Baldassi et al. | Jan 2017 | S |
9569886 | Akenine-Moller et al. | Feb 2017 | B2 |
9576399 | Lo et al. | Feb 2017 | B2 |
9581819 | Boggs et al. | Feb 2017 | B1 |
9582922 | Lanman et al. | Feb 2017 | B2 |
9588341 | Bar-Zeev et al. | Mar 2017 | B2 |
9606362 | Passmore et al. | Mar 2017 | B2 |
9638836 | Harrison et al. | May 2017 | B1 |
9638921 | Miller et al. | May 2017 | B2 |
9645396 | Andes et al. | May 2017 | B2 |
9658457 | Osterhout | May 2017 | B2 |
9658460 | Lee et al. | May 2017 | B2 |
9667954 | Tang | May 2017 | B2 |
9690371 | Saito | Jun 2017 | B2 |
9690374 | Clement et al. | Jun 2017 | B2 |
9690375 | Blum et al. | Jun 2017 | B2 |
9696552 | Goergen et al. | Jul 2017 | B1 |
9720505 | Gribetz et al. | Aug 2017 | B2 |
9733477 | Gupta | Aug 2017 | B2 |
9733480 | Baek et al. | Aug 2017 | B2 |
9733481 | Carollo et al. | Aug 2017 | B2 |
9741125 | Baruch et al. | Aug 2017 | B2 |
9763342 | Long et al. | Sep 2017 | B2 |
9773438 | Gribetz et al. | Sep 2017 | B1 |
9778467 | White et al. | Oct 2017 | B1 |
9839857 | Wagner | Dec 2017 | B2 |
D807882 | Gribetz et al. | Jan 2018 | S |
9864406 | Miller et al. | Jan 2018 | B2 |
9869862 | Cheng et al. | Jan 2018 | B2 |
9874749 | Bradski et al. | Jan 2018 | B2 |
9877016 | Esteban et al. | Jan 2018 | B2 |
9885871 | Abdollahi et al. | Feb 2018 | B2 |
D812612 | Gribetz et al. | Mar 2018 | S |
9928661 | Kinstner et al. | Mar 2018 | B1 |
9933624 | White et al. | Apr 2018 | B1 |
9939650 | Smith et al. | Apr 2018 | B2 |
9958951 | Gribetz | May 2018 | B1 |
9983697 | Gribetz | May 2018 | B1 |
9984505 | Rimon et al. | May 2018 | B2 |
9984510 | Kinstner et al. | May 2018 | B1 |
9990779 | Kinstner et al. | Jun 2018 | B2 |
9990872 | Gribetz et al. | Jun 2018 | B1 |
10003726 | Price | Jun 2018 | B2 |
10026231 | Gribetz et al. | Jul 2018 | B1 |
10026232 | Lo et al. | Jul 2018 | B2 |
10037629 | Kinstner et al. | Jul 2018 | B2 |
D825560 | Gribetz et al. | Aug 2018 | S |
D825561 | Gribetz et al. | Aug 2018 | S |
10043305 | Lo et al. | Aug 2018 | B2 |
10057968 | Gribetz et al. | Aug 2018 | B2 |
D830359 | Gribetz et al. | Oct 2018 | S |
10088685 | Aharoni et al. | Oct 2018 | B1 |
10127727 | Yuan et al. | Nov 2018 | B1 |
10168768 | Kinstner | Jan 2019 | B1 |
10168789 | Soto et al. | Jan 2019 | B1 |
10168791 | Gribetz et al. | Jan 2019 | B2 |
10186088 | Kinstner et al. | Jan 2019 | B2 |
10212517 | Beltran et al. | Feb 2019 | B1 |
10260864 | Edwin et al. | Apr 2019 | B2 |
10565446 | Gustafsson | Feb 2020 | B2 |
20060072206 | Tsuyuki et al. | Apr 2006 | A1 |
20060250322 | Hall et al. | Nov 2006 | A1 |
20120212398 | Border et al. | Aug 2012 | A1 |
20120320100 | Machida et al. | Dec 2012 | A1 |
20130137076 | Perez et al. | May 2013 | A1 |
20130318776 | Jacobs | Dec 2013 | A1 |
20140118829 | Ma et al. | May 2014 | A1 |
20140146394 | Tout et al. | May 2014 | A1 |
20140168264 | Harrison et al. | Jun 2014 | A1 |
20140364208 | Perry | Dec 2014 | A1 |
20140364209 | Perry | Dec 2014 | A1 |
20150003819 | Ackerman et al. | Jan 2015 | A1 |
20150103152 | Qin | Apr 2015 | A1 |
20150198808 | Morifuji | Jul 2015 | A1 |
20150312561 | Hoof et al. | Oct 2015 | A1 |
20160011341 | Smith | Jan 2016 | A1 |
20160062454 | Wang et al. | Mar 2016 | A1 |
20160093230 | Boggs et al. | Mar 2016 | A1 |
20160097929 | Yee et al. | Apr 2016 | A1 |
20160097930 | Robbins et al. | Apr 2016 | A1 |
20160098095 | Gonzalez-Banos et al. | Apr 2016 | A1 |
20160109710 | Smith et al. | Apr 2016 | A1 |
20160171779 | Bar-Zeev et al. | Jun 2016 | A1 |
20160188943 | Franz | Jun 2016 | A1 |
20160210784 | Ramsby et al. | Jul 2016 | A1 |
20160223822 | Harrison et al. | Aug 2016 | A1 |
20160240013 | Spitzer | Aug 2016 | A1 |
20160262608 | Krueger | Sep 2016 | A1 |
20160292918 | Cummings | Oct 2016 | A1 |
20160346704 | Wagner | Dec 2016 | A1 |
20160353089 | Gallup et al. | Dec 2016 | A1 |
20160364907 | Schoenberg | Dec 2016 | A1 |
20160370855 | Lanier et al. | Dec 2016 | A1 |
20160377869 | Lee et al. | Dec 2016 | A1 |
20160379417 | Mount et al. | Dec 2016 | A1 |
20170053445 | Chen et al. | Feb 2017 | A1 |
20170053446 | Chen et al. | Feb 2017 | A1 |
20170053447 | Chen et al. | Feb 2017 | A1 |
20170059831 | Hua et al. | Mar 2017 | A1 |
20170108696 | Harrison et al. | Apr 2017 | A1 |
20170116950 | Onal | Apr 2017 | A1 |
20170131581 | Pletenetskyy | May 2017 | A1 |
20170171538 | Bell et al. | Jun 2017 | A1 |
20170176747 | Vallius et al. | Jun 2017 | A1 |
20170178408 | Bavor, Jr. et al. | Jun 2017 | A1 |
20170188021 | Lo et al. | Jun 2017 | A1 |
20170193679 | Wu et al. | Jul 2017 | A1 |
20170206713 | Lo et al. | Jul 2017 | A1 |
20170208318 | Passmore et al. | Jul 2017 | A1 |
20170212717 | Zhang | Jul 2017 | A1 |
20170220134 | Burns | Aug 2017 | A1 |
20170221264 | Perry | Aug 2017 | A1 |
20170236320 | Gribetz et al. | Aug 2017 | A1 |
20170236332 | Kipman et al. | Aug 2017 | A1 |
20170237789 | Harner et al. | Aug 2017 | A1 |
20170242249 | Wall et al. | Aug 2017 | A1 |
20170255011 | Son et al. | Sep 2017 | A1 |
20170262046 | Clement et al. | Sep 2017 | A1 |
20170262047 | Saito | Sep 2017 | A1 |
20170270841 | An et al. | Sep 2017 | A1 |
20170277256 | Burns et al. | Sep 2017 | A1 |
20170285344 | Benko et al. | Oct 2017 | A1 |
20170293144 | Cakmakci et al. | Oct 2017 | A1 |
20170305083 | Smith et al. | Oct 2017 | A1 |
20170316607 | Khalid et al. | Nov 2017 | A1 |
20170323416 | Finnila | Nov 2017 | A1 |
20170323482 | Coup et al. | Nov 2017 | A1 |
20170336863 | Tilton et al. | Nov 2017 | A1 |
20170337737 | Edwards et al. | Nov 2017 | A1 |
20170345198 | Magpuri et al. | Nov 2017 | A1 |
20170352226 | Matsuzawa et al. | Dec 2017 | A1 |
20170363872 | Border et al. | Dec 2017 | A1 |
20170363949 | Valente et al. | Dec 2017 | A1 |
20170364145 | Blum et al. | Dec 2017 | A1 |
20180003962 | Urey et al. | Jan 2018 | A1 |
20180018515 | Spizhevoy et al. | Jan 2018 | A1 |
20180024370 | Carollo et al. | Jan 2018 | A1 |
20180032101 | Jiang | Feb 2018 | A1 |
20180033199 | Eatedali et al. | Feb 2018 | A9 |
20180052501 | Jones, Jr. | Feb 2018 | A1 |
20180059715 | Chen et al. | Mar 2018 | A1 |
20180059776 | Jiang et al. | Mar 2018 | A1 |
20180095498 | Raffle et al. | Apr 2018 | A1 |
20180098056 | Bohn | Apr 2018 | A1 |
20180104601 | Wagner | Apr 2018 | A1 |
20180164594 | Lee et al. | Jun 2018 | A1 |
20180196262 | Cage | Jul 2018 | A1 |
20180196512 | Kim | Jul 2018 | A1 |
20180203240 | Jones et al. | Jul 2018 | A1 |
20180293041 | Harviainen | Oct 2018 | A1 |
20190082170 | Akahori | Mar 2019 | A1 |
20190094554 | Benesh et al. | Mar 2019 | A1 |
20190171023 | Carlvik | Jun 2019 | A1 |
20190318706 | Peng et al. | Oct 2019 | A1 |
20190333480 | Lang | Oct 2019 | A1 |
20190349576 | Yildiz | Nov 2019 | A1 |
20200341269 | Mills | Oct 2020 | A1 |
Number | Date | Country |
---|---|---|
1643501 | Jun 2013 | EP |
2834718 | Oct 2018 | EP |
2562758 | Nov 2018 | GB |
2012141461 | Jul 2012 | JP |
5790187 | Oct 2015 | JP |
5801401 | Oct 2015 | JP |
2015228050 | Dec 2015 | JP |
5913346 | Apr 2016 | JP |
2016528942 | Sep 2016 | JP |
2017522911 | Aug 2017 | JP |
6191929 | Sep 2017 | JP |
6216100 | Oct 2017 | JP |
6237000 | Nov 2017 | JP |
2017532825 | Nov 2017 | JP |
6248227 | Dec 2017 | JP |
100630762 | Sep 2006 | KR |
WO-2008044569 | Apr 2008 | WO |
WO-2014106041 | Jul 2014 | WO |
WO-2018213727 | Nov 2018 | WO |
Entry |
---|
PCT/US2020/013159 Invitation to Pay Additional Fees Apr. 28, 2020. |
U.S. Appl. No. 16/738,908, filed Jan. 9, 2020, Patrick John Goergen. |
U.S. Appl. No. 16/738,788, filed Jan. 9, 2020, Andrew Brian Raij |
U.S. Appl. No. 16/738,917, Jan. 9, 2020, Douglas Evan Goodner. |
Number | Date | Country | |
---|---|---|---|
20200226838 A1 | Jul 2020 | US |
Number | Date | Country | |
---|---|---|---|
62791735 | Jan 2019 | US |