The subject matter disclosed herein relates to amusement park attractions, and more specifically, to providing interactive experiences in amusement park ride attractions.
Amusement parks or theme parks may include various entertainment attractions in providing enjoyment to guests of the amusement parks. For example, the attractions may include a ride attraction (e.g., closed-loop track, dark ride, thrill ride, or other similar ride). In such ride attractions, motion of a ride vehicle consists of programmed movements or the ride vehicle may include features (e.g., various buttons and knobs) that provide a passenger with varying levels of control over the ride vehicle. It is now recognized that there is a need for an improved ride attraction that provides enhanced passenger control over the ride vehicle to create a more interactive ride experience.
Certain embodiments commensurate in scope with the present disclosure are summarized below. These embodiments are not intended to limit the scope of the disclosure, but rather these embodiments are intended only to provide a brief summary of possible forms of present embodiments. Indeed, present embodiments may encompass a variety of forms that may be similar to or different from the embodiments set forth below.
In an embodiment, a ride system for an amusement park includes a ride vehicle configured to accommodate a rider and configured to travel along a ride path, a head mounted display configured to be worn by the rider, and a control system. The control system is configured to display a virtual instruction to the rider via the head mounted display, receive a signal from a user input device associated with the rider, determine that the signal received from the user input device corresponds to the virtual instruction, and trigger a movement of the ride vehicle in response to determining that the signal corresponds to the virtual instruction.
In an embodiment, a method of providing entertainment in an amusement park includes generating, via a computer graphics generation system of a control system communicatively coupled to a ride vehicle, a gaming environment, wherein the gaming environment includes a plurality of augmented reality (AR) objects, wherein the ride vehicle is configured carry one or more riders and to travel along a ride path, displaying the gaming environment via one or more head mounted displays associated with each of the one or more riders, receiving, via the control system, one or more signals from one or more user input devices associated with each of the one or more riders, and triggering, via the control system, movement of the ride vehicle based at least in part on the signals received from the one or more user input devices associated with each of the one or more riders, wherein the one or more user input devices include at least one steering device and at least one marking device.
In an embodiment, a ride and game control system integrated with a ride vehicle configured to carry a rider along a ride track, the ride and game control system includes a game controller configured to receive first input signals from a steering user input device associated with the ride vehicle, receive second input signals from at least one marking user input device associated with the ride vehicle, determine that the first input signals correspond to a virtual instruction, output, in response to determining that the first input signals correspond to the virtual instruction, a first control signal indicative of an instruction to cause a first movement of the ride vehicle, determine that the second input signals correspond to an interaction with an augmented reality (AR) object, and output, in response to determining that the second input signals correspond to the interaction with the AR object, a second control signal indicative of an instruction to cause a second movement of the ride vehicle. The ride and game control system also includes a ride controller communicatively coupled to the game controller and configured to receive the first control signal and the second control signal, and configured to control movement of the ride vehicle based at least in part on the first control signal and the second control signal.
These and other features, aspects, and advantages of the present disclosure will become better understood when the following detailed description is read with reference to the accompanying drawings in which like characters represent like parts throughout the drawings, wherein:
One or more specific embodiments of the present disclosure will be described below. In an effort to provide a concise description of these embodiments, all features of an actual implementation may not be described in the specification. It should be appreciated that in the development of any such actual implementation, as in any engineering or design project, numerous implementation-specific decisions must be made to achieve the developers' specific goals, such as compliance with system-related and business-related constraints, which may vary from one implementation to another. Moreover, it should be appreciated that such a development effort might be complex and time consuming, but would nevertheless be a routine undertaking of design, fabrication, and manufacture for those of ordinary skill having the benefit of this disclosure.
Present embodiments relate to systems and methods of providing an enhanced experience for riders of a ride attraction, such as passengers on a closed-loop track, dark ride, or other similar ride. A ride and game control system associated with a ride vehicle of the ride attraction may provide an augmented reality (AR) environment by way of head mounted displays worn by the riders, or other suitable displays. The AR environment may be combined with other off-board displays and effects to create an overall ride environment. The AR environment may be presented as, or part of, a game or gameplay environment that the riders may interact with throughout the ride. To provide a more interactive ride experience and to provide varied ride experiences between subsequent visits, the ride and game control system may provide divided (e.g., bifurcated) control over movement of the ride vehicle by the riders. That is, movement of the ride vehicle along and about a track (e.g., path) of the ride attraction may be controlled based at least in part on direct input from one or more user input devices, such as steering of the ride vehicle via a steering wheel user input device and/or changing a speed of the ride vehicle by depressing one or more pedals (e.g., accelerator pedal, brake pedal). Movement of the ride vehicle may also be controlled by the ride and game control system based at least in part on performance of the riders' interaction with the AR environment (e.g., as part of a game). For example, driving instructions may be presented via the AR environment to the rider steering the ride vehicle, and following or not following the driving instructions may cause different resulting movements of the ride vehicle. As another example, interactions by the riders with objects in the AR environment via the user input devices, such as releasing (e.g., shooting, throwing, or ejecting) an AR projectile (e.g., shell, ball, or other item) to mark or collide with a target object, may cause certain movements of the ride vehicle. Therefore, movement of the ride vehicle may be controlled at some times by a driver, at some times by the other riders, or at some times by both the driver and the other riders. The ride vehicle may enable the driver to drive the ride vehicle using the steering wheel user input device at certain times, and then the ride vehicle may override the driver control or supplement the driver control based on the driver's performance (e.g., following instructions) and/or based on the rider's performance (e.g., marking objects of the AR environment with AR projectiles) at other times to affect movement of the ride vehicle. Furthermore, the driver may retain steering control of the ride vehicle throughout some or all of the ride, but the speed of the ride vehicle may change based on the driver's performance and/or based on rider's performance during at least some portions of the ride. Control of the movement of the ride vehicle by the riders and the division of such control may provide a more interactive and varied ride experience. Additionally, in an embodiment, the ride and game control system may provide varied ride experiences via a weighting process based on a score or assessed skill level of the riders of the ride vehicle based on their driving and/or interaction with the objects of the AR environment.
With the foregoing in mind,
The track 18 may be a simple track or a controlled path, in which the movement of the ride vehicle 20 may be limited or controlled via an electronic system, a magnetic system, or other similar system. As the ride vehicle 20 moves along the track 18, the track 18 may define linear movement of the ride vehicle 20. The movement of the ride vehicle 20 may be controlled by a control system (e.g., a ride and game control system 24) of the ride vehicle 20, which may include multiple control systems. As well as causing linear movement of the ride vehicle 20 along the track 18, the ride and game control system 24 may cause other motion of the ride vehicle 20, such as rotating, rocking, spinning, vibrating, pivoting, and other similar motions. Additionally, the ride and game control system 24 may provide an augmented reality (AR) environment 25 of AR graphics including AR objects 26 presented to the riders 22 via head mounted displays 28 worn by the riders 22 throughout the duration of the ride attraction 12. The ride and game control system 24 may further coordinate presentation of the AR objects 26 via the head mounted displays 28 and/or the movement of the ride vehicle 20 with other off-board effects, such as visual and/or sound presentations, provided by way of a show effect system 30 that may include a projection game computer, display devices (e.g., projection display devices, digital display devices), lighting systems, and sound effect devices (e.g., speakers) disposed along the tracks 18.
In operation, the ride attraction 12 may be presented as a game or gameplay interaction between the riders 22 of the ride vehicle 20, the AR environment 25 (e.g., gaming environment), including the AR objects 26, and/or one or more other ride vehicles 20 of the ride attraction 12. The AR objects 26 may include objects, characters, and instructions to the riders. In some embodiments, the game of the ride attraction 12 may be presented as a race between the ride vehicles 20 and/or between characters presented via the AR environment 25. As the ride vehicle 20 moves along the track 18, one rider 22 may control certain direct movements of the ride vehicle 20, such as steering and turning the ride vehicle 20, via a user input device 32. The user input device 32 may be communicatively coupled to the ride and game control system 24, which may cause movement of the ride vehicle 20 based at least in part on signals received from the user input device 32. Various AR objects 26 (including AR instructions) may be presented to the riders 22 throughout the duration of the ride attraction 12 via the ride and game control system 24 and the head mounted displays 28. Each rider may have control of more than one user input device 32, and the user input devices 32 may include a steering user input device 34, a marking user input device 36, or other user input devices 38, such as an accelerator user input device. The riders 22 may interact with the AR objects 26 presented via the head mounted displays 28, such as following instructions presented via the AR objects 26 or interacting with the AR objects 26 via the marking user input devices 36 (e.g., marking the AR objects 26 with AR projectiles). Such interaction with the AR environment 25 by the riders 22 may also trigger or affect movement of the ride vehicle 20 caused by the ride and game control system 24. As such, the ride and game control system 24 may provide divided control of the movement of the ride vehicle 20 triggered by direct user input from the riders 22 (e.g., steering wheel) and by virtual interaction of the riders 22 with the AR environment 25 (e.g., game environment), as discussed in greater detail with reference to
Additionally, the ride vehicle 20 may include multiple user input devices 32, such that each rider 22 is provided with one or more user input devices 32. For example, each rider 22 may be provided with a user input device 32 used to drive the ride vehicle 20, such as the steering user input device 34 and/or the other user input devices (e.g., an accelerator user input device) and a separate user input device 32 used to interact with the AR objects 26, such as the marking user input device 36 that may be used to simulate marking the AR object 26 with AR projectiles or a user input device used to simulate grabbing AR reward objects. The ride and game control system 24 may activate and deactivate control of the user input devices 32 throughout the duration of the ride attraction 12. As such, each rider 22 may have a chance to operate different user input devices 32 to interact with the ride attraction 12 and the AR environment 25 during each operation of the ride attraction 12. Further, the ride and game control system 24 may rotate control of certain user input devices 32 between the riders 22 in the ride vehicle 20. For example, each rider 22 may be provided with the steering user input device 34 and the marking user input device 36. The ride and game control system 24 may activate the steering user input device 34 for one rider 22, while activating the marking user input devices 36 for the other riders 22 within the ride vehicle 20. After a predetermined amount of time, a length of track, a number of interactions, or other metric, the ride and game control system 24 may rotate the steering control to another rider 22, thereby deactivating the steering user input device 34 from the previous driving rider 22 and activating the marking user input device 36 for the previous driving rider 22. Such rotation of control of the types of user input devices 32 throughout the duration of the ride attraction 12 may further provide a more interactive ride experience that may vary during subsequent rides of the ride attraction 12.
As previously discussed, the ride and game control system 24 may provide divided control of the movement of the ride vehicle 20 to the riders 22 based at least in part on input signals received from the user input device(s) 32 related to steering of the ride vehicle 20, as well as based at least in part on the interactions of the riders 22 with the AR environment 25, which may be presented as a gameplay setting.
The ride controller 42 of the ride and game control system 24 may be a programmable logic controller (PLC), or other suitable control device. The ride controller 42 may include a processor 58 (e.g., a general-purpose processor, a system-on-chip (SoC) device, an application-specific integrated circuit (ASIC), or some other similar processor configuration) operatively coupled to a memory 60 (e.g., a tangible non-transitory computer-readable medium and/or other storage device) to execute instructions for tracking operational parameters of the ride vehicle 20 and instructions for causing movement of the ride vehicle 20. As such, the ride controller 42 may be configured to track operational parameters of the ride vehicle 20 including, but not limited to, position, yaw, pitch, roll, and velocity of the ride vehicle 20, and input control state (e.g., input provided by one or more of the riders 22 to steer and/or drive the ride vehicle 20). Additionally, the ride controller 42 may be configured to control or change physical operation of the ride vehicle 22 based on the input signals received from the user input devices 32 and/or the game controller 44 (e.g., to change the position, yaw, pitch roll, and the velocity of the ride vehicle 22). Based at least in part on the input signals received from the user input devices 32 and/or the game controller 44, the ride controller 42 may output one or more control signals to a motor 62, or a brake 64, and/or a steering system 65 of the ride vehicle 20 indicative of an instruction to perform the input movement, such as turn the ride vehicle 20, change the speed of the ride vehicle 20, rotate the ride vehicle 20, or other suitable movement.
The game controller 44 of the ride and game control system 24 may be a programmable logic controller (PLC), or other suitable control device. The game controller 44 may include a processor 66 (e.g., a general-purpose processor, a system-on-chip (SoC) device, an application-specific integrated circuit (ASIC), or some other similar processor configuration) operatively coupled to a memory 68 (e.g., a tangible non-transitory computer-readable medium and/or other storage device) to execute instructions stored in the memory 68. The game controller 44 may be configured to provide operational parameters or information relating to the ride vehicle 22 to the one or more game systems 48 and/or the ride controller 42. The operational parameters or information may include, but is not limited to position, yaw, pitch roll, and the velocity of the ride vehicle 22. Additionally, the game controller 44 may be configured to determine and output signals to the one or more game systems 48 and/or the ride controller 42 indicative of how input signals received from the user input devices 32 should affect the movement of the ride vehicle 20. Thus, the game controller 44 may output instruction signals to the ride controller 42 indicative of particular movements associated with the input signals received via the user input devices 32 (e.g., the steering user input device 34, the marking user input device(s) 36, and the other user input device 38). The game controller 44 may also coordinate the movement instructions output to the ride controller with control signals and parameters output to the one or more game systems 48 indicative of how the ride vehicle 20 will move in order to coordinate the AR environment 25 presented to the riders 22 via the head mounted displays 28 with the movement of the ride vehicle 20.
The monitoring system 46 may include any suitable sensors and/or computing systems disposed on or integrated with the ride vehicle 20 to track the positions, locations, orientations, presences, and so forth of the riders 22 and/or the position, location, or orientation of the ride vehicle 20. Such sensors may include orientation and position sensors (e.g., accelerometers, magnetometers, gyroscopes, Global Positioning System [GPS] receivers), motion tracking sensors (e.g., electromagnetic and solid-state motion tracking sensors), inertial measurement units (IMU), presence sensors, and others. The information obtained by the monitoring system 46 may be provided to the game controller 44, the one or more game systems 48, and/or the ride controller 42 for determining each rider's gaze direction, viewing perspective, field of view, interaction with the game, and so forth. In an embodiment, the monitoring system 46 may also receive data obtained by the head mounted display 32 indicative of the respective rider's gaze direction, viewing perspective, field of view, interaction with the game, and so forth (e.g., position and orientation data of the head mounted display 32).
The one or more game systems 48 may be central processing units (CPUs) or other suitable system, and may generally be configured to render virtual or augmented graphics for overlay onto real-world environmental views. As such, the one or more game systems 48 may generate the AR environment 25 with which the riders 22 may interact in the gameplay setting. The one or more game systems 48 may also be responsible for game logic, and to run simulations of real world ride vehicles and stage geometry for the placement of virtual objects in real space. In certain embodiments, the one or more game systems 48 are configured to provide AR and/or game play experiences to the riders 22 via the head mounted displays 32. In particular, each seat or position of the ride vehicle 20 may include a dedicated game system 48. In an embodiment, the one or more game systems 48 may be communicatively coupled to one another, such that the passengers may engage in a shared game (e.g., a game having multiple players). The one or more game systems 48 may be communicatively coupled (directly or indirectly) to the ride controller 42, the game controller 44, the monitoring system 46, and the show effect system 34. Each of the one or more game systems 48 may include the user input device 32 or a group of multiple user input devices 32 and a computer graphics generation system 70. The user input device 32 may be communicatively coupled to the computer graphics generation system 70, and the computer graphics generation system 70 may be communicatively coupled to the respective head mounted display 32 (e.g., via the communication network 40).
The user input device(s) 32 may include one or more user input devices (e.g., handheld controllers, joysticks, push buttons) disposed on the ride vehicle 20 to enable the respective rider 22 to provide inputs to the ride controller 42, the game controller 44, and or the one or more game systems 48 for gameplay and to control movement of the ride vehicle 20, such as to change the velocity and/or direction of travel of the ride vehicle 20. For example, as previously discussed, the steering user input device 34 may include a steering wheel or other device to allow the rider 22 to steer the ride vehicle 20. The user input device(s) 32 may also include the marking device user input device 36 to allow the rider 22 to interact with the AR objects 26 of the AR environment 25. The user input device 32 may include any other type of input devices configured to allow the riders 22 to interact with the AR environment (e.g., game environment) and/or to directly control operation of the ride vehicle 20, such as the accelerator user input device. Additionally, the user input device(s) 32 may be configured to allow different actions and/or effects to be applied in the AR environment 25. For example, the user input device(s) 32 may allow the riders 22 to control the AR objects 26 (e.g., character, object) of the AR environment 25 in different directions (e.g., up, down, left, right). In an embodiment, the user input device(s) 32 may also include a display screen and/or a touch screen to enable ride and game related information to be communicated to the riders 22, such as information related to which user input device(s) 32 are currently activated for each rider 22 and/or gameplay instructions.
The computer graphics generation system 70 may generate and transmit AR graphics (e.g., the AR environment 25 including the AR objects 26) to be displayed on the respective head mounted display 28, such that the respective rider 22 may visualize the AR environment 25 (e.g., game environment). The computer graphics generation system 70 includes processing circuitry, such as a processor 72 (e.g., general purpose processor or other processor) and a memory 74, and may process data useful in generating the AR environment 25 for the respective rider 22. The data useful in generating the AR environment 25 may include, but is not limited to, real-time data received from the respective head mounted display 28, the user input device(s) 32, and the game controller 44 (e.g., including data from the ride controller 42 and the monitoring system 46), and data stored in the memory 74.
The computer graphics generation system 70 may use such data to generate a frame of reference to register the AR environment 25 to the real-world environment of the ride attraction 12, for example to generated real-world images or to the actual physical environment. Specifically, in certain embodiments, using the frame of reference generated based on orientation data, position data, point of view data, motion tracking data, and so forth, the computer graphics generation system 70 may render a view of the AR environment 25 in a manner that is temporally and spatially commensurate with what the respective rider 22 would perceive if not wearing the head mounted display 28. The computer graphics generation system 52 may store a model of the ride attraction 12 that is built using spatial information of the real-word physical features of the ride attraction 12 including the themed environment (e.g., physical scenery of the ride attraction 12). The model may be used, together with other inputs, such as inputs from the ride controller 42, the game controller 44, the monitoring system 46 and/or the head mounted display 28, to locate the respective rider 22 and determine the rider's gaze direction and/or field of view. The model may be used to provide display signals to the head mounted display 28 that are dynamically updated as the rider 22 travels along the track 18.
For example, the computer graphics generation system 70 may selectively generate AR graphics (e.g., the AR objects 26, including instruction objects) of the AR environment 25 to reflect changes in the respective rider's orientation, position, gaze direction, field of view, motion, and so forth. The computer graphics generation system 70 may selectively generate the AR environment 25 based on data indicative of the position, a yaw, and a velocity, and/or other operational parameters of the ride vehicle 20 received from the monitoring system 46, the game controller 44, and/or the ride controller 42. The computer graphics generation system 70 may also selectively generate the AR graphics to reflect changes in inputs provided by the respective passenger using the user input device(s) 32. Furthermore, the computer graphics generation system 70 may generate the AR graphics based on simulated interactions that may cause the AR objects 26 to be affected according to certain predetermined or modeled responses stored by the computer graphics generation system 70 (e.g., in the memory 74). As an example, the predetermined or modeled responses may be implemented by a physics engine or similar module or as a part of the computer graphics generation system 70. In certain embodiments, the computer graphics generation system 70 may track the information or data set forth above corresponding to a plurality of riders 22 in a shared game, such that the riders 22 in the shared game may see the game effects applied by other riders 22 (e.g., players) in the shared game.
Additionally, the computer graphics generations system 70 may receive input signals from the ride controller 24 and/or the game controller 44 indicative of the movement of the ride vehicle 20, such that the computer graphics generation system 70 may generate the AR environment 25 based at least in part on how the ride vehicle 20 is moving along or about the track 18. That is, the game controller 44 may determine, based at least in part on the inputs received from the user input devices 32 (e.g., the steering user input device 34, the marking user input device 36, and the other user input devices 38) indicative of the interaction of the riders 22 with the AR environment 25 how the ride vehicle 20 should be moving in response to the inputs, and the game controller 44 may output signals to the computer generation system 70 indicative of how the ride vehicle 20 will be moving in response to the rider interaction with the AR environment 25. As such, the computer graphics generation system 70 may generate the AR environment 25 based at least in part on how the ride vehicle 20 is being caused to move by the riders 22. Further, the computer graphics generation system 70 may receive signals from the ride controller 42 indicative of certain direct user inputs received from the steering user input device 34, such as steering the ride vehicle 22, such that the computer graphics generation system 70 may further generate the AR environment 25 based at least in part on how the ride vehicle 20 is being caused to move by the riders 22.
The head mounted display 28 may include a processor 86 and a memory 88 (e.g., a tangible non-transitory computer-readable medium). The processor 86 and the memory 88 may be configured to allow the head mounted display 28 to function as a display (e.g., to receive signals from the computer graphics generation system 70 that ultimately drive the display). The processor 86 may be a general-purpose processor, system-on-chip (SoC) device, an application-specific integrated circuit (ASIC), or some other similar processor configuration.
The head mounted display 28 may include a tracking system 90 that may include orientation and/or position sensors, such as accelerometer, magnetometer, gyroscopes, GPS receiver, motion tracking sensor, electromagnetic and solid-state motion tracking sensor, IMU, presence sensor, and others. The tracking system 90 may collect real-time data indicative of the rider's position, orientation, focal length, gaze direction, field of view, motion, or any combination thereof. The head mounted display 28 may include a communication interface 92 (e.g., including a wireless transceiver) that may transmit the real-time data captured via the tracking system 90 to the processor 86 and/or the computer graphics generation system 70 for processing. The communication interface 92 may also allow the head mounted display 28 to receive the display signal transmitted by the computer graphics generation system 70.
The electronic eyeglasses 80 of the head mounted display 28 may include one or more displays 94. The one or more displays 94 may include a see-through display surface onto which images are projected, such as a see-through liquid crystal display (LCD), a see-through organic light emitting diode (OLED) display, or other similar display useful in displaying the real world and the AR graphical images to the rider 22. For example, the rider 22 may view the AR graphics appearing on the respective displays 94 as an overlay to the actual and physical real world environment. In accordance with the present embodiments, the head mounted display 28 may receive, via the communication interface 92, the display signal (e.g., AR graphics together with the respective overlay information, such as spatial and/or temporal information with respect to the one or more displays 94), such that the head mounted display 28 may process and overlay, via the processor 86, the AR graphics on the one or more displays 94 so that the rider 22 perceives that the AR graphics of the AR environment 25 are integrated into the real world environment. In an embodiment, the head mounted display 28 may include one or more sound devices (e.g., earphones, speakers, microphones).
To illustrate driver control of movement of the ride vehicle 20,
In the illustrated embodiment, the ride vehicle 20 is traveling along the track 18 in the direction 106. The riders 22 may be viewing the AR environment 25 (e.g., game environment) in combination with themed attractions 108 (e.g., real-world scenery of the ride attraction 12) via the head mounted displays 28. At certain points during the ride attraction 12, one rider 22 may be the driving rider 100. The driving rider 100 may be the rider 22 whose respective steering user input device 34 is currently activated. As previously discussed, the ride and game control system 24 may rotate control of the steering user input device 34 between the riders 22 throughout the duration of the ride attraction 12, such that each rider 22 of the ride vehicle 20 may have an opportunity to steer the ride vehicle 20. The steering user input device 34 may output signals to the ride and game control system 24 (e.g., to the ride controller 42, the game controller 44, and/or the respective game system 48) to directly control movement of the ride vehicle 20. While traveling along the track 18, the driving rider 100 may be presented with an AR object 26, or physical object, that is an instruction object 104 via the respective head mounted display 28. The instruction object 104 may present driving or game instructions to the driving rider 100, such as instructions to turn a certain direction, to increase or decrease speed, to follow a certain track at a track split, or to collide with or avoid certain AR objects 26 of the AR environment 25. The instruction object 104 indicating driving instructions may be presented only to the driving rider 100, or may be presented to all of the riders 22 of the ride vehicle 20. Additionally, while the instruction object 104 is shown as an arrow in the illustrated embodiment, the instruction object 104 may be text, a symbol, a graphic, a character, or any other object that may be used to indicate to the driving rider 100 a direction to steer or control the ride vehicle 20.
As an example, in the illustrated embodiment, the driving rider 100 controlling the steering user input device 34 is presented with the instruction object 104 via the respective head mounted display 28 combined with themed attractions 108 within their field-of-view 110. The instruction object 104, in the illustrated embodiment, is presented as an arrow indicating a turning direction to follow at an upcoming track split 112. The track 18 splits at the track split 112 such that the driver may continue on a current path by staying on a track 14 or may turn the ride vehicle 20 using the steering user input device 34 to follow a path 116 as indicated by the instruction object 104. If the driving rider 100 turns the ride vehicle 20 toward the track 116 using the steering user input device 34, the ride vehicle 20 may be controlled by the ride controller 42 and/or the game controller 44 of the ride and game control system 24 to turn in the direction of the track 116. As such, the driving rider 100 is able to directly control movement of the ride vehicle 20 via the steering user input device 34.
Additionally, the driving rider 100 may control movement of the ride vehicle 20 by following the instruction objects 104. As such, correctly following the indication of the presented instruction object 104 may cause additional movement of the ride vehicle 20 and/or may add to a score of the driving rider 100, or a score of the ride vehicle 20, which may cause additional movement of the ride vehicle 20 to be triggered by the ride and game control system 24. For example, in the illustrated embodiment, if the driving rider 100 follows the instruction object 104 by using the steering user input device 34 to turn the ride vehicle 20 toward the track 116, not only will the ride vehicle 20 be caused to turn toward the track 116 based at least in part on signals received from the steering user input device 34, but additional movement of the ride vehicle 20 may be triggered by correctly following the instruction object 104 and/or increasing the driving score. The movement triggered by following the instruction object 104 may include any movement additional to the direct movement initiated by input signals from the steering user input device 34, such as a change in velocity (e.g., from a first non-zero velocity to a second non-zero velocity) of the ride vehicle 20, start the ride vehicle 20, stop the ride vehicle 20, or other suitable movement. For example, the movement triggered by following the instruction object 104 may be to cause the ride vehicle 20 to move faster to provide a more exciting ride or to cause the ride vehicle 20 to move slower to allow the other riders 22 more time to interact with the AR objects 26, thereby allowing the other riders 22 and/or the ride vehicle 20 to collect more points. As such, the driving rider 100 is able to indirectly control movement of the ride vehicle 20 by following the presented instruction object 104. Additionally, following the instruction object 104 may cause the AR environment 25 to change, such that the other riders 22 may be presented with more AR objects 26.
In an embodiment, in addition to triggering movement of the ride vehicle 20 by following the instruction object 104 and/or increasing the score of the driving rider 100, or the score of the ride vehicle 20, the driving rider 100 may cause movement of the ride vehicle 20 by not following, or incorrectly following, the instruction object 104. Such movement may include spinning of the ride vehicle 20, a decrease in velocity of the ride vehicle 20, or other suitable movement. For example, in the illustrated embodiment, if the driving rider 100 does not follow the instruction object 104 (e.g., within a time period following display of the instruction and/or prior to the track split 112) indicating to steer the ride vehicle 20 toward the track 116, the ride and game control system 24 may cause the ride vehicle to spin and/or change velocity based at least in part on the missed instruction object 104. For example, the movement triggered by not following, or missing, the instruction object 104 may be to cause the ride vehicle 20 to move faster to make it more difficult for the other riders 22 to earn points or to move slower to make the game easier. As such, in an embodiment, the driving rider 100 is also able to indirectly cause movement of the ride vehicle 20 by not following the presented instruction object 104. Additionally, missing the instruction object 104 may cause the AR environment 25 to change, such that the other riders 22 may be presented with fewer or different AR objects 26. In an embodiment, a missed instruction object 104 may result in a decrease in the rider score or ride vehicle score, which may in turn trigger movement of the ride vehicle 20 as well.
In an embodiment, the game controller 44 and/or the respective game system 48 of the ride and game control system 24 may include a weighting system that may cause particular AR objects, including the instruction object 104, to be presented to each rider 22, or ride vehicle 20, corresponding to a skill level of the rider 22 or the ride vehicle 20 determined by the weighting system. The ride and game control system 24 may monitor driving and/or interactions with the AR environment 25, including the instruction object 104, of each rider 22 and/or ride vehicle 20 for a determined period of time at the beginning of the ride attraction 12. Based at least in part on the monitored driving and/or interactions with the AR environment 25, the ride and game control system 24 may determine a starting skill level of the rider 22 or the ride vehicle 20. A subsequent scene or predetermined interaction may be generated by the computer graphics generation system 70 and presented to the rider(s) 22 based on the determined skill level.
As such, each rider 22 or ride vehicle 20 may be presented with the AR environment 25, including the instruction objects 104, corresponding to the determined skill level, such that each rider 22 or ride vehicle 20 may be presented with different AR objects at the same interaction area along the ride attraction 12. The weighting system of the ride and game control system 24 may determine and update the skill level of the rider 22 and/or the ride vehicle 20 at each predetermined interaction area, thus allowing the AR environment 25 to correspond to the skill level of the rider 22 and/or all riders 22 of the ride vehicle 20, collectively, as the ride vehicle 20 travels through the ride attraction 12. Therefore, the instruction object 104 presented to the driving rider 100 may be based at least in part on the current determined skill level of the driving rider 100 and/or the current determined skill level of the riders 22 of the ride vehicle 20 as a whole. Additionally, the types of movements of the ride vehicle 20 triggered directly and indirectly by inputs received from the steering user input device 34 may be based at least in part on the current determined skill level of the driving rider 100 and/or the current determined skill level of the riders 22 of the ride vehicle 20 as a whole. For example, if the driving rider 100 is determined to have a relatively high skill level, the instruction object 104 may appear closer to the track split 112 and/or following the instruction correctly may cause the velocity of the ride vehicle 20 to change to a greater extent compared to lower skill levels.
Similarly,
In the illustrated embodiment, the ride vehicle 20 is traveling along the track 18 in the direction 106. The riders 22 may be viewing the AR environment 25 in combination with themed attractions 108 (e.g., real-world scenery of the ride attraction 12) via the head mounted displays 28. While traveling along the track 18, the marking rider(s) 130 may be presented with AR objects 26, such as objects, rewards, and/or characters, within their field-of-view 134. The marking rider 130 may mark or otherwise interact with the AR objects with AR projectiles 136, as in the illustrated embodiment. Such interactions with the AR objects 26 of the AR environment 25 may indirectly affect or trigger movement of the ride vehicle 20 via the ride and game control system 24. Interactions with the AR objects 26 may also increase the rider score or ride vehicle score, which in turn may affect or trigger movement of the ride vehicle 20. For example, the marking rider 130 may mark the presented AR objects 26 with AR projectiles 136. If the marking rider 130 marks or hits the AR object 26, or wins a game with the AR object 26 (e.g., a character), the ride and game control system 24 may trigger certain movements of the ride vehicle 20, such as change the velocity of the ride vehicle 20, spin the ride vehicle 20, shake the ride vehicle 20 or components of the ride vehicle 20 (e.g., seats), or other suitable movement. Additionally, achieving certain rider scores or ride vehicle scores may also trigger such movements of the ride vehicle 20. The game controller 44 and/or the respective game system 48 may receive input signals from the marking user input device 36 and based at least in part on the signals received, may output control signals to the ride controller 42 to cause certain movement of the ride vehicle 20. As such, the marking rider(s) 130, or riders 22 not currently in control of the steering user input device 34, are able to indirectly control movement of the ride vehicle 20 by interacting with the presented AR objects 26.
As previously discussed, in an embodiment, the game controller 44 and/or the respective game system 48 of the ride and game control system 24 may include or act as a weighting system that may cause particular AR objects 26 to be presented to each rider 22, or ride vehicle 20, corresponding to a skill level of the rider 22 or the ride vehicle 20 determined by the weighting system. The ride and game control system 24 may monitor driving and/or interactions with the AR environment 25 of each rider 22 and/or ride vehicle 20 for a determined period of time at the beginning of the ride attraction 12. Based at least in part on the monitored driving and/or interactions with the AR environment 25, the ride and game control system 24 may determine a starting skill level of the rider 22 or the ride vehicle 20. A subsequent scene or predetermined interaction may be generated by the computer graphics generation system 70 and presented to the rider(s) 22 based on the determined skill level.
As such, each rider 22 or ride vehicle 20 may presented with the AR environment 25 corresponding to the determined skill level, such that each rider 22 or ride vehicle 20 may be presented with different AR objects 26 at the same interaction area along the ride attraction 12. The weighting system of the ride and game control system 24 may determine and update the skill level of the rider 22 and/or the ride vehicle 20 at each predetermined interaction area, thus allowing the AR environment 25 to correspond to the skill level of the rider 22 and/or all riders 22 of the ride vehicle 20, collectively, as the ride vehicle 20 travels through the ride attraction 12. Therefore, the AR objects 26 presented to the marking rider 130 may be based at least in part on the current determined skill level of the driving rider 100 and/or the current determined skill level of the riders 22 of the ride vehicle 20 as a whole. Additionally, the types of movements of the ride vehicle 20 triggered indirectly by inputs received from the marking user input device 36 may be based at least in part on the current determined skill level of the marking rider 130 and/or the current determined skill level of the riders 22 of the ride vehicle 20 as a whole. For example, if the marking rider 130 is determined to have a relatively high skill level, a difficulty level of the interaction may be relatively high, such as harder to hit AR objects 26, and hitting or marking the AR objects 26 may cause the velocity of the ride vehicle 20 to change to a greater degree compared to lower skill levels.
It should be understood that both
As previously discussed, movement of the ride vehicle 20 may be controlled by both the driving rider 100 and the other riders 22 (e.g., the marking riders 130). Steering of the ride vehicle 20 may be controlled by the driving rider 100 via the steering user input device, and additionally, in an embodiment, velocity of the ride vehicle 20 may be controlled by the driving rider 100 via the other user input devices 38, such as the accelerator user input device or brake user input device. Movement of the ride vehicle 20, such as steering and/or velocity of the ride vehicle 20, may also be controlled based on a performance of the driving rider 100 and/or the other riders 22 (e.g., the marking riders 130) relative to the AR environment 25 or game. With that in mind,
Next, the ride and game control system 24 may trigger direct movement of the ride vehicle 20 based at least in part on the received steering input received, such as turning the ride vehicle 20 in the direction of the steering input, via the ride controller 42 and/or the game controller 44 (block 156). Next, the ride and game control system 24 may determine, via the game controller 44 and/or the respective game system 48, whether the steering inputs received from the steering user input device 34 correspond to the indicated instruction presented to the driving rider 100 via the instruction object 104 (block 158). For example, whether the driver appropriately turned the steering wheel (e.g., steering user input device 34) and/or operated other inputs, such as the accelerator user input, within a time period following presentation of the instruction object 104. If the ride and game control system 24 determines that the received steering input corresponds to the instruction object 104, the ride and game control system 24 may trigger additional movement of the ride vehicle 20, such as a change in velocity of the ride vehicle 20 (block 160). Additionally or alternatively, movement of the ride vehicle 20 may be affected or triggered based at least in part on the rider score and/or ride vehicle score, such that movements are triggered when the rider score and/or ride vehicle score exceed a threshold score. In such embodiments, the rider score and/or ride vehicle score may be increased in response to the driver correctly following the instructions 104.
If the ride and game control system 24 determines that the received steering input does not correspond to the instruction object 104, the ride and game control system 24 may trigger different movement of the ride vehicle 20, such as a change in velocity of the ride vehicle 20 and/or spinning of the ride vehicle 20 (block 162). In an embodiment, movement may not be triggered if the ride and game control system 24 determines that the received steering input does not correspond to the presented instruction object 104. It should be understood that the method 150 may be an iterative or repeating process that is performed throughout the duration of the ride attraction 12 to trigger movement of the ride vehicle 20.
Next, the ride and game control system 24 may determine, via the game controller 44 and/or the respective game system 48, whether the received signals indicative of the interaction of the marking rider(s) 130 with the AR objects 26 correspond to hits of the AR objects 26 or a win of a simulated game (block 176). If the ride and game control system 24 determines that the interaction corresponds to marks or hits of the AR objects 26 or a win of a simulated game, the ride and game control system 24 may trigger movement of the ride vehicle 20, such as a change in velocity of the ride vehicle 20 (block 178). Additionally or alternatively, movement of the ride vehicle 20 based at least in part on the interaction of the marking rider(s) 130 with the AR environment 25 may be triggered based at least in part on the rider score and/or ride vehicle score, such that additional movements are triggered when the rider score and/or ride vehicle score exceed a threshold score. In such embodiments, the rider score and/or ride vehicle score may be increased based on hits or wins. If the ride and game control system 24 determines that the interaction does not correspond to marks or hits of the AR objects 26 or a win of a simulated game, the ride and game control system 24 may trigger movement of the ride vehicle 20, such as changing the velocity of the ride vehicle 20, or may not trigger additional movement of the ride vehicle 20.
It should be understood that the methods 150 and 170 may be performed independently or together throughout the duration of the ride attraction 12. Performance of the methods 150 and 170 may alternate or be at different portions of the ride attraction 12. Therefore, movement of the ride vehicle 20 may be controlled at some times by the driving rider 100, at some times by the other riders 22 (e.g., the marking riders 130), or at some times by both the driving rider 100 and the marking riders 130. Further, as each rider 22 may be in control of at least one user input device 32 at a time, the methods 150 and 170 may be continuously performed for each rider 22 and each user device throughout the duration of the ride attraction 12 to create divided control of the movement of the ride vehicle 20 by the riders 22 and their interaction with the AR environment 25 (e.g., the game environment).
While only certain features of the present embodiments have been illustrated and described herein, many modifications and changes will occur to those skilled in the art. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and changes as fall within the true spirit of the present disclosure. Further, it should be understood that certain elements of the disclosed embodiments may be combined or exchanged with one another.
The techniques presented and claimed herein are referenced and applied to material objects and concrete examples of a practical nature that demonstrably improve the present technical field and, as such, are not abstract, intangible or purely theoretical. Further, if any claims appended to the end of this specification contain one or more elements designated as “means for [perform]ing [a function] . . . ” or “step for [perform]ing [a function] . . . ”, it is intended that such elements are to be interpreted under 35 U.S.C. 112(f). However, for any claims containing elements designated in any other manner, it is intended that such elements are not to be interpreted under 35 U.S.C. 112(f).
This application claims priority to and benefit of U.S. Provisional Patent Application No. 62/467,817, entitled “SYSTEMS AND METHODS FOR DIGITAL OVERLAY IN AN AMUSEMENT PARK ENVIRONMENT,” filed Mar. 6, 2017, which is hereby incorporated by reference in its entirety for all purposes.
Number | Name | Date | Kind |
---|---|---|---|
5716281 | Dote | Feb 1998 | A |
5785592 | Jacobsen | Jul 1998 | A |
5844530 | Tosaki | Dec 1998 | A |
6179619 | Tanaka | Jan 2001 | B1 |
6220965 | Hanna | Apr 2001 | B1 |
6462769 | Trowbridge | Oct 2002 | B1 |
6533670 | Drobnis | Mar 2003 | B1 |
6606953 | Mares | Aug 2003 | B2 |
6796908 | Weston | Sep 2004 | B2 |
7495638 | Lamvik et al. | Feb 2009 | B2 |
7955168 | Mendelsohn et al. | Jun 2011 | B2 |
8025581 | Bryan et al. | Sep 2011 | B2 |
8066576 | Threlkel | Nov 2011 | B2 |
8212859 | Tang et al. | Jul 2012 | B2 |
8491403 | Schreibfeder | Jul 2013 | B2 |
8511827 | Hua et al. | Aug 2013 | B2 |
8576276 | Bar-Zeev et al. | Nov 2013 | B2 |
8705177 | Miao | Apr 2014 | B1 |
8767014 | Vaught et al. | Jul 2014 | B2 |
8810482 | Abdollahi et al. | Aug 2014 | B2 |
8867139 | Gupta | Oct 2014 | B2 |
8894492 | Ackley et al. | Nov 2014 | B2 |
8941559 | Bar-Zeev et al. | Jan 2015 | B2 |
9052505 | Cheng et al. | Jun 2015 | B2 |
9092953 | Mortimer et al. | Jul 2015 | B1 |
9155971 | Trowbridge | Oct 2015 | B1 |
9253524 | Kaburlasos et al. | Feb 2016 | B2 |
9266028 | Alfieri et al. | Feb 2016 | B2 |
9268138 | Shimizu et al. | Feb 2016 | B2 |
9285871 | Geisner et al. | Mar 2016 | B2 |
9286730 | Bar-Zeev et al. | Mar 2016 | B2 |
9292973 | Bar-Zeev et al. | Mar 2016 | B2 |
9310591 | Hua et al. | Apr 2016 | B2 |
9316834 | Makino et al. | Apr 2016 | B2 |
9342610 | Liu et al. | May 2016 | B2 |
9354446 | Abdollahi et al. | May 2016 | B2 |
9360671 | Zhou | Jun 2016 | B1 |
9366870 | Cheng et al. | Jun 2016 | B2 |
9366871 | Ghosh et al. | Jun 2016 | B2 |
9383582 | Tang et al. | Jul 2016 | B2 |
9389423 | Bhardwaj et al. | Jul 2016 | B2 |
9395811 | Vaught et al. | Jul 2016 | B2 |
9454010 | Passmore et al. | Sep 2016 | B1 |
9497501 | Mount et al. | Nov 2016 | B2 |
9519144 | Lanman et al. | Dec 2016 | B2 |
9569886 | Akenine-Moller et al. | Feb 2017 | B2 |
9582922 | Lanman et al. | Feb 2017 | B2 |
9588341 | Bar-Zeev et al. | Mar 2017 | B2 |
9606362 | Passmore et al. | Mar 2017 | B2 |
9638921 | Miller et al. | May 2017 | B2 |
9658460 | Lee et al. | May 2017 | B2 |
9690371 | Saito | Jun 2017 | B2 |
9690374 | Clement et al. | Jun 2017 | B2 |
9690375 | Blum et al. | Jun 2017 | B2 |
9733477 | Gupta | Aug 2017 | B2 |
9733480 | Baek et al. | Aug 2017 | B2 |
9733481 | Carollo et al. | Aug 2017 | B2 |
9741125 | Baruch et al. | Aug 2017 | B2 |
9763342 | Long et al. | Sep 2017 | B2 |
9778467 | White et al. | Oct 2017 | B1 |
9839857 | Wagner | Dec 2017 | B2 |
9864406 | Miller et al. | Jan 2018 | B2 |
9869862 | Cheng et al. | Jan 2018 | B2 |
9874749 | Bradski et al. | Jan 2018 | B2 |
9877016 | Esteban et al. | Jan 2018 | B2 |
9885871 | Abdollahi et al. | Feb 2018 | B2 |
9933624 | White et al. | Apr 2018 | B1 |
20060250322 | Hall et al. | Nov 2006 | A1 |
20080188318 | Piccionelli et al. | Aug 2008 | A1 |
20100131865 | Ackley et al. | May 2010 | A1 |
20110141246 | Schwartz et al. | Jun 2011 | A1 |
20110242134 | Miller et al. | Oct 2011 | A1 |
20120320100 | Machida et al. | Dec 2012 | A1 |
20130137076 | Perez et al. | May 2013 | A1 |
20130141419 | Mount et al. | Jun 2013 | A1 |
20140118829 | Ma et al. | May 2014 | A1 |
20140146394 | Tout et al. | May 2014 | A1 |
20140364208 | Perry | Dec 2014 | A1 |
20140364209 | Perry | Dec 2014 | A1 |
20140364212 | Osman et al. | Dec 2014 | A1 |
20150003819 | Ackerman et al. | Jan 2015 | A1 |
20150090242 | Weston et al. | Apr 2015 | A1 |
20150097863 | Alaniz et al. | Apr 2015 | A1 |
20150100179 | Alaniz et al. | Apr 2015 | A1 |
20150103152 | Qin | Apr 2015 | A1 |
20150190726 | Frolov | Jul 2015 | A1 |
20150312561 | Hoof et al. | Oct 2015 | A1 |
20150363976 | Henson | Dec 2015 | A1 |
20160048203 | Blum et al. | Feb 2016 | A1 |
20160062454 | Wang et al. | Mar 2016 | A1 |
20160089610 | Boyle et al. | Mar 2016 | A1 |
20160097929 | Yee et al. | Apr 2016 | A1 |
20160097930 | Robbins et al. | Apr 2016 | A1 |
20160098095 | Gonzalez-Banos et al. | Apr 2016 | A1 |
20160171779 | Bar-Zeev et al. | Jun 2016 | A1 |
20160188943 | Franz | Jun 2016 | A1 |
20160196694 | Lindeman | Jul 2016 | A1 |
20160210784 | Ramsby et al. | Jul 2016 | A1 |
20160240013 | Spitzer | Aug 2016 | A1 |
20160346704 | Wagner | Dec 2016 | A1 |
20160353089 | Gallup et al. | Dec 2016 | A1 |
20160364907 | Schoenberg | Dec 2016 | A1 |
20160370855 | Lanier et al. | Dec 2016 | A1 |
20160377869 | Lee et al. | Dec 2016 | A1 |
20160379417 | Mount et al. | Dec 2016 | A1 |
20170053445 | Chen et al. | Feb 2017 | A1 |
20170053446 | Chen et al. | Feb 2017 | A1 |
20170053447 | Chen et al. | Feb 2017 | A1 |
20170059831 | Hua et al. | Mar 2017 | A1 |
20170116950 | Onal | Apr 2017 | A1 |
20170131581 | Pletenetskyy | May 2017 | A1 |
20170171538 | Bell et al. | Jun 2017 | A1 |
20170176747 | Vallius et al. | Jun 2017 | A1 |
20170178408 | Bavor, Jr. et al. | Jun 2017 | A1 |
20170193679 | Wu et al. | Jul 2017 | A1 |
20170208318 | Passmore et al. | Jul 2017 | A1 |
20170212717 | Zhang | Jul 2017 | A1 |
20170220134 | Burns | Aug 2017 | A1 |
20170221264 | Perry | Aug 2017 | A1 |
20170236332 | Kipman et al. | Aug 2017 | A1 |
20170242249 | Wall et al. | Aug 2017 | A1 |
20170255011 | Son et al. | Sep 2017 | A1 |
20170262046 | Clement et al. | Sep 2017 | A1 |
20170262047 | Saito | Sep 2017 | A1 |
20170270841 | An et al. | Sep 2017 | A1 |
20170277256 | Burns et al. | Sep 2017 | A1 |
20170285344 | Benko et al. | Oct 2017 | A1 |
20170293144 | Cakmakci et al. | Oct 2017 | A1 |
20170316607 | Khalid | Nov 2017 | A1 |
20170323416 | Finnila | Nov 2017 | A1 |
20170323482 | Coup et al. | Nov 2017 | A1 |
20170336863 | Tilton et al. | Nov 2017 | A1 |
20170337737 | Edwards et al. | Nov 2017 | A1 |
20170345198 | Magpuri et al. | Nov 2017 | A1 |
20170363872 | Border et al. | Dec 2017 | A1 |
20170363949 | Valente et al. | Dec 2017 | A1 |
20170364145 | Blum et al. | Dec 2017 | A1 |
20180003962 | Urey et al. | Jan 2018 | A1 |
20180018515 | Spizhevoy et al. | Jan 2018 | A1 |
20180024370 | Carollo et al. | Jan 2018 | A1 |
20180032101 | Jiang | Feb 2018 | A1 |
20180033199 | Eatedali et al. | Feb 2018 | A9 |
20180059715 | Chen et al. | Mar 2018 | A1 |
20180059776 | Jiang et al. | Mar 2018 | A1 |
20180095498 | Raffle et al. | Apr 2018 | A1 |
20180104601 | Wagner | Apr 2018 | A1 |
20180164594 | Lee et al. | Jun 2018 | A1 |
20180196262 | Cage | Jul 2018 | A1 |
20180203240 | Jones et al. | Jul 2018 | A1 |
Number | Date | Country |
---|---|---|
2138213 | Dec 2009 | EP |
2189200 | May 2010 | EP |
2911463 | Jul 2008 | FR |
2012141461 | Jul 2012 | JP |
5790187 | Oct 2015 | JP |
5801401 | Oct 2015 | JP |
2015228050 | Dec 2015 | JP |
5913346 | Apr 2016 | JP |
2016528942 | Sep 2016 | JP |
2017522911 | Aug 2017 | JP |
6191929 | Sep 2017 | JP |
6216100 | Oct 2017 | JP |
6237000 | Nov 2017 | JP |
2017532825 | Nov 2017 | JP |
6248227 | Dec 2017 | JP |
1996031444 | Jul 1998 | WO |
9851385 | Nov 1998 | WO |
2008059086 | May 2008 | WO |
2016023817 | Feb 2016 | WO |
Entry |
---|
Fred H. Previc et al: “Spatial Disorientation in Aviation, vol. 203 of Progress in astronautics and aeronautics”, p. 476, XP055489810, Jan. 1, 2004. |
Jiejie Zhu et al: “Handling occlusions in video-based augmented reality using depth information”, Computer Animation and Virtual Worlds, vol. 21, No. 5, pp. 509-521, XP055184802, Sep. 13, 2010. |
Anonymous: “Head-up display—Wikipedia”, XP055489840, retrieved from the Internet: URL:https://en.wikipedia.org/w/index.php?title=Head-up_display&oldid=766263622, pp. 2-3, Feb. 19, 2017. |
Anonymous: “Head-mounted display—Wikipedia”, XP055489914, retrieved from the Internet: URL:https://en.wikipedia.org/w/index.php?title=Head-mounted_display&oldid=767462464, pp. 1-3, Feb. 26, 2017. |
Anonymous: “Augmented reality—Wikipedia”, XP55488978, retrieved from the Internet: URL:https://en.wikipedia.org/w/index.php?title=Augmented_reality&oldid=768706789, pp. 3-4, Mar. 5, 2017. |
PCT/US2018/021165 International Search Report and Written Opinion dated Jul. 24, 2018. |
Number | Date | Country | |
---|---|---|---|
20180253905 A1 | Sep 2018 | US |
Number | Date | Country | |
---|---|---|---|
62467817 | Mar 2017 | US |