The subject matter disclosed herein relates to amusement park attractions, and more specifically, to providing augmented experiences in amusement park attractions.
Amusement parks or theme parks may include various entertainment attractions in providing enjoyment to guests (e.g., families and/or people of all ages) of the amusement parks. For example, the attractions may include a ride attraction (e.g., closed-loop track, dark ride, thriller ride, or other similar ride), and there may be themed environments along the ride that may be traditionally established using equipment, furniture, building layouts, props, decorations, and so forth. Depending on the complexity of the themed environments, this could prove to be very difficult and time-consuming to setup and replace the themed environment. It may also be very difficult to setup a themed environment that is entertaining for all passengers on the ride. The same themed environment may be appealing to some passengers, but not others.
In addition, due to different motion paths and/or different view perspectives of the passengers in the ride vehicle, it may be difficult to provide the same ride experience to all passengers. For example, passengers sitting in the front row may have better view and thus a more immersive ride experience than passengers in the back row. It is now recognized that it is desirable to include attractions where it may be possible to change attraction themes, or to include or remove certain themed features in such attractions in a flexible and efficient manner relative to traditional techniques. It is also now recognized that it may be desirable to provide an immersive and more personalized or customized ride experience for all passengers.
Certain embodiments commensurate in scope with the present disclosure are summarized below. These embodiments are not intended to limit the scope of the disclosure, but rather these embodiments are intended only to provide a brief summary of possible forms of present embodiments. Indeed, present embodiments may encompass a variety of forms that may be similar to or different from the embodiments set forth below.
In one embodiment, a ride system includes a ride vehicle configured to accommodate a passenger and configured to travel along a ride path during a ride in an amusement park, a head mounted display connected to the ride vehicle and configured to be worn by the passenger, and a ride and game control system integrated with the ride vehicle and configured to coordinate a ride experience of the passenger with events occurring during the ride using at least the head mounted display and the ride vehicle. The ride and game control system includes a user interface configured to receive inputs from the passenger, a ride controller configured to control physical operation of the ride vehicle based at least on data relating to operational parameters of the ride vehicle and the inputs from the passenger, and a monitoring system configured to monitor at least a position and an orientation of the head mounted display, a position and an orientation of the ride vehicle, or a combination thereof. The ride and game control system also includes a computer graphics generation system communicatively coupled to the head mounted display and configured to selectively generate AR features for display on the head mounted display based on data received from the ride controller, the monitoring system, the user interface, or any combination thereof.
In another embodiment, a method includes receiving and analyzing data via a ride and game control system integrated with a ride vehicle configured to accommodate one or more passengers, wherein the ride vehicle is configured to travel along a ride path during a ride in an amusement park, and wherein the data relates to the one or more passengers, or to the ride vehicle, or both. The method includes generating, via a computer graphics generation system of the ride and game control system, gaming effects based on the received and analyzed data, wherein the gaming effects comprise augmented reality (AR) graphics. The method also includes transmitting, via the computer graphics generation system, the AR graphics for display on a head mounted display configured to be worn by a respective passenger of the one or more passengers in the ride vehicle.
In another embodiment, a ride and game control system physically integrated with a ride vehicle configured to carry a passenger along a ride path includes a ride controller configured to receive real-time data related to an input control state from at least a user interface, and configured to receive data related to operational parameters of the ride vehicle. The ride and game control system includes a game controller communicatively coupled to the ride controller and configured to push updates relating to the input control state and at least one of the operational parameters to features of the ride vehicle. The ride and game control system also includes a computer graphics generation system communicatively coupled to the game controller and to a head mounted display configured to be worn by the passenger, wherein the computer graphics generation system is configured to selectively render augmented reality (AR) graphics for display on the head mounted display based at least on the updates from the game controller.
These and other features, aspects, and advantages of the present disclosure will become better understood when the following detailed description is read with reference to the accompanying drawings in which like characters represent like parts throughout the drawings, wherein:
One or more specific embodiments of the present disclosure will be described below. In an effort to provide a concise description of these embodiments, all features of an actual implementation may not be described in the specification. It should be appreciated that in the development of any such actual implementation, as in any engineering or design project, numerous implementation-specific decisions must be made to achieve the developers' specific goals, such as compliance with system-related and business-related constraints, which may vary from one implementation to another. Moreover, it should be appreciated that such a development effort might be complex and time consuming, but would nevertheless be a routine undertaking of design, fabrication, and manufacture for those of ordinary skill having the benefit of this disclosure.
Present embodiments relate to systems and methods of providing an enhanced experience for guests visiting a themed attraction, such as passengers on a ride (e.g., closed-loop track, dark ride, or other similar ride). The enhanced experience may be performed as the passengers travel a ride path in an amusement park or a theme park. In particular, a ride and game control system associated with an attraction may provide gaming effects resulting in an augmented reality (AR) experience by way of head mounted displays. The ride and game control system may provide other entertainment experiences using show effect systems (e.g., projection display devices, digital display devices, sound systems, lighting systems, etc.) disposed along the ride path. To provide more personalized and immersive AR experiences, the ride and game control system may selectively provide an AR experience based on a passenger's viewing interest and/or engagement in gameplay associated with the ride. As one example, the ride and game control system may determine which real-world features (e.g., aspects of the physical amusement park attractions) the passenger is looking at, and may provide the AR experience according to the passenger's interest and may enable the passenger to interact (e.g., engaging, grabbing, selecting, targeting, moving, etc.) with AR features associated with the real-world features.
As another example, the ride and game control system may dynamically change perspectives of AR features, such that the passenger's visualization experience is not blocked or hindered by other passengers and/or real-world objects (e.g., by bringing the gaming and/or show effects right in front of the passenger's eyes). As another example, the ride and game control system may dynamically provide and/or update the AR features corresponding to the engaged game effects by another passenger. Furthermore, the ride and game control system may provide a real-time adaptive AR experience based on passengers' reactions and/or engagements. For example, presence and/or content of the AR features may be updated or changed in real-time based on how many passengers are looking, whether passengers appear to be enjoying the AR experience, whether passengers are engaging in an AR-based gameplay, and so forth.
While present embodiments may be implemented in a variety of settings, an example setting in which a ride and game control system 10 is used in an amusement park 12 is shown schematically in
While the passengers may find the ride 14 to be a very enjoyable experience, in certain embodiments, it may be useful to enhance their ride experience. Specifically, instead of having a physical view of only the scenery, the ride experience provided to the passengers 24, 26, 28, and 30 may be enhanced with gaming effects including augmented reality (AR) experience by way of head mounted displays 32. For example, as the ride vehicle 22 travels along the tracks 20, the ride and game control system 10 may coordinate AR images or features, such as AR objects 36, to be shown to the passengers 24, 26, 28, and 30 on their respective head mounted displays 32. In addition, the ride and game control system 10 may coordinate off-board (e.g., off the ride vehicle 22) entertainment to enhance the ride experience provided to the passengers 24, 26, 28, and 30. For example, as the ride vehicle 22 travels along the tracks 20, the ride and game control system 10 may coordinate visual and/or sound presentations provided by way of a show effect system 34 that may include a projection game computer, display devices (e.g., projection display devices, digital display devices), lighting systems, and sound effect devices (e.g., speakers) disposed along the tracks 20.
In some embodiments, the ride experience provided to the passengers 24, 26, 28, and 30 may be enhanced with game play including the AR experience. For example, certain embodiments of the ride 14 may involve passenger interaction with the AR images or features, for example simulated interaction with the AR objects 36 as the ride vehicle 22 passes by or through the AR objects 36. Certain embodiments of the ride vehicle 22 may be user-controlled, and one aspect of a game may be to interact with various AR objects 36 by directing the ride vehicle 22 toward the AR objects 36 and/or avoid colliding certain AR objects 36 by steering away from them. The simulated interaction may cause the AR objects 36 to be affected according to certain predetermined or modeled responses stored by the ride and game control system 10. As an example, the predetermined or modeled responses may be implemented by a physics model, engine or similar module implemented by the ride and game control system 10.
Further, the ride and game control system 10 may coordinate these gaming effects including the on-board (e.g., on the ride vehicle 22) AR and the off-board (e.g., off the ride vehicle 22) entertainment experience set forth above to collectively enhance the ride experience of the passengers 24, 26, 28, and 30. For example, the ride and game control system 10 may be communicatively coupled to a server 38 (e.g., a remote server, an on-site server, or an outboard server) via a wireless communication network (e.g., wireless local area networks [WLAN], wireless wide area networks [WWAN], near field communication [NFC]). The server 38 may be a master game server that coordinates the on-board and off-board experiences. In some embodiments, the server 38 may store and/or process user information of the passengers 24, 26, 28, and 30, such that the ride and game control system 10 may provide personalized gaming effects to the passengers based on the user information. The user information may include any suitable information provided by or authorized by the users, such as payment information, membership information, personal information (e.g., age, height, special needs, etc.), gaming information (e.g., information about the video game associated with the themed attractions 16, information about a particular character the user is associated with in the video game, information about game history of the user), and so forth.
The ride controller 42 may be a programmable logic controller (PLC), or other suitable control device. The ride controller 42 may include a processor (e.g., a general-purpose processor, a system-on-chip (SoC) device, an application-specific integrated circuit (ASIC), or some other similar processor configuration) operatively coupled to a memory (e.g., a tangible non-transitory computer-readable medium and/or other storage device) to execute instructions for tracking operational parameters or formation of the ride vehicle 22. For example, the operational parameters or information of the ride vehicle 22 may include, but is not limited to, position (e.g., with precision level in the range of millimeters), yaw, pitch roll, and velocity of the ride vehicle 22, and input control state (e.g., input provided by one or more of the passengers to drive the ride vehicle 22). In some embodiments, the ride controller 42 may also be configured to control or change physical operation of the ride vehicle 22 based on the input control state provided by one or more of the passengers (e.g., to change the position, yaw, pitch roll, and the velocity of the ride vehicle 22).
The game controller 44 may be a programmable logic controller (PLC), or other suitable control device. The game controller 44 may include a processor (e.g., a general-purpose processor, a system-on-chip (SoC) device, an application-specific integrated circuit (ASIC), or some other similar processor configuration) operatively coupled to a memory (e.g., a tangible non-transitory computer-readable medium and/or other storage device) to execute instructions stored in the memory. The game controller 44 may be configured to provide operational parameters or information relating to the ride vehicle 22 (e.g., a summed game state of the ride vehicle 22) to the one or more game systems 48 and/or the server 38. The operational parameters or information may include, but is not limited to position, yaw, pitch roll, and the velocity of the ride vehicle 22. In some embodiments, the game controller 44 may transmit the operational parameters or information via user datagram protocol (UDP) to the one or more game systems 48 and/or the server 38. It may be appreciated that data transmission via UDP may incur less delay, making an appealing choice for delay-sensitive applications, such as the presently disclosed techniques of generating the personalized immersive ride experience.
The monitoring system 46 may include any suitable sensors and/or computing systems disposed on or integrated with the ride vehicle 22 to track the positions, locations, orientations, presences, and so forth of the passengers (e.g., the passengers 24, 26, 28, and 30) and/or the position, location, or orientation of the ride vehicle 22. Such sensors may include orientation and position sensors (e.g., accelerometers, magnetometers, gyroscopes, Global Positioning System [GPS] receivers), motion tracking sensors (e.g., electromagnetic and solid-state motion tracking sensors), inertial measurement units (IMU), presence sensors, and others. The information obtained by the monitoring system 46 may be useful in determining each passenger's gaze direction, viewing perspective, field of view, viewing interest, interaction with the game, and so forth. In some embodiments, the monitoring system 46 may also receive data obtained by the head mounted display 32 indicative of the respective passenger's gaze direction, viewing perspective, field of view, viewing interest, interaction with the game, and so forth (e.g., position and orientation data of the head mounted display 32).
The one or more game systems 48, generally, may be configured to render virtual or augmented graphics for overlay onto real-world environmental views. The one or more game systems 48 may also be responsible for game logic, and to run simulations of real world ride vehicles and stage geometry for the placement of virtual objects in real space. In certain embodiments, the one or more game systems 48 are configured to provide AR and/or game play experiences to the passengers (e.g., the passengers 24, 26, 28, and 30). In particular, each seat of the ride vehicle 22 may include a dedicated game system 48. In some embodiments, the one or more game systems 48 may be communicatively coupled to one another, such that the passengers may engage in a shared game (e.g., a game having multiple players). The one or more game systems 48 may be communicatively coupled (directly or indirectly) to the game controller 44, the monitoring system 46, the server 38, and the show effect system 34. Each of the one or more game systems 48 may include a user interface 50 and a computer graphics generation system 52. The user interface 50 may be communicatively coupled to the computer graphics generation system 52 and the computer graphics generation system 52 may be communicatively coupled to a respective head mounted display 32 (e.g., via the communication network 40).
The user interface 50 may include one or more user input devices (e.g., handheld controllers, joysticks, push buttons) disposed on the ride vehicle 22 to enable the passenger to provide inputs. For example, the user interface 50 may be configured to enable different actions and/or effects to be applied in the AR environment. For example, the user interface 50 may enable the passenger to control a character or an object of the AR features in different directions (e.g., up, down, left, right) in the AR environment. By way of a more specific example, the user interface 50 may enable the passenger to make selections or grab/release objects of the AR features in the AR environment. In some embodiments, the user interface 50 may enable the passenger to control operation of the ride vehicle 22, such as changing its velocity and/or direction. In some embodiments, the user interface 50 may also include a display screen and/or a touch screen to enable ride and game related information to be communicated to the passenger.
The computer graphics generation system 52 may generate and transmit AR graphics to be displayed on the respective head mounted display 32, such that the respective passenger may enjoy the immersive ride experience enhanced by the AR experience. The computer graphics generation system 52 includes processing circuitry, such as a processor 54 (e.g., general purpose processor or other processor) and a memory 56, and may process data useful in generating an AR experience for the respective passenger. The data useful in generating the AR experience may include, but is not limited to, real-time data received from the head mounted display 32, the user interface 50, and the game controller 44 (e.g., including data from the ride controller 42, the monitoring system 46, the server 38), and data stored in the memory 56.
The computer graphics generation system 52 may use such data to generate a frame of reference to register AR graphics to the real-world environment, for example to the generated real-world images or to the actual physical environment. Specifically, in certain embodiments, using the frame of reference generated based on orientation data, position data, point of view data, motion tracking data, and so forth, the computer graphics generation system 52 may render a view of the AR graphics in a manner that is temporally and spatially commensurate with what the respective passenger would perceive if not wearing the head mounted display 32. The computer graphics generation system 52 may store a model of the ride 14 that is built using spatial information of the real-word physical features of the ride 14 including the themed environment. The model is used, together with other inputs, e.g., from the ride controller 42, the game controller 44, the monitoring system 46 and/or the head mounted display 32, to locate the respective passenger and determine the passenger's gaze direction and/or field of view. The model may be used to provide display signals to the head mounted display 32 that are dynamically updated as the passenger travels along the tracks 20.
For example, the computer graphics generation system 52 may selectively generate AR graphics to reflect changes in the respective passenger's orientation, position, gaze direction, field of view, motion, and so forth. The computer graphics generation system 52 may selectively generate AR graphics based on data indicative of the position, a yaw, and a velocity, and/or other operational parameters of the ride vehicle 22 received from the monitoring system 46. The computer graphics generation system 52 may also selectively generate the AR graphics to reflect changes in inputs provided by the respective passenger using the user interface 50. Furthermore, the computer graphics generation system 52 may generate the AR graphics based on simulated interactions that may cause the AR objects to be affected according to certain predetermined or modeled responses stored by the computer graphics generation system 52 (e.g., in the memory 56). As an example, the predetermined or modeled responses may be implemented by a physics engine or similar module or as a part of the computer graphics generation system 52. In certain embodiments, the computer graphics generation system 52 may track the information or data set forth above corresponding to a plurality of passengers (e.g., the passengers 24, 26, 28, and 30) in a shared game, such that the passengers in the shared game may see the game effects applied by other players in the shared game.
The head mounted display 32 may include a processor 66 and a memory 68 (e.g., a tangible non-transitory computer-readable medium). The processor 66 and the memory 68 may be configured to allow the head mounted display 32 to function as a display (e.g., to receive signals from the computer graphics generation system 52 that ultimately drive the display). The processor 66 may be a general-purpose processor, system-on-chip (SoC) device, an application-specific integrated circuit (ASIC), or some other similar processor configuration.
The head mounted display 32 may include a tracking system 70 that may include orientation and/or position sensors, such as accelerometer, magnetometer, gyroscopes, GPS receiver, motion tracking sensor, electromagnetic and solid-state motion tracking sensor, IMU, presence sensor, and others. The tracking system 70 may collect real-time data indicative of the passenger's position, orientation, focal length, gaze direction, field of view, motion, or any combination thereof. The head mounted display 32 may include a communication interface 72 (e.g., including a wireless transceiver) that may transmit the real-time data captured via the tracking system 70 to the processor 66 and/or the computer graphics generation system 52 for processing. The communication interface 72 may also enable the head mounted display 32 to receive the display signal transmitted by the computer graphics generation system 52.
The electronic eyeglasses 60 of the head mounted display 32 may include one or more displays 74. The one or more displays 74 may include a see-through display surface onto which images are projected, such as a see-through liquid crystal display (LCD), a see-through organic light emitting diode (OLED) display, or other similar display useful in displaying the real world and the AR graphical images to the passenger 25. For example, the passenger 25 may view the AR graphics appearing on the respective displays 74 as an overlay to the actual and physical real world environment. In accordance with the present embodiments, the head mounted display 32 may receive, via the communication interface 72, the display signal (e.g., AR graphics together with the respective overlay information, such as spatial and/or temporal information with respect to the one or more displays 74), such that the head mounted display 32 may process and overlay, via the processor 66, the AR graphics on the one or more displays 74 so that the passenger 25 perceives that the AR graphics are integrated into the real world environment. In some embodiments, the head mounted display 32 may include one or more sound devices (e.g., earphones, speakers, microphones).
By way of example, the real-time data may include operational parameters or information of the ride vehicle 22 (e.g., from the ride controller 42, the game controller 44, the user interface 50, or a combination thereof) that may include, but is not limited to, position, yaw, pitch roll, and velocity of the ride vehicle 22. The real-time data may include inputs provided by one or more of the passengers (e.g., from the user interfaces 50) to drive the ride vehicle 22. Input provided by the passengers may include a request to change the position, yaw, pitch roll, and/or velocity of the ride vehicle 22. The real-time data may include position/orientation data of each passenger (e.g., from the ride controller 42, the game controller 44, the monitoring system 46, the head mounted displays 32). Such data may relate to or otherwise indicate the passenger's position, orientation, gaze direction, field of view, or any combination thereof. The real-time data may include user-input data (e.g., from the user interfaces 50) that may indicate interaction with or control of AR features, such as grabbing or releasing AR objects. As may be appreciated, the real-time data may be useful in determining each passenger's current state during the ride (e.g., the current position and/or orientation of the passenger, what the passenger is looking at, the passenger's interaction with AR features in the surreal environment 64).
The data received and analyzed at block 82 also includes data related to the passengers. For example, the data related to the passengers may include, but is not limited to, the passenger's identification and game history (e.g., from the server 38). As may be appreciated, the data related to the passengers may be useful in determining appealing AR features and/or other appealing visual and/or sound presentations for a respective passenger. In some embodiments, the real-time data and/or the data related to the passengers received by the processor 54 of each game system 48 (at block 82) may be shared among one another, such that the overall ride experience of the passengers (e.g., the passengers 24, 26, 28, and 30) may be coordinated by the ride and game control system 10. As described in more detail herein, the position/orientation data from the monitoring system 46 and/or the head mounted displays 32 may serve to facilitate synchronization between certain perspectives, such as the view perspective of the passenger and the view perspective of another passenger. Such synchronization may be useful for triangulation of passenger positon based on the known position of certain fixed features (e.g., certain attraction locations), for timing of certain special effects (e.g., real and/or augmented), for modeling of realistic AR effects (e.g., based on physics model), and other effects described below.
Furthermore, the data received in accordance with block 82 may not only be useful for directly impacting a single passenger's experience, but also to facilitate and enhance the physical layout of the real-world and/or simulated environment. For example, the ride and game control system 10 may track historical data, such as locations and views that are the most common across a relatively large sample size (e.g., more than 10, 50, or 100 passengers having corresponding head mounted displays 32) to ensure that passenger notifications are in a conspicuous location. Such data may also be used for selective application of the special effects. For example, the ride and game control system 10 may track the historical data of the user interface 50 and may generate game effects if the passengers are game-players (e.g., if the timing and/or usage of the user interface 50 correspond to the game), and may generate regular show effects if the passengers are not game-players. For example, the ride and game control system 10 may track the passenger's viewing perspectives and may generate special effects (e.g., gaming effects, AR effects) corresponding to the themed attraction if the passengers are looking in the direction of the themed attraction. Additionally, such data may be used to identify locations that are rarely viewed (e.g., based on user view perspectives) to locate areas for hiding certain real-world features (e.g., hide unsightly cables and so forth, or to virtually remove such features, as set forth herein) and/or hiding certain items that may be desirable from an experiential standpoint (e.g., hiding virtual Easter eggs or virtual scavenger hunt items).
The process 80 may include generating (block 84) gaming effects based on the real-time data. In some embodiments, generating gaming effects may include generating on-board entertainment including AR graphics to be displayed on the head mounted display 32. For example, the processor 54 of the computer graphics generation system 52 may generate AR graphics and/or video to be displayed on the one or more displays 74 of the head mounted display 32 based on any one or a combination of factors. Such factors may include the passenger's gaze direction and/or field of view, the passenger's usage of the user interface 50, the passenger's game play history, the position and/or orientation of the ride vehicle 22 along the tracks 20 (or other location when the tracks 22 are not present) at any given time point during a cycle of the ride 14, a predetermined distance traveled by the ride vehicle 22 during a cycle of the ride 14, after a predetermined lapse of time, or after one or more actions have been performed by one or more passengers of the ride vehicle 22. In particular, the processor 54 of the computer graphics generation system 52 may generate overlaid AR features using projection, one or more video merging and/or optical merging techniques, and so forth, such that the passenger may view the AR features displayed on the one or more displays 74 of the head mounted display 32 as an overlay of the passenger's view of the actual and physical real-world environment.
The generation of AR graphics may be used for AR addition of objects from an environment to add objects to the environment, to create games and develop personalized experiences for the themed environment. For example, animated features may be virtually added to the passenger's view on the one or more displays 74 of the head mounted display 32. In some embodiments, generation of the AR graphics may be used for AR removal of objects from an environment, such as to hide unsightly objects from the passenger, or so that certain objects (e.g., special effect features) appear to be floating to enhance the realism of a desired visual effect. For example, a bird or similar animated robotic feature may be suspended by a series of poles and wires, and the poles or wires may be actively removed from the environment so that the bird appears to fly through the air. As another example, certain themed areas may be fenced off by construction fences, which are often viewed as a negative experience. The AR features may be used to virtually remove the construction fence (e.g., the fenced area may be overlaid with environmental features, or transformed into a virtual attraction while construction is underway). Still in further embodiments, the computer graphics generation system 52 may generate realistic AR effects set forth above based on a physics model for simulating and/or timing certain special effects.
In some embodiments, generating gaming effects may include generating or initiating off-board entertainment including visual and/or audible effects using the show effect system 34. For example, the processor 54 of the computer graphics generation system 52 may be programmed to control the operation of the show effect system 34 (e.g., the projection game computer, the one or more display devices 35 and the one or more lighting systems 37) to provide real-world show effects to the passenger. Based on the model of the ride 14, the passenger's location/orientation, the gaze direction, field of view, and other data obtained at block 82, the computer graphics generation system 52 may generate a signal to display scene specific images or video on the one or more display devices 35 and/or to change operation the one or more lighting systems 37 (e.g., change lighting effects, such as lighting directions, timing, intensities, colors, and so forth). In some embodiments, for example, the processor 54 may generate the signals to control the display content and timing of the one or more display devices 35 and may generate signals to control the lighting effects based on any one or a combination of factors. Such factors may include the passenger's gaze direction and/or field of view, the passenger's usage of the user interface 50, the passenger's game play history, the position and/or orientation of the passenger ride vehicle 22 along the tracks 20 (or other location when the tracks 22 are not present) at any given time point during a cycle of the ride 14, a predetermined distance traveled by the ride vehicle 22 during a cycle of the ride 14, after a predetermined lapse of time, or after one or more actions have been performed by one or more passengers of the ride vehicle 22.
In still further embodiments, the processor 54 may generate the signal to control the show effect system 34 based on the generated AR graphics, such that the display content on the one or more displays 35, the lighting effects provided by the one or more lighting systems 37, and the AR graphics shown on the respective head mounted display 32 synchronize with one another. Different aspects of generating the gaming effects will be discussed in more detail with respect to
The process 90 may also include performing (block 94) simulations based on the received real-time data. As an example, the processor 54 may use a physics model stored in the memory 56 to simulate realistic AR effects. The physics model may be implemented using suitable software or algorithms describing, simulating, and/or illustrating movements and interactions of objects in reality. The physics model may take into account any one or a combination of factors obtained from the real-time data to simulate the movements and interactions of the AR graphics. Such factors may include the passenger's gaze direction and/or field of view, the passenger's position/orientation or change in position/orientation (e.g., moving speed and direction, rotating speed and direction), among other factors that may influence how the passenger may perceive the simulated AR graphics.
The physics model may also take into account the properties of the interacted AR objects and real-world objects. For example, in some embodiments, the computer graphics generation system 52 may use a computer-aided design or similar three-dimensional representation (e.g., digital three-dimensional model) of the real-world environment surrounding the ride to enhance simulations of interactions between real-world objects and computerized effects shown on the head mounted displays 32. For example, real-world environmental features surrounding the ride may be represented from a geometric and material of construction standpoint so that interactions between the real-world features and AR objects can be accurately simulated. In this way, the position, orientation, material construction, and so forth, of the real-world objects may be stored in the memory 56. Specially-configured software of the computer graphics generation system 52 may allow association of the real-world object representations with AR show effects, AR gameplay, and the physics model (or models) that govern the manner in which AR features are generated and updated.
In this respect, the computer graphics generation system 52 may use the physics model to simulate realistic interactions between real objects and computerized objects (e.g., AR objects and real-world objects). As an example, the computer graphics generation system 52 may accurately simulate a computer-generated ball bouncing off of a real-world object, such as a wall. The computer graphics generation system 52 may have a stored representation of the wall, such as its size, shape, and material construction. The computer graphics generation system 52 would account for these variables in simulating an interaction where the ball strikes the wall, for example by simulating realistic deformation of the ball, simulating realistic rebounding of the ball off of the wall, and so forth. The change in momentum, velocity, trajectory, and so on, simulated for the ball would be governed, at least in part, by the physics model of the computer graphics generation system 52.
Once the computer graphics generation system 52 has performed the appropriate simulations, the process 90 may include generating (block 96) gaming effects based on the simulation. In particular, the processor 54 may generate gaming effects using the simulated AR graphics in accordance with aspects of the process 80 discussed in block 84 of
It should be noted that for a ride in which there is a known motion profile, for example a particular path or set of tracks, certain simulations may be pre-loaded into the computer graphics generation system 52 to reduce computing requirements. For example, as the ride vehicle 22 travels along the path 20, the number of orientations, positions, and velocities of the ride passengers are generally limited. In this respect, the computer graphics generation system 52 may perform comparisons between the information relating to a particular passenger, and a known motion profile (e.g., a profile for which particular simulations have already been performed). In situations where the motion profile matches the known motion profile, the computer graphics generation system 52 may generate graphics according to the simulation that has already been performed.
Further still, in certain embodiments, the AR simulation may also be combined with a real-world effect. For example, if an AR object is simulated to strike a movable real-world object, various actuation devices may be triggered to cause the real-world object to move according to the simulated interaction.
In accordance with one embodiment of the present disclosure, the ride and game control system 10 may perform simulations to allow a passenger to experience certain portions of the ride from another perspective. In particular, based on the obtained real-time data (e.g., obtained from the ride controller 42, the game controller 44, the head mounted display 32, and/or the monitoring system 46), the ride and game control system 10 may determine that the field of view 106 of the passenger 26 is at least partially blocked. In response to this determination, the ride and game control system 10 may generate gaming effects to provide an adjusted perspective 108 and display the generated gaming effects on the respective head mounted display 32 of the passenger 26, such that the passenger 26 may have an unobstructed view of the real-world feature 102. For example, the generated gaming effects may include virtual adjustment to the position of the real-world feature 102, such that it is brought in front of the passenger's 26 eyes while the passenger 26 may still see the passenger 28. In some embodiments, the generated gaming effects in the adjusted perspective 108 may include AR features illustrating the perspective of the passenger 28 (e.g., the field of view 104).
The manner in which perspective adjustments are made may depend on a number of factors. One example of a method 110 for dynamically adjusting ride perspective is illustrated as a flow diagram in
The illustrated method 110 also includes synchronizing (block 114) the position and orientation of a particular one of the head mounted displays 32 to the stored virtual ride profile (the known profile). For example, synchronizing in accordance with the acts of block 114 may include synchronizing the timing of the identified gaze of the head mounted display 32 with the stored virtual ride profile. As another example, the synchronization may include synchronizing position and rotation as measured by sensors on the head mounted display 32 with the known ride profile. This allows the computer graphics generation system 52 to compare and identify differences between the view perspectives of the head mounted display 32 with the view perspectives of the known ride profile. Using these differences, appropriate amounts of perspective adjustments may be identified to allow the computer graphics generation system 52 to adjust the augmentations displayed on the head mounted display 32 to correspond more closely to the view perspectives of the known ride profile.
To identify whether perspective adjustments are appropriate, the computer graphics generation system 52 may, for example, monitor the aspects of the head mounted displays 32 to identify (block 116) potential gaze obstructions. For example, the motion profile of the ride vehicle 22 and desired view perspectives (e.g., including a view of attraction features) of the passengers may be known. This allows the computer graphics generation system 52 to identify, at a certain time during the ride, that a potential gaze obstruction would prevent one of the passengers from having a full view of a particular attraction feature. As an example, the computer graphics generation system 52 may identify the potential gaze obstruction due to a combination of factors, such as seat position and the presence of other passengers, fixed features of the ride and their known position, and so forth.
In response to identifying potential gaze obstructions or similar situations where a passenger (a particular head mounted display 32) may not have a full view of attraction features, the computer graphics generation system 52 may dynamically adjust (block 118) the perspective of at least a portion of the ride shown on the head mounted display 32. As an example, the computer graphics generation system 52 may shift a location of a particular virtual augmentation (or locations of multiple augmentations) so that it is in full view of the passenger for whose perspective would otherwise be blocked by other passengers or environmental ride elements.
In some embodiments, the ride and game control system 10 may also change the one or more lighting systems 37 and/or other show effects (e.g., sound effects) based on the rendered AR features. For example, the ride and game control system 10, in response to determining that the passenger 28 has a gaze in the second field of view 124, may change the lighting condition (e.g., lighting intensities, directions, colors, etc.) of the one or more lighting systems 37 to accentuate the rendered AR features 130 (e.g., the flying bird). For example, the ride and game control system 10, in response to determining that the passenger 28 has a gaze in the third field of view 126, may change the lighting condition (e.g., lighting intensities, directions, colors, etc.) of the one or more lighting systems 37 and/or may change sound effects (e.g., provided via sound effect devices of the head mounted display 32 or of the show effect system 34) to reflect the specific scene of the volcano eruption.
In some embodiments, the ride and game control system 10 may render AR features in response to determining that at least a threshold number (e.g., 2, 4, 6, 10 or more) of passengers show viewing interests. For example, the ride and game control system 10 may render the AR features 132 (e.g., erupting volcanos), only in response to determining that at least two passengers (e.g., the passengers 26 and 28) are gazing in the field of view 126. It should be appreciated that the threshold number may be scene specific. For example, the threshold number for the AR features 132 (e.g., erupting volcanos) may be two, and the threshold number for the AR features 130 (e.g., flying bird) may be four.
Both of the passengers 28 and 30 may gaze in substantially the same direction at a themed attraction 140 (e.g., volcanos) that may include real-world features (e.g., real-world objections or images displayed on the one or more display devices 35) and/or AR features. For example, in response to determining that a field of view 142 of the passenger 28 overlaps a portion of an field of view 144 of the passenger 30, the ride and game control system 10 may render the same AR features to be shown on the respective head mounted displays 32 of the passengers 28 and 30, but from different perspectives. Furthermore, both of the passengers 28 and 30 may see actions 146 (e.g., actions in the AR environment) applied by either of the passengers 28 and 30 on the respective user interface 50 of the game system 48. In the illustrated example, the passenger 28 operates the respective user interface 50 to execute the actions 146, such as shooting a fire ball 148 at volcanos 150. For example, the passenger 28 may adjust operation of the respective user interface 50 to change the flying trajectory of the fire ball 148. Correspondingly, the passenger 30, from the respective head mounted display 32, may see the action 146 (e.g., shooting the fire ball 148 at the volcanos 150) applied by the passenger 28. In some embodiments, the ride and game control system 10 may determine that the passenger 30 is engaged in the same game as the passenger 28 (e.g., the passenger 30 may provide an indication using the user interface 50 to consent joining the game with the passenger 28), and in response to this determination, the ride and game control system 10 may display the same AR features including the results of the actions 146 on the respective head mounted displays 32 of the passengers 28 and 30.
While only certain features of the present embodiments have been illustrated and described herein, many modifications and changes will occur to those skilled in the art. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and changes as fall within the true spirit of the present disclosure. Further, it should be understood that certain elements of the disclosed embodiments may be combined or exchanged with one another.
This application claims priority to and benefit of U.S. Provisional Patent Application No. 62/467,817, entitled “SYSTEMS AND METHODS FOR DIGITAL OVERLAY IN AN AMUSEMENT PARK ENVIRONMENT,” filed Mar. 6, 2017, which is herein incorporated by reference in its entirety for all purposes.
Number | Name | Date | Kind |
---|---|---|---|
5844530 | Tosaki | Dec 1998 | A |
6220965 | Hanna et al. | Apr 2001 | B1 |
6462769 | Trowbridge et al. | Oct 2002 | B1 |
6533670 | Drobnis | Mar 2003 | B1 |
6606953 | Mares | Aug 2003 | B2 |
6796908 | Weston | Sep 2004 | B2 |
7495638 | Lamvik et al. | Feb 2009 | B2 |
7955168 | Mendelsohn et al. | Jun 2011 | B2 |
8025581 | Bryan et al. | Sep 2011 | B2 |
8066576 | Threlkel | Nov 2011 | B2 |
8212859 | Tang et al. | Jul 2012 | B2 |
8491403 | Schreibfeder | Jul 2013 | B2 |
8511827 | Hua et al. | Aug 2013 | B2 |
8576276 | Bar-Zeev et al. | Nov 2013 | B2 |
8705177 | Miao | Apr 2014 | B1 |
8767014 | Vaught et al. | Jul 2014 | B2 |
8810482 | Abdollahi et al. | Aug 2014 | B2 |
8867139 | Gupta | Oct 2014 | B2 |
8894492 | Ackley et al. | Nov 2014 | B2 |
8941559 | Bar-Zeev et al. | Jan 2015 | B2 |
9052505 | Cheng et al. | Jun 2015 | B2 |
9092953 | Mortimer et al. | Jul 2015 | B1 |
9155971 | Trowbridge | Oct 2015 | B1 |
9253524 | Kaburlasos et al. | Feb 2016 | B2 |
9266028 | Alfieri et al. | Feb 2016 | B2 |
9268138 | Shimizu et al. | Feb 2016 | B2 |
9285871 | Geisner et al. | Mar 2016 | B2 |
9286730 | Bar-Zeev et al. | Mar 2016 | B2 |
9292973 | Bar-Zeev et al. | Mar 2016 | B2 |
9310591 | Hua et al. | Apr 2016 | B2 |
9316834 | Makino et al. | Apr 2016 | B2 |
9342610 | Liu et al. | May 2016 | B2 |
9354446 | Abdollahi et al. | May 2016 | B2 |
9360671 | Zhou | Jun 2016 | B1 |
9366870 | Cheng et al. | Jun 2016 | B2 |
9366871 | Ghosh et al. | Jun 2016 | B2 |
9383582 | Tang et al. | Jul 2016 | B2 |
9389423 | Bhardwaj et al. | Jul 2016 | B2 |
9395811 | Vaught et al. | Jul 2016 | B2 |
9454010 | Passmore et al. | Sep 2016 | B1 |
9497501 | Mount et al. | Nov 2016 | B2 |
9519144 | Lanman et al. | Dec 2016 | B2 |
9569886 | Akenine-Moller et al. | Feb 2017 | B2 |
9582922 | Lanman et al. | Feb 2017 | B2 |
9588341 | Bar-Zeev et al. | Mar 2017 | B2 |
9606362 | Passmore et al. | Mar 2017 | B2 |
9638921 | Miller et al. | May 2017 | B2 |
9658460 | Lee et al. | May 2017 | B2 |
9690371 | Saito | Jun 2017 | B2 |
9690374 | Clement et al. | Jun 2017 | B2 |
9690375 | Blum et al. | Jun 2017 | B2 |
9733477 | Gupta | Aug 2017 | B2 |
9733480 | Baek et al. | Aug 2017 | B2 |
9733481 | Carollo et al. | Aug 2017 | B2 |
9741125 | Baruch et al. | Aug 2017 | B2 |
9763342 | Long et al. | Sep 2017 | B2 |
9778467 | White et al. | Oct 2017 | B1 |
9839857 | Wagner | Dec 2017 | B2 |
9864406 | Miller et al. | Jan 2018 | B2 |
9869862 | Cheng et al. | Jan 2018 | B2 |
9874749 | Bradski et al. | Jan 2018 | B2 |
9877016 | Esteban et al. | Jan 2018 | B2 |
9885871 | Abdollahi et al. | Feb 2018 | B2 |
9933624 | White et al. | Apr 2018 | B1 |
20060250322 | Hall et al. | Nov 2006 | A1 |
20080188318 | Piccionelli et al. | Aug 2008 | A1 |
20100131865 | Ackley et al. | May 2010 | A1 |
20110141246 | Schwartz et al. | Jun 2011 | A1 |
20110242134 | Miller et al. | Oct 2011 | A1 |
20120320100 | Machida et al. | Dec 2012 | A1 |
20130137076 | Perez et al. | May 2013 | A1 |
20130141419 | Mount et al. | Jun 2013 | A1 |
20130286004 | McCulloch | Oct 2013 | A1 |
20140118829 | Ma et al. | May 2014 | A1 |
20140146394 | Tout et al. | May 2014 | A1 |
20140364208 | Perry | Dec 2014 | A1 |
20140364209 | Perry | Dec 2014 | A1 |
20140364212 | Osman et al. | Dec 2014 | A1 |
20150003819 | Ackerman et al. | Jan 2015 | A1 |
20150090242 | Weston et al. | Apr 2015 | A1 |
20150097863 | Alaniz et al. | Apr 2015 | A1 |
20150100179 | Alaniz et al. | Apr 2015 | A1 |
20150103152 | Qin | Apr 2015 | A1 |
20150190726 | Frolov | Jul 2015 | A1 |
20150312561 | Hoof et al. | Oct 2015 | A1 |
20150363976 | Henson | Dec 2015 | A1 |
20160048203 | Blum et al. | Feb 2016 | A1 |
20160062454 | Wang et al. | Mar 2016 | A1 |
20160089610 | Boyle et al. | Mar 2016 | A1 |
20160097929 | Yee et al. | Apr 2016 | A1 |
20160097930 | Robbins et al. | Apr 2016 | A1 |
20160098095 | Gonzalez-Banos et al. | Apr 2016 | A1 |
20160171779 | Bar-Zeev et al. | Jun 2016 | A1 |
20160188943 | Franz | Jun 2016 | A1 |
20160196694 | Lindeman | Jul 2016 | A1 |
20160210784 | Ramsby et al. | Jul 2016 | A1 |
20160240013 | Spitzer | Aug 2016 | A1 |
20160346704 | Wagner | Dec 2016 | A1 |
20160353089 | Gallup et al. | Dec 2016 | A1 |
20160353095 | Tait | Dec 2016 | A1 |
20160364907 | Schoenberg | Dec 2016 | A1 |
20160370855 | Lanier et al. | Dec 2016 | A1 |
20160377869 | Lee et al. | Dec 2016 | A1 |
20160379417 | Mount et al. | Dec 2016 | A1 |
20170053445 | Chen et al. | Feb 2017 | A1 |
20170053446 | Chen et al. | Feb 2017 | A1 |
20170053447 | Chen et al. | Feb 2017 | A1 |
20170059831 | Hua et al. | Mar 2017 | A1 |
20170103571 | Beaurepaire | Apr 2017 | A1 |
20170116950 | Onal | Apr 2017 | A1 |
20170131581 | Pletenetskyy | May 2017 | A1 |
20170171538 | Bell et al. | Jun 2017 | A1 |
20170176747 | Vallius et al. | Jun 2017 | A1 |
20170178408 | Bavor, Jr. et al. | Jun 2017 | A1 |
20170193679 | Wu et al. | Jul 2017 | A1 |
20170208318 | Passmore et al. | Jul 2017 | A1 |
20170212717 | Zhang | Jul 2017 | A1 |
20170220134 | Burns | Aug 2017 | A1 |
20170221264 | Perry | Aug 2017 | A1 |
20170225084 | Snyder | Aug 2017 | A1 |
20170236332 | Kipman et al. | Aug 2017 | A1 |
20170242249 | Wall et al. | Aug 2017 | A1 |
20170255011 | Son et al. | Sep 2017 | A1 |
20170262046 | Clement et al. | Sep 2017 | A1 |
20170262047 | Saito | Sep 2017 | A1 |
20170270841 | An et al. | Sep 2017 | A1 |
20170277256 | Burns et al. | Sep 2017 | A1 |
20170285344 | Benko et al. | Oct 2017 | A1 |
20170293144 | Cakmakci et al. | Oct 2017 | A1 |
20170316607 | Khalid et al. | Nov 2017 | A1 |
20170323416 | Finnila | Nov 2017 | A1 |
20170323482 | Coup et al. | Nov 2017 | A1 |
20170336863 | Tilton et al. | Nov 2017 | A1 |
20170337737 | Edwards et al. | Nov 2017 | A1 |
20170345198 | Magpuri et al. | Nov 2017 | A1 |
20170363872 | Border et al. | Dec 2017 | A1 |
20170363949 | Valente et al. | Dec 2017 | A1 |
20170364145 | Blum et al. | Dec 2017 | A1 |
20180003962 | Urey et al. | Jan 2018 | A1 |
20180018515 | Spizhevoy et al. | Jan 2018 | A1 |
20180024370 | Carolio et al. | Jan 2018 | A1 |
20180032101 | Jiang | Feb 2018 | A1 |
20180033199 | Eatedali et al. | Feb 2018 | A9 |
20180040163 | Donnelly | Feb 2018 | A1 |
20180059715 | Chen et al. | Mar 2018 | A1 |
20180059776 | Jiang et al. | Mar 2018 | A1 |
20180095498 | Raffle et al. | Apr 2018 | A1 |
20180104601 | Wagner | Apr 2018 | A1 |
20180164594 | Lee et al. | Jun 2018 | A1 |
20180196262 | Cage | Jul 2018 | A1 |
20180203240 | Jones et al. | Jul 2018 | A1 |
Number | Date | Country |
---|---|---|
2138213 | Dec 2009 | EP |
2189200 | May 2010 | EP |
2911463 | Jul 2008 | FR |
2012141461 | Jul 2012 | JP |
5790187 | Oct 2015 | JP |
5801401 | Oct 2015 | JP |
2015228050 | Dec 2015 | JP |
5913346 | Apr 2016 | JP |
2016528942 | Sep 2016 | JP |
2017522911 | Aug 2017 | JP |
6191929 | Sep 2017 | JP |
6216100 | Oct 2017 | JP |
6237000 | Nov 2017 | JP |
2017532825 | Nov 2017 | JP |
6248227 | Dec 2017 | JP |
1998031444 | Jul 1998 | WO |
9851385 | Nov 1998 | WO |
2008059086 | May 2008 | WO |
2016023817 | Feb 2016 | WO |
Entry |
---|
Fred H. Previc et al: “Spatial Disorientation in Aviation, vol. 203 of Progress in astronautics and aeronautics”, p. 476, XP055489810, Jan. 1, 2004. |
Jiejie Zhu et al: “Handling occlusions in video-based augmented reality using depth information”, Computer Animation and Virtual Worlds, vol. 21, No. 5, pp. 509-521, XP055184802, Sep. 13, 2010. |
Anonymous: “Head-up display—Wikipedia”, XP055489840, retrieved from the Internet: URL:https://en.wikipedia.org/w/index.php?title=Head-up_display&oldid=766263622, pp. 2-3, Feb. 19, 2017. |
Anonymous: “Head-mounted display—Wikipedia”, XP055489914, retrieved from the Internet: URL:https://en.wikipedia.org/w/index.php?title=Head-mounted_display&oldid=767462464, pp. 1-3, Feb. 26, 2017. |
Anonymous: “Augmented reality—Wikipedia”, XP55488978, retrieved from the Internet: URL:https://en.wikipedia.org/w/index.php?title=Augmented_reality&oldid=768706789, pp. 3-4, Mar. 5, 2017. |
PCT/US2018/021154 International Search Report and Written Opinion dated Sep. 18, 2018. |
Number | Date | Country | |
---|---|---|---|
20180253141 A1 | Sep 2018 | US |
Number | Date | Country | |
---|---|---|---|
62467817 | Mar 2017 | US |