The present invention generally relates to gaming systems. More specifically, the system and method of competitively gaming in a mixed reality with multiple players relates to a method for enabling cooperative and competitive drone/radio controlled (RC) vehicle control. Normally complex controls are made accessible to casual fans, and an audience can watch physical drones/RC vehicles interact in real time according to player inputs.
With the growth in industries such as virtual reality gaming, eSports and aerial drone racing, there is a growing need for the next generation of interactive game play. The eSports industry has grown massively, with users and audiences that want more than ever to experience live competitive events, where the users may compete against each other within video games or hardware competitions such as aerial drone racing.
Unfortunately, each of these categories presents its own set of problems and limitations with respect to recruitment of new users. A problem with eSports is that eSports are fully virtual and lack the true spectacle of a game like aerial drone racing. Further, eSports equipment can be expensive for an enthusiast to purchase, install, and maintain. On the other hand, aerial drone racing is generally perceived to be inaccessible due to the requirement for users to have a fairly complicated understanding of aeronautics. The barrier to entry for aerial drone racing is further heightened due to potential requirements for special licensing, depending on local laws. Furthermore, aerial drone racing uses a traditional handheld controller or mobile device to control the movement of the aerial drone, which mitigates the immersive ecosystem sought by the described invention. What is needed is an accessible combination of virtual reality gaming, eSports, and aerial drone racing. Further desirable is a method by which physical drones or remotely-controlled units may be automatically maintained.
The present invention addresses these issues. The system and method of competitively gaming in a mixed reality with multiple players generally provides a facility in which various types of drone/radio-controlled (RC) vehicle, including, but not limited to, quadcopters, submarines, tanks, RC cars or trucks, and more, may be stored, maintained, and deployed. Appropriate environments are provided and changed in order to provide variety to the gamers. Users enter a pod equipped with virtual reality headgear and several controls befitting the drone type or game type being played. The virtual reality headgear enables users to control the drone/RC vehicle from a first-person perspective, utilizing cameras strategically dispersed throughout the drone. Haptic feedback further increases user immersion in gaming events. Between uses, drones/RC vehicles are sent through automated maintenance, which allows for cooling, charging, and other standard maintenance for otherwise functional units and automated part replacement as needed for damaged units. The automated maintenance, coupled with high drone/RC vehicle inventory, allows for continuous play, as opposed to single unit aerial drone racing, in which drones generally require maintenance between uses.
All illustrations of the drawings are for the purpose of describing selected versions of the present invention and are not intended to limit the scope of the present invention.
The present invention is a system and method of competitively gaming in a mixed reality with multiple players that provides a system that allows casual enthusiasts to use virtual reality (VR) equipment to engage in drone/radio-controlled (RC) vehicle games, such as races, battle royals, map exploration, and more, as represented in
The overall process followed by the method of the present invention allows for effective and efficient deployment of automated avatars, real-time relay of controls, and display of relevant data. A gameplay is next initialized amongst the player profiles with the central computing device (Step C). The gameplay includes activation and coordination of stimuli within the corresponding control pod with movement of the corresponding automated avatar. Real-time environment data is then continuously captured with each automated avatar during the gameplay (Step D). In this way, the positional coordinates, velocity vectors, video data, audio data, atmospheric data, and more may be utilized by a user in order to make in-game decisions. Next, the real-time environment data of the corresponding automated avatar for each player profile is continuously outputted with the corresponding pod during the gameplay (Step E). Thus, the user is presented with relevant environmental information, as well as information regarding the status of the automated avatar. Next, each player profile is prompted to enter at least one avatar instruction with the corresponding pod during the gameplay (Step F), as represented in
In order to allow a user to experience first person action from a variety of perspectives within the automated avatar, an automated avatar must be equipped with advanced imaging technology. To this end, at least one camera may be provided for each automated avatar, as represented in
Video data may further be supplemented with appropriate artificial visuals in order to enhance gameplay of various types. To this end, at least one piece of video augmentation may be generated in accordance to the gameplay with the central computing device, as represented in
Many game modes require that a user engage with audio data, especially in conjunction with video data. To achieve this, at least one microphone may be provided for each automated avatar, as represented in
Audio data may further be supplemented with appropriate artificial sounds in order to enhance gameplay of various types. To this end, at least one piece of audio augmentation may be generated in accordance to the gameplay with the central computing device, as represented in
Different scenarios may provide opportunities for more advanced immersive feedback from the automated avatar. To enable such feedback, at least one inertia measurement unit (IMU) may be provided with each automated avatar, as represented in
The user experience within a control pod may further be supplemented with appropriate artificial vibrations in order to enhance gameplay of various types. To this end, at least one piece of haptic augmentation may be generated in accordance to the gameplay with the central computing device, as represented in
A user experiencing immersive video, sounds, and haptic feedback requires an appropriate mechanism with which to interact with an automated avatar. To this end, at least one maneuver input device is provided for each control pod, wherein the maneuver input device is configured to receive a plurality of automated avatar-related maneuvers, as represented in
As it is common for various drones and RC units to become damaged or to require service between uses, the present invention requires a method by which to address avatar maintenance. To this end, a computerized maintenance center is provided, wherein the computerized maintenance center is positioned adjacent to the computerized arena, as represented in
Automated avatars must move to the computerized maintenance center in order to receive required maintenance between uses. To this end, each automated avatar may be automatically maneuvered from the computerized arena to the computerized maintenance center by instruction from the central computing device after Step H, as represented in
During use, it is possible for an automated avatar to become too damaged to use, necessitating an exchange of automated avatars. To achieve this, a plurality of alternate automated avatars may be provided, wherein the alternate automated avatars are communicably coupled to the central computing device, wherein each player profile is associated with a corresponding alternate automated avatar from the plurality of alternate automated avatars, and wherein each alternate automated avatar has already gone through either the regular maintenance procedure or the repair procedure, as represented in
Properly-functioning automated avatars may undergo a variety of procedures in order to ensure optimal performance during use. To facilitate this, a portable power source may be provided for each automated avatar, as represented in
The plurality of improperly-functioning automated avatars may require a variety of different maintenance repairs in order to return to functional, game-ready quality. To this end, a detailed diagnosis status is assessed for each automated avatar with the computerized maintenance center during the repair procedure, as represented in
A user of the present invention may have appropriate equipment available at their place of residence and may therefore want to engage with a control pod remotely from a dedicated facility. To this end, at least one external personal computing (PC) device may be provided, wherein the external PC device is communicably coupled to the central computing device, as represented in
It may be the case that a player requires a partner for gameplay and does not have anybody available. To address this issue, automated control of the corresponding pod of at least one specific profile may be enabled with the central computing device, wherein the specific profile is from the plurality of player profiles, as represented in
As can be seen in
In general, the method disclosed herein may be performed by one or more computing devices. For example, in some embodiments, the method may be performed by a server computer in communication with one or more client devices over a communication network such as, for example, the Internet. In some other embodiments, the method may be performed by one or more of at least one server computer, at least one client device, at least one network device, at least one sensor and at least one actuator. Examples of the one or more client devices and/or the server computer may include, a desktop computer, a laptop computer, a tablet computer, a personal digital assistant, a portable electronic device, a wearable computer, a smart phone, an Internet of Things (IoT) device, a smart electrical appliance, a video game console, a rack server, a super computer, a mainframe computer, mini-computer, micro-computer, a storage server, an application server (e.g. a mail server, a web server, a real-time communication server, an FTP server, a virtual server, a proxy server, a DNS server etc.), a quantum computer, and so on. Further, one or more client devices and/or the server computer may be configured for executing a software application such as, for example, but not limited to, an operating system (e.g. Windows, Mac OS, Unix, Linux, Android, etc.) in order to provide a user interface (e.g. GUI, touchscreen interface, voice-based interface, gesture-based interface etc.) for use by the one or more users and/or a network interface for communicating with other devices over a communication network. Accordingly, the server computer may include a processing device configured for performing data processing tasks such as, for example, but not limited to, analyzing, identifying, determining, generating, transforming, calculating, computing, compressing, decompressing, encrypting, decrypting, scrambling, splitting, merging, interpolating, extrapolating, redacting, anonymizing, encoding and decoding. Further, the server computer may include a communication device configured for communicating with one or more external devices. The one or more external devices may include, for example, but are not limited to, a client device, a third-party database, public database, a private database and so on. Further, the communication device may be configured for communicating with the one or more external devices over one or more communication channels. Further, the one or more communication channels may include a wireless communication channel and/or a wired communication channel. Accordingly, the communication device may be configured for performing one or more of transmitting and receiving of information in electronic form. Further, the server computer may include a storage device configured for performing data storage and/or data retrieval operations. In general, the storage device may be configured for providing reliable storage of digital information. Accordingly, in some embodiments, the storage device may be based on technologies such as, but not limited to, data compression, data backup, data redundancy, deduplication, error correction, data finger-printing, role based access control, and so on.
Further, one or more steps of the method disclosed herein may be initiated, maintained, controlled and/or terminated based on a control input received from one or more devices operated by one or more users such as, for example, but not limited to, an end user, an admin, a service provider, a service consumer, an agent, a broker and a representative thereof. Further, the user as defined herein may refer to a human, an animal or an artificially intelligent being in any state of existence, unless stated otherwise, elsewhere in the present disclosure. Further, in some embodiments, the one or more users may be required to successfully perform authentication in order for the control input to be effective. In general, a user of the one or more users may perform authentication based on the possession of a secret human readable secret data (e.g. username, password, passphrase, PIN, secret question, secret answer etc.) and/or possession of a machine-readable secret data (e.g. encryption key, decryption key, bar codes, etc.) and/or possession of one or more embodied characteristics unique to the user (e.g. biometric variables such as, but not limited to, fingerprint, palm-print, voice characteristics, behavioral characteristics, facial features, iris pattern, heart rate variability, evoked potentials, brain waves, and so on) and/or possession of a unique device (e.g. a device with a unique physical and/or chemical and/or biological characteristic, a hardware device with a unique serial number, a network device with a unique IP/MAC address, a telephone with a unique phone number, a smartcard with an authentication token stored thereupon, etc.). Accordingly, the one or more steps of the method may include communicating (e.g. transmitting and/or receiving) with one or more sensor devices and/or one or more actuators in order to perform authentication. For example, the one or more steps may include receiving, using the communication device, the secret human readable data from an input device such as, for example, a keyboard, a keypad, a touch-screen, a microphone, a camera and so on. Likewise, the one or more steps may include receiving, using the communication device, the one or more embodied characteristics from one or more biometric sensors.
Further, one or more steps of the method may be automatically initiated, maintained and/or terminated based on one or more predefined conditions. In an instance, the one or more predefined conditions may be based on one or more contextual variables. In general, the one or more contextual variables may represent a condition relevant to the performance of the one or more steps of the method. The one or more contextual variables may include, for example, but are not limited to, location, time, identity of a user associated with a device (e.g. the server computer, a client device etc.) corresponding to the performance of the one or more steps, environmental variables (e.g. temperature, humidity, pressure, wind speed, lighting, sound, etc.) associated with a device corresponding to the performance of the one or more steps, physical state and/or physiological state and/or psychological state of the user, physical state (e.g. motion, direction of motion, orientation, speed, velocity, acceleration, trajectory, etc.) of the device corresponding to the performance of the one or more steps and/or semantic content of data associated with the one or more users. Accordingly, the one or more steps may include communicating with one or more sensors and/or one or more actuators associated with the one or more contextual variables. For example, the one or more sensors may include, but are not limited to, a timing device (e.g. a real-time clock), a location sensor (e.g. a GPS receiver, a GLONASS receiver, an indoor location sensor etc.), a biometric sensor (e.g. a fingerprint sensor), an environmental variable sensor (e.g. temperature sensor, humidity sensor, pressure sensor, etc.) and a device state sensor (e.g. a power sensor, a voltage/current sensor, a switch-state sensor, a usage sensor, etc. associated with the device corresponding to performance of the or more steps). Further, the one or more steps of the method may be performed one or more number of times. Additionally, the one or more steps may be performed in any order other than as exemplarily disclosed herein, unless explicitly stated otherwise, elsewhere in the present disclosure. Further, two or more steps of the one or more steps may, in some embodiments, be simultaneously performed, at least in part. Further, in some embodiments, there may be one or more time gaps between performance of any two steps of the one or more steps.
Further, in some embodiments, the one or more predefined conditions may be specified by the one or more users. Accordingly, the one or more steps may include receiving, using the communication device, the one or more predefined conditions from one or more and devices operated by the one or more users. Further, the one or more predefined conditions may be stored in the storage device. Alternatively, and/or additionally, in some embodiments, the one or more predefined conditions may be automatically determined, using the processing device, based on historical data corresponding to performance of the one or more steps. For example, the historical data may be collected, using the storage device, from a plurality of instances of performance of the method. Such historical data may include performance actions (e.g. initiating, maintaining, interrupting, terminating, etc.) of the one or more steps and/or the one or more contextual variables associated therewith. Further, machine learning may be performed on the historical data in order to determine the one or more predefined conditions. For instance, machine learning on the historical data may determine a correlation between one or more contextual variables and performance of the one or more steps of the method. Accordingly, the one or more predefined conditions may be generated, using the processing device, based on the correlation.
Further, one or more steps of the method may be performed at one or more spatial locations. For instance, the method may be performed by a plurality of devices interconnected through a communication network. Accordingly, in an example, one or more steps of the method may be performed by a server computer. Similarly, one or more steps of the method may be performed by a client computer. Likewise, one or more steps of the method may be performed by an intermediate entity such as, for example, a proxy server. For instance, one or more steps of the method may be performed in a distributed fashion across the plurality of devices in order to meet one or more objectives. For example, one objective may be to provide load balancing between two or more devices. Another objective may be to restrict a location of one or more of an input data, an output data and any intermediate data therebetween corresponding to one or more steps of the method. For example, in a client-server environment, sensitive data corresponding to a user may not be allowed to be transmitted to the server computer. Accordingly, one or more steps of the method operating on the sensitive data and/or a derivative thereof may be performed at the client device.
As an overview, the present disclosure may describe mixed reality competitive gameplay to facilitate competition among multiple users. Further, the disclosed mixed reality competitive gameplay may use realistic controller environment to allow one or multiple users to control a remote-controlled drone and compete against other users/teams in a course, track or arena. Further, the mixed reality gaming concept may utilize a realistic controller cockpit or simulated environment to allow one or multiple users to control a remote-controlled vehicle/machine through a course or track to compete against others. Further, the disclosed methods, systems and apparatuses for mixed reality gaming may include one or more cockpit/simulated control devices and one or more remotely-controlled drones, including providing game control software that connects remote controlled drones to the various user control devices/environments. This game control software monitors user success, allow for quick change in hardware/devices for continual gameplay, including rules for play affecting the operation of the remotely-controllable drones. This may also include virtual assets or objects displayed within the play environment that the user can use their RC drone to interact with for gameplay. Tracks and courses could be hyper-realistic environments that have moving elements that make gameplay more interactive and challenging.
Further, the disclosed methods, systems and apparatuses related to the mixed reality and electronic gaming industry, particularly games may use realistic controller environments that transport users to the driver's seat of remote-controlled drones for competitive gameplay. These games could be mobile and set up in a traveling unit so users could experience the games in parking lots, malls, or other public areas. These games could also be set up in dedicated entertainment centers where experiences could be built out to be higher quality, more complex, and larger in scale. Further, the disclosed mixed reality competitive gameplay may fill the gap between complex aerial drone racing and eSports by creating a mixed reality gaming experience, systems and apparatuses to have a user truly become a part of the gaming experience. They enter into a controlled environment, such as a driver's cockpit or life-size representation of the type of remote-controlled drone, that allows them to operate individual or multiple remote-controlled drones throughout a track or course to compete against other users in a similar control environment. This does not include aerial drones due to the licensing and complexity in operation but remains ground and water operational like a Remote Control Car, boat, or submersible. This makes the experience easy to grip for brand new users and not require as many maintenance costs from broken pieces. These vehicles could become airborne for small periods of time from jumps and other obstacles but does not operate similar to an aerial drone with sustained flight. Further, the track or course may be magnetized to maintain the RC Drone from excessive flipping/overturning during game play. In an embodiment, the track and RC unit(s) may be magnetized dependent on the gameplay type, so the RC units can be flipped, can turn over automatically or with some other means to maintain gameplay, such as a mechanical arm or manually by a trained staff member. The environments that the remote-controlled drones operate in could be hyper-realistic and scaled to make the user feel as though they are truly inside the environment. This creates a unique perspective within the gameplay that is not represented anywhere else in the marketplace.
All of the data from the gameplay for each remote-controlled drone feeds back into a central gaming software and feeds directly back into the control system to reduce latency in the signal to make gameplay responsive and smooth to the user. Additionally, units are able to be switched out and registered back into the gaming software using technologies like RFID to identify their unique transmission signal to improve continuous gameplay and allow users to even enter in their own custom build remote-controlled drones for tournaments, camps, or competitions. This also allows the remote-controlled drones to be interchanged quickly so that they can remain cool in temperature and allow time for battery changes/recharging without halting the gameplay experience between rounds.
The gaming software may also utilize Artificial Intelligence (AI) driven controls to make remote-controlled drones go back to the reset point for gameplay start or for switching themselves out when reaching a higher temperature or low battery range. This improves the gameplay experience for users so that there is no lag between rounds. This method, systems, and apparatuses for mixed reality gaming can be adapted to different scenarios over time such as competitive RC car racing through scaled environments or simulated tanks that can battle within an arena. Hardware and controllers may be a combination of custom 3D printed parts, third party manufactures, and internally build parts. These can change and be adapted overtime to give the user the best experience possible and reduce operational costs.
The remote-controlled drones could also utilize sensors and other feedback data to the control units to give physical/sensory feedback to the users, such as vibration or haptic movement from gameplay. As well as vice versa, the control units and mixed reality/virtual reality headsets could send data to the remote-control drone controlling its movement or action. For instance, user's head movement could control the first-person view camera movement on the remote-controlled drone. This sensor information from the remote-controlled drone could also provide data back to the user such as speed, game score/standing, and any other information pertinent to gameplay rules. This data is displayed within the mixed reality/virtual reality headset as a Heads-Up Display (HUD), or it can be translated to the user using audio feedback. All game play and first-person views can be viewed by spectators watching the mixed reality competitive gameplay either through displays and/or external mixed reality/virtual reality headsets that are streaming the view of players. This gameplay footage can also be streamed live through online gaming platforms and esports providers such as Twitch™, in compliance with that third party's usage rules. Gameplay may grow and evolve as new technologies become available such as allowing users to shoot projectiles either real or virtual at other game players remote-controlled drones, within that specific gameplay rules. Gameplay software is utilized to integrate mixed reality/virtual reality headsets, sensor data from sensory devices, microcontrollers for movement of remote-controlled drone, gameplay rules, audio output, and remote-control drone functions/movement so that everything is a part of the same cohesive experience from users.
Gameplay rules include single races or rounds, tournament style, and league style play with the champions sometimes receiving trophies, rewards, and other types of wmnmgs. Mixed-Reality Competitive Gaming System that uses “life-size” controller containment units and conveyor methods to allow one or multiple users continuous gameplay controlling a remote-controlled drone through an interactive, controlled and smart gameplay environment. Mixed Reality Gaming concepts, systems, methods and apparatuses may bring a new generation of competitive entertainment experiences to market in the Gaming industry. This mixed reality gaming concept utilizes “life-size” realistic controller containment units to reproduce the experience of a Remote Controlled (RC) Drone (AKA RC Unit) in an interactive, complex, and scaled gameplay environment. The smart gameplay environments may include a conveyor system that allows for continuous gameplay using RC drones and solves major problems associated with using RC drones for competitive gaming. This conveyor system allows the RC drones to enter, be diagnosed, sorted, cooled, recharged, repaired, and other similar functions. They can then be replaced or reintroduced into gameplay between rounds or during gameplay dependent on gameplay type. During exit and reintroduction, the RC Unit is identified uniquely through the identification reader zone within the conveyor system, which contains RFID, Bluetooth and/or other similar identification technologies. The gameplay engine then dynamically takes this ID to reprogram and to match the radio frequency of the Remote Controller housed in the RC Controller Containment Unit. This can result in actions like light color response on the RC drone to match the controller containment unit's color scheme (shows audience who is who on track) or a multitude of other functions dependent on game type, such as team. This gameplay system can also take control of the RC drones using Artificial intelligence/Machine learning/neural network technologies or other similar programming technologies in-between rounds for reset and conveyor entrance/exit, or during gameplay to replace an empty player position so other users can still have competition against a Central Processing Unit (CPU) if there is not a full player roster in all RC controller containment units for a specific gameplay environment (track/arena/battleground/etc.). This is all used to create little to no downtime between rounds for users to enjoy, produce continuous gameplay during working hours so it is profitable for the facility/owner to operate such a system, and solves the issues plagued with using RC drones (short battery spans, overheating, often service needs, etc.) for gameplay. The system dramatically and non-obviously improves upon previous concepts to create a system that is cost-effective, quality controlled and able to be taken to market in an unparalleled way.
Game types can be built upon or whole new game concepts developed using this as the basis of functions for gameplay. For instance, one build out environment could be similar to an RC Racing game where users race around a track that changes or throws things at the user to avoid, while they sit in a life-size drivers cockpit seeing from a 360-degree camera (or camera on gimbal) what the RC unit is seeing (reacting to head motions of the user with Virtual reality (VR) I Mixed Reality (MR)/Augmented reality (AR) headsets), feeling the twists of the track and actions of gameplay (haptics/sounds/motion/vibration of controller or controller containment unit). Another variation could be to have several users within a single RC Controller containment unit, controlling different aspects of the same RC drone/unit, say in a tank game where one person controlled the main shooter, another drives, and another is lookout/gunner on top of tank. These systems, methods, and apparatuses could also be adapted to create a submersible game where users are inside of a submarine-like RC Controller Containment Unit controlling an RC submarine in a controlled, smart water tank gameplay arena.
Gameplay types can extend to many varying themes, such as water boats on an everglades themed map or dune buggies/motorcycles on an Egyptian sand dune themed gameplay map. This could also include virtual assets or objects displayed within the play environment that the user sees through their VR/AR/MR Headsets that they can use their RC drone to interact with for gameplay. An example of this would be unlocking certain gameplay elements of their RC unit, such as projectiles or reducing other player's power by % for a certain period of time. Gameplay Arena/Environment could be hyper realistically scaled, futuristic in design, or video game like designed and have moving elements that make gameplay more interactive and challenging. Moving elements may also be manipulated by onlookers or other players during gameplay dependent on game type.
Further, the disclosed mixed reality competitive gameplay relates to the mixed reality and electronic gaming industries, particularly games that use realistic controller environments that transport users to the driver's seat (referring to a racing example game) of remote-controlled drones for competitive gameplay. These games and gameplay arenas are mostly self-sustaining looped systems that could be mobile and set up in a traveling unit so users could experience the games in parking lots, malls, or other public areas. These games could also be set up in dedicated Entertainment Centers, their principle application, where experiences could be built out to a larger scale, be higher quality, and more complex. Further, the disclosed mixed reality competitive gameplay is broken down into seven core functional areas:
6. RC Controller Containment Unit (“life-size” representation)
These systems may vary slightly in their internal structure based on the gameplay type but involve one, some or all of the same basic functional areas. This could include combining or eliminating certain system areas to improve performance or to accommodate the gameplay type. Gameplay rules include single races, battles or rounds, tournament style, and league style play with the champions sometimes receiving trophies, rewards, and other types of winnings. Further, the disclosed mixed reality competitive gameplay may introduce brand new concepts in manufacturing using 3D printed parts at scale to build out the RC Units, RC Controllers, RC Controller Containment Units, and Gameplay Arenas. Not every piece is able to be 3D printed but the majority of body parts can be, which allows for local or onsite repair that substantially lowers costs of a system as large in scale as described. Audiences are able to view the 3D printing process through viewpoints at specific locations where this mixed reality gaming system/platform is set up.
The known prior art overall describes systems that do not match the current inventions scale, solutioning, design, commercial viability or impact. The disclosed methods, systems, and apparatuses create a competitive mixed reality gaming system that creates a mass-market viability and solves some of the most prevailing issues in the markets of mixed reality gaming and remote-controlled drone games. Further, the disclosed methods, systems, and apparatuses control the gameplay environment, creating a Gameplay Arena (could also be referred to as a track, court, area, stadium) that has a built-in Conveyor System (may be hidden to the view of users/audience, but accessible to facility professionals trained in the art) that creates the ability of limited lag time between sessions. This Conveyor System features many benefits to gameplay and creating a commercially viable competitive gaming experience, features described in greater length. But one major benefit of this conveyor system is its ability to allow RC units to be switched between gameplay sessions, allowing for the same experience in each round of gameplay (i.e. no overheating of the unit, lower speed power or running out of battery as examples). This all happens in a matter of moments, using the Identification Reader method, allowing there to be little to no lag time between sessions. The disclosed methods, systems, and apparatuses create a full system that creates a brand new experience in competitive mixed reality gaming by taking control of the landscape of the game.
Further, the disclosed mixed reality competitive gameplay includes dynamic landscapes that are scaled to create a unique gaming experience from the vantage point of the user, such as driving through a volcano island track mentioned before, the sand dunes of the dessert or submersibles in the great shark reef. This dynamic field changes similar to a Disney or Universal Studio ride, with varying themes using the latest techniques in real-world or virtual special effects creation and modeling. This controlled environment (Gameplay Arena) would go beyond just controlling the field of play to create a scalable competitive experience but may also include the programming of the RC units to take commands from the central gaming server/engine. This allows them to be controlled by machine learning, artificial intelligence, a neural network or similar programming to help dynamically control the RC unit for reset/service between rounds or replacing an empty player seat. For instance, if an RC car is running low on battery or approaching a critical heat level, once that round is completed the gameplay engine would take control of the vehicle guiding it into the service conveyor previously discussed, with a replacement unit entering the gameplay arena to take its place. The system of the present invention is packaged in a scalable fashion that can be taken to mass market and played by nearly anyone. The system focuses not only on gameplay but reducing downtime, ensuring safety, and creating a scalable experience that can be dynamically changed to new game types or new facilities. This system creates the magic of the game by not making users recharge the batteries themselves, or see a person swapping the RC Units in and out. It is a closed-loop system that creates an experience for players and onlookers that can be played during nearly all facility/venue open hours, for example—hours per day. The present invention utilizes a conveyor that takes these RC Units and services them through automation and professional staff and instantly replaces that unit in gameplay, reprogramming the life-size RC Controller Containment Unit and RC Controller to match the new RC unit's radio frequency (RF) or similar signal type. This system allows for scalability as RC units and gameplay types become more complicated, and the technology in the conveyor and arena becomes more advanced. For instance, the Conveyor System could refill gameplay ammunition, replace broken pieces with D printed repairs automatically or even allow a team of technicians to service the RC unit during gameplay (almost like Formula cockpit services during race). This invention system allows for continued competitive gameplay by handling the challenges of current day RC Unit devices with a mix of AI, machine learning, neural network, and conveyor technologies. The disclosed methods, systems, and apparatuses create a user experience that tracks a user's history, points and gameplay ranking to create a competitive gaming experience that is similar to online video games. The current invention also allows users to form teams that build their own RC units (within the confines of the game rules for that league) and race them across facilities and different gameplay arenas in a league/championship style of play. This promotes innovation, education of youth and engineering advancement in a way that is rarely seen today in gaming. Rather than focusing on at-home Gameplay Arenas or having slow gameplay that has long period of rest time between playing due to battery recharging, overheating or other RC unit challenges, the current invention is a system that produced rapid gameplay session rounds, which allows room to create profitability for the facility but also creates a completely unique user experience, such as what the company TopGolf™ did for golf.
The current invention described provides mixed reality competitive gameplay environments similar to theme park rides with life-size control units users interact with to control a remote control drone through that environment, competing against others. This system could even be scaled to have remote plugin access where users could actually be offsite, and link into the track at a controlled facility not open to the general public or another Gameplay Arena at a normal facility. This allows users in their own at home pods to link into the gameplay arena and join a session against other players, similar to online first-person shooter games today. This system is scalable beyond the mentioned games in the known prior arts to having submersible vehicles racing and/or battling underwater, robots fighting one another, and much more. Lastly, the dynamic reprogramming to change frequencies to a new RC unit to create fast round play in the games through a conveyor system is a concept that is not mentioned anywhere in the known prior art and is a non-obvious solution to people within the field of study.
As can be seen in
As can be seen in
Further, the gameplay server may communicate directly with a relay station or directly to other functional areas of the system through Wi-Fi, Bluetooth, /G, direct cable or other data transfer means. Further, data that may be received and transmitted from the gameplay server/engine may include sensor information, gameplay arena monitoring/control, safety system control/monitoring, conveyor system control/monitoring, Remote Controlled (RC) controller containment unit control/interaction/monitoring, RC controller monitoring/responsive input & output, information associated with the user, gameplay, records, etc. Further, upon identifying the RC unit running on low battery or approaching a critical heat level, the gameplay server/engine may take control of the RC unit, guiding it into the service conveyor, with a replacement RC unit entering the gameplay arena to take its place. This is implemented using a robotic operation system, robot process automation, computer vision, artificial intelligence, machine learning, convolutional neural networks or other neural network technologies, speech recognition, development platforms like UiPath, frameworks like PyTorch and TensorFlow, line or marking follower programming, or other similar techniques dependent on game type. Dependent on game time, these automation functions could work with onboard sensors like laser scanners, stereo vision cameras, bump sensor, force-torque sensors, spectrometers, LIDAR, radar, other camera technologies, etc. Further, the RC units/drones may include actual devices on the gameplay arena that players (or users) or a central processing unit (CPU) controls to attempt to win that specific game type. Further, the RC units may be adapted to fit specific game objectives of a gameplay type. For instance, in a racing game, aerodynamic design and/or 4—wheel drive may be the desired objective if facing a sand-dune theme. In another game type, strength may be the objective and gameplay item fuel capacity (example projectile ammo load) may be the build goal. Further, the gameplay type examples may include a racing, a battle royal, a submersible. Further, the racing gameplay type may include a car racing, a buggy racing, a boat/airboat (on water surface) racing, motorcycle racing, and other vehicles (or group type drones) racing. Further, the battle royal gameplay may include tanks, battle robots, etc. Further, the submersible gameplay type may include a submarine racing or battles. Further, the RC units may include varying types of design elements and actual component parts. Further, the RC Units may include one or more components.
Further, the one or more components may include body, battery, gameplay items, action control unit(s) (for specific game-type, deploys gameplay items), motor(s), central processor(s), signal receiver(s), camera(s) (degree or regular), gimbal, sensors, scanners, propulsion system, dampeners (as applicable), actuators, gears & gearbox (if applicable), shock absorbers, body, buoyancy control (if applicable), lighting system (for visual color team/player aid and for gameplay), and other components found in the RC units. Further, the RC Unit may be connected to one or more of the full system components dependent on gameplay type and structure that may increase user/viewer experience. Further, the one or more system components may include RC controller, RC controller containment unit, gameplay arena, conveyor system, relay station and/or the gameplay server/engine. Further, the one or more components of the RC units may be designed and manufactured to be created using 3D printing techniques to lower costs, increase speed of repair, and gain flexibility on location. Further, the RC Units may be designed and built for education/competition purposes. For instance, high school students instructed to design the RC units to compete at a local gameplay arena that may promote education, learning in the technical and engineering arts, and drives innovation. Further, third party organizations may provide their own branded devices upon partnering. Further, the RC controller(s) may include actual physical inputs or audio inputs that may come from the player(s) to control the action of the RC unit(s). Further, the RC controller(s) may be housed inside of an RC Controller Containment Unit. Further, the RC control inputs may vary based on game type and RC unit design but may be audio, direction (steering), power (speed/strength/current), and/or action (projectiles, speed boost, other game type-specific pickups) based. Further, the RC controller may provide a sensory-based feedback to the player based on game action, for instance faulty steering when hit with projectile, or loose force reaction when slipping on virtual or real ice. Further, the sensory-based feedback may be based on the gameplay type. Further, the sensory-based feedback may be vibration, force, and/or movement-based. Further, the RC controller may communicate directly with the RC controller containment unit, RC unit, relay station, gameplay arena, safety systems and/or gameplay server, dependent on the gameplay type. Further, the RC controller may combine with a mixed reality/virtual reality/augmented reality headset for specific gameplay types and may send data to the RC unit controlling movement or action. For instance, user's head movement may control the first person view camera(s) movement on the RC unit or interact with the 360-degree camera's panorama view. Further, sensor information from the RC unit or information from the gameplay server may provide data back to the user through the RC controller such as speed, game score/standing, gameplay item bonuses and any other information pertinent to gameplay rules. Further, the data may be displayed within the mixed reality/virtual reality/augmented headset as a Heads-Up Display (HUD), or it can be translated to the user using audio feedback or other means based on the gameplay type. Further, the RC Controller Containment Unit (CCU) may include a life-size housing for the RC controller that may represent the gameplay action controls of the RC unit for the gameplay type. Further, the RC Controller Containment Units may include enough space for one or more players, seats or control interaction points with the RC Controller(s), and/or visual/sensory inputs/outputs (functions to create a experience: Airflow, heat, cooling, movement, vibration, haptics). Further, the RC Controller Containment Unit may respond to actions by users, the gameplay arena, other players, and/or the RC unit. Further, the actions may include things like causing rotation, vibration, airflow, movement, sounds, lights or other features to make the experience even more real to the player's senses and increase quality. Further, the RC Controller Containment Units may be life-size in the representation of the RC unit. Further, the RC Controller Containment Unit may vary in its appearance, function, design, and action based on the gameplay type. For instance, creating a racing pod for racing type gameplay, full-size body corresponding to a tank for battle royal type gameplay, and a body corresponding to a submarine type for submersible type games. Further, the body may not be an exact representation. Further, the body may be the embodiment of the function and gameplay type purpose. Further, the body may be padded throughout to increase safety in design and may never be a truly sealed area to create easy exit in the unlikely event of an emergency or safety risk. Further, the RC Controller Containment Unit may have gameplay arena components attached, upon fitting gameplay scene and experience for players and/or viewers. Further, the gameplay arena components may include external components such as color matching lights, displays to present Gamer(s) Tag information, and/or score ranking in gameplay type rounds. Further, the external components may vary based on the gameplay type. Further, the RC Controller Containment Units (coupled with the RC controller) may be scaled to have remote plugin access where the user may actually be off-sight, and link into the track at a controlled facility not open to the general public or another gameplay arena at a normal facility filling an empty players position. This may allow the users at home pods to link into the gameplay arena and join a session against other players, similar to online first-person shooter games today.
Further, the conveyor system may facilitate continuous gameplay without the downtimes traditionally associated with RC drone gaming. Further, the conveyor system may include one or more zones. Further, certain game types may integrate two or more zones together or separate them further or have entirely new zones for a specific game type. For instance, a submersible game may include a drain zone to eliminate the water around the RC underwater drone for service in a safe manner. However, the drain zone may not be necessary in some other game types. Further, ordering of the one or more zones may vary based on the game type. Further, the conveyor system may include a track that may control the movement of the RC drone as the RC drone goes through, similar to a roller coaster, manufacturing line, or train on rails associated with the conveyor system. Further, the track and the hold type on the RC drone(s) may vary based on type and arena setup. In an instance, the track may be above the RC drone for it to hang or be along the ground. Further, at the beginning and end of the conveyor system, the conveyor system may include a set of input/output sensors for identification reading based on game type. Further, the input/output sensors may include tools like radio frequency identification (RFID), Bluetooth, barcode, QR code, WIFI, or other similar signal transmission and identification technologies. Further, the signals may be relayed to the relay station and/or directly to the gameplay server. Further, the identification reading may facilitate changing out the RC unit/drone with another fully powered and fully operational RC drone replacement. Further, the gameplay server along with other functional areas of the system may update based on the RC Units unique identification ID, match and reprogram that specific unit into the Gameplay Arena and connect to a RC Controller and RC Controller Containment Unit for that gameplay round. Further, the one or more zones along the conveyor system may be referred to as staging areas. Further, the staging areas may include essential functions to get the RC drone prepared to reenter the Gameplay Arena. Further, the staging areas may include functions like cooling, refueling gameplay items, replacing/recharging batteries, diagnostics, sorting, automated repair, relegation for technician support, testing, storage, final prep for gameplay, approval diagnostics, etc. based on gameplay type needs.
Further, the RC unit/drone may be controlled by preprogrammed functions, artificial intelligence, machine learning, a neural network or similar programming functions mentioned before to create a consistent entry and exit condition. Further, the gameplay arena (or environment) may include a field for gameplay that may allow RC units to interact with one another and objects (both virtual and physical), actions, and movements on the field. However, RC controller(s) and RC controller containment units may be scaled to life-size, the gameplay arena may be scaled to any size to fit the space it is in or mobility requirements upon traveling for mobile setup. Further, moving elements associated with the system are a part of the gameplay arena that may depend on the gameplay type. Further, the gameplay type may include racing, battle royal, submersibles, and many other game types. Further, the racing gameplay type may include car racing, buggy racing, boat/airboat (on water surface) racing, motorcycle racing, and other vehicles (or group type drones) racing. Further, the battle royal gameplay may include tanks, battle robots, etc. Further, the submersible gameplay type may include a submarines racing or battles. In an instance, in the battle royal game type, dynamic barriers may move up and down or side to side based on predefined functions (such as timing), user/onlooker interaction, or CPU engagement. Further, moving elements (such as the dynamic barriers) may make the gameplay more exciting for participants and onlookers, such as an avalanche of rocks while racing around a Volcano erupting themed course. Further, the gameplay arena may utilize special effects type equipment and technologies to make the gameplay environment come to life. Further, the special effects type equipment and technologies/techniques may be similar to the technology used for theme park rides and movie effects. Further, the special effects type equipment and technologies may include the usage of sensors such as motion, vibration, temperature, position, cameras, timers, etc. that may improve gameplay and/or the experience. Further, the gameplay arena may include element markers. Further, the element markers may include a finish line, scoreboard, live video display of gameplay/players, or time/position markers, etc. to increase gameplay fun, competition, excitement and viewing ability. Further, these position markers may be used by the preprogrammed RC units' control for navigation purposed and feedback to the gameplay engine. Further, the gameplay arena may be integrated with the RC Controller Containment Units (Game Pods) using a display system that may allow onlookers to see the game pods and information like the gamer tags/score through displays, glass and staging around the actual field of play. Further, display system may include features like winner programming that may activate fogged glass to display the winner at the end of match play. Further, the gameplay arena may tie to the Viewer Monitoring and Access Systems in conjunction with the Gameplay Server, Relay Station, and all or some of the other gameplay systems. Further, the viewer monitoring and access systems gives the system the ability to use camera, microphones, and other inputs like voiceover to display to viewers. Further, the viewer monitoring and access systems may be onsite and offsite of the gameplay location. Further, the onsite may include a viewer headset to see from the view of players (or users) in the gameplay match, displays around the arena, and even replays/score/ranking boards that may be shown through displays. Further, the offsite monitoring may allow viewers from online platforms like Youtube™, Twitch™ or other social media sites to watch gameplay, highlights and replays. Further, the gameplay arena may be thoroughly integrated with the conveyor system as to hide it from the view of the players and onlookers that may keep the magic alive and does not let the players and onlookers see the inner mechanics as much as possible. Further, the conveyor system may be associated with offshoots to and from the gameplay arena for the RC Units to be monitored, refueled and deployed among other actions. Further, the safety is a vital aspect of the overall design of the system. Further, the safety system may be required to make sure viewers, technical staff, operators, and players are safe and minimize the risk of injury. Further, the safety system may include critical safety failsafe's and is integrated into the major functional areas of the gameplay system (or platform), such as gameplay server, RC controller containment unit, RC unit, gameplay arena. These systems could be integrated into other systems (such as the gameplay arena) and include power override shutoffs, water/flame retardant sprinkler/dispenser systems, cooling battery storage areas, temperature control fans, circuitry testing/monitoring, professional emergency wash/first aid stations and protection screening for viewers and players. Further, the safety systems may vary depending on the gameplay type and system configuration but may maintain and exceed safety standards set by location related regulations and operator/owner safety policies. Further, the safety systems may include external access points for profession technicians that may be trained in the art and safety procedures. Further, the one or more components that feature the highest risk, although still low overall, such as the RC Units, gameplay arena special effects, and batteries, may only be accessible by technicians trained in the art and safety procedures or in special circumstances under direct supervision. Further, operational staff may be provided safety training, procedural documentation on operations such as checklists, and equipment for protection such as safety glasses and protective gloves.
As can be seen in
As can be seen in
As can be seen in
As can be seen in
As can be seen in
As can be seen in
As can be seen in
As can be seen in
As can be seen in
The computing device 1200 may have additional features or functionality. For example, computing device 1200 may also include additional data storage devices (removable and/or non-removable) such as, for example, magnetic disks, optical disks, memory discs, SD cards or tape. Such additional storage may be a removable storage 1209 and a non-removable storage 1210. Computer storage media may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer-readable instructions, data structures, program modules, or other data. System memory 1204, removable storage 1209, and non-removable storage 1210 are all computer storage media examples (i.e., memory storage.) Computer storage media may include, but is not limited to, RAM, ROM, electrically erasable read-only memory (EEPROM), flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store information and which can be accessed by computing device 1200. Any such computer storage media may be part of the device 1200. The computing device 1200 may also have input device(s) 1212 such as a keyboard, a mouse, a pen, a sound input device, a touch input device, a location sensor, a camera, a biometric sensor, etc. Output device(s) 1214 such as a display, speakers, a printer, etc. may also be included. The aforementioned devices are examples and others may be used.
The computing device 1200 may also contain a communication connection 1216 that may allow device 1200 to communicate with other computing devices, such as over a network in a distributed computing environment, for example, an intranet or the Internet. Communication connection 1216 is one example of communication media.
Communication media may typically be embodied by computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and includes any information delivery media. The term “modulated data signal” may describe a signal that has one or more characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media may include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, radio frequency (RF), infrared, and other wireless media. The term computer readable media as used herein may include both storage media and communication media.
As stated above, a number of program modules and data files may be stored in the system memory 1204, including the operating system 1205. While executing on the processing unit 1202, programming modules 1206 (e.g., application 1220 such as a media player) may perform processes including, for example, one or more stages of methods, algorithms, systems, applications, servers, databases as described above. The aforementioned process is an example, and processing unit 1202 may perform other processes. Generally, consistent with embodiments of the disclosure, program modules may include routines, programs, components, data structures, and other types of structures that may perform particular tasks or that may implement particular abstract data types.
Moreover, embodiments of the disclosure may be practiced with other computer system configurations, including hand-held devices, general purpose graphics processor-based systems, multiprocessor systems, microprocessor-based or programmable consumer electronics, application specific integrated circuit-based electronics, minicomputers, mainframe computers, and the like. Embodiments of the disclosure may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote memory storage devices.
Furthermore, embodiments of the disclosure may be practiced in an electrical circuit comprising discrete electronic elements, packaged or integrated electronic chips containing logic gates, a circuit utilizing a microprocessor, or on a single chip containing electronic elements or microprocessors. Embodiments of the disclosure may also be practiced using other technologies capable of performing logical operations such as, for example, AND, OR, and NOT, including but not limited to mechanical, optical, fluidic, and quantum technologies. In addition, embodiments of the disclosure may be practiced within a general-purpose computer or in any other circuits or systems.
Embodiments of the disclosure, for example, may be implemented as a computer process (method), a computing system, or as an article of manufacture, such as a computer program product or computer readable media. The computer program product may be a computer storage media readable by a computer system and encoding a computer program of instructions for executing a computer process. The computer program product may also be a propagated signal on a carrier readable by a computing system and encoding a computer program of instructions for executing a computer process. Accordingly, the present disclosure may be embodied in hardware and/or in software (including firmware, resident software, micro-code, etc.). In other words, embodiments of the present disclosure may take the form of a computer program product on a computer-usable or computer-readable storage medium having computer-usable or computer-readable program code embodied in the medium for use by or in connection with an instruction execution system. A computer-usable or computer-readable medium may be any medium that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
The computer-usable or computer-readable medium may be, for example but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. More specific computer-readable medium examples (a non-exhaustive list), the computer-readable medium may include the following: an electrical connection having one or more wires, a portable computer diskette, a random-access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, and a portable compact disc read-only memory (CD-ROM). Note that the computer-usable or computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via, for instance, optical scanning of the paper or other medium, then compiled, interpreted, or otherwise processed in a suitable manner, if necessary, and then stored in a computer memory.
Embodiments of the present disclosure, for example, are described above with reference to block diagrams and/or operational illustrations of methods, systems, and computer program products according to embodiments of the disclosure. The functions/acts noted in the blocks may occur out of the order as shown in any flowchart. For example, two blocks shown in succession may in fact be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending upon the functionality/acts involved.
While certain embodiments of the disclosure have been described, other embodiments may exist. Furthermore, although embodiments of the present disclosure have been described as being associated with data stored in memory and other storage mediums, data can also be stored on or read from other types of computer-readable media, such as secondary storage devices, like hard disks, solid state storage (e.g., USB drive), or a CD-ROM, a carrier wave from the Internet, or other forms of RAM or ROM. Further, the disclosed methods' stages may be modified in any manner, including by reordering stages and/or inserting or deleting stages, without departing from the disclosure.
Although the invention has been explained in relation to its preferred embodiment, it is to be understood that many other possible modifications and variations can be made without departing from the spirit and scope of the invention.
The current application claims a priority to the U.S. Provisional Patent application Ser. No. 62/961,015 filed on Jan. 14, 2020.
Number | Date | Country | |
---|---|---|---|
62961015 | Jan 2020 | US |