The described embodiments relate to systems for coordinating and synchronizing interactive experiences shared between participants located at one or more locations. Some of the described embodiments relate to user interfaces for interactive experiences.
Gaming, educational and other shared experiences are increasingly delivered to people through networked computer systems. Some existing systems allow participants in shared experiences to simultaneously observe common information and other graphical elements at different locations simultaneously. Other systems allow the delivery of survey questions and other simple interactive elements in a shared experience. However, these elements are typically delivered to all participants identically. In some systems, participants may be able to make simple inputs to the system based on the common display shown to all participants. The individual inputs from different participants are processed by the system and some rudimentary confirmation or response to the individual inputs may be provided, typically on the shared common display. However, these systems do not provide a customized experience for individual participants incorporating personalized displays and information for different participants. Furthermore, these systems typically allow only a small number of participants to use the system at a location, typically in the range of 10 or fewer participants.
Accordingly, there is a need for systems and methods that allow an interactive experience to be shared among participants located in one or more places, while allowing the participants to participate in a personalized or customized manner. For example, there is a need for gaming systems that allow players to access a customized display of personal or private information or use personal input devices to participate in the otherwise shared experience. In addition, there is a need for systems and methods that provide a customized or individualized experience for the participants as they participate in the interactive experience.
In a first aspect, some embodiments according to the invention provide a system with a plurality of nodes. The system includes a coordination node and a plurality of interactive nodes. Each interactive node is at a venue, which may be a public venue, a private venue or an individual venue. At each node, participants in interactive experiences provided by the system are able to view a main or shared display and a personal or private display. The main display at each interactive node contains information that is shared between some or all of the participants at the various interactive nodes. Each participant's personal display includes information that is specific to the participant and may also include other information, including information that is also displayed on a main display or on other participant's personal displays.
Some of the interactive nodes may include a local controller that communicates with the coordination node and one or more participant devices that communicate with the local controller. The local controller controls the main display at each such node. The local controller provides an interface between the participant devices and the coordination node.
Some interactive nodes may include special purpose local controller that is intended primarily or solely for use within the system. For example, an interactive node at a public venue or location may include a purpose built local controller designed to communicate with a plurality of differing participant devices that include a screen on which the personal display may be shown. The participant devices may communicate with the local controller using a proprietary or non-proprietary protocol, or both.
Other interactive nodes may use a multi-purpose local controller, such as a gaming console, television adapter, television or satellite set-top-box, computer or any other processing device. Such a local controller may communicate with participant devices including differing participant devices and potentially including purpose built participant devices that communicate with the local controller using a proprietary or nonproprietary protocol or both.
Some interactive nodes are individual nodes in which a participant uses a single personal device that acts as both a local controller and as a participant device. A main display and a personal display are shown to the participant. In various embodiments, the main display and the personal display may be shown simultaneously or alternatively.
In some embodiments, some or all of the local controllers may be virtual local controllers that are instantiated at an interactive node or at a different location that is accessible to participant devices at the interactive node through a communication network. For example, the virtual local controller an interactive node may be an instance of a software object, computer program or a computer program product that is installed and operated on a computing device that is accessible to participant devices at the interactive node. The virtual local controller may operate on a computing device that is at a location remote from the venue of the interactive node, but which is accessible to participant devices at the interactive node through a network. In some embodiments, the virtual local controller may operate on a computing device at the location of the central coordination node. In some embodiments, the virtual local controller may operate on the same computer device or computing system and the central coordination node of the system. In some embodiments, the virtual local controller may effectively be integrated with the coordination node such that there is no independent local controller, but rather a coordination node that communicates with a plurality of participant devices and also coordinates and synchronizes an interactive experience shared by participants using the participant devices.
Any particular embodiment may include one or more interactive nodes. The various interactive nodes may have the same configuration or may have different configurations.
The participants participate in a shared interactive experience that is coordinated for the participants by the system. The participant devices, local controllers and coordination node communicate through the exchange of messages. The messages include program update messages that provide information relating to participant inputs and updates describing changes in the state of the interactive experience. The messages synchronize the interactive experience allowing the actions of one participant to affect the experience of other participants.
In some embodiments, the actions of a participant may not affect the experience directly, but may be taken into account by the system in delivering a personalized experience to each participant.
In another aspect, there are provided one or more configurable controller that may be used for interactive experiences. Each controller includes one or more controller interfaces that may be suitable for use with a variety of participant devices. Each controller interface may be adapted for use with the particular input devices, sensors and other features and characteristics of a particular type of device. The controller also includes one or more configuration files that may be used to configure a controller interface to operate in a particular manner, which may be suitable for use with one or more interactive experiences. Some configuration files may include a plurality of configurations that may be used during different parts of an interactive experience. Some controllers may be configured to allow a participant to personalize or customize a controller interface for the participant's use during an interactive experience.
In some embodiments, multiple controllers may be operable on a participant device simultaneously and a participant may be provided with inputs to select between controllers.
These and other aspects are further identified and described below.
Various embodiments of the present invention will now be described with reference to the drawings, in which:
a illustrates a personal display corresponding to the main display of
b illustrates a personal display corresponding to the main display of
a illustrates a screenshot of a button controller configured according to an example embodiment;
b illustrates a screenshot of a button controller configured according to another example embodiment;
a illustrates a screenshot of a button controller configured according to another example embodiment;
b illustrates a screenshot of a button controller configured according to another example embodiment;
a illustrates a screenshot of a toss controller configured according to an example embodiment;
b illustrates a screenshot of a toss controller configured according to another example embodiment;
c illustrates a screenshot of a toss controller configured according to another example embodiment;
d illustrates a screenshot of a toss controller configured according to another example embodiment;
a illustrates a screenshot of a toss controller according to a different example embodiment;
b illustrates a screenshot of a toss controller according to a another example embodiment;
c illustrates a screenshot of a toss controller according to a another example embodiment;
a illustrates a gyroscope controller according to an example embodiment;
b illustrates a gyroscope controller according to another example embodiment;
a illustrates an interaction system comprising an augmented reality controller according to an example embodiment;
b illustrates an interaction system comprising an augmented reality controller according to another example embodiment;
It will be appreciated that numerous specific details are set forth in order to provide an understanding of the exemplary embodiments described herein. However, it will be understood by those of ordinary skill in the art that the embodiments described herein may be practiced without these specific details. In some instances, well-known methods, procedures and components have not been described in detail so as not to obscure the embodiments described herein. Furthermore, this description is not to be considered as limiting the scope of the embodiments described herein in any way, but rather as merely describing the implementation of several example embodiments.
The embodiments of the systems and methods described herein, and their component nodes, devices and system, may be implemented in hardware or software, or a combination of both. However, preferably, these embodiments are implemented in computer programs executing on programmable computers each comprising at least one processor, a data storage system (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device.
For example and without limitation, the various programmable computers may be a personal computer, laptop, tablet, personal data assistant, cellular telephone, smartphone device, UMPC tablets and wireless hypermedia device or any other data processing or computing device. Program code is applied to input data to perform the functions described herein and generate output information. The output information is applied to one or more output devices, in known fashion.
Each program is preferably implemented in a high level procedural or object oriented programming and/or scripting language such as Flash or Java, for example, to communicate with a computer system. However, the programs can be implemented in assembly or machine language, if desired. In any case, the language may be a compiled or interpreted language. Each such computer program is preferably stored on a non-transitory storage media or a device (e.g. ROM or magnetic diskette) readable by or accessible to a general or special purpose programmable computer, for configuring and operating the computer when the storage media or device is read by the computer to perform the procedures described herein. In various embodiments, the computer program may be stored locally or at a location distant from the computer in non-transitory storage media. In some embodiments, the computer program may be stored on a device accessible through a local area network (LAN) or a wide area network such as the Internet. The subject system may also be considered to be implemented as a computer-readable storage medium, configured with a computer program, where the storage medium so configured causes a computer to operate in a specific and predefined manner to perform the functions described herein.
Furthermore, the system, processes and methods of the described embodiments are capable of being distributed in a computer program product comprising a computer readable medium that bears computer usable instructions for one or more processors. The medium may be provided in various forms, including one or more diskettes, compact disks, tapes, chips, wireline transmissions, satellite transmissions, internet transmission or downloadings, magnetic and electronic storage media, digital and analog signals, network based storage and the like. The computer useable instructions may also be in various forms, including compiled and non-compiled code.
Reference is first made to
Reference is next made to
Typically, each participant device 126 will be a portable wireless computing device. Each participant device 126 includes a secondary display screen 127 and one or more input devices 128 such as a keypad, keyboard, touchscreen, button, scroll wheel, scroll ball, gyroscope, accelerometer, compass, level, orientation sensor, voice controller or a combination of such devices. Each participant device 126 is coupled to local controller 122 through local network 129. The participant devices 126 may be different devices, such as various multi-purpose devices such as smartphones, cell phones or other portable computing devices, which are typically coupled to the local controller through wireless communication components of local network 129.
In other embodiments, the participant devices may be wired devices that are physically coupled to the local controller 122 through wired communication components of local network 129. Some participant devices may be mounted in a fixed position or fastened to a fixed location in the public location. For example, some participant devices may be secured to a seat or table to prevent theft of the participant devices. Such physically anchored or tethered participant devices may be coupled to the local controller through wired or wireless communication components of local network 129.
Primary display screen 124 is also coupled to local controller 122, which controls the display of data on the primary display screen 124. The primary display screen 124 is used to present a main display of information to all participants and to observers present in the public location. Local controller 122 is configured to control the display of information on the primary display screen 124 and on each of the participant devices 126. In some embodiments, there may be two or more primary screen positioned to allow participants and other persons in the venue to view one or more of the primary screens. Identical or similar main displays will typically be shown on all of the primary displays.
As used herein, the term “coupled” means that two or more devices are able to communicate such that data and other information can be transmitted between them. The coupling may be a physical coupling through cables, communications networks and devices or other devices. The coupling may also be a wireless coupling through a wireless communication protocol or a network. The coupling may also incorporate both physical and wireless couplings.
Public location 112 may be any location in which a plurality of members of the public may be present and view the primary display screen 124 such as a movie theatre, sporting facility, bar, restaurant or any other location in which a primary display may be visible to members of the public. Local controller 122 may be part of one or more public nodes 104 at a public location 112. For example, if the public location is a movie theatre having multiple auditoriums, some or all of the individual auditoriums may have a public node. The movie screen at the front of the auditorium is used as a primary display screen and individual movie viewers may use participant devices to view individual information on a secondary screen to provide inputs. A public node is provided in each auditorium. The local controller for the various public nodes in the various auditoriums may be shared between two or more public nodes.
Reference is next made to
Local controller 132 is coupled to coordination node 102 through a local private location network 140, which, in this embodiment, is a wireless network, and through an ISP network 142 and network 110. ISP network 142 provides Internet access to devices such as the local controller 132 located at the private location 130. In other embodiments, the local controller 132 may be coupled to coordination node 102 through a wired coupling or through any other means for coupling computing devices.
Local controller 132 is also coupled to the participant devices 136. In a private node 106, the local controller 132 and the participant devices 136 may be designed specifically to interoperate with one another. For example, the local controller 132 may be a gaming console and the participant devices may be game controllers for use with the gaming console. For example, the local controller may be a Sony Playstation 3™, a Nintendo Wii™, a Microsoft XBOX 360™ or another such device or console such as a set-top television or satellite communication box or a computer. In other embodiments, the controller may be integrated into a display device such as a television or monitor or into another type of device capable of communicating with the coordination node and with the participant devices. For example, in some embodiments, the local controller may be an Internet television or video service device such as an Apple TV™ and the participant devices may be devices capable of communicating with the television or video service devices such as Apple iPhones™, iPods™ or iPads™.
Each of the gaming consoles or devices is capable of communicating with and receiving inputs from participant devices, which may be game controllers, designed for communication with the respective console or device. Each participant device 136, according to this embodiment, has a secondary display screen 144 and one or more input devices 146. In some embodiments, the participant devices 136 in a particular private node 106 may be essentially identical in construction. That is, the participant devices may have the same physical structure and controls, although the local controller 132 is able to independently communicate bi-directionally with each of the participant devices. In other embodiments, the participant devices may be of different physical structures, configurations or arrangements.
Local controller 132 controls the display of a main display on the primary display screen 134 and of personal displays on the secondary display screens 144 of the participant devices.
In some embodiment, the local controller 132 may be a virtual component that resides in a network or a device that may be coupled to the coordination node 102 and to the participant devices 136. For example, the local controller 132 may be a virtual component operating on a computer at the same location as the coordination node or at another location. In some embodiments a virtual controller may be shared between different interactive nodes that are in different locations.
Reference is next made to
Each individual interactive node 108 is configured to operate as both a main display and as a participant device. In this specification, the term “participant device” includes an individual interactive node, unless specified otherwise, or unless dictated otherwise by the context.
The display screen 150 of an individual node 108 is used as both a primary display screen and as a secondary display screen. For some individual nodes or in some interactive experiences, this may be done by selecting a portion of the display screen 150 in which to display a main display (corresponding to the main display shown on primary display screens at public and private nodes) and a portion of the display screen 150 in which to display a private display (corresponding to the secondary display screens of participant devices used in public and private nodes). In some individual nodes or some interactive experiences, this may be done by displaying a main display on the display screen 150 at some times and a personal display on the display screen 150 at other times. A participant may be able to select between the main and personal displays. The two techniques may be combined in some individual nodes or some interactive experiences.
Individual node 108a has a variety of input devices 152 including a keypad, a control wheel, a control ball and various other buttons. Individual node 108b has several input devices 152 including a button and a touchscreen. Individual node 108b also has an orientation or tilt sensor that allows a participant to provide inputs by tilting or rotating the device and accelerometers that allow a participant to provide inputs by moving the device.
Each individual node 108 is coupled to the coordination node 102. In
The public nodes 104, private nodes 106 and individual nodes 108 may be referred to collectively as interactive nodes. In system 100, each interactive node is coupled to coordination node 102, although the communication networks and modes through which the interactive nodes are coupled to the coordination node 102 may vary.
System 100 allows participants using a variety of participant devices 126, 136, 108 to interactively participate in a shared experience. For example, system 100 may be used to allow participants to engage in a shared gaming, presentation, marketing, training, surveying or other interactive experience.
In some embodiments, system 100 is configured as a gaming system. In such configurations, a game is played by participants in at least two locations. At each location, each participant can view at least two displays: a main display that displays shared information and a personal display that includes information that is personal to the corresponding participant.
For example, the game may be a car racing game. An overhead view of a race track may be shown on the main display. Each participant controls one car that moves along the track. The participant can also view information specific to that participant's car or performance in the race on a personal device. For example, a participant's personal display may show the participant car and the track from the perspective of a driver inside the car. The participant's display is shown on a participant device, which also allows the participant to steer the car and to provide other inputs for the car racing game.
In some embodiments, system 100 may be configured as a betting or wagering system. The main display at each interactive node is used to display a video presentation such as a sporting event, a roulette wheel or a card dealer. Participants may view a variety of betting options on the personal display on their personal participant devices and may make bets on events in the video presentation. For example, participants may be able to bet on the outcome of the sporting event or events that occur during the sporting event (such as the next team to score, the next penalty, the outcome of the next play, etc.), the next number to be drawn at the roulette table, a card or hand to be dealt by the card dealer. Each participant is able to independently and privately access information about possible bets, make such bets, receive results for such. Individual betting may be reflected in updates to odds for some bets or to display bets or the outcomes of bets placed by participants.
In some embodiments, system 100 may be configured as an educational system or training system. Information may be presented to a group of participants at several locations. Each participant may view shared information presented on a main display and may also view private information on a personal display.
For example, in a training system, a series of slides may be presented to all participants on the main display that is shown to all participants. Some or all of the participants may also be presented with content specific to each respective participant on the personal display such as a series of questions that each participant must answer. The personal display may allow participants to view and answer questions at the participants own pace, or may display different questions to different participants. The personal display is shown on a participant device to each player, who may use input devices on the participant device to answer questions or otherwise interactively participate in the training session.
Reference is next made to
A plurality of interactive programs are recorded in the program database 610. The interactive gaming and educational experiences described above are examples of experiences that may be provided by the interactive programs recorded in the program database 610. Each interactive program includes participant components that operate at the participant devices 126, 136 and 108 and may include central or core components that may operate at the coordination node. In addition, some interactive programs may include local controller components that operate at some or all of the local controllers of the public and private nodes. Each of the participant components, central components and local controller components are software objects or components that are executable on the respective devices on system 100.
Program control modules 614 operate within the coordination node 102 to coordinate a shared experience between participants located at various interactive nodes. Typically, each program control module 614 is a software object or component that executes on a processor within the coordination node. The processor has access to a non-transitory memory in which the program database 610, participant database 612 and system access applications 616 are recorded. One or more program control modules 614 may be active at any time to manage the operation of one or more interactive experiences.
System access applications 616 are software objects or components that are installed and operate on different participant devices. Each system access application allows a participant to use the respective participant device to view a personal display on a participant device and to provide inputs using input devices on the participant device. In some embodiments, different system access application may be provided for different participant devices or for the use of participant devices in different interactive nodes. For example, system access applications that operate on a Blackberry™ smartphone may differ from system access applications that operate on an Apple™ iPhone™ smartphone. Different system access applications may be provided for use of a particular smartphone (or other participant device) in different modes. For example, a different system access application may be operated on a participant device when the participant device is used as part of a public node 104, as part of a private node 106 or as an individual node 108. In some embodiments, a single system access application 616 may include modules and components that allow the system access application to operate in more than one mode.
A system access application 616 for use on an individual node 108 may include separate local controller software components that operate the individual node as a local controller and separate participant software components that operate the individual node as a participant device. The two distinct groups of software components may operate simultaneously and communicate with one another in the manner described herein in relation to local controller and participant devices at other interactive nodes. In other embodiments, a system access application for use at an individual node may include integrated software components that operate the individual node such that it communicates with the coordination node as a local controller and allows a participant to use the device as a participant device in an integrated manner.
The system access application 616 at an individual node 108 may produce a main display that is displayed as an alternative to or in conjunction with a personal display. The system access application may also provide control and communication services between the individual node 108 and the coordination node 102.
A plurality of participant records are stored in the participant database 612. In some embodiments, each participant that participates in an interactive experience using system 100 may be required to create an account or profile that is stored in a participant record. The participant records may include identification and authentication information; demographic and personal information about the participant; and program experience information for recording a participant's past success or progress in one or more programs.
Identification and authentication information may be used to allow a participant to securely access the participant's record.
Demographic and personal information may be used to provide personalized information to a participant. A participant may receive information on the participant's personal display based on the participant's previous performance in an interactive experience or based on the demographic or status information about the participant. For example, in an educational interactive experience directed to teaching employees about a new company initiative, various employees in various company and other locations. At each location, employees view common information on a main display. Each employee may receive customized information about the initiative in a personal display, based on the department in which the employee works.
Each program control module 614 manages one ongoing interactive experience at a time. Interactive nodes 104, 106 and 108 communicate with a program control module 614 to participate in the interactive experience. In other embodiments, a single program control module may manage more than one simultaneous ongoing interactive experience.
The operation of system 100 will now be explained with reference to an example gaming configuration of the system. The particular example is a car racing game in which individual participants at various public nodes, private node and individual nodes each control a virtual car as it moves around a track. Different cars controlled by different participants race around a track and the first participant to manoeuvre his or her virtual car around the track is the winner of the race.
Reference is made to
For example, in some embodiments, participants or other persons may be able to participate in a text chat, video chat or other interaction using system 100. Some components of the interaction may be displayed on the main displays shown at the interactive nodes. For example, text chat or instant messages sent by participants or other persons may be displayed. In some embodiments, text chatting or other services may be provided as a second interactive program contemporaneously with a first interactive program and components of both programs may be displayed on some or all of the main displays in the system. Participants in the respective interactive programs use their respective participant devices to participate in the respective interactive experiences.
At the same time, a main display on a private node 106 (
At each individual node 108 in the system, the respective participant may also view a main display and a personal display. At some individual nodes, the participant may switch the individual node device 108 between a primary display mode in which a main display is shown and a secondary display mode in which a personal display is shown. At some individual nodes, a composite display showing both a primary display and a personal display is shown.
Reference is next made to
b shows a different personal display 830 for a second participant in the example car racing game. Personal display 830 includes an image of the second participant's car 832 from an in-car perspective. Personal display 830 also includes the track 814, the second participant's position 836 in the race, speed 838 and options 840 that the second participant has during the race.
During a multi-player interactive experience, a main display is available for viewing by all participants. The specific main display shown to a particular participant may depend on the participant's location. In the case of a participant using an individual node, the main display available to the participant may depend on the participant's device or on the participant's preferences. Such options may be provided by the participant components of an interactive program. For example, some participant components may display a main display together with a personal display on the screen of a participant device. Other participant devices may provide several configurations of a main display that may be displayed based on the participant's preferences. Similarly, local controller components at a private node may provide various alternative formats for a main display at the private node or a public node.
Reference is made to
Method 1000 begins in step 1002, in which a plurality of participants located at two or more locations are enrolled to participate in an interactive experience. To enroll, each participant activates a system access application 616. Participants located at a public or private node may be able to access the respective local controller 122 or 132 for the node using a participant device to download a system access application 616. For example, at a public node 104, instructions for accessing the respective local controller 122 may be displayed on the primary screen 124 of the public node. Participants may use a participant device 126 to access the local controller 122 and then download a system access application suitable for operations on the participant device.
At a private node 106, a system access application 616 suitable for use with the participant devices 136 may be pre-installed in the participant devices prior to their delivery to a retail customer. In some embodiments, a system access application 616 may be downloaded to the local controller 132 of the private node 106, and may then be installed on the participant devices from the local controller 132.
At an individual node 108, a system access application 616 may be installed on the individual node by downloading the system access application 616 from an application store or application service or from a computer or other device to which the individual node device may be coupled.
Each system access application allows a participant to communicate with the coordination node 102.
In a public node 104, the system access application 616 communicates with the coordination node 102 through the local controller 122 of the public node 104.
In this embodiment, in a private node 106, a participant device may not communicate directly with the communication node. Instead, the participant device may communicate only with the local controller 132 of the private node, which then communicates with the coordination node. In some embodiments, a public node 102 may also have this configuration.
An individual node 108 is also a participant device which communicates with coordination node 102 directly (although typically through various communication network elements).
The coordination node 102 maintains a list of currently available interactive experiences during operation of the system 100. Some interactive experiences may be available to all participants, while others are available to participants located only at certain interactive nodes or certain types of interactive nodes. For example, some interactive experiences may be designed to last a relatively long time, exceeding the short period of time that participant in a movie theatre may be waiting before the start of a movie. Such interactive experiences may not be available to participants accessing system 100 from a public node such as a movie theatre. Other participants at public nodes where patrons tend to participate in a shared experience for a longer period, such as participants accessing system 100 from a bar or other social establishment may be permitted to participate in such an interactive experience. At some interactive nodes, all participants may be required to participate in the same interactive experience. For example, at a public or private node that has only a single primary display that is used to show a main display for a single interactive experience, then all participants must participate in that interactive experience. In some embodiments a primary display may be used to show a main display for two different interactive experiences on different parts of the primary screen.
Each participant activates the respective system access application on the participant's device 126, 136 or 108. The system access application obtains a list of currently available interactive experiences from the coordination node 102, based on the interactive node from which the player has accessed the system 100. The list of available interactive experiences available to the participant is displayed on the participant's device and the participant selects one of the experiences, thereby enrolling to participate in the selected interactive experience.
In other embodiments, participants may select an interactive experience directly under the control of their respective local controllers. Interactive experiences available at each interactive node may be recorded (in real time or in advance) in the respective local controller. Participant devices communicate with the local controller to present a list of interactive experiences available to a participant, who may then choose from the list.
Method 1000 proceeds to step 1004, in which any participant components required for the interactive experience are installed on the enrolled participant's device. If the participant's device has not previously been used for the interactive experience, then any participant components necessary for the participant's device to provide the interactive experience are transmitted and installed on the participant's device. If the participant components have previously been installed on the device, then outdated components may be updated with current participant components. The particular participant components installed on a particular participant device may be dependent on the features of the participant device, the particular interactive experience for which the participant has enrolled or both. For example, if a participant device has a touchscreen, an orientation sensor, an accelerometer or other input device, then the participant components installed on the participant device may be designed to allow a participant to use such input devices. The participant components may be transmitted from the coordination node, a local controller or from an asset server coupled to the interaction system.
Method 1000 then proceeds to step 1006, in which the local controller for the interactive node at which an enrolled player will participate in an interactive experience is updated, if necessary. Some interactive programs may include local controller components that operate on the local controllers at the interactive nodes 104, 106 and 108. Typically, although not necessarily, such local controller components may differ depending on the specific type of interactive node in which they will operate. For example, local controller components for a local controller 122 in a public node 104 may be configured differently than local controller components for a local controller 132 such as a gaming console in a private node 106. Similarly, local controller components for an individual node 108 may act as both a local controller and as a participant device and are typically configured for the specific type of participant device on which they will be used.
If the local controller components have not previously been installed on the respective local controller of the interactive node from which the newly enrolled participant has accessed the system 100, then the local controller components are installed. If the local controller components have previously been installed, they may be updated to reflect any changes in the local controller components.
The local controller components for different interactive programs may vary depending on the nature of the interactive program. For example, in the car racing game described above, the local controller components may include information about the virtual tracks and virtual cars in the game. For example, the program components for the racing game may include various core components relating to the control, display and interaction of vehicles that may be used by a participant in a race. Specific details of each vehicle including specific characteristics that may be used by the core components to determine how the specific vehicle is controlled, displayed and how it interacts with other vehicles and other elements of the car racing program. If a new vehicle is added to the program, then local controller components relating to the new vehicle may be uploaded to the local controller in this step. The core components use the new vehicle specific components to display and otherwise use the new vehicle in an interactive car racing interactive experience. The local controller components may also include rules of the game and details of information message that will be exchanged between the coordination node, the local controller and the participant devices. In the case of an educational or survey interactive experience, the local controller components may include questions, slides or other information to be displayed on the main display of the interactive node or to be transmitted to and displayed on the participant devices at a local node.
Steps 1004 and 1006 allow program components for an interactive program to be updated at the local controller and participant devices. These steps are optional and may not be performed in some embodiments. For example, in some embodiments, a participant device may be updated independently of method 1000 in which a participant is able to participate in an interactive experience. Similarly, in some embodiments, local controllers may be updated during periodic updates (such as nightly or weekly updates) to add new components. In other embodiments, a limited number of interactive program components may be transmitted to a participant device during method 1000. For example, if a particular interactive program requires a graphic, computation or other asset or component, the asset may be transmitted to and installed on a participant device.
Different interactive experiences may permit or require a different number of participants to be enrolled. When an appropriate number of participants have enrolled in an interactive experience, method 1000 then proceeds to step 1008, in which the interactive experience is provided to the enrolled participants.
Reference is made to
The program control module 614 transmits program update messages 902 to each of the interactive nodes 104, 106 and 108 at which a participant in the shared experience is enrolled. The program update messages 902 may include a variety of messages including:
The program control module 614 also receives participant input messages 904 from the participant devices 126, 136 and 108. The participant input messages are generated based on inputs entered by a participant using input devices at the participant's device.
The participant components provide an interface for the participant to participate in the interactive experience. Depending on the interactive experience, the participant components may permit a participant to change the personal display shown on the secondary screen of the participant's device or to change input controls to those preferred by a participant.
For example, in the car racing game example, the participant components may provide various display perspectives or views from within, behind or ahead of the participant's car in the race. The participant may also be able to see forward ahead of or backwards behind the participant's car. Other views may include an overhead view of the participant's car. Such inputs may be processed entirely by the participant components, which may be configured to generate and provide various personal displays on the participant's devices secondary display.
Other participant inputs may affect the shared interactive experience for other players. For example, some participant inputs may relate to the direction (i.e. a steering input) or speed (i.e. an accelerator input or a braking input). Such inputs affect the position of participant's car in the race. The participant components may process such inputs to modify the personal display on the participant's device. For example, the speed of the virtual car may be updated on the personal display by the participant components. Such inputs, or a variant of such inputs, are transmitted in participant input messages 904 to the local controller 122 or 132. The local controller may also process the participant inputs. For example, the local controller may modify the main display shown on the primary display at the interactive node. The local controller then transmits the participant input message 904 (or a copy or variant) of it to the corresponding program control module 614 in the coordination node 102.
At the coordination node, the program control module 614 receives the participant input message 904, determines the effect of the participant input on the shared interactive experience and takes one or more responsive actions. Such actions may include updating a player profile of the participant from whose participant device the participant input message originated, updating interactive experience information recorded by the program control module to record the state of the interactive experience or generating one or more program update messages 902 that are then sent to local controllers, or a combination of these actions. If the participant input message 904 is not relevant to the interactive experience (for example, where the message is received after the interactive experience has terminated), program control module may discard the participant input message 904.
The program control module may process and react to a program update message 902 from a participant device in various manners, including: —If the participant input affects the main display of the interactive experience at various interactive nodes, the program control module 614 determines the modification required to the main display and transmits a main display control message to the local controller at each interactive node identifying the modification. In various embodiments, the main display control message may identify all of the content of the main display, may identify only components that are to be changed in the main display, or may provide information that allows the local controller at the respective interactive nodes to generate a main display. —If the participant input affects another participant's interactive experience, the program control module 614 transmits a participant device message to the local controller of the interactive node at which the other participant is accessing system 100. The local controller passes the participant device message to the appropriate participant device. The participant device message may provide various types of information to a participant device:
An interactive experience is provided to participants primarily in step 1008. Typically, an interactive program ends if certain end-of-experience conditions are met. For example, in a gaming interactive experience, the game may end if a participant or team of participants wins the game, if a selected time period expires or if another end 10 of-experience condition is met. In the case of a survey, educational or other interactive experience in which different participants are viewing a common main screen and independently answering questions or concurrently on a personal display, the experience may end when the participants have answered all of the questions, at the end of a program displayed on the main screen, after a selected time period, when a selected percentage or number of participants have completed a selected percentage or number of questions or other activities. In the case of a betting interactive experience in which the participants are viewing a video program on the common main display and concurrently placing bets based on events shown in the video program, the interactive experience may end when the video program ends.
When the end-of-experience conditions are met, method 1000 proceeds to step 1010. In step 1010, program control module 614 transmits a program update message to all local controllers and to each individual node indicating that the interactive experience is ended. The local controllers transmit a corresponding program update message to each participant device at each public and private node.
The local controller may update the main display to reflect an outcome of the interactive experience. For example, the main display may be updated to identify the winner of a gaming interactive experience, to display a summary of an interactive experience or simply to indicate that the interactive experience has ended.
Similarly, participant components of the interactive program may display the outcome of an interactive experience for the participant, such as a message indicating the end of an interactive experience on the personal display shown on a secondary display screen of a participant device.
During step 1010, some interactive experience control messages may be transmitted only within an interactive node. For example, if an interactive experience control message indicates a change in the state of a game that is relevant only to one participant or only to participants at the interactive node from which the message originates, it may not be transmitted by the local controller of that node to the coordination node. In some embodiments, a local controller may transmit only information that is relevant to the coordination node or to participants at other interactive nodes in an interactive experience control message.
In some embodiments, the local controllers or the coordination node or both may modify interactive program control message such that only information that is relevant to participants at an interactive node is sent to that node. This may reduce the number and size of interactive program control messages, allowing an interactive experience to be synchronized more quickly or with the use of less communication bandwidth or both.
Method 1000 then ends.
Optionally, method 1000 may be performed repetitively, allowing the interactive experience to be repeated.
Method 1000 provides an interactive experience to a plurality of participants located in disparate locations. Each participant shares the same interactive experience and view common information on a main display. Simultaneously, each participant has a personal display shown on the participant's personal device that provides a rich graphical experience that is personal to the individual participant.
Some interactive experiences may permit participants to join or leave an interactive experience while the experience is ongoing. For example, in some betting interactive experiences, such as some poker experiences, participants may be able to join and leave the interactive experience individually, with the interactive experience continuing before and after a particular participant participates in the interactive experience.
In an interactive experience in which a participant may join after the interactive experience has started, a participant may complete steps 1002 and 1004 independently. Step 1006 may not be required in such a situation, particularly if the local controller used by the newly enrolled participant is also in use by other participants.
In an interactive experience in which a participant may leave or be removed from before the interactive ends for other players, a departing participant may move to step 1010 while other participants continue in the interactive experience in step 1008.
In some interactive experiences, a participant device may not require updates in step 1008. For example, in some interactive experiences, all components required for a participant to participate in the experience may be delivered in step 1004 and it may not be necessary to transmit update messages to the participant devices during step 1008. In such experiences, update messages are transmitted to the coordination node based on inputs from participants. The coordination node then transmits corresponding update messages to the interactive nodes allowing the local controllers to update the respective main displays.
Reference is made to
In some embodiments, each interactive node may be a public node 104. In other embodiments, each interactive node may be a private node 106. In other embodiments, different combinations of public, private and individual nodes may be permitted.
Reference is made to
System 1100 includes a coordination node 1102, one or more public nodes 1104 (only one of which is illustrated), one or more private nodes 1106 (only one of which is illustrated) and one or more individual nodes 1108 (only one of which is illustrated).
System 1100 includes a coordination framework that includes central coordination components 1150, local coordination components 1154 and participant coordination components 1156.
The interactive programs stored in the coordination node 1102 include central components 1162, local controller components 1164 and participant components 1166.
When system 1100 is used to provide an interactive experience using a particular interactive program, the components of system 100 operate as follows.
At the coordination node 1102, the central components operate with a program control module 1114. The program control module 1114 operates with the central coordination components 1150. The central coordination components of the interactive program provide functions and services that are specific to the interactive experience or to the interactive program. The program control module manages the coordination of the interactive experience for all participants in the interactive experience at the various participant nodes, including management of the main display at each interactive node, the personal display at each participant device and the processing of participant inputs received from each participant device. The central coordination components may provide communication and other services to the program control module 1114 and the central coordination components 1150. In some embodiments, the program control module 1114 may be combined with the central coordination components 1150 such that an integrated program control module provides the functions of both a program control module and the central coordination components.
At each local controller 1122, 1132, local controller components 1164 operate with the local coordination components 1154. The local controller components 1164 provide services and functions that are specific to the interactive experience or the interactive program. The local coordination components 1154 may provide communication and other services. The local coordination components also manage the main display shown on the primary screen in a public or private node.
At each participant device 1126 or 1136, participant components 1166 operate with participant coordination components 1156. The participant components 1166 provide services or functions that are specific to the interactive experience or interactive program. The participant coordination components 1156 may provide communication and other services to the participant components 1166.
Typically, the coordination framework provides coordination services that are common to a plurality of interactive programs. In such embodiments, the interactive programs may rely on the coordination framework for coordination services, allowing developers of the interactive programs to limit interactive programs and their respective components to software, data and other content that is specific to the interactive experience provided by the interactive program. Coordination services that are required by a plurality of interactive programs are provided by the coordination framework. This may reduce the size of the local and participant components that must be installed respectively on local controller and participant devices before an interactive experience can be provided. It may also serve to make interactive experiences more uniform, allowing participants to more easily participate in new interactive experiences using previously acquired knowledge and skills.
A coordination framework may provide various services.
In some embodiments, the coordination framework may provide internode communication services. For example, coordination components 1150, 1154 and 1156 may provide a message or data passing service that allow interactive program components 1162, 1164 and 1166 to communicate with one another. The coordination components communicate with one another. The interactive program components communicate with the respective coordination components installed at the same nodes, and communicate indirectly with one another through the coordination components.
In some embodiments, the coordination framework may provide participant account services. For example, the central coordination components may interface with a participant database stored in the coordination node. The central coordination components may provide details from a participant's account to an interactive application, either directly to a central component or through other coordination framework components to a local controller component or to a participant device component of an interactive program. The interactive program component may use the information from the participant's account to personalize or modify the participant's experience. In addition, the interactive application may provide updated information for a participant's account to the central coordination component to be stored in the participant's account. Such updated account information may be recorded in the participant database.
The coordination framework may also provide account creation services. Participant coordination components installed on the participant devices may include an account creation function. When a participant accesses system 1100 using either a system access application or a participant component of an interactive application, the participant may wish to create an account. The participant coordination components may include an account creation module that collects the information required for a participant account, and then forward such information to central coordination components. The central coordination components may then create a new account for the participant in the participant database.
In some embodiments, the coordination framework may provide device interface services. For example, participant coordination components may interface with input devices built into or attached to a participant device. The participant coordination components may convert various types of inputs received from various types of input devices into a consistent set of inputs that are then provided to the participant components, local controller components and central components of an interactive application. This allows the same or similar participant components to be installed on participant devices regardless of their different input devices. Other differences in the participant devices may still require different participant components to be installed on different participant devices.
In some embodiments, the coordination framework may provide content delivery services that allow content for an interactive experience to be pushed from the coordination node to local controllers and participant devices at interactive nodes. For example, an interactive program may use the coordination framework to push media components for an interactive experience to the interactive nodes at the start of or during an interactive experience.
In some embodiments, the coordination framework may provide participant interaction services. For example, the coordination framework may provide video chat, voice chat, multimedia messaging, social media interfaces (such as an interface to automatically transmit information to or using Facebook™ or Twitter™). In various cases, the participant devices may be configured to access third party assets that are not part of the original interactive experiences. For example, the participant device may be configured to access assets, such as images or pictures, from social media websites. The participant device may also be configured to access assets from the local memory of the participant device. The coordination framework may enable the participants to access third party assets and add them to the interactive experience. In some cases, the participants may select the assets and toss them, for example by using a toss controller as discussed below, onto the secondary display. The coordination framework may provide these inputs to the local controller and the coordination node so that they become part of the interactive experience.
In some embodiments, the coordination framework may provide a reward system. For example, the coordination framework or an interactive application may reward participants for participating or succeeding in various interactive experiences. A participant's interactive experience may be varied based on the rewards earned by the participant. Typically, the participant's earned rewards will be recorded in the participant's account record in the coordination node. The participant's reward status may be provided to an interactive application as described above in relation to account services.
In some embodiments, the reward system may provide coupons, incentives or other information to participants. In some embodiments, participant preferences may be recorded with a player's account. A participant's preferences may be used to provide a more customized experience to the participant, including the provision in-game and other advertising, coupons and other information.
In some embodiments, the coordination framework may provide graphical and physics processing services. For example, the coordination may provide mathematical algorithms and routines that calculate outcomes for events such as collisions, scene management, graphic layering and other processing intensive activities, eliminating the need for the components of an interactive program to include such algorithms and components. Like other services provided by the coordination framework, components of the interactive applications may invoke such services, reducing the need to include such services in the interactive application components.
In some embodiments, the coordination framework may provide positioning services. For example, the participant coordination components in a coordination framework may use positioning devices such as global position system sensors, Wi-Fi (802.11) antennas and other devices built into a participant device to estimate the location of a participant device. The position may be provided to an interactive program to allow a participant's experience to be customized based on the player's location.
In some embodiment various participants may be organized into teams. For example, in the car racing example, participants may be organized into a first team and a second team such that one team wins if a specified condition is met. The program control module in such embodiments tracks the membership of participants in each team. The personal displays shown to members of each team may include information that is relevant to the entire team. In this way, the participants on one team are able to share information that is not provided to the other team. In some embodiments, all participants at a particular node may be on the same team. In such embodiments, the main display shown at the node may include information to be shown to the team.
In systems 100 and 1100, three types of interactive nodes are described: public nodes, private nodes and individual nodes. In some embodiments, only public nodes may be provided. In other embodiments, only private nodes may be provided. In other embodiments, only public and private nodes may be provided. In other embodiments, only individual nodes may be provided. In some embodiments, only public and individual nodes may be provided. In some embodiments, only private and individual nodes may be provided. In each case, a participant at any node is able to see a main display that contains information that is also shown on other main display and a personal display that contains information specific to that participant.
Reference is next made to
System 1200 includes a coordination node 1202, one or more public nodes 1204 (only one of which is illustrated), one or more private nodes 1206 (only one of which is illustrated) and one or more individual nodes 1208 (only one of which is illustrated).
Public node 1204a does not include a local controller. Coordination node 1202 includes an interactive node controller module 1222. Interactive node controller module 1222 includes interactive node control components 1264. Interactive node control components communicate with a primary display screen 1234a at public node 1204a and also with one or more participant devices 1226. The interactive node control components 1264 provide the functions described above in relation to the local controllers of public nodes 104 and 1104 for public node 1104.
Similarly, private node 1206a does not have a local controller. Instead the interactive node control components in the interactive node control module 1264 provide the functions of a local controller of private nodes 106 and 1106.
Individual node 1208a also does not have local controller components. Instead the interactive node control components 1264 in the interactive node control module 1122 provide the functions of a local controller of individual nodes 108 and 1108.
In system 1200, the interactive node control components 1264 in the coordination node 1202 operate as a virtual local controller for some or all of the interactive nodes in the system. For interactive nodes that utilize the interactive node control components 1264, the interactive node control components control a main display at each interactive node and communicates with and control each participant device at the interactive node.
In some embodiments, the interactive node control module 1122 may be integrated with other components in the coordination node. For example, interactive node control module 1122 may be integrated with a program control module 1214. In an embodiment that includes a coordination framework, the interactive node control module 1122 may be integrated, alternatively or additionally, be integrated with the central coordination components. In such embodiments, control of the main display
In various embodiments, the interactive node control module 1222 may operate in the same or a different location or the same or a different computing device than the coordination node. For example, in some embodiments, the interactive node control module may operate at a node within network 1210 and may communicate with the coordination node and with interactive nodes through the network. Some embodiments may include more than one interactive node control module with each interactive node control module controlling the operation of one or more interactive nodes.
In some embodiments, it may be desirable to provide one or more controller configuration modules that allow a participant device to be configured to operate in a particular manner to receive inputs from a participant. For example, it may be desirable to provide a configurable controller at a participant device that can be configured to provide different input controls such as buttons on the participant device for use during an interactive experience. The buttons and other controls can be configured to display a set of buttons that operate in a particular manner to allow a participant to enter information or otherwise provide inputs for an interactive experience.
Reference is next made to
a illustrates an example configuration of button controller into five regions 1474 and 1476a-d. Grid elements in region 1474, which includes a group of non-contiguous portions of the secondary display screen 1427, are identified by a value of 0 in each grid element. Grid elements in regions 1476a-d are identified corresponding values of 1, 2, 3 or 4 in each grid element of each region.
When no grid element in any of regions 1476 is pressed, a graphic under the heading “Button_Up” is displayed in each region. In this example, the secondary display on the participant device displays a bright red graphic overlying the grid elements in region 1, a bright green graphic overlying the grid elements in region 2, a bright blue graphic overlying the grid elements in region 3 and a bright yellow graphic overlying the grid elements in region 4. When any one of the grid elements in a region is touched by a participant, a corresponding “Button_Pressed” graphic is displayed in that region. For example, if any grid element in region 2 is touched, a graphic in the file Green_Dark.gif is displayed.
When a grid element in a region is touched, an action under the heading On_Click is triggered. In this example, if a grid element in region 3 is touched by a participant, an action titled Click_Blue is triggered. In this example, no actions are triggered when a participant stops touching a button. In other cases, actions may be defined for additional aspects of the operation of a button. For example, display and action properties for button may be defined for a click-and-hold gesture, in which a participant touches and holds a button for a defined time. Other gestures for which display and action properties may be defined may include double-click, swiping, touch-and-hold, tap-and-then-hold, multifinger gestures and any other gestures or actions that the participant device is capable to sensing.
For region 1474, no display images or operation are defined, essentially making region 1474 an inactive or null region.
Some regions or portions of the secondary display screen may be provided in a button controller for specific purposes such as providing instructions to a participant or for providing information such as a score. In
Reference is next made to
In this configuration, the appearance of each region remains constant when the corresponding grid elements are touched by a participant. When a grid element is touched a corresponding sound file based on a selected instrument is played until the participant stops touching the grid element. Regions 10, 11 and 12 trigger actions that change the particular sound files that will be played when any of regions 1 to 9 are touched by a participant. Region 13 provides a sustain function that results in a music file continuing to play even after the key that triggered the file is released. In effect, the sustain function suspends the “End Sound” action specified for the release of regions 1 to 9. Region 14 is configured as a text region to encourage a participant to play music using the controller. In some embodiments, the configuration file may include various options that affect the way in which a sound is played. For example, if a greater number of grid elements corresponding to a region are touched, then the corresponding sound file may be played at a louder volume. The pitch, timbre, attack, sustain, decay and other characteristics for each region may be defined in the configuration file and may vary depending on how a participant touches the various grid elements using various gestures.
Button controller 1470 may be configured in many other ways to provide different combinations and arrangements of regions, which may correspond to buttons when viewed by a participant. A participant may interact with the buttons with various gestures such as touching, holding, sliding, releasing and other gestures, in order to trigger corresponding actions.
Reference is next made to
Button controller 1470 includes one or more button controller interface components 1480. In some embodiments, each button controller interface is part of a system access application 1616 that is recorded in a non-transitory memory in the coordination node 1616. Each button controller interface component 1480 is configured to operate on one or more particular types of participant devices. For example, if a particular system access application is configured to operate on an Apple iPhone 4, then the button controller interface component 1480 in that system access application is correspondingly configured to operate on an Apple iPhone 4. This will typically require that the system access application and the button controller interface component are consistent with software, interface and other standards for the Apple iPhone 4. As described above, system 1600 may include a variety of system access applications corresponding to a variety of types of participant devices. Some or all of the system access applications may include a button controller interface component that is configured to operate on the corresponding type of participant device.
Button controller 1470 further includes one or more button controller configuration files 1482. A button controller configuration file 1482 configures a button controller interface component 1480 to operate in a specific manner, as described above. In various embodiments, a button controller configuration file 1482 may be adapted to configure one or more button controller interface components. For example, a common button controller interface configuration file may be used to configure a group of button controller interface components to appear and operate in a particular manner. For example, a group of button controller interfaces may be provided for different types of participant devices and the same button controller configuration file may be used to configure all of these button controller interfaces to operate in the same or a similar manner.
During operation of system 1600 to provide certain interactive experiences to a group of participants using participant devices, a button controller interface component corresponding to each participant device is installed on the participant device. Each button controller interface component 1480 is configured to operate in a desired manner using a corresponding button controller configuration file 1482.
In some embodiments, a suitable button controller interface component may be installed at each participant device 1626, 1636, 1608 as part of a system access application 1616 as described above in relation to step 1002 of method 1000 (
The specific button controller configuration file required to appropriately configure may be specified in an interactive application. During step 1004 of method 1000 (
In some embodiments, the participant device may discard a controller configuration file when an interactive experience ends. In such embodiments, the controller configuration file is transmitted to the participant device at the start of each corresponding interactive experience.
In this manner, the button controller may be installed on a variety of participant devices as part of a system access application in the form of an unconfigured button controller interface component, which is then configured as desired for a variety of interactive experiences. Creators of the interactive experiences may utilize existing configurations for the button controller or may provide a button controller configuration file that is specific to their interactive experiences. In some embodiments, such button controller configuration files may be provided in or with an interactive experience program. In some interactive programs, the configuration of a button controller may be varied during the program, or a participant may be permitted to choose between a variety of predetermined configuration or to design a personal configuration. Such options and selections made be a user may be recorded in a button controller configuration file stored on the participant's device. In some embodiments, multiple configuration files may be used to simultaneously to configure a button controller. For example, some aspects of a button controller may be configured based on a system provided configuration file that is specified in an interactive program while other aspects of the configuration are provided in a user specific configuration file stored on the participant device.
In some embodiments, a controller may be reconfigured dynamically during an interactive experience. The controller configuration file may include multiple configurations that can be interchanged during an interactive experience. In some embodiments, multiple configuration files may be transmitted to a participant device and an interactive program may specify which controller configuration file, and which part of a controller configuration file is to be used to configure a controller at any particular time. In some embodiments, controller configuration files and associated assets may be delivered to a participant device during an interactive experience, thereby adding to the number of configurations in which a controller may be used during an interactive experience.
Reference is next made to
The toss controller has a variety of configurable characteristics, which can be controlled by different gestures. For example,
In response to a participant's use of the toss controller, various data are transmitted to a coordination node from the participant's device, depending on the configuration of the toss controller. For example, a participant may be able to use a gesture to toss or throw an object with greater or lesser speed, impart spin to the tossed object by using a gesture or by touching the object in a particular manner or position. The toss controller configuration file may be used to define output data or parameters from the toss controller interface. Output data may be statically determined based on the participant's use of the configured toss controller interface displayed on the participant's device or may be dynamically determined. For example, the energy with which a paintball or other object is thrown may be dynamically determined by the length of time a participant holds the object before releasing, the speed with which the participant swipes the object across or along the secondary display screen on the participant's device or in another manner.
The output data from the controller corresponds to the participant's inputs to the controller. The central coordination components of an interactive program receive the output data and determine the resulting action or outcome in the interactive program. In some embodiments, the central coordination components may record the output data or a version of the output data.
Output data from a controller may be transmitted from a participant device to a local controller or to a coordination node in the same manner as other information that is transmitted during the operation of an interactive program.
The toss controller, like the button controller, can be configured or skinned to provide to appear and to operate differently.
Reference is next made to
The toss controller may be used for both the soccer ball kicking configuration shown in
Reference is next made to
Gyroscope controller 1910 is an example of a controller that uses input devices in a participant devices other than a touchscreen or a button or cursor (i.e. trackball or control wheel) interface. Various embodiments of controllers may allow for any type of input device or sensor in a participant device to be configured for use with an interactive experience. For example, temperatures sensors, humidity sensors, light sensors, proximity sensors, external sensors coupled to participant device through a wired or wireless coupling and any other type sensor may be configured for use with an interactive experience.
In some embodiments, several controllers may be operative at a participant device simultaneously. For example, in some embodiments, a gyroscope controller may provide for sensing and reporting of data only from a gyroscope sensor (or other orientation detection sensor). A button controller may be operative at the same time as a gyroscope controller at a participant device to provide buttons and sliders to allow a participant to provide inputs using virtual buttons on a touchscreen or using physical buttons on the participant device. Output data may be combined by participant components and transmitted to central components of an interactive program or output data from the different controllers may be independently transmitted to the central components.
Reference is next made to
Participant device 2020b illustrates a trace controller 2030. Trace controller configuration files are used to configure trace controller interfaces installed at various participant devices, as with the other controller described herein. In
The trace controller may be used in a variety of interactive experience. In system 2002, the trace controller is used to generate drawings for the Graffiti interactive experience. The trace controller may also be configured for other interactive experiences. For example, the trace controller could be configured for a document markup interactive experience. Trace controllers on various participant devices at one or more interactive nodes may be configured to illustrate a text or other document underlying a drawing region and the various participants may be able to view and mark up the document to suggest changes or for other reasons. The participants may simultaneously be engaged in a text chat, voice chat or other live communication that allows them to discuss various markups and thus simultaneously and interactively mark up the document.
A trace controller configuration file is used to configure the trace controller for various interactive experiences. In an unconfigured form, the trace controller may have a defined trace drawing region and may provide various pallette, navigation or control tools, all of which may be configured in the trace controller configuration file.
Participant device 2020c illustrates a word controller 2050. Word controller configuration files are used to configure trace controller interfaces installed at various participant devices, as with the other controller described herein. In
The word controller may be used in a variety of interactive experiences. For example, the trace controller could be configured for a document editing interactive experience. Word controllers on various participant devices at one or more interactive nodes may be configured to edit a text documents. Various participants may be able to view and edit the document to suggest changes or for other reasons. The participants may simultaneously be engaged in a text chat, voice chat or other live communication that allows them to discuss various markups and thus simultaneously and interactively mark up the document. In some embodiments, the trace and word controllers may be combined or may be used simultaneously be various participants to simultaneously mark up and edit a text document.
A word controller configuration file is used to configure the word controller for various interactive experiences. In an unconfigured form, the word controller may have a text editing region and may provide various pallette, navigation or control tools, all of which may be configured in the trace controller configuration file.
System 2002 illustrates the simultaneous use of the splat, trace and word controllers in a common Graffiti interactive experience. In some embodiments, the participant components of the interactive experience may provide one or more controls to allow a participant to select different controller for use during an interactive experience. For example a participant may wish to switch between adding drawings to the virtual wall and through paintballs on to the virtual wall to obscure drawings added by other participants.
Reference is next made to
Participant device 2126 illustrates an augmented reality controller 2120. Most participant devices include a photo/video camera and a viewfinder that allows the participant device to display what is in front of the camera. In some other cases, the participant device may include other ways of detecting or capturing what is in front of the participant device. Augmented reality controller may use images from the camera to determine the position and orientation of the participant device relative to a main display 2110. For example, the main display may include registration marks or elements that can be detected in an image taken by the camera of a participant device. An augmented reality controller may identify the registration marks or element in the image to determine the position and orientation of the participant device relative to the main display. In other embodiments, the augmented reality controller may use any portion of a main display to determine the position and orientation of a participant device.
Augmented reality controller 2120 allows for superimposition of virtual content on top of the content displayed on the main display 2110 if the participant device is held up to view the main display 2110. For example, during an interactive experience, if a participant holds up their participant device configured with augmented reality controller 2120 to view the main display 2110, the secondary display of the participant device display may include some or all of the content seen on the main display 2110 but also additional content customized for the participant device.
The augmented reality controller 2120 includes one or more augmented reality controller interface components that are configured to operate on one or more particular types of participant devices.
The augmented reality controller 2120 further includes one or more augmented reality controller configuration files. The configuration files configure the augmented reality controller interface components to operate in a specific manner. During operation of system 2102 to provide certain interactive experiences to a group of participants using participant devices, an augmented reality controller interface component corresponding to each participant device is installed on the participant device. Each augmented reality interface component is configured to operate in a desired manner using a corresponding augmented reality controller configuration file.
The configuration file may configure the augmented reality controller to detect when a participant device 2126 is held up to view the main display 2110. This may be based on factors such as, for example, spatial coordinates and orientation of the participant device with respect to the main display, which may be determined in various manner, including the use of elements of the main display as described above. In some embodiments, the participant device may detect that it is held up to view the main display, and communicate a request to enter augmented reality to the local controller. In some other embodiments, the spatial coordinates and the orientation of the participant device may be communicated to the local controller, where the local controller determines whether or not the participant device has been held up in the acceptable range of spatial coordinates and orientation to view the main display and whether augmented reality can be entered.
When augmented reality is entered, the display of the participant device displays additional content superimposed on top of the content seen in the primary display 2110. In various cases, the additional content is customized for the participant device.
a illustrates a main display illustrating an extra-terrestrial spaceship that is a part of a shoot-out interactive experience.
b also illustrates a simultaneous use of an augmented reality controller 2120 and a toss controller 2130. The participant can toss bullets from a gun or missile onto the superimposed alien targets. The bullets may be moved by the participant by touching gun or missile options, and tossing them towards the alien targets. In some other embodiments, other controllers, such as button controllers, may be used along with the augmented reality controllers to shoot at the alien targets. In some further embodiments, a voice controller (not shown) may be simultaneously used with the augmented reality controller 2120. The voice controller may enable the participants to fire arms and ammunitions by speaking into the corresponding participant device. For example, the participant may say “Fire Gun” or “Fire Missile” to cause the participant device to fire bullets from a gun or a missile to the alien targets.
Typically, the central components of the interactive program, which coordinate the interactions between participants in an interactive experience and the respective displays on the main displays and secondary displays on participant devices, will provide instructions to the participant components on each participant device to control and allow for the participant's participation in the interactive experience. In the present example, the central components receive data about the shooting of the alien targets. The central components communicate with the participant components at each participant node to show the outcome of the respective player's movements.
The controllers described above have been described in the context of multi-location interaction systems. In various embodiments, the controllers may be used in an interaction system that is operable at a single interactive location only, such as a movie theater or sporting venue where all participants are in a single location sharing a common main display (or multiple main displays that are positioned to allow participants in different locations in the venue to see one of the main displays, such as main displays on different sides of a scoreboard in a sporting venue).
Various controllers may permit the frequency at which output data from a controller interface at a participant device is transmitted to central coordination components for an interactive experience at a coordination node. For example, in system 2002, drawings and text from respectively, the trace controller and the word controller, are transmitted when the respective Submit buttons are touched. In other embodiments, the components of a drawing or a text objects may be transmitted as they are created or periodically (such as every 100 or 500 ms or every few seconds) such that viewers of the main display can observe drawings and text objects as they are created or modified. Various actions may be used to trigger the transmission of output data relating to some or all of the inputs provided at a controller interface. For example, the use of a submit button, a period of time, such as every few seconds or minutes or any other time period, when any input is provided (such as a change in the position of a participant device when the gyroscope controller is used). In some embodiments, the central coordination components of an interactive program may query a controller interface at a participant device to obtain updated output data from the participant device. In some embodiments, some or all of these update triggers may be combined.
Each of the controller described above may be installed on a participant device and configured as described in relation to the button controller and the other controllers. The features of the various controllers may be combined to form new controllers or hybrid controllers. In some cases, an interactive program may allow multiple controllers to be used to participate in an interactive experience. For example, a participant that prefers to use a touchscreen may use a controller that is configured to provide buttons and other inputs on a screen while a participant who prefers to use physical buttons or physical movement of a participant device may use a suitably configured controller for the same interactive experience. All of these controllers could be provided for the same participant device, allowing a participant to use a controller that the participant prefers for a particular interactive experience. The use of controllers at the participant devices, under the control of interactive programs, allows producers of interactive programs to make use of one or more pre-designed controllers that can be configured to provide specific input and output functions required for the interactive programs.
Reference is next made to
The operator console may be further configured to determine the course of the interactive experience, i.e. the manner in which the interactive experience progresses or evolves. For example, the operator console may be configured to start the interactive experience, interrupt the interactive experience and end the interactive experience etc. based on participant feedback.
The operator console may be further configured to determine which interactive experience to initiate, such as, for example, which game to launch for playing, or which advertisement to launch for viewing etc.
The operator console may also determine when to poll the participants to receive feedback on the interactive experiences. The participant feedback may be displayed in real-time, or the poll results may be aggregated and displayed at a later time. The operator console may also select certain participant feedback for display to some or all participants.
Operator console 2220 is coupled to the coordination node 2202 directly or indirectly through network 2210. As previously mentioned, the term “coupled” means that two or more devices are able to communicate such that data and other information can be transmitted between them. The operator console 2220 is configured to make determinations regarding the interactive experience and communicate them to the coordination node 2202. Based on the determinations, the course of the interactive experience may be interrupted, altered or allowed to continue.
For example, in a movie theater venue, the operator console 2220 may determine the popularity of the ongoing interactive experience. The popularity of the ongoing experience may be determined based on certain factors, such as, for example, by monitoring the number of new participants joining in the experience, the number of participants leaving the experience, the type of feedback received from the participants etc. The operator console 2220 may determine that the current interactive experience is not very popular with the participants. In response, the operator console 2220 may cause the experience to change by, for example, shortening the interactive experience, introducing opportunities within the experience to win rewards, switching to the scoreboard to motivate the participants etc. The operator console 2220 communicates decisions regarding the selected course of the interactive experience to the coordination node 2202. The coordination node 2202 coordinates and synchronizes the interactive experience shared by the interactive nodes.
The operator console 2220 may be deployed in the cloud, such as, for example, a public cloud, a private cloud or a hybrid cloud, and configured to control all downstream interactive experiences.
Reference is next made to
Public node includes a local controller 2322, an operator console 2320, a primary screen 2324 and a plurality of participant devices 2304a, 2304b and 2304c. Local controller 2322 is coupled to coordination node 2302, directly or indirectly, through network 2310. Local controller 2322 is coupled to the participant devices via local network 2309 available at the public venue. The local network 2309 may be a wireless network such as a Wi-Fi network, a Bluetooth network or any other type of communication network or system. The operator console 2320 is coupled to the local controller 2322 either directly or indirectly through network 2309.
The operator console 2320 may determine the course of the interactive experience by making determinations specific to the particular node in which it is deployed. For example, in a movie theater venue, the operator console 2320 may determine the direction of the interactive experience by determining which game to initiate. This may be determined based on factors, such as, for example, gender distribution of the participants, age group of the participants etc.
The operator console 2320 communicates the decided course of the interactive experience to the local controller 2322. The local controller 2322 may synchronize the interactive experience with the coordination node 2302, and control the display of the primary screen 2324 and participant devices 2304a, 2304b and 2304c.
Reference is next made to
Although not shown, the operator console 2420 can be similarly deployed in other types of interactive nodes, such as, for example, a private or individual node.
The operator console 2420 may be configured to determine the course of the interactive experience by determining when the participant devices 2404a, 2404b and 2404c and/or the primary screen 2424 displays the scoreboard or when the participants are polled for feedback etc. As previously mentioned, the operator console 2420 may also decide when to stop the interactive experience, which interactive experience to start and when to interrupt the interactive experience etc. In some embodiments, an interaction system comprises more than one operator consoles. The multiple operator consoles may be deployed at the same location, or different locations within the interaction system. For example, in some cases, one operator console may be coupled to the coordination node, such as in
The present invention has been described here by way of example only. Various modification and variations may be made to these exemplary embodiments without departing from the spirit and scope of the invention.
Number | Date | Country | |
---|---|---|---|
61545984 | Oct 2011 | US |
Number | Date | Country | |
---|---|---|---|
Parent | PCT/CA2012/000938 | Oct 2012 | US |
Child | 14249980 | US |