There are many ways that a user can interact with software and typically a user controls software via a keyboard and mouse or touch screen and for computer games, a user may use a games controller (which may be handheld or detect body movement). The user input device used is dependent upon the platform on which the game is being played (e.g. computer, games console or handheld device). A number of computer games have been developed in which gameplay is enabled (or unlocked) through the use of physical character toys which are placed on a custom base connected via a USB lead to a games console. By placing different toys on the custom base, different gameplay is enabled.
The embodiments described below are not limited to implementations which solve any or all of the disadvantages of known interactive software experiences and apparatus for interacting with interactive software experiences.
The following presents a simplified summary of the disclosure in order to provide a basic understanding to the reader. This summary is not an extensive overview of the disclosure and it does not identify key/critical elements or delineate the scope of the specification. Its sole purpose is to present a selection of concepts disclosed herein in a simplified form as a prelude to the more detailed description that is presented later.
Methods of interacting with a story in a virtual world through manipulation of physical play pieces are described. An interactive software experience presents an interactive story to a user where the direction (and/or progression) of the story depends on user actions with physical play pieces. In an embodiment these actions are sensed by the physical play pieces themselves and sensed input data is communicated to the interactive software experience from the play pieces. The interactive story comprises one or more branching points at which there are a number of possible outcomes and one of the possible outcomes is selected at a branching point based on the sensed input data. The interactive story is presented to the user, for example using sounds and/or images.
Many of the attendant features will be more readily appreciated as the same becomes better understood by reference to the following detailed description considered in connection with the accompanying drawings.
The present description will be better understood from the following detailed description read in light of the accompanying drawings, wherein:
Like reference numerals are used to designate like parts in the accompanying drawings.
The detailed description provided below in connection with the appended drawings is intended as a description of the present examples and is not intended to represent the only forms in which the present example may be constructed or utilized. The description sets forth the functions of the example and the sequence of steps for constructing and operating the example. However, the same or equivalent functions and sequences may be accomplished by different examples.
An interactive software experience is described below which comprises an interactive story (e.g. an interactive adventure). The interactive story comprises one or more pre-defined branching points, i.e. points in the story where the story line can take one of a number of different paths, and both the position of the branching points along the story line and the possible outcomes (i.e. the different paths) may be pre-defined. The term ‘interactive story’ is used herein to refer to an interactive software experience which provides limited user interaction at pre-defined branching points and in between the branching points provides segments of audio and/or images (e.g. video or still images) where there is no user interaction (e.g. where the user hears and/or views the story). An interactive story is different from a computer game which allows (and requires) much more interaction and does not provide a limited number of interaction points (the pre-defined branching points).
The software receives sensed inputs which correspond to a user's action(s) with a physical play piece (which may also be referred to as a game piece), such as lifting up the piece or moving a part of the play piece (e.g. where the play piece has movable limbs). At a pre-defined branching point (and in various examples, at each pre-defined branching point), the received sensed inputs are used to determine the path that is taken (i.e. to select a path from the possible paths at the pre-defined branching point). At a branching point, only one of the possible paths going forward can be selected and therefore can be taken; although in some examples more than one branching point may occur at the same time (e.g. a branching point relating to the action of a first character and a branching point associated with a second character), with one outcome being taken from each branching point and the combination of outcomes defining the subsequent direction of the interactive story. The story is presented to the user (e.g. using sound and/or images) via a computing device which may be separate from the physical play pieces or integrated into one of the physical play pieces.
For example, in an interactive story about a battle between two knights, a user may have a play piece that represents the first knight and a play piece that represents a second knight. At a pre-defined branching point in the interactive story, the winner of a battle may be determined based on which knight the user is currently (or was most recently) holding, manipulating, or otherwise interacting with (as detected based on received sensed input data) and the story may progress based on this determination. For example, the knight which is lying down (e.g. horizontal) may be the loser and the knight which remains upright (e.g. vertical) is the winner. In various examples, a weapon that is held by the winning knight (e.g. a toy weapon) may be represented visually in the continuation of the story (to maintain continuity between the physical play and the interactive story.
As the inputs to the interactive story are by way of user actions with a physical play piece, the play piece and system offers a new user input device and method. Such devices/methods make it easier for a user to interact with the interactive software and may be particularly suited to less dexterous users (e.g. younger users and more elderly users) who may find use of a mouse/keyboard/games controller difficult to manipulate/control. Furthermore, through use of a combination of an interactive story and physical play pieces (which may be shaped to represent characters, objects or environments in the interactive story) the overall user experience is enhanced with the resulting experience being in both the virtual world and the real world.
As the direction taken by the interactive story (and hence the progression of the story) is dependent upon user actions (with a physical play piece), when a user plays the same interactive story again the outcome of the story is likely to be different. This increases the re-usability of the interactive story (e.g. compared to television programs where the outcome within an episode is always the same).
The individual physical play pieces which form the set 103-104 may be shaped to represent a character, object or environment within the interactive story 106, such that the play piece looks the same as (or similar to) the corresponding entity in the interactive story (where this is presented in graphical form to the user).
In various examples, the physical play pieces may be arranged to act both as physical play pieces and beads which fit onto a connecting element (e.g. to form a bracelet, necklace or other fashion or wearable item). Such play pieces may comprise a hole through the piece to enable them to be threaded onto the connecting element or other means (e.g. a clip) to attach them onto the connecting element.
In the first example play system 101 shown in
In the second example play system 102 shown in
In a further example play system, the set of physical play pieces may comprise one or more active pieces and one or more passive pieces. In such an example, the active play pieces may detect their own motion and communicate with the interactive story and the motion of the passive pieces may be sensed by the sensing device 116 within the computing device or by a sensing device in a proximate active piece (which then communicates the sensed action to the interactive story 106).
In the story shown in
Although the first representation 200 in
In a first implementation, the method proceeds as indicated by arrow 408 with the next segment of the interactive story being presented to a user following the selection of the outcome from a pre-defined branching point. For example, referring back to
In a second implementation, the method proceeds as indicated by dotted arrow 410. In this second implementation all the sensed inputs are received (in block 402) and outcomes determined (in block 404) prior to presenting any of the interactive story segments to the user (in block 406). In the second implementation, the story may be influenced by another game or activity that was played previously. In the second implementation, there may be a time delay and/or location change between the receiving of the sensed inputs (in block 402) and the presenting of the interactive story segments (in block 406). In various examples, the audience may also change—for example two children may play together and the sensed inputs may be received and then they may subsequently watch the interactive story themselves and also share it with a relative or friend who is remote from them (e.g. living in another house, town, country, etc.).
Further implementations may comprise a combination of the first and second implementations described above, with some of the outcomes being determined (in block 404) before starting to present the interactive story segments (as in the second implementation) and other outcomes being determined later (as in the first implementation).
In various examples, if no suitable input is received (e.g. no input is received or none of the inputs corresponds to any of the available outcomes) to enable selection of an outcome (in block 404), an outcome may be selected automatically. The automatically selected outcome may be chosen in any way, including for example a fixed (default) outcome, a random outcome, or a cyclical selection of one of the available outcomes (cycling over the course of subsequent executions of the interactive story).
Where the first implementation is used (in its entirety or in part), there is more user engagement during the interactive story playback than where the second implementation is used. Depending upon the type of interactive story, this may result in the first implementation being more educational. For example, a user may be asked a question at a pre-defined branching point and then depending upon their reaction (e.g. during a pause), an outcome may be selected. This enables the user to engage with the portions of story (e.g. video) that they are watching.
The segments of the interactive story which are presented to the user may comprise sound and/or images, i.e. audio and/or visual effects, and in various examples other effects such as haptic feedback, smells, etc. For example, a segment may be an audio clip or a video clip (which may include a sound track or may be silent). In various examples the segments comprise pre-recorded (or pre-created) audio/video clips and in such examples a user may not be able to interact with the interactive story except at the pre-defined branching points 202. In other examples, however, the segments may not be pre-recorded but may be generated dynamically (e.g. dependent upon the particular play pieces within a user's play set) and again a user may not be able to interact with the interactive story except at the pre-defined branching points. In various examples, although a segment may not be pre-recorded/pre-created, it may be generated based on a pre-defined story section (e.g. a pre-defined description of what happens in the segment) and a characteristic of one or more play pieces, where the characteristic may be pre-defined (e.g. an environment which corresponds to a base play piece 122 or a character which corresponds to a figurine play piece 120) or linked to an external source (e.g. the user, a real world place, a TV show, etc. as described in more detail below). In such examples, a user may also not be able to interact with the interactive story except at the pre-defined branching points.
When making a selection (in block 404) based on a sensed input, the selection may be made based on inputs sensed (or sensed inputs received) during presentation of the previous segment (e.g. as described above with reference to the first implementation example) or based on sensed inputs which were received prior to presenting any of the story to the user (e.g. as described above with reference to the second implementation example). In various examples, the interactive story may store some or all of the sensed inputs (block 412) such that future interactive stories presented to the user are based, in part, on sensed inputs from previous interactive stories. This enables stories to develop over a period of time based on a longer history of user behavior (e.g. in the form of user actions with play pieces).
In addition to or instead of storing sensed inputs for use in future stories (as described above), sensed inputs or presented segments for a story may be stored (in block 412) to enable an interactive story to be replayed subsequently in response to a user input (block 414). When replaying an interactive story (in block 414) there may be no user interaction with the story (i.e. any sensed inputs received would not affect the interactive story which is being replayed). This replay feature may, for example, enable a user to rewind through a story and replay a part again. In various examples, a user may be able to rewind through a story to a previous branching point and then start to interact with the story once more (as indicated by dotted arrow 416). In such an example, a user may be able to explore different possible outcomes for an interactive story (e.g. by interacting differently with the play pieces, subsequent selections in block 404 may be different from the original story).
Although
As described above, the physical play pieces (e.g. in sets 103-104 shown in
In an example, groups of three possible outcomes may each relate to a different environment (e.g. three ‘castle’ outcomes, three ‘beach’ outcomes, three ‘snowy’ outcomes) and the candidate outcomes may be restricted to those which correspond to an environment piece used by the user (e.g. castle landscape, beach landscape and/or snowy landscape). In this way, the selection of an outcome (in block 404) may be described as being dependent upon both a sensed input and one or more play pieces being used by the user. For example, if a user has interacted with the ‘castle’ base piece most recently of all base pieces (i.e. all available pieces that correspond to an environment), the candidate set of outcomes from which a selection is made (in block 404) comprises the three ‘castle’ outcomes.
In various examples, as described above, a user may be able to change the story by both interacting with play pieces which are characters and objects (e.g. moving them around) and assembling an environment from one or more environment (or base) play pieces.
In various examples, a physical play piece may be linked to a real world person or environment. For example, a play piece may be linked to the user or to a real world place. In various examples, the subset of candidate outcomes (from which a selection is made in block 404) may be limited based on a characteristic of the linked person/place. For example, based on the name of the person/place or the current weather at the place. In other examples, in addition to or instead of modifying the subset of candidate outcomes, the story itself (e.g. the segments of the interactive story) may be modified to reflect a characteristic of the linked person/place. For example, the interactive story may be modified to include a character with the same name as the user or a friend/relative of the user and/or the interactive story may be modified such that the weather reflects the current weather at the place (e.g. if it is raining at the user's location, it is raining in the interactive story). In various examples where a physical play piece is linked to a real world person, such as the user themselves, the user's real-world activity (e.g. their activity over a day/week/longer) may influence the interactive story (e.g. be used as a sensed input to the interactive story), e.g. eating healthily, exercising, attending an event or social gathering, clothing/fashion choices, etc.
In various examples even where a physical play piece is not linked to a real world place, the subset of candidate outcomes may be limited by a characteristic which mimics the real world. For example, the subset of candidate outcomes may be outcomes for a particular time of day/year (e.g. month, season) and the characteristic may change as the interactive story progresses to mimic the real world progression of time.
In various examples, a physical play piece may be linked to something other than a real world person or environment, such as to a fictional character/place in a television program. In such an example, the subset of candidate outcomes (from which a selection is made in block 404) may be limited based on a characteristic of the linked person/place. For example, based on the name of the fictional person/place or the weather at the fictional place in a recently broadcast episode. In other examples, in addition to or instead of modifying the subset of candidate outcomes, the story itself (e.g. the segments of the interactive story) may be modified to reflect a characteristic of the linked person/place (e.g. based on a recently broadcast episode of the television program). For example, the interactive story may be modified to include a character with the same name as the fictional character or another character in the same TV program and/or the interactive story may be modified such that the weather reflects the weather at the fictional place in a recently broadcast episode (e.g. if it was raining in the last broadcast episode, it is raining in the interactive story) or the weather at the location of the fictional character in a recently broadcast episode.
In the examples described above where a play piece is linked to an external source (e.g. the user, a real world place, a TV show, etc.) the segments used in presenting the interactive story (in block 406) may not be pre-created but instead may be created dynamically (in block 404 or following block 404) based on a pre-defined story section and a characteristic of the external source (e.g. a name, the weather, etc.). In such an example, an outcome may be selected (in block 404) from a set of candidate outcomes, where each candidate outcome corresponds to a pre-defined story section (e.g. an outline of what happens in a segment) and then the segment may be generated dynamically based on the selected section and the characteristic so that it can be presented to the user (in block 406).
As described above, the selection of a possible outcome (in block 404) is based on a sensed input (received in block 402), where the sensed input corresponds to a user action with a physical play piece. The user action may, for example, be:
In various examples, the sensed input may relate to one or more of:
As described above, in some examples, a set of play pieces may also comprise one or more passive pieces and user actions with a passive piece may be sensed by the computing device running the interactive story or by a proximate play piece. In examples where the active piece 500 is configured to detect user actions with a proximate play piece, the active play piece may detect a user action with another play piece using the sensor(s) 506 (block 514) and transmit this data to the interactive story (in block 512).
The transmitter 504 in a play piece 500 may be a wireless device and may for example use Bluetooth® Low Energy (BLE) or other short range wireless protocol.
In various examples, the play pieces may themselves be modular and be formed from two or more modules.
The processor 710 within the core play piece 702 is arranged to detect user actions using the sensor(s) 709. In various examples, the processor may also collect the IDs (which may be a unique ID or an ID shared with other identical-looking modules, e.g. an ID for a particular shape or type of module) of each of the modules connected together to form a coherent physical whole play piece. The processor 710 may be a microprocessor, controller or any other suitable type of processor for processing computer executable instructions to control the operation of the core play piece in order to detect user actions on the core module and in some examples also on connected peripheral modules (e.g. where a peripheral module does not comprise sensor(s) and/or a processor and wireless module. Core and peripheral modules may be connected together in any way.
In various examples, the processor 710 may also collect the IDs of connected modules. The module IDs may be collected from each of the connected modules directly (e.g. via a bus) or each module may collect information on its neighbors with the core module aggregating the data provided by its direct neighbor play pieces. In various examples, these module IDs may be collected via the data connection provided by the connectors 712 and in other examples, another means may be used (e.g. NFC, QR codes or computer vision). Where other means are used, the core module 702 may comprise additional hardware/software such as an NFC reader module or a camera or other image sensor to collect the module IDs of all the connected play pieces. In addition to collecting the module IDs of the connected module (e.g. to generate a set or list of connected modules), the core module may detect the topology of the arrangement of play pieces.
Each peripheral module 704 comprises one or more connectors 712, 714 to physically attach the module to another module to form a coherent physical whole play piece. The peripheral module 704 may also comprise one or more sensors 709 for detecting user actions. The peripheral module 704 may further comprises electrical connections 724 (e.g. in the form of a bus comprising 2 wires, data and ground) between the two connectors 712, 714. In the example shown in
Although not shown in
Although not shown in
Examples of sensors 709 that may be used in modules include: temperature sensors, vibration sensors, accelerometers, tilt sensors, gyroscopic sensors, rotation sensors, magnetometers, proximity sensors (active/passive infrared or ultrasonic), sound sensors, light sensors, etc.
It will be appreciated that the modules 702, 704 shown in
Computing-based device 800 comprises one or more processors 802 which may be microprocessors, controllers or any other suitable type of processors for processing computer executable instructions to control the operation of the device in order to perform the methods described herein (e.g. generate an interactive story by selecting paths at pre-defined branching points and present the story to the user). In some examples, for example where a system on a chip architecture is used, the processors 800 may include one or more fixed function blocks (also referred to as accelerators) which implement a part of the method of generating and presenting an interactive story in hardware (rather than software or firmware).
Alternatively, or in addition, the functionality described herein can be performed, at least in part, by one or more hardware logic components. For example, and without limitation, illustrative types of hardware logic components that can be used include Field-programmable Gate Arrays (FPGAs), Program-specific Integrated Circuits (ASICs), Program-specific Standard Products (ASSPs), System-on-a-chip systems (SOCs), Complex Programmable Logic Devices (CPLDs).
Platform software comprising an operating system 804 or any other suitable platform software may be provided at the computing-based device to enable application software, such as an interactive software experience comprising an interactive story 106 to be executed on the device. As shown in
The computer executable instructions may be provided using any computer-readable media that is accessible by computing based device 800. Computer-readable media may include, for example, computer storage media such as memory 812 and communications media. Computer storage media, such as memory 812, includes volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes, but is not limited to, RAM, ROM, EPROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information for access by a computing device. In contrast, communication media may embody computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave, or other transport mechanism. As defined herein, computer storage media does not include communication media. Therefore, a computer storage medium should not be interpreted to be a propagating signal per se. Propagated signals may be present in a computer storage media, but propagated signals per se are not examples of computer storage media. Although the computer storage media (memory 812) is shown within the computing-based device 800 it will be appreciated that the storage may be distributed or located remotely and accessed via a network or other communication link (e.g. using communication interface 814).
The communication interface 814 may be arranged to receive data from one or more physical play pieces and may comprise a wireless receiver. In various examples the communication interface 814 receives data from the physical play pieces directly and in other examples, the communication interface 814 may receive data from the play pieces via an intermediary device.
In examples where the computing-based device 800 is integrated within a play piece (e.g. as shown in
The computing-based device 800 may also comprise an input/output controller 816. The input/output controller may be arranged to output presentation information for use in presenting the interactive story to the user (e.g. in block 406) to a presentation device 818 (e.g. a display or speakers) which may be separate from or integral to the computing-based device 800. The input/output controller 816 may also be arranged to receive and process input from one or more devices, such as a sensing module 822 (which may be internal or external to the computing based device 800) or a user input device 824 (e.g. a mouse, keyboard, camera, microphone or other sensor). The sensing module 822 may, for example, be used to detect user actions with passive pieces (as described above). In some examples the user input device 824 may detect voice input, user gestures or other user actions and may provide a natural user interface (NUI). This user input may be used to further control the interactive story. In an embodiment the presentation device 818 may also act as the user input device 824 if it is a touch sensitive display device. The input/output controller 816 may also output data to devices other than the display device, e.g. a locally connected printing device (not shown in
Any of the input/output controller 816, presentation device 818, sensing module 822 and the user input device 824 may comprise NUI technology which enables a user to interact with the computing-based device in a natural manner, free from artificial constraints imposed by input devices such as mice, keyboards, remote controls and the like. Examples of NUI technology that may be provided include but are not limited to those relying on voice and/or speech recognition, touch and/or stylus recognition (touch sensitive displays), gesture recognition both on screen and adjacent to the screen, air gestures, head and eye tracking, voice and speech, vision, touch, gestures, and machine intelligence. Other examples of NUI technology that may be used include intention and goal understanding systems, motion gesture detection systems using depth cameras (such as stereoscopic camera systems, infrared camera systems, RGB camera systems and combinations of these), motion gesture detection using accelerometers/gyroscopes, facial recognition, 3D displays, head, eye and gaze tracking, immersive augmented reality and virtual reality systems and technologies for sensing brain activity using electric field sensing electrodes (EEG and related methods).
Although the present examples are described and illustrated herein as being implemented in a play system (comprising a set of physical play pieces and an associated interactive story) as shown in
An aspect provides a method comprising: receiving sensed input data corresponding to a user action with a physical play piece; selecting an outcome at a pre-defined branching point in an interactive story based on received sensed input data and presenting the interactive story to the user via a presentation device. In such examples, the interactive story comprises one or more pre-defined branching points and a pre-defined branching point has two or more possible outcomes from which the outcome is selected.
In various examples, presenting the interactive story to the user comprises: presenting an interactive story segment corresponding to the selected outcome to the user.
In various examples the interactive story segment is pre-created. The use of pre-created story segments reduces the processing power required to implement the method (as processing power is not required to generate the sound/images dynamically) and this may make it particularly suited to computing devices which are resource constrained (e.g. handheld computing devices or computing devices which are integrated within a physical play piece).
In other examples, the interactive story segment is generated based on a pre-defined story section corresponding to the selected outcome and a characteristic of a physical play piece. In various examples, the characteristic of a physical play piece comprises a link to an external data source and the method further comprises: accessing the external data source. This combines use of stored data (which reduces computational effort) with the ability to personalize the story for a user based on their personal characteristics (e.g. their friends, family, location, interests, favorite TV shows, etc.).
In examples where the interactive story segment is generated based at least in part on stored data (e.g. an entire stored segment or a stored story outline), a user may not be able to interact with the story except at the pre-defined branching points.
In various examples, the method may further comprise storing a history of sensed input data or presented interactive story segments; and in response to a user input, replaying a part of the interactive story to the user. This enables a user to rewind the story and play some or all of it again, with the replayed part being the same it was the first time that it was played (unlike if decisions were made afresh at each branching point). This is unlike a user's interaction with a computer game which is transitory and cannot easily be reviewed subsequently.
In various examples, the interactive story comprises a pre-defined start point and one or more pre-defined end points and has a duration which is fixed prior to presenting the interactive story to the user. This provides predictability to the user, which is unlike a typical computer game where the game may last a variable amount of time dependent upon how well the user plays the game.
In various examples, the user action comprises motion of the physical play piece.
Another aspect provides a system comprising a physical play piece, the active physical play piece comprising: a sensor operative to detect a user interaction with the physical play piece; and an output arranged to transmit data describing the detected user interaction to an associated interactive software experience, the associated interactive software experience comprising an interactive story and the interactive story comprising one or more pre-defined branching points.
In various examples, the sensor is operative to detect one or more of: a proximate physical play piece, an orientation of the physical play piece and a position where the user is touching the physical play piece.
In various examples, the output is a transmitter and the active physical play piece further comprises a sensor operative to detect a user interaction with a proximate physical play piece and wherein the wireless transmitter is further arranged to transmit data describing the detected user interaction with the proximate physical play piece to the associated interactive software experience.
In various examples, the active physical play piece has a shape and/or appearance which corresponds to a character, object or environment in the interactive story. This enhances the user experience by making the real world and the virtual world activities correspond more closely (i.e. the user motion of play pieces and the interactive story which is presented are more similar).
In various examples, the active physical play piece further comprises: a presentation device; and an outcome selection engine operative to select an outcome at a pre-defined branching point in the interactive story based on a detected user interaction; and a presentation engine operative to present the interactive story to the user via the presentation device. By integrating the computing-based device which presents the interactive story into a physical play piece, a separate computing-based device is not required. This may further enhance the user experience and make the experience more suited to younger users (e.g. children) who may not be able to operate a handheld or desktop computer or games console or who may not have access to such a device.
In various examples, the active physical play piece further comprises: a memory arranged to store a plurality of pre-created interactive story segments, each segment corresponding to a possible outcome at a pre-defined branching point in the interactive story.
A further aspect provides a computing-based device comprising: a processor; a presentation device; a memory comprising device-executable instructions which when executed cause the processor to: select an outcome at a pre-defined branching point in an interactive story based on received sensed input data, the received sensed input data corresponding to a user action with a physical play piece and the interactive story comprising one or more pre-defined branching points and a pre-defined branching point having two or more possible outcomes from which the outcome is selected; and present the interactive story to the user via the presentation device.
In various examples, the memory is further arranged to store a plurality of pre-created interactive story segments, each segment corresponding to a possible outcome at a pre-defined branching point in the interactive story. In such examples, presenting the interactive story to the user comprises: presenting a pre-created interactive story segment to the user, the pre-created interactive story segment corresponding to the selected outcome.
In various examples, the memory is further arranged to store a plurality of pre-defined interactive story sections, each section corresponding to a possible outcome at a pre-defined branching point in the interactive story. In such examples, presenting the interactive story to the user comprises: generating an interactive story segment based on a characteristic of a physical play piece and a pre-defined interactive story section corresponding to the selected outcome; and presenting the interactive story segment to the user.
In various examples, the characteristic of the physical play piece is linked to an external data source. In such examples, presenting the interactive story to the user further comprises accessing the external data source to obtain information which is used in generating the interactive story segment.
In various examples, the computing-based device further comprises a communication interface operative to receive sensed input data from a physical play piece.
In various examples, the computing-based device further comprises a sensing module operative to detect a user action with a physical play piece and generate the sensed input data. This enables the computing-based device to sense user actions with passive play pieces (e.g. play pieces which do not comprise sensors to detect when a user interacts with them).
Another aspect provides a system comprising a physical play piece, the active physical play piece comprising: a means for detecting a user interaction with the physical play piece; and a means for communicating data describing the detected user interaction to an associated interactive software experience, the associated interactive software experience comprising an interactive story and the interactive story comprising one or more pre-defined branching points.
A yet further aspect provides a computing-based device comprising: means for selecting an outcome at a pre-defined branching point in an interactive story based on received sensed input data and means for presenting the interactive story to the user via the presentation device. In such examples, the received sensed input data corresponds to a user action with a physical play piece and the interactive story comprises one or more pre-defined branching points and a pre-defined branching point has two or more possible outcomes from which the outcome is selected.
The term ‘computer’ or ‘computing-based device’ is used herein to refer to any device with processing capability such that it can execute instructions. Those skilled in the art will realize that such processing capabilities are incorporated into many different devices and therefore the terms ‘computer’ and ‘computing-based device’ each include PCs, servers, mobile telephones (including smart phones), tablet computers, set-top boxes, media players, games consoles, personal digital assistants and many other devices.
The methods described herein may be performed by software in machine readable form on a tangible storage medium e.g. in the form of a computer program comprising computer program code means adapted to perform all the steps of any of the methods described herein when the program is run on a computer and where the computer program may be embodied on a computer readable medium. Examples of tangible storage media include computer storage devices comprising computer-readable media such as disks, thumb drives, memory etc. and do not include propagated signals. Propagated signals may be present in a tangible storage media, but propagated signals per se are not examples of tangible storage media. The software can be suitable for execution on a parallel processor or a serial processor such that the method steps may be carried out in any suitable order, or simultaneously.
This acknowledges that software can be a valuable, separately tradable commodity. It is intended to encompass software, which runs on or controls “dumb” or standard hardware, to carry out the desired functions. It is also intended to encompass software which “describes” or defines the configuration of hardware, such as HDL (hardware description language) software, as is used for designing silicon chips, or for configuring universal programmable chips, to carry out desired functions.
Those skilled in the art will realize that storage devices utilized to store program instructions can be distributed across a network. For example, a remote computer may store an example of the process described as software. A local or terminal computer may access the remote computer and download a part or all of the software to run the program. Alternatively, the local computer may download pieces of the software as needed, or execute some software instructions at the local terminal and some at the remote computer (or computer network). Those skilled in the art will also realize that by utilizing conventional techniques known to those skilled in the art that all, or a portion of the software instructions may be carried out by a dedicated circuit, such as a DSP, programmable logic array, or the like.
Any range or device value given herein may be extended or altered without losing the effect sought, as will be apparent to the skilled person.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.
It will be understood that the benefits and advantages described above may relate to one embodiment or may relate to several embodiments. The embodiments are not limited to those that solve any or all of the stated problems or those that have any or all of the stated benefits and advantages. It will further be understood that reference to ‘an’ item refers to one or more of those items.
The steps of the methods described herein may be carried out in any suitable order, or simultaneously where appropriate. Additionally, individual blocks may be deleted from any of the methods without departing from the spirit and scope of the subject matter described herein. Aspects of any of the examples described above may be combined with aspects of any of the other examples described to form further examples without losing the effect sought.
The term ‘comprising’ is used herein to mean including the method blocks or elements identified, but that such blocks or elements do not comprise an exclusive list and a method or apparatus may contain additional blocks or elements.
The term ‘subset’ is used herein to refer to a proper subset such that a subset of a set does not comprise all the elements of the set (i.e. at least one of the elements of the set is missing from the subset).
It will be understood that the above description is given by way of example only and that various modifications may be made by those skilled in the art. The above specification, examples and data provide a complete description of the structure and use of exemplary embodiments. Although various embodiments have been described above with a certain degree of particularity, or with reference to one or more individual embodiments, those skilled in the art could make numerous alterations to the disclosed embodiments without departing from the spirit or scope of this specification.