There are many ways that a user can interact with a computer game and typically a user controls the game via a keyboard and mouse, games controller (which may be handheld or detect body movement) or touch screen, dependent upon the platform on which the game is being played (e.g. computer, games console or handheld device). A number of games have also been developed in which gameplay is enabled (or unlocked) through the use of physical character toys which are placed on a custom base connected to a games console. By placing different toys on the custom base, different gameplay is enabled.
The embodiments described below are not limited to implementations which solve any or all of the disadvantages of known apparatus for interacting with interactive entertainment experiences, such as computer games.
The following presents a simplified summary of the disclosure in order to provide a basic understanding to the reader. This summary is not an extensive overview of the disclosure and it does not identify key/critical elements or delineate the scope of the specification. Its sole purpose is to present a selection of concepts disclosed herein in a simplified form as a prelude to the more detailed description that is presented later.
An interactive play set is described which comprises one or more active physical play pieces. In an embodiment, each active physical play piece receives data from one or more other play pieces in the play set and uses this data to generate proximity data. The proximity data describes distances between two or more play pieces from which it receives data to three or more proximity levels. The proximity data is then transmitted to an interactive entertainment experience (such as a game) which reflects changes in proximity level. For example, the changes may be visible in a graphical user interface or another play piece or audible to a user.
Many of the attendant features will be more readily appreciated as the same becomes better understood by reference to the following detailed description considered in connection with the accompanying drawings.
The present description will be better understood from the following detailed description read in light of the accompanying drawings, wherein:
Like reference numerals are used to designate like parts in the accompanying drawings.
The detailed description provided below in connection with the appended drawings is intended as a description of the present examples and is not intended to represent the only forms in which the present example may be constructed or utilized. The description sets forth the functions of the example and the sequence of steps for constructing and operating the example. However, the same or equivalent functions and sequences may be accomplished by different examples.
As described in more detail below, the physical play pieces (and hence the play set) provide an intuitive user input means for the associated interactive entertainment experience. This may be particularly useful for younger or less dexterous users for whom traditional input devices (e.g. mouse, keyboard, joystick) may be more difficult to manipulate. This may also provide a more enjoyable or natural experience than interacting solely through a traditional computer input device such as a keyboard, game controller or touchscreen.
In various examples (e.g. as shown in
By reflecting changes in proximity data within an interactive entertainment experience, the user experience of playing with the play set is enhanced (e.g. through visual or audio effects triggered by the interactive entertainment experience). In various examples, the interactive entertainment experience may enable a multi-user experience, including in some examples where the users are remote from each other (e.g. not in the same room) and where one or more of the users have a play set as described herein.
The associated interactive entertainment experience 106, which may comprise a computer game, virtual world or media entertainment experience (e.g. an interactive movie), runs on a computing device 108 which may be a desktop or laptop computer, a games console, a handheld computing-device (e.g. a smart phone, tablet computer or handheld games console), a cloud-based computing device, a dedicated game hardware device (i.e. a device that is dedicated to running the interactive entertainment experience 106, rather than a console that can run many different games and which in various examples may be part of the play set, e.g. an active play piece) or any other computing device. In various examples, the computing device 108 may be integrated within one of the active physical play pieces. In the example shown in
In various examples all the play pieces in a play set are active play pieces. In other examples, however, a play set may comprise one or more active play pieces and one or more other play pieces which are not active play pieces. These other play pieces may be semi-active play pieces (which transmit signals to other play pieces but do not generate proximity data) or passive play pieces (which do not transmit any signals). Consequently, in generating proximity data for two or more play pieces in the set (using signals received by the receiver), an active play piece generates proximity data describing the distance between itself and an active or semi-active play piece or between two other play pieces each of which is either an active or semi-active play piece. In various examples, an active piece may also be arranged to sense the position of a passive piece (i.e. a piece which does not transmit any signals), in which case an active play piece may also be able to generate proximity data describing the distance between an active or semi-active play piece and the passive play piece.
In various examples, an active play piece may also be the dedicated game hardware device which runs the interactive software experience 106. In such an example, the transmitter 206 may be omitted from that particular active play piece.
In various examples, an active play piece 200 may also be arranged to receive signals (e.g. commands) from the interactive entertainment experience 106 (block 229). These signals may be received via a receiver 202 which is also arranged to receive signals from other play pieces or a separate receiver 202 may be provided (not shown in
The distance module 204 is arranged to generate proximity data describing distances between two or more play pieces sending signals (i.e. active or semi-active play pieces) based on signals received via the receiver 202 (block 226). There are many different ways in which distances may be determined and various examples are described in more detail below. The distance module 204 may generate proximity data to at least three discrete, non-overlapping levels of proximity (which may also be referred to as ‘distance states’), where these levels (or states) may be:
In various examples, the distance module 204 may generate proximity data to more than three discrete levels of proximity and in an example a further (e.g. fourth) proximity level may be:
It will be appreciated that in generating proximity data to three or more discrete, non-overlapping levels of proximity, the distance module 204 may determine a distance between the two play pieces based on the signals received (block 232, e.g. distance=32 cm) and then determine which proximity level the distance falls within (block 234, e.g. 32 cm corresponds to the level ‘lack of proximity’), e.g. where the proximity levels are defined using distance thresholds. Alternatively, the distance module 204 may determine a proximity level without first determining a distance between the play pieces and instead may determine a signal parameter or characteristic (block 236, e.g. −45 dBm) and then quantize the signal parameters (block 238, e.g. −45 dBm corresponds to proximity level ‘distant proximity’), e.g. where the proximity levels are defined using signal characteristic thresholds.
In various examples, the proximity data may comprise the actual determined distance between play pieces (from block 232) instead of using a predefined finite set of quantized proximity levels (i.e. block 234 is omitted).
In various examples, the proximity data (generated in block 226) may comprise additional information as well as a proximity level or actual distance. The additional information may, for example, comprise a relative location of the two pieces (e.g. in terms of angles, bearings, etc.) and/or information about the surroundings of the active play piece (e.g. light sources or proximate obstacles such as walls). In order to detect information about the surroundings, an active play piece 200 may further comprise one or more sensors 210 (e.g. a camera, light sensor, microphone, etc.).
The transmitter 206 is arranged to transmit proximity data (e.g. the proximity levels or actual distances) to the interactive entertainment experience 106 (block 228) either directly or via one or more intermediaries (which may include another active play piece). In various examples, the transmitter 206 may also be arranged to transmit signals used for distance determination by other active play pieces (block 222) and in some examples, the signals which comprise proximity data (transmitted in block 228) may also be used for distance determination by other active play pieces (i.e. blocks 222 and 228 may be combined). For example, an active device may use an array microphone to sense the angle of arrival of sounds generated by another play piece. In various examples the signals used for distance determination may use a different technology or protocol to the signals used to transmit proximity data and in such cases the active play piece may comprise two transmitters 206: one for transmitting signals used for distance determination by other active play pieces (in block 222) and one for transmitting proximity data to other active pieces or to the interactive entertainment experience (in block 228). In various examples, audio or ultrasound “pings” and responses may be used to determine distances between active play pieces.
There are many different technologies which may be used to transmit signals between play pieces and from an active play piece to the interactive entertainment experience. In various examples, Bluetooth® Low Energy (BLE) or another short range wireless protocol may be used by an active play piece 200 to transmit proximity data to other active play pieces and/or to the interactive entertainment experience 106 (in block 228). Consequently, transmitter 206 (or one of the transmitters 206 where there is more than one) may be a wireless transmitter and in particular may be a BLE transmitter. Where an active play piece 300 also comprises a receiver 202 which is arranged to receive proximity data from another active play piece, this receiver 202 (which may be the sole receiver 202 or one of the receivers 202 in the active play piece) may be a wireless receiver and in particular may be a BLE receiver.
The same technology (and hence the same transmitter and/or receiver) may be used to transmit/receive signals which are used in distance determination and in various examples RF field strength may be used and this may be capable of centimeter levels of accuracy. Other technologies which may be used to transmit/receive signals which are used in distance determination include, but are not limited to:
The interactive entertainment experience 106 receives proximity data from an active play piece (block 402). As described above, the proximity data describes distances between two or more play pieces sending signals and may be defined in terms of three or more discrete (i.e. non-overlapping) proximity levels (e.g. lack of proximity, distant proximity and close proximity) and an example arrangement of pieces is shown in
The interactive entertainment experience 106 modifies the user experience provided to reflect the proximity data received (block 404). This modification (in block 404) may be based on the absolute proximity data received (e.g. pieces A and D are in close proximity) and/or based on changes in proximity data (e.g. changes in proximity levels, such as if piece D moves as indicated by the dotted arrow 504 shown in
In various examples, the user's progress within a game or online world may be updated based on the proximity data (block 441). For example, a user's progress may be updated when particular combinations of pieces are in close proximity to each other (e.g. pieces A, B and C in the example shown in
In various examples, the proximity data may be reflected within the graphical user interface (GUI) of the interactive entertainment experience (block 442). In an example, the interactive entertainment experience may display a map or plan of the arrangement of play pieces and their relative separation within the GUI (e.g. similar to that shown in
In various examples, changes in proximity data (received in block 402) may result in changes to sound effects within the interactive entertainment experience (block 443). For example, where play piece A in
In various examples, changes in proximity data (received in block 402) may result in commands being sent from one play piece to another (block 444), e.g. to cause an effect within the receiving play piece which is apparent (e.g. visible/audible) to a user. For example, if play piece A is a command center and play piece B is a vehicle, when play piece B approaches play piece A, the interactive entertainment experience may send a command to play piece A (block 444, and received by the play piece in block 229 in
In various examples, changes in proximity data may cause a change in the ambient experience provided by the interactive entertainment experience. The ambient experience may comprise sound effects (which may include inaudible low frequency rumbles generated by a bass speaker) and/or visual effects within the GUI (e.g. by providing lighting effects—in one example, if the whole screen has greens and browns fading in and out, then the ambient experience induced may correspond to a jungle). For example, where the user is associated with play piece A (a figurine) and play piece B represents a beach location, if play piece A approaches play piece B (as indicated by dotted arrow 508 in
In various examples, the interactive entertainment experience may be a single user experience and/or it may be associated with a single play set (where one or more users may play with the play set at the same time). In various examples, however, it may be a multi-user experience. There are different ways in which a multi-user experience may be implemented. In various examples, users may play with different play sets and multiple play sets may be represented within the GUI to all the users. The multiple users (and hence play sets) may be co-located or they may be geographically distributed and their game play may only be connected via the interactive entertainment experience.
In other examples of a multi-user experience, a remote user, who may not have a play set, may be able to interact with a virtual representation of another user's play set. In such an example, a first user (in a first location) may create a particular arrangement of play pieces in the real world using the physical play pieces in a play set and this arrangement may be represented within the interactive entertainment experience (e.g. within a game or virtual world). Then a second user (in a second location which may be remote from the first location) may interact with that arrangement in the virtual world. In an example, the first user may set out an arrangement of toy soldiers (with each toy soldier being a play piece) or build a fortified castle (with parts of the castle being play pieces) and the second user may attack the virtual representation of the toy soldiers or castle. The results of the attack may be displayed to the first user within the interactive entertainment experience.
As described above, in various examples the interactive entertainment experience 106 is a computer game and as well as responding to the user's arrangement of play pieces (by reflecting the proximity of the play pieces within the interactive entertainment experience in block 404) in a style of game play which may be described as non-directed (because the game does not force or suggest any particular interaction with the play pieces and any physical object constrained to the path), the game may also provide directed game play, as shown in
In the directed game play, the game presents goals or objectives to the user (who might also be referred to as a player) within the game (block 604) where those goals/objectives require the player to interact with the physical play pieces in order to further progress within the game 106, i.e. the user cannot achieve the goal/objective without interacting with the physical pieces and changing the proximity of two or more pieces. For example, a user may need to bring a particular combination of play pieces into close proximity (e.g. blocks A, B and D) or move the pieces such that a piece is not proximate (e.g. blocks A and B in close proximity and lack of proximity between blocks A and C and blocks B and C), etc. In order to determine whether the objective has been met, the game receives (updated) proximity data from one or more physical pieces (block 402).
The game then modifies the game play (block 606) dependent upon whether the objective (set in block 604) has been met or not. By meeting the objective, the user may be able to progress to a new level, achieve a higher score, win a contest, unlock additional features (e.g. hidden features, mini-games, new levels, etc.) within the game, get an “achievement” awarded to them, assist other players in cooperative multiplayer scenarios, play against other players in competitive multiplayer scenarios, etc.
The progression which is achieved through the interaction with physical play pieces, and hence by achieving the objective set, may be linear progression (e.g. progression to the next level) or may be non-linear progression which results in an enhancement to the game play. For example, the interaction may unlock some optional content e.g. a new avatar for the virtual vehicle which is not required to complete the main storyline of the game.
The directed game play may be explicit, in that the goals/objectives and the corresponding need to move the physical play pieces are clearly communicated to the user (e.g. through messages within the GUI). Alternatively, the goals/objectives and/or the need to interact with the physical play pieces may be implicit, in that the goals/objectives or required arrangement of physical play pieces are known to the game but are not communicated to the user and must be discovered by the user. The use of implicit directed game play adds further challenges to the user and enhances the user experience.
The objectives which are presented to the user (in block 604) may be pre-defined and stored within the game software. Alternatively they may be generated dynamically (block 602). In various examples, they may be generated based at least in part on the information received from the physical play pieces (in block 402), e.g. they may be dependent on the current arrangement of physical play pieces. In various examples, the objective which is set may be generated based on the user's history (e.g. past performance) within the game or based on any other characteristics of the user or information about the user. Data detailing the user's history may, for example, be stored by the game itself or alternatively may be stored on a remote server and accessed by the game. By tailoring the objectives to be specific to a user, this enhances the overall user experience within the game. In examples where the objectives are dynamically generated (in block 602), this may comprise one or more of: choosing an objective or goal from a pre-existing list of possible objectives/goals (e.g. based on a characteristic of the user or another factor described above), creating an objective/goal based on random factors and using existing gameplay to date to influence the choice/creation of objective/goal.
In the second example play set 702 in
The third example play set 703 in
In a further example, any of the play pieces described herein (e.g. any of the play pieces shown in the examples in
Where the object play piece is operative as an active play piece, the processor 810 within the core play piece 802 may be arranged to generate the proximity data (e.g. as in block 226 of
In various examples each peripheral play piece 804 may comprise a storage element 816 which stores an identifier (ID) for the peripheral play piece (which may be referred to as the play piece ID) and may comprise additional data. The storage element 816 may comprise memory or any other form of storage device. In the example shown in
Although not shown in
It will be appreciated that the play pieces 802, 804 shown in
In various examples, a play piece (which may be a peripheral play piece 804 or a core play piece 802) may comprise one or more sensors, actuators and/or displays that are controlled by and/or provide data to the processor 810 within the core play piece 802. Examples of sensors that may be used include: temperature sensors, vibration sensors, accelerometers, tilt sensors, gyroscopic sensors, rotation sensors, magnetometers, proximity sensors (active/passive infrared or ultrasonic), sound sensors, light sensors, etc. Examples of actuators that may be used include: electromagnets, motors, servos, vibration units, solenoids, speakers, etc. Examples of displays that may be used include one or more LEDs, a small LCD display, an e-ink display, etc. Where a play piece comprises a sensor, the sensor data may be communicated by the core play piece 802 to the game 106.
The play pieces and associated interactive entertainment experience described above may enable many different types of play. The examples shown in
Computing-based device 1000 comprises one or more processors 112 which may be microprocessors, controllers or any other suitable type of processors for processing computer executable instructions to control the operation of the device in order to perform the methods described herein (e.g. infer a path and present at least a part of the path in a GUI). In some examples, for example where a system on a chip architecture is used, the processors 112 may include one or more fixed function blocks (also referred to as accelerators) which implement a part of the method of path inference in hardware (rather than software or firmware).
Alternatively, or in addition, the functionality described herein can be performed, at least in part, by one or more hardware logic components. For example, and without limitation, illustrative types of hardware logic components that can be used include Field-programmable Gate Arrays (FPGAs), Program-specific Integrated Circuits (ASICs), Program-specific Standard Products (ASSPs), System-on-a-chip systems (SOCs), Complex Programmable Logic Devices (CPLDs).
Platform software comprising an operating system 1004 or any other suitable platform software may be provided at the computing-based device to enable application software, such as an interactive entertainment experience 106 to be executed on the device.
The computer executable instructions may be provided using any computer-readable media that is accessible by computing based device 1000. Computer-readable media may include, for example, computer storage media such as memory 110 and communications media. Computer storage media, such as memory 110, includes volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes, but is not limited to, RAM, ROM, EPROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information for access by a computing device. In contrast, communication media may embody computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave, or other transport mechanism. As defined herein, computer storage media does not include communication media. Therefore, a computer storage medium should not be interpreted to be a propagating signal per se. Propagated signals may be present in a computer storage media, but propagated signals per se are not examples of computer storage media. Although the computer storage media (memory 110) is shown within the computing-based device 1000 it will be appreciated that the storage may be distributed or located remotely and accessed via a network or other communication link (e.g. using communication interface 113).
The communication interface 113 may be arranged to receive data from one or more physical play pieces and may comprise a wireless transmitter and/or wireless receiver. In various examples the communication interface 113 receives data from the physical play pieces directly and in other examples, the communication interface 113 may receive data from the play pieces via an intermediary device (e.g. where the computing-based device 1000 is a cloud-based device, the data may be received via a another computing-based device which is within range of the transmitters in the physical play pieces and the two computing devices may be connected via a network such as the internet). In examples where the play pieces comprise a feedback mechanism (e.g. LED, speaker, etc.) the communication interface 113 may also be arranged to transmit data (e.g. commands) to one or more physical play pieces (again these may be transmitted directly to the play pieces or via an intermediary device).
The computing-based device 1000 may also comprise an input/output controller 1016. The input/output controller may be arranged to output display information (e.g. the GUI) to a display device 1018 which may be separate from or integral to the computing-based device 1000. The input/output controller 1016 may also be arranged to receive and process input from one or more devices, such as a user input device 1024 (e.g. a mouse, keyboard, camera, microphone or other sensor). In some examples the user input device 1024 may detect voice input, user gestures or other user actions and may provide a natural user interface (NUI). This user input may be used to further control game play. In an embodiment the display device 1018 may also act as the user input device 1024 if it is a touch sensitive display device. The input/output controller 1016 may also output data to devices other than the display device, e.g. a locally connected printing device (not shown in
Any of the input/output controller 1016, display device 1018 and the user input device 1024 may comprise NUI technology which enables a user to interact with the computing-based device in a natural manner, free from artificial constraints imposed by input devices such as mice, keyboards, remote controls and the like. Examples of NUI technology that may be provided include but are not limited to those relying on voice and/or speech recognition, touch and/or stylus recognition (touch sensitive displays), gesture recognition both on screen and adjacent to the screen, air gestures, head and eye tracking, voice and speech, vision, touch, gestures, and machine intelligence. Other examples of NUI technology that may be used include intention and goal understanding systems, motion gesture detection systems using depth cameras (such as stereoscopic camera systems, infrared camera systems, RGB camera systems and combinations of these), motion gesture detection using accelerometers/gyroscopes, facial recognition, 3D displays, head, eye and gaze tracking, immersive augmented reality and virtual reality systems and technologies for sensing brain activity using electric field sensing electrodes (EEG and related methods).
Although the present examples are described and illustrated herein as being implemented in a play system (comprising a set of physical play pieces and an associated game) as shown in
An aspect provides a system comprising a play set which in turn comprises one or more active physical play pieces. An active physical play piece comprises: a receiver operative to receive signals from one or more other physical play pieces; a distance module operative to generate proximity data using signals received by the receiver; and a transmitter operative to transmit proximity data to an interactive entertainment experience associated with the play set. The proximity data describes distances between two or more physical play pieces that are transmitting signals to three or more discrete levels of proximity and changes in proximity levels between play pieces are reflected in the interactive entertainment experience.
In various examples, the play set comprises two or more active physical play pieces. The play set may further comprise one or more semi-active play pieces, a semi-active play piece comprising: a transmitter operative to transmit signals to an active play piece. As described above, a semi-active piece does not comprise a distance module.
In various examples, the three or more discrete levels of proximity comprise a first level indicating a lack of proximity between play pieces, a second level indicating distant proximity between play pieces and a third level indicating close proximity between play pieces. The three or more discrete levels of proximity may further comprise one or more of: a proximity level indicating that two play pieces are connected together; and a proximity level indicating that one play piece is inside another play piece. In other examples, however, the three or more discrete levels of proximity comprise actual distances between play pieces.
In various examples, an active physical play piece further comprises a feedback device, wherein the feedback device is activated in response to a command received by the active physical play piece from the associated interactive entertainment experience.
In various examples, the receiver is a wireless receiver and the transmitter is a wireless transmitter.
In various examples, the active physical play piece represents an entity in the interactive entertainment experience and in some examples, the active physical play piece may have a shape and appearance which corresponds to the entity in the interactive entertainment experience. In various examples, some of the play pieces in a play set may be capable of connecting to each other and in other examples each of the play pieces in a play set may be a separate piece which does not comprise any mechanism for connecting it to another piece.
In various examples, some of the play pieces in a play set may comprise base boards and represent environments and others of the play pieces in the play set may represent objects or characters.
In various examples, the system further comprises a computing device arranged to run the interactive entertainment experience. In various examples, the interactive entertainment experience comprises device-executable instructions which when executed cause the computing device to: receive proximity data from a physical play piece; and reflect the proximity data in the interactive entertainment experience. Reflecting the proximity data in the interactive entertainment experience may comprise one or more of: updating a user's progress based on the proximity data; reflecting the proximity data in a graphical user interface; playing sound effects based on the proximity data; and sending a command to another physical play piece based on the proximity data.
In various examples, the interactive entertainment experience further comprises device-executable instructions which when executed cause the computing device to: present an objective (which may be dynamically generated) to the user within the interactive entertainment experience; determine whether the objective is met based on proximity data received; and modify the interactive entertainment experience in response to determining that the objective has been met.
A further aspect provides a method comprising: receiving signals at an active physical play piece from one or more other physical play pieces; generating proximity data in the active physical play piece based on signals received, the proximity data describing distances between two or more physical play pieces that are transmitting signals to three or more discrete levels of proximity; and transmitting the proximity data from the active physical play piece to an interactive entertainment experience running on a computing device.
In various examples, the method further comprises transmitting signals from the active play piece to one or more other physical play pieces. These signals may comprise the proximity data.
In various examples, the method further comprises: receiving a signal from the interactive entertainment experience at an active physical play piece; and in response to receiving the signal, controlling a feedback mechanism within the active physical play piece.
A yet further aspect provides a method comprising: receiving proximity data from an active physical play piece at an interactive entertainment experience, the proximity data describing distances between two or more physical play pieces that are transmitting signals to three or more discrete levels of proximity; and reflecting the proximity data in the interactive entertainment experience. Reflecting the proximity data in the interactive entertainment experience may comprise one or more of: updating a user's progress based on the proximity data; reflecting the proximity data in a graphical user interface; playing sound effects based on the proximity data; and sending a command to another physical play piece based on the proximity data.
In various examples, the interactive entertainment experience is a game and the method further comprises: presenting an objective (which may be dynamically generated) to a user within the game; and modifying game play dependent upon whether the user meets the objective.
Another aspect provides a system comprising a play set which in turn comprises one or more active physical play pieces. An active physical play piece comprises: a means for receiving signals from one or more other physical play pieces; a means for generating proximity data using signals received by the receiver; and a means for transmitting proximity data to an interactive entertainment experience associated with the play set. The proximity data describes distances between two or more physical play pieces that are transmitting signals to three or more discrete levels of proximity and changes in proximity levels between play pieces are reflected in the interactive entertainment experience.
In various examples, the means for transmitting proximity data may use one of: infra-red, radio-frequency signals, ultra-sonic signals, magnetic fields, etc.
The term ‘computer’ or ‘computing-based device’ is used herein to refer to any device with processing capability such that it can execute instructions. Those skilled in the art will realize that such processing capabilities are incorporated into many different devices and therefore the terms ‘computer’ and ‘computing-based device’ each include PCs, servers, mobile telephones (including smart phones), tablet computers, set-top boxes, media players, games consoles, personal digital assistants and many other devices.
The methods described herein may be performed by software in machine readable form on a tangible storage medium e.g. in the form of a computer program comprising computer program code means adapted to perform all the steps of any of the methods described herein when the program is run on a computer and where the computer program may be embodied on a computer readable medium. Examples of tangible storage media include computer storage devices comprising computer-readable media such as disks, thumb drives, memory etc. and do not include propagated signals. Propagated signals may be present in a tangible storage media, but propagated signals per se are not examples of tangible storage media. The software can be suitable for execution on a parallel processor or a serial processor such that the method steps may be carried out in any suitable order, or simultaneously.
This acknowledges that software can be a valuable, separately tradable commodity. It is intended to encompass software, which runs on or controls “dumb” or standard hardware, to carry out the desired functions. It is also intended to encompass software which “describes” or defines the configuration of hardware, such as HDL (hardware description language) software, as is used for designing silicon chips, or for configuring universal programmable chips, to carry out desired functions.
Those skilled in the art will realize that storage devices utilized to store program instructions can be distributed across a network. For example, a remote computer may store an example of the process described as software. A local or terminal computer may access the remote computer and download a part or all of the software to run the program. Alternatively, the local computer may download pieces of the software as needed, or execute some software instructions at the local terminal and some at the remote computer (or computer network). Those skilled in the art will also realize that by utilizing conventional techniques known to those skilled in the art that all, or a portion of the software instructions may be carried out by a dedicated circuit, such as a DSP, programmable logic array, or the like.
Any range or device value given herein may be extended or altered without losing the effect sought, as will be apparent to the skilled person.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.
It will be understood that the benefits and advantages described above may relate to one embodiment or may relate to several embodiments. The embodiments are not limited to those that solve any or all of the stated problems or those that have any or all of the stated benefits and advantages. It will further be understood that reference to ‘an’ item refers to one or more of those items.
The steps of the methods described herein may be carried out in any suitable order, or simultaneously where appropriate. Additionally, individual blocks may be deleted from any of the methods without departing from the spirit and scope of the subject matter described herein. Aspects of any of the examples described above may be combined with aspects of any of the other examples described to form further examples without losing the effect sought.
The term ‘comprising’ is used herein to mean including the method blocks or elements identified, but that such blocks or elements do not comprise an exclusive list and a method or apparatus may contain additional blocks or elements.
The term ‘subset’ is used herein to refer to a proper subset such that a subset of a set does not comprise all the elements of the set (i.e. at least one of the elements of the set is missing from the subset).
It will be understood that the above description is given by way of example only and that various modifications may be made by those skilled in the art. The above specification, examples and data provide a complete description of the structure and use of exemplary embodiments. Although various embodiments have been described above with a certain degree of particularity, or with reference to one or more individual embodiments, those skilled in the art could make numerous alterations to the disclosed embodiments without departing from the spirit or scope of this specification.
Number | Name | Date | Kind |
---|---|---|---|
6031549 | Hayes-Roth | Feb 2000 | A |
6149490 | Hampton et al. | Nov 2000 | A |
6159101 | Simpson | Dec 2000 | A |
6290565 | Galyean, III | Sep 2001 | B1 |
6305688 | Waroway | Oct 2001 | B1 |
6454624 | Duff et al. | Sep 2002 | B1 |
6572431 | Maa | Jun 2003 | B1 |
6575802 | Yim et al. | Jun 2003 | B2 |
6629591 | Griswold et al. | Oct 2003 | B1 |
6682392 | Chan | Jan 2004 | B2 |
6773322 | Gabai et al. | Aug 2004 | B2 |
6773344 | Gabai et al. | Aug 2004 | B1 |
6877096 | Chung et al. | Apr 2005 | B1 |
6923717 | Mayer et al. | Aug 2005 | B2 |
6954659 | Tushinsky et al. | Oct 2005 | B2 |
7154363 | Hunts | Dec 2006 | B2 |
7439972 | Timcenko | Oct 2008 | B2 |
7568963 | Atsmon | Aug 2009 | B1 |
7641476 | Didur et al. | Jan 2010 | B2 |
7695338 | Dooley et al. | Apr 2010 | B2 |
7749089 | Briggs et al. | Jun 2010 | B1 |
8058837 | Beers et al. | Nov 2011 | B2 |
8079846 | Cookson | Dec 2011 | B1 |
8087939 | Rohrbach et al. | Jan 2012 | B2 |
8157611 | Zheng | Apr 2012 | B2 |
8228202 | Buchner et al. | Jul 2012 | B2 |
8257157 | Polchin | Sep 2012 | B2 |
8292688 | Ganz | Oct 2012 | B2 |
8317566 | Ganz | Nov 2012 | B2 |
8332544 | Ralls et al. | Dec 2012 | B1 |
8475275 | Weston et al. | Jul 2013 | B2 |
8548819 | Chan et al. | Oct 2013 | B2 |
8585476 | Mullen | Nov 2013 | B2 |
8628414 | Walker et al. | Jan 2014 | B2 |
8825187 | Hamrick et al. | Sep 2014 | B1 |
8854925 | Lee et al. | Oct 2014 | B1 |
8864589 | Reiche, III | Oct 2014 | B2 |
8894459 | Leyland | Nov 2014 | B2 |
8894462 | Leyland | Nov 2014 | B2 |
9008310 | Nelson | Apr 2015 | B2 |
9180378 | Reiche | Nov 2015 | B2 |
9387407 | Vignocchi | Jul 2016 | B2 |
9409084 | Horovitz | Aug 2016 | B2 |
9696757 | Scott et al. | Jul 2017 | B2 |
9919226 | Scott et al. | Mar 2018 | B2 |
20020196250 | Anderson et al. | Dec 2002 | A1 |
20030030595 | Radley-smith | Feb 2003 | A1 |
20040053690 | Fogel et al. | Mar 2004 | A1 |
20050132290 | Buchner et al. | Jun 2005 | A1 |
20050227811 | Shum et al. | Oct 2005 | A1 |
20050255916 | Chen | Nov 2005 | A1 |
20060058018 | Toulis et al. | Mar 2006 | A1 |
20070097832 | Koivisto et al. | May 2007 | A1 |
20070155505 | Huomo | Jul 2007 | A1 |
20070198117 | Wajihuddin | Aug 2007 | A1 |
20070211047 | Doan | Sep 2007 | A1 |
20070218988 | Lucich | Sep 2007 | A1 |
20070279852 | Daniel et al. | Dec 2007 | A1 |
20070293319 | Stamper et al. | Dec 2007 | A1 |
20080014835 | Weston et al. | Jan 2008 | A1 |
20080045283 | Stamper et al. | Feb 2008 | A1 |
20080076519 | Chim | Mar 2008 | A1 |
20080085773 | Wood | Apr 2008 | A1 |
20080153559 | De Weerd | Jun 2008 | A1 |
20080280684 | Mcbride et al. | Nov 2008 | A1 |
20090008875 | Wu et al. | Jan 2009 | A1 |
20090029771 | Donahue | Jan 2009 | A1 |
20090047865 | Nakano | Feb 2009 | A1 |
20090048009 | Brekelmans et al. | Feb 2009 | A1 |
20090053970 | Borge | Feb 2009 | A1 |
20090081923 | Dooley et al. | Mar 2009 | A1 |
20090082879 | Dooley et al. | Mar 2009 | A1 |
20090094287 | Johnson et al. | Apr 2009 | A1 |
20090104988 | Enge et al. | Apr 2009 | A1 |
20090206548 | Hawkins et al. | Aug 2009 | A1 |
20090251419 | Radely-smith | Oct 2009 | A1 |
20090265642 | Carter et al. | Oct 2009 | A1 |
20090291764 | Kirkman et al. | Nov 2009 | A1 |
20090307592 | Kalanithi et al. | Dec 2009 | A1 |
20100009747 | Reville et al. | Jan 2010 | A1 |
20100026698 | Reville et al. | Feb 2010 | A1 |
20100035726 | Fisher et al. | Feb 2010 | A1 |
20100103075 | Kalaboukis et al. | Apr 2010 | A1 |
20100113148 | Haltovsky et al. | May 2010 | A1 |
20100144436 | Marks et al. | Jun 2010 | A1 |
20100167623 | Eyzaguirre et al. | Jul 2010 | A1 |
20100274902 | Penman et al. | Oct 2010 | A1 |
20100279823 | Waters | Nov 2010 | A1 |
20110021109 | Le et al. | Jan 2011 | A1 |
20110172015 | Ikeda et al. | Jul 2011 | A1 |
20110215998 | Fitzgerald et al. | Sep 2011 | A1 |
20110239143 | Ye et al. | Sep 2011 | A1 |
20120007817 | Heatherly et al. | Jan 2012 | A1 |
20120050198 | Cannon | Mar 2012 | A1 |
20120052931 | Jaqua et al. | Mar 2012 | A1 |
20120052934 | Maharbiz et al. | Mar 2012 | A1 |
20120122059 | Schweikardt et al. | May 2012 | A1 |
20120190453 | Skaff et al. | Jul 2012 | A1 |
20120268360 | Mikhailov | Oct 2012 | A1 |
20120286629 | Johnson et al. | Nov 2012 | A1 |
20120295700 | Reiche | Nov 2012 | A1 |
20120295704 | Reiche et al. | Nov 2012 | A1 |
20130109267 | Schweikardt et al. | May 2013 | A1 |
20130109272 | Rindlisbacher | May 2013 | A1 |
20130122753 | Blakborn | May 2013 | A1 |
20130165223 | Leyland et al. | Jun 2013 | A1 |
20130173658 | Adelman et al. | Jul 2013 | A1 |
20130196766 | Leyland et al. | Aug 2013 | A1 |
20130196770 | Barney et al. | Aug 2013 | A1 |
20130231193 | Heatherly et al. | Sep 2013 | A1 |
20130271390 | Lyons et al. | Oct 2013 | A1 |
20130288563 | Zheng et al. | Oct 2013 | A1 |
20130324239 | Ur et al. | Dec 2013 | A1 |
20140002580 | Bear et al. | Jan 2014 | A1 |
20140011595 | Muller | Jan 2014 | A1 |
20140055352 | Davis et al. | Feb 2014 | A1 |
20140141865 | Tropper et al. | May 2014 | A1 |
20140181820 | Vignocchi | Jun 2014 | A1 |
20140213357 | Claffey | Jul 2014 | A1 |
20140235198 | Lee et al. | Aug 2014 | A1 |
20140235353 | Witchey | Aug 2014 | A1 |
20160051904 | Abir | Feb 2016 | A1 |
20160101361 | Scott et al. | Apr 2016 | A1 |
20160101364 | Scott et al. | Apr 2016 | A1 |
20160104321 | Scott et al. | Apr 2016 | A1 |
20170232347 | Scott et al. | Aug 2017 | A1 |
Number | Date | Country |
---|---|---|
101207640 | Jun 2008 | CN |
103096987 | May 2013 | CN |
103236720 | Aug 2013 | CN |
103443743 | Dec 2013 | CN |
203434701 | Feb 2014 | CN |
103999012 | Aug 2014 | CN |
1444662 | Aug 2004 | EP |
S5429063 | Sep 1979 | JP |
H09205733 | Aug 1997 | JP |
2003038842 | Feb 2003 | JP |
2013135374 | Jul 2013 | JP |
2001012285 | Feb 2001 | WO |
WO0112285 | Feb 2001 | WO |
2001069799 | Sep 2001 | WO |
2001069829 | Sep 2001 | WO |
WO0169799 | Sep 2001 | WO |
WO2001069829 | Sep 2001 | WO |
03027970 | Apr 2003 | WO |
2009037679 | Mar 2009 | WO |
WO2009037679 | Mar 2009 | WO |
2012160055 | Nov 2012 | WO |
WO2012160055 | Nov 2012 | WO |
Entry |
---|
Schweikardt, Eric, “Designing Modular Robots”, Nov. 19, 2013, Available at: http://www.cmu.edu/architecture/research/grad_work/2009_phdcd_schweikardt_eric.pdf. |
“Skylanders Swapforce”, Sep. 11, 2013, Available at: http://www.skylanders.com/swapforce. |
“Disney Infinity”, Nov. 19, 2013, Available at: https://infinity.disney.com/en-gb. |
“Cubelets”, Sep. 11, 2013, Available at: http://www.modrobotics.com/. |
“Shapeways”, Nov. 19, 2013, Available at: http://shapeways.com/. |
Lampe, et al., “The Augmented Knight's Castle—Integrating Mobile and Pervasive Computing Technologies into Traditional Toy Environments”, Nov. 21, 2013, Available at: http://www.vs.inf.ethz.ch/publ/papers/mlampe-pg07-akc.pdf. |
Kikin-Gil, Ruth, “BuddyBeads”, Published on: Oct. 10, 2006, Available at: http://www.ruthkikin.com/Images/r.kikin-gil_thesis2005.pdf. |
Fortmann, et al., “Illumee: Aesthetic Light Bracelet as a Wearable Information Display for Everyday Life”, In Proceedings of ACM Conference on Pervasive and Ubiquitous Computing Adjunct Publication, Sep. 8, 2013, 4 pages. |
Labrune, et al., “Telebeads: Social Network Mnemonics for Teenagers”, In Proceedings of Conference on Interaction Design and Children, Jun. 7, 2006, 8 pages. |
Ahde, et al., “Hello—Bracelets Communicating Nearby Presence of Friends”, In Proceedings of the Tenth Anniversary Conference on Participatory Design, Sep. 30, 2008, 3 pages. |
Kuniavsky, Mike, “Smart Things: Ubiquitous Computing User Experience Design”, Published on: Sep. 2010, Available at: http://books.google.co.in/books?id=-WLyUCBBUVAC&pg=PA89&lpg=PA89&dq=Interactive+Smart+Beads+and+Bracelet&source=bl&ots=HA6ZA1Bssz&sig=x1s2X1pGZIe-5oVqX3uZA0jZ1ks&hl=en&sa=X&ei=BxWLUqSGI4X3rQfh9oDYCg&ved=0CFAQ6AEwBg#v=onepage&q=Interactive%20Smart%20Beads%20and%20Bracelet&f=false. |
Robertson, Judy, “Encouraging Girls to Study Geeky Subjects (Part 2): Programmable Bracelets”, Published on: Apr. 12, 2010, Available at: http://cacm.acm.org/blogs/blog-cacm/85132-encouraging-girls-to-study-geeky-subjects-part-2-programmable-bracelets/fulltext. |
Lampe, et al., “Integrating Interactive Learning Experiences into Augmented Toy Environments”, In Proceedings of the Pervasive Learning Workshop at the Pervasive Conference, May 2007, 8 pages. |
“Seebo Platform”, Published on: Jun. 22, 2013, Available at: http://www.seebo.com/. |
Raffle, et al., “Topobo: A Constructive Assembly System with Kinetic Memory”, In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Apr. 24, 2004. |
Schmid, et al., “Networking Smart Toys with Wireless ToyBridge and ToyTalk”, In IEEE International Conference on Computer Communications, Apr. 10, 2011, 2 pages. |
Patrizia, et al., “A Robotic Toy for Children with special needs: From requirements to Design”, In IEEE 11th International Conference on Rehabilitation Robotics, Nov. 20, 2013, 6 pages. |
Zaino, Jennifer, “JNFC Technology Brings New Life to Games”, In Journal of RFID, Oct. 1, 2012, 10 pages. |
U.S. Appl. No. 14/203,991, Zhang, et al., “Generation of Custom Modular Objects”, filed Mar. 11, 2014. |
U.S. Appl. No. 14/204,239, Zhang, et al., “Gaming System for Modular Toys”, filed Mar. 11, 2014. |
U.S. Appl. No. 14/204,483, Saul, et al., “Interactive Smart Beads”, filed Mar. 11, 2014. |
U.S. Appl. No. 14/204,740, Saul, et al., “A Modular Construction for Interacting with Software”, filed Mar. 11, 2014. |
U.S. Appl. No. 14/204,929, Zhang, et al., “Storing State for Physical Modular Toys”, filed Mar. 11, 2014. |
U.S. Appl. No. 14/205,077, Zhang, et al., “Data Store for a Modular Assembly System”, filed Mar. 11, 2014. |
“Disney Infinity”, Published on: Aug. 25, 2013, Available at: http://www.essentialkids.com.au/entertaining-kids/games-and-technology/disney-infinity-20130823-2sgg0.html. |
Marshall, Rick, “Skylanders: Swap Force Review” Published on: Nov. 1, 2013, Available at: http://www.digitaltrends.com/game-reviews/skylanders-swap-force-review/. |
Jennings, et al., “CONSTRUCT/VizM: A Framework for Rendering Tangible constructions”, In Proceedings of the 14th Congress of the Iberoamerican Society of Digital Graphics, Nov. 17, 2010, 4 pages. |
Kitamura, et al., “Real-time 3D Interaction with ActiveCube”, In Proceedings of Extended Abstracts on Human Factors in Computing Systems, Mar. 31, 2001, 2 pages. |
“Siftables are Changing the Shape of Computing”, Published on: May 9, 2010, Available at: http://singularityhub.com/2010/05/05/siftables-are-changing-the-shape-of-computing/. |
‘Cuff—fashion wearable bracelets’, 2014 Available at: http://www.elle.com/_mobile/news/fashion-accessories/cufflinc-wearable-techsrc=spr_TWITTER&spr_id=1448_51714286&linkId=7882609. |
‘Prodigy—Kickstarter’, 2014 Available at: https://www.kickstarter.com/projects/121511007/prodigy-the-game. |
Construkts—Part time UI/UX and Engineer Positions, 2014 Available at: http://www.construkts.com. |
“Second Written Opinion Issued in PCT Application No. PCT/US2015/038215,” dated Jun. 10, 2016, 9 pages. |
“International Search Report & Written Opinion Received for PCT Application No. PCT/US2015/038215”, dated Oct. 8, 2015, 11 Pages. |
“International Preliminary Report on Patentability Issued in PCT Application No. PCT/US2015/038215,” dated Sep. 23, 2016, 10 Pages. |
“Final Office Action Issued in U.S. Appl. No. 14/509,919”, dated Jan. 2, 2019, 18 Pages. |
“Office Action Issued in European Patent Application No. 15782188.5”, dated Feb. 14, 2019, 4 Pages. |
“First Office Action and Search Report Issued in Chinese Patent Application No. 201580036030.0”, dated Jan. 31, 2019, 16 Pages. |
“Proxi In-Device Charging Solution”, Retrieved From: http://powerbyproxi.com/consumer-electronics/industrial/proxi-in-device-charging-solution/, May 19, 2013, 8 Pages. |
“World of Warcraft Crafting Skills”, Retrieved From: https://web.archive.org/web/20140527091410/http://us.battle.net/wow/en/, May 27, 2014, 3 Pages. |
“Non Final Office Action Issued in U.S. Appl. No. 14/509,862”, dated Apr. 6, 2017, 15 Pages. |
“Non Final Office Action Issued in U.S. Appl. No. 14/509,919”, dated Jun. 13, 2018, 18 Pages. |
“Non Final Office Action Issued in U.S. Appl. No. 14/509,940”, dated Oct. 6, 2016, 26 Pages. |
“Final Office Action Issued in U.S. Appl. No. 15/582,146”, dated Jul. 9, 2018, 25 Pages. |
“Non Final Office Action Issued in U.S. Appl. No. 15/582,146”, dated Oct. 5, 2017, 35 Pages. |
Betters, Elyse, “LeapFrog LeapBand is an Activity Band for Kids with Virtual Pet Capabilities”, Retrieved From: https://uk.news.yahoo.com/leapfrog-leapband-activity-band-kids-virtual-pet-capabilities-231500937.html.guccounter=1, May 1, 2014, 1 Page. |
Murphy, Samantha, “The Reinvented Tamagotchi: Bright, Flashy and Just as Needy”, Retrieved From: http://mashable.com/2014/02/20/tamagotchi-friends/, Feb. 20, 2014, 22 Pages. |
“International Preliminary Report on Patentability Issued in PCT Application No. PCT/US2015/054103”, dated Jul. 21, 2016, 5 Pages. |
“International Search Report & Written Opinion Received for PCT Application No. PCT/US2015/054103”, dated Feb. 1, 2016, 12 Pages. |
Persson, Markus, “Minecraft”, Retrieved From: https://web.archive.org/web/20140531165512/https://minecraft.net/game, May 27, 2014, 3 Pages. |
Webster, Andrew, “Nex Band is a Smart, Modular Charm Bracelet for Gaming on Your Wrist”, Retrieved From: http://www.theverge.com/2014/2/13/5289404/nex-band-is-a-smart-modular-charm-bracelet, Feb. 13, 2014, 4 Pages. |
Wu, Yingying, “Customizable Wristband Sensor for Healthcare Monitoring 24/7”, Retrieved From: http://marblar.com/idea/493o7, Nov. 14, 2013, 4 Pages. |
“Non Final Office Action Issued in U.S. Appl. No. 15/582,146”, dated Nov. 19, 2018, 22 Pages. |
“Final Office Action Issued in U.S. Appl. No. 15/582,146”, dated Apr. 4, 2019, 25 Pages. |
“First Office Action and Search Report Issued in Chinese Patent Application No. 201580054924.2”, dated May 24, 2019, 13 Pages. |
“Second Office Action Issued in Chinese Patent Application No. 201580036030.0”, dated Jun. 18, 2019, 23 Pages. |
“Second Office Action Issued in Chinese Patent Application No. 201580054924.2”, dated Aug. 9, 2019, 9 Pages. |
“Third Office Action Issued in Chinese Patent Application No. 201580054924.2”, dated Sep. 20, 2019, 7 Pages. |
“Office Action Issued in Japanese Patent Application No. 2017-518972”, dated Sep. 24, 2019, 9 Pages. |
Number | Date | Country | |
---|---|---|---|
20150375134 A1 | Dec 2015 | US |