There are many ways that a user can interact with a computer game and typically a user controls the game via a keyboard and mouse, games controller (which may be handheld or detect body movement) or touch screen, dependent upon the platform on which the game is being played (e.g. computer, games console or handheld device). A number of games have also been developed in which gameplay is enabled (or unlocked) through the use of physical character toys which are placed on a custom base connected to a games console. By placing different toys on the custom base, different gameplay is enabled.
The embodiments described below are not limited to implementations which solve any or all of the disadvantages of known apparatus for interacting with computer games.
The following presents a simplified summary of the disclosure in order to provide a basic understanding to the reader. This summary is not an extensive overview of the disclosure and it does not identify key/critical elements or delineate the scope of the specification. Its sole purpose is to present a selection of concepts disclosed herein in a simplified form as a prelude to the more detailed description that is presented later.
Track based play systems are described which comprise a set of physical play pieces and an associated computer game. In an embodiment, a user may arrange some or all of the play pieces in the set to form a path. The computer game is arranged to infer a virtual model of the path defined by the user-created arrangement of the play pieces. The inference may be based on data communicated by one or more the play pieces to the game or based on data from a local sensing device such as a camera which views the relative positions of the play pieces. Having inferred the path, the game constrains a virtual or physical object to the path within the game play and renders a graphical user interface showing at least a portion of the path.
Many of the attendant features will be more readily appreciated as the same becomes better understood by reference to the following detailed description considered in connection with the accompanying drawings.
The present description will be better understood from the following detailed description read in light of the accompanying drawings, wherein:
Like reference numerals are used to designate like parts in the accompanying drawings.
The detailed description provided below in connection with the appended drawings is intended as a description of the present examples and is not intended to represent the only forms in which the present example may be constructed or utilized. The description sets forth the functions of the example and the sequence of steps for constructing and operating the example. However, the same or equivalent functions and sequences may be accomplished by different examples.
As described in more detail below, the individual physical play pieces which form the set may all be the same shape or may have different shapes (as shown in
In the first example play system 101 shown in
In the second example play system 102 shown in
In a further example play system, the set of physical play pieces may comprise one or more active pieces and one or more passive pieces. In such an example, the active play pieces may detect their own arrangement (or configuration, e.g. orientation, which other play pieces they are connected or adjacent to, etc.) and also the arrangement of any proximate passive pieces and then communicate with other pieces and/or the game 106 to assist the game 106 to infer a shape of a path defined by the arrangement of all of the pieces.
The operation of the game 106, which may be referred to as an electronic game, a video game or computer game (because it runs on a computing device 108, 114 and comprises device-executable instructions even though that computing device may take any form), can be described with reference to
The virtual model of the path can be inferred (in block 206) from the physical pieces in a number of ways depending on whether the pieces are active or passive, how they are sensed, and whether the pieces themselves constitute the path or whether the pieces constitute boundaries or waypoints that define the path implicitly (as described in more detail below with reference to
Having inferred a virtual model of a path (in block 206), the game constrains the position of a virtual and/or physical object to that path within the game play (blocks 208-210). In various examples, the configuration of at least a portion of the path (e.g. at least the part of the path where the object is currently located) and/or the object which is constrained to the path are reflected within a graphical user interface (GUI) of the game (block 212). In various examples, this comprises displaying a portion of the path, the object and/or a view from the path within the GUI. In addition, or instead, the reflection of the path within the GUI may comprise enabling user inputs to affect movement of an object along the path as part of the game play. This may be directly (e.g. by linking a user input to a particular motion along the path within the gameplay such as “left button=go left along path”) or indirectly (where the object's actual motion is controlled by the game engine but the user inputs higher-level commands such as “stay put”, “follow another object”, “search for another object”, etc.).
In examples where a virtual object is constrained to the path (in block 208), the edges of the virtual path may be treated, within the game play, like barriers or walls that the virtual object cannot pass through and various example game scenarios are described below. The motion of the virtual object along the path may be controlled by a physics engine within the game 106 and in various examples, the physics engine may also control the effect on the virtual object of hitting the edge of the virtual path (e.g. the object may bounce back off the wall). In various examples, there may be a feedback mechanism on the physical play pieces themselves (e.g. in the form of one or more LEDs) to indicate the current position (and hence also the motion) of the virtual object on the virtual path relative to the actual arrangement of the physical pieces. Any feedback mechanism may, in addition or instead be used to display the position (or motion) of a virtual object which is controlled by the game (e.g. to show its position relative to a virtual object controlled by the user or relative to a physical object constrained to the path and controlled by the user).
In examples where the physical object is constrained to the path (in block 210) this may be a result of the physical shape of the play pieces themselves (e.g. the play pieces may have one or more grooves or indentations in their surface along which the physical object moves), a constraining mechanism within the play pieces that is controlled by the game (e.g. electromagnets within the pieces that attract the object and are switched on and off by the game) and/or virtual in-game effects (e.g. such that if the physical object is removed from the path it disappears from within the virtual world depicted by the game). The physical object may be moved directly by the user (e.g. the user may push the object along the track) or indirectly by the user (e.g. via the game or other remote control device). Alternatively, the physical object may be controlled by the game without user input (e.g. it may be an autonomous object). When the game controls the physical object's movement, the object can be constrained to follow the path by restricting the control sequences to stay on the path. In various examples more subtle constraints may be used, whereby the game UI allows the user to control the movement of the object and leave the path, but if they do so, then in-game penalties occur (e.g. in a racing game, the object moves slower), or modifications to the control system are made (e.g. the steering is altered slightly to tend to move the object back towards the path).
There are many ways in which the GUI may reflect the configuration of at least a portion of the inferred path to the user (in block 212) and in various examples also display the object. In various examples the GUI may display a plan view (e.g. similar to a map) of the inferred path and/or a view of a part of the path as if seen from the position of the object (e.g. where the object is a vehicle, the view in the GUI may be as would be seen by an occupant of the vehicle). This may, for example, enable a user to appear (within the computer game) to be driving a vehicle around a path they have created in their own living room, where the path is defined by active pieces or passive pieces. Where the path is inferred based on passive pieces, the passive pieces may include items of furniture, cushions, etc. in the living room.
As described above, in various examples a user may re-arrange the physical pieces to create modify the arrangement during game play. This re-arrangement may result in the method shown in
Although the arrangement of pieces 105 shown in
The one or more sensors 306 detect the arrangement of the play piece 300 (in block 310) and the arrangement which is detected may be an absolute position and/or orientation of the play piece (e.g. “at coordinate (x,y)” and/or “face B upwards”) or a relative position of the pieces (e.g. “next to piece X” or “between pieces A and B”) or a combination of the two (e.g. “next to piece X and face A upwards”). Examples of sensors which may be used include, but are not limited to, accelerometers, magnetometers, infra-red transceivers, color sensors, etc. Two play pieces 300 may also communicate with each other using a wired (e.g. 1-Wire) or proximity-based wireless networking technology (e.g. RFID) to determine their relative orientation. As described above, the one or more sensors 306 may also detect the arrangement of a proximate passive play piece (in block 316), e.g. using RFID, magnetometers, color sensors or other sensing technologies.
In some examples, the active piece 300 may further comprise a feedback mechanism 307 (e.g. one or more LEDs) and a receiver 308. The feedback mechanism 307 may be used to indicate the position of a virtual object on the path defined by the arrangement of physical pieces. Consequently, an active piece 300 may receive a command (e.g. data) from the game 106 via the receiver 308 (block 318) indicating a position of a virtual object and may activate the feedback mechanism 307 to display the position (block 320). The virtual object, the position of which is indicated by the feedback mechanism 307, may be a virtual object which is controlled by the user within the game 106 or an autonomous object controlled by the game 106. In other examples, the feedback mechanism 307 may be arranged to move the physical object along the track (e.g. using motors and/or servoes and/or by providing control signals to a control mechanism within the physical object).
The transmitter 304 and/or receiver 308 in a play piece 300 may be a wireless device and may for example use Bluetooth® Low Energy (BLE) or other short range wireless protocol (e.g. IEEE 802.15.4 or ANT+). In other examples, the transmitter 304 and/or receiver 308 in a play piece 300 may use a wired connection to the computing device 108. Where a wired connection is used, the connection to the computing device 108 may be provided by only one of the active pieces in the arrangement which may collect data from all the other active pieces in the arrangement for transmission to the computing device 108. Use of a wired connection may be useful where one or more active pieces have a relatively high power consumption (which would result in a short battery life if battery powered) because the wired connection can also be used to provide power to the play piece and, in various examples, to other play pieces in the arrangement. For example, if the physical play pieces move the physical object along the path using motors or servoes or if pieces incorporate displays or vibration motors to show where the virtual object is, etc.
In various examples the active piece 400 may further comprise a feedback mechanism 307 which is controlled by the game 106 and the game may transmit signals to other active pieces (via communication interface 113) to control feedback mechanisms in other active pieces in the arrangement.
The game may also generate a GUI showing the virtual model of the path (or at least a part of the path) and the object constrained to the path and this GUI may be rendered on a separate computing device. In an example, a separate computing device may render the GUI within a web browser and this separate computing device may communicate with the active piece 400 directly via the communication interface 113 or may receive the GUI data via an intermediary (e.g. from a web server which communicates with the active piece 400). For example, the active piece 400 may be connected to a cloud service which performs the graphics processing to render complex game content which can be viewed in a web browser on the separate computing device. This means that the play pieces only have to sense and report low-level data about their relative placement. Alternatively, the play pieces may incorporate one active piece that acts as a web server and generates the GUI via that web service. Access to the web service (by the separate computing device) may be via an 802.11 wireless access point embedded in the active piece or the active piece may connect to a shared network (e.g. a home WiFi™ network, via transmitter 304). Discovery of the active play piece may use techniques such as uPnP or the active piece may have an NFC interface which allows it to broadcast its current location on the home network to nearby devices (e.g. by broadcasting a URL such as http://192.168.0.107/ where the home router assigned the play piece master that IP address).
There are many different ways in which the plurality of physical play pieces (whether active or passive) can define the shape of a path (as inferred by the associated game). A first example arrangement of pieces 105 is shown in
In the next two examples 502-503 in
In the final example 504 in
In various examples, a user may be free to place the physical play pieces where they want and also to re-arrange the play pieces. In other examples, however, there may be restrictions which limit the positioning and/or order of placement of the play pieces. These restrictions may be a result of the physical shape of the play pieces (e.g. so that only certain play pieces can connect together or be placed abutting each other) and/or may be a result of rules within the game (e.g. the game may specify that a blue edge to a piece may only be placed abutting a blue edge on another tile).
The sets of physical pieces and associated game may be used for many different types of games and in various examples, the play pieces may be shaped according to the type of game. Various examples include:
As well as responding to the user's arrangement of play pieces (by inferring the path in block 206 and representing it within the game in block 212) (and in various example also interactions with the physical object and/or play pieces) in a style of game play which may be described as non-directed (because the game does not force or suggest any particular interaction with the play pieces and any physical object constrained to the path), the game 106 may also provide directed game play, as shown in
In the directed game play, the game 106 presents goals or objectives to the user (who might also be referred to as a player) within the game 106 (block 604) where those goals/objectives require the player to interact with the physical pieces and/or a physical object (where there is one) in order to further progress within the game 106, i.e. the user cannot achieve the goal/objective without interacting with the physical pieces and/or a physical object. For example, a user may need to rearrange the physical play pieces to create a different shape of path (e.g. to reach a target location which may be a real or virtual location), move the physical object along the path in some way (e.g. in the form of a race, to beat other virtual objects which are autonomously controlled by the game), etc. In order to determine whether the objective has been met, the game 106 may receive configuration data from one or more physical pieces (block 202), sense an arrangement (or change in arrangement) of a plurality of physical play pieces (block 204), receive data from a physical object constrained to the path (block 606) or sense a location (or position along the path) of a physical object (block 608).
In examples where the game 106 receives configuration data from one or more physical pieces (in block 202) or senses an arrangement (or change in arrangement) of a plurality of physical play pieces (in block 204), the game 106 then infers a new (or updated) version of a path from this data (block 206), as described above with reference to
The game 106 then modifies the game play (block 610) dependent upon whether the objective (set in block 604) has been met or not. By meeting the objective, the user may be able to progress to a new level, achieve a higher score, win a contest, unlock additional features (e.g. hidden features, mini-games, new levels, etc.) within the game 106, get an “achievement” awarded to them, assist other players in cooperative multiplayer scenarios, play against other players in competitive multiplayer scenarios, etc.
The progression which is achieved through the interaction with physical play pieces and/or physical object (and hence achieving the objective set) may be linear progression (e.g. progression to the next level) or may be non-linear progression which results in an enhancement to the game play. For example, the interaction may unlock some optional content e.g. a new avatar for the virtual vehicle which is not required to complete the main storyline of the game.
The directed game play may be explicit, in that the goals/objectives and the corresponding need to interact with the physical play pieces and/or physical object are clearly communicated to the user (e.g. through messages within the graphical user interface, GUI). Alternatively, the goals/objectives and/or the need to interact with the physical play pieces and/or physical object may be implicit, in that the goals/objectives or required arrangement of physical play pieces and/or physical object are known to the game but are not communicated to the user and must be discovered by the user. The use of implicit directed game play adds further challenges to the user and enhances the user experience.
The objectives which are presented to the user (in block 604) may be pre-defined and stored within the game software. Alternatively they may be generated dynamically (block 602). In various examples, they may be generated based at least in part on the information received from the physical play pieces (in block 202 or 204 of
In various examples, the objective presented to the user (in block 604) may be time-based (e.g. complete the path within a defined time period), location based (e.g. create and/or traverse a path to a defined location, which may be real or virtual) and/or competitive (e.g. against other users or autonomous objects within the game). In an example, the objective may require the real/virtual object to traverse all parts of the path and stay ahead of an autonomous virtual object. In another example, the object may require a user to modify the arrangement of physical pieces so that the object which is constrained to the path can reach a particular location in the real world (e.g. to reach the sofa in the living room where the user is playing the game) or the virtual world.
Although
The core play piece 702 comprises a battery 706, a wireless communications module 708, a processor 710 and one or more connectors 712. The battery 706 provides power to components within the core (such as processor 710 and wireless communications module 708) and also to some/all of the peripheral play pieces 704 via the connectors 712. The wireless communications module 708 enables the core play piece 702 to communicate with a computing device running the game 106. Any suitable wireless technology may be used (e.g. Bluetooth®, BLE, WiFi™ or WiFi™ Direct, Near Field Communication (NFC), 802.15.4, etc.). The wireless communications module 708 may communicate directly with the computing device 108 (as shown in
The processor 710 within the core play piece 702 is arranged to collect the IDs (which may be a unique ID or an ID shared with other identical-looking play pieces, e.g. an ID for a particular shape or type of play piece) of each of the play pieces connected together (and hence which form the path that will be inferred by the game 106). The processor 710 may be a microprocessor, controller or any other suitable type of processor for processing computer executable instructions to control the operation of the core play piece in order to collect the IDs of connected play pieces. Core and peripheral play pieces may be connected together in any way. The play piece IDs (which may just identify a piece type, rather than uniquely identifying a play piece) may be collected from each of the connected play pieces directly (e.g. via a bus) or each play piece may collect information on its neighbors with the core play piece aggregating the data provided by its direct neighbor play pieces. In various examples, these play piece IDs may be collected via the data connection provided by the connectors 712 and in other examples, another means may be used (e.g. NFC, QR codes or computer vision). Where other means are used, the core play piece 702 may comprise additional hardware/software such as an NFC reader module or a camera or other image sensor to collect the play piece IDs of all the connected play pieces. In addition to collecting the play piece IDs of the connected play pieces (e.g. to generate a set or list of connected play pieces), the core play piece may detect the topology of the arrangement of play pieces.
Each peripheral play piece 704 comprises one or more connectors 712, 714 to physically attach the play piece to another play piece to form the path. The peripheral play piece 704 further comprises electrical connections 724 (e.g. in the form of a bus comprising 2 wires, data and ground) between the two connectors 712, 714.
Each peripheral play piece 704 also comprises a storage element 716 which stores an identifier (ID) for the peripheral play piece (which may be referred to as the play piece ID) and which may identify the type (e.g. shape) of the piece or may uniquely identify the play piece. The storage element 716 may comprise additional data, such as the shape and/or appearance of the play piece, locations of any connection points, mechanical compatibility details for connection points (e.g. detailing which other piece types can be connected to), other information used to help sense topology (e.g. color pattern), game play information (e.g. is the virtual/physical object on the piece, so that this can be used in subsequent game play) etc. This additional data may be used by the game 106 when inferring the path formed by an arrangement of physical pieces (e.g. in block 206 of
Although not shown in
It will be appreciated that the play pieces 702, 704 shown in
In various examples, a play piece (which may be a peripheral play piece 704 or a core play piece 702) may comprise one or more sensors, actuators and/or displays that are controlled by and/or provide data to the processor 710 within the core play piece 702. Examples of sensors that may be used include: temperature sensors, vibration sensors, accelerometers, tilt sensors, gyroscopic sensors, rotation sensors, magnetometers, proximity sensors (active/passive infrared or ultrasonic), sound sensors, light sensors, etc. Examples of actuators that may be used include: electromagnets, motors, servos, vibration units, solenoids, speakers, etc. Examples of displays that may be used include one or more LEDs, a small LCD display, an c-ink display, etc. Where a play piece comprises a sensor, the sensor data may be communicated by the core play piece 702 to the game 106.
The topology determination (in block 806) may use any suitable method. In various examples, each connector 712, 714 in a play piece 702, 704 may comprise hardware logic (such as an electronic switch) to enable the processor 710 within the core play piece 702 to dissect the bus (i.e. the electrical connections connecting all the play pieces) programmatically. This can be described with reference to
In the example shown in
In order that the core play piece knows when it has identified the relative position of all the connected play pieces, the core may first (prior to causing the bus to be dissected) detect the IDs of all the connected play pieces (block 91, e.g. when the bus is fully connected) and then proceed with the iterative discovery process until all detected IDs have been discovered. An example method of operation of the core play piece which uses this is described below.
In a first detection step (block 91) the core play piece detects all the connected play pieces, which in the example of
Referring back to
Some or all of the methods shown in
When a user re-arranges the play pieces (e.g. by removing or adding a new play piece), it may not be necessary to perform a full topology analysis (e.g. as shown in
In addition to collecting the play piece IDs and communicating them to the game (in blocks 804-808), the core play piece may additionally perform one or more additional functions. As shown in
Where a peripheral play piece 704 or the core play piece 702 comprises one or more sensors, the core play piece 702 collects the sensor data (block 810) and communicates this data to the game 106 (block 812). As described above with reference to the IDs, the data which is communicated to the game 106 (e.g. via wireless module 708) may be the raw sensor data or an aggregated or processed form of the sensor data.
In various examples, the core play piece 702 may receive commands from the game (block 814), for example where a play piece (core/peripheral) comprises an actuator or display. In response to receiving such a command, it may be processed within the core play piece (e.g. where the core play piece comprises an actuator/display) or may be passed to a connected play piece (block 816), e.g. to a play piece identified by its ID within the received command. In various examples, actuators may be used to constrain a physical object to a path formed by the play pieces and inferred by the game 106.
In various examples, such as the example shown in
Computing-based device 1000 comprises one or more processors 1002 which may be microprocessors, controllers or any other suitable type of processors for processing computer executable instructions to control the operation of the device in order to perform the methods described herein (e.g. infer a path and present at least a part of the path in a GUI). In some examples, for example where a system on a chip architecture is used, the processors 1000 may include one or more fixed function blocks (also referred to as accelerators) which implement a part of the method of path inference in hardware (rather than software or firmware).
Alternatively, or in addition, the functionality described herein can be performed, at least in part, by one or more hardware logic components. For example, and without limitation, illustrative types of hardware logic components that can be used include Field-programmable Gate Arrays (FPGAs), Program-specific Integrated Circuits (ASICs), Program-specific Standard Products (ASSPs), System-on-a-chip systems (SOCs), Complex Programmable Logic Devices (CPLDs).
Platform software comprising an operating system 1004 or any other suitable platform software may be provided at the computing-based device to enable application software, such as a game 106 to be executed on the device. As shown in
The computer executable instructions may be provided using any computer-readable media that is accessible by computing based device 1000. Computer-readable media may include, for example, computer storage media such as memory 1012 and communications media. Computer storage media, such as memory 1012, includes volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes, but is not limited to, RAM, ROM, EPROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information for access by a computing device. In contrast, communication media may embody computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave, or other transport mechanism. As defined herein, computer storage media does not include communication media. Therefore, a computer storage medium should not be interpreted to be a propagating signal per se. Propagated signals may be present in a computer storage media, but propagated signals per se are not examples of computer storage media. Although the computer storage media (memory 1012) is shown within the computing-based device 1000 it will be appreciated that the storage may be distributed or located remotely and accessed via a network or other communication link (e.g. using communication interface 1014).
The communication interface 1014 may be arranged to receive data from one or more physical play pieces and may comprise a wireless transmitter and/or wireless receiver. In various examples the communication interface 1014 receives data from the physical play pieces directly and in other examples, the communication interface 1014 may receive data from the play pieces via an intermediary device. In examples where the play pieces comprise a feedback mechanism (e.g. LEDs arranged to show a location of a virtual object constrained to the inferred path) the communication interface 1014 may also be arranged to transmit data (e.g. commands) to one or more physical play pieces.
The computing-based device 1000 may also comprise an input/output controller 1016. The input/output controller may be arranged to output display information (e.g. the GUI) to a display device 1018 which may be separate from or integral to the computing-based device 1000. The input/output controller 1016 may also be arranged to receive and process input from one or more devices, such sensors 1020 or a sensing module 1022 (which may be internal or external to the computing based device 1000) or a user input device 1024 (e.g. a mouse, keyboard, camera, microphone or other sensor). In some examples the user input device 1024 may detect voice input, user gestures or other user actions and may provide a natural user interface (NUI). This user input may be used to further control game play. In an embodiment the display device 1018 may also act as the user input device 1024 if it is a touch sensitive display device. The input/output controller 1016 may also output data to devices other than the display device, e.g. a locally connected printing device (not shown in
Any of the input/output controller 1016, display device 1018 and the user input device 1024 may comprise NUI technology which enables a user to interact with the computing-based device in a natural manner, free from artificial constraints imposed by input devices such as mice, keyboards, remote controls and the like. Examples of NUI technology that may be provided include but are not limited to those relying on voice and/or speech recognition, touch and/or stylus recognition (touch sensitive displays), gesture recognition both on screen and adjacent to the screen, air gestures, head and eye tracking, voice and speech, vision, touch, gestures, and machine intelligence. Other examples of NUI technology that may be used include intention and goal understanding systems, motion gesture detection systems using depth cameras (such as stereoscopic camera systems, infrared camera systems, RGB camera systems and combinations of these), motion gesture detection using accelerometers/gyroscopes, facial recognition, 3D displays, head, eye and gaze tracking, immersive augmented reality and virtual reality systems and technologies for sensing brain activity using electric field sensing electrodes (EEG and related methods).
Although the present examples are described and illustrated herein as being implemented in a play system (comprising a set of physical play pieces and an associated game) as shown in
Many of the examples described above involve physical game play by a user with a physical object which is constrained to the inferred path. It will be appreciated, however, that game play may involve a combination of some physical game play and some virtual game play.
The term ‘computer’ or ‘computing-based device’ is used herein to refer to any device with processing capability such that it can execute instructions. Those skilled in the art will realize that such processing capabilities are incorporated into many different devices and therefore the terms ‘computer’ and ‘computing-based device’ each include PCs, servers, mobile telephones (including smart phones), tablet computers, set-top boxes, media players, games consoles, personal digital assistants and many other devices.
The methods described herein may be performed by software in machine readable form on a tangible storage medium e.g. in the form of a computer program comprising computer program code means adapted to perform all the steps of any of the methods described herein when the program is run on a computer and where the computer program may be embodied on a computer readable medium. Examples of tangible storage media include computer storage devices comprising computer-readable media such as disks, thumb drives, memory etc. and do not include propagated signals. Propagated signals may be present in a tangible storage media, but propagated signals per se are not examples of tangible storage media. The software can be suitable for execution on a parallel processor or a serial processor such that the method steps may be carried out in any suitable order, or simultaneously.
This acknowledges that software can be a valuable, separately tradable commodity. It is intended to encompass software, which runs on or controls “dumb” or standard hardware, to carry out the desired functions. It is also intended to encompass software which “describes” or defines the configuration of hardware, such as HDL (hardware description language) software, as is used for designing silicon chips, or for configuring universal programmable chips, to carry out desired functions.
Those skilled in the art will realize that storage devices utilized to store program instructions can be distributed across a network. For example, a remote computer may store an example of the process described as software. A local or terminal computer may access the remote computer and download a part or all of the software to run the program. Alternatively, the local computer may download pieces of the software as needed, or execute some software instructions at the local terminal and some at the remote computer (or computer network). Those skilled in the art will also realize that by utilizing conventional techniques known to those skilled in the art that all, or a portion of the software instructions may be carried out by a dedicated circuit, such as a DSP, programmable logic array, or the like.
Any range or device value given herein may be extended or altered without losing the effect sought, as will be apparent to the skilled person.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.
It will be understood that the benefits and advantages described above may relate to one embodiment or may relate to several embodiments. The embodiments are not limited to those that solve any or all of the stated problems or those that have any or all of the stated benefits and advantages. It will further be understood that reference to ‘an’ item refers to one or more of those items.
The steps of the methods described herein may be carried out in any suitable order, or simultaneously where appropriate. Additionally, individual blocks may be deleted from any of the methods without departing from the spirit and scope of the subject matter described herein. Aspects of any of the examples described above may be combined with aspects of any of the other examples described to form further examples without losing the effect sought.
The term ‘comprising’ is used herein to mean including the method blocks or elements identified, but that such blocks or elements do not comprise an exclusive list and a method or apparatus may contain additional blocks or elements.
The term ‘subset’ is used herein to refer to a proper subset such that a subset of a set does not comprise all the elements of the set (i.e. at least one of the elements of the set is missing from the subset).
It will be understood that the above description is given by way of example only and that various modifications may be made by those skilled in the art. The above specification, examples and data provide a complete description of the structure and use of exemplary embodiments. Although various embodiments have been described above with a certain degree of particularity, or with reference to one or more individual embodiments, those skilled in the art could make numerous alterations to the disclosed embodiments without departing from the spirit or scope of this specification.
Number | Name | Date | Kind |
---|---|---|---|
6031549 | Hayes-Roth | Feb 2000 | A |
6149490 | Hampton et al. | Nov 2000 | A |
6159101 | Simpson | Dec 2000 | A |
6290565 | Galyean, III et al. | Sep 2001 | B1 |
6305688 | Waroway | Oct 2001 | B1 |
6454624 | Duff et al. | Sep 2002 | B1 |
6572431 | Maa | Jun 2003 | B1 |
6575802 | Yim et al. | Jun 2003 | B2 |
6629591 | Griswold et al. | Oct 2003 | B1 |
6682392 | Chan | Jan 2004 | B2 |
6773322 | Gabai et al. | Aug 2004 | B2 |
6773344 | Gabai et al. | Aug 2004 | B1 |
6877096 | Chung et al. | Apr 2005 | B1 |
6923717 | Mayer et al. | Aug 2005 | B2 |
6954659 | Tushinsky et al. | Oct 2005 | B2 |
7154363 | Hunts | Dec 2006 | B2 |
7439972 | Timcenko | Oct 2008 | B2 |
7641476 | Didur et al. | Jan 2010 | B2 |
7695338 | Dooley et al. | Apr 2010 | B2 |
7749089 | Briggs et al. | Jun 2010 | B1 |
8058837 | Beers et al. | Nov 2011 | B2 |
8079846 | Cookson | Dec 2011 | B1 |
8087939 | Rohrbach et al. | Jan 2012 | B2 |
8157611 | Zheng | Apr 2012 | B2 |
8228202 | Buchner et al. | Jul 2012 | B2 |
8257157 | Polchin | Sep 2012 | B2 |
8317566 | Ganz | Nov 2012 | B2 |
8332544 | Ralls et al. | Dec 2012 | B1 |
8475275 | Weston et al. | Jul 2013 | B2 |
8548819 | Chan et al. | Oct 2013 | B2 |
8585476 | Mullen | Nov 2013 | B2 |
8628414 | Walker et al. | Jan 2014 | B2 |
8825187 | Hamrick et al. | Sep 2014 | B1 |
8854925 | Lee et al. | Oct 2014 | B1 |
9696757 | Scott et al. | Jul 2017 | B2 |
9919226 | Scott et al. | Mar 2018 | B2 |
20020196250 | Anderson et al. | Dec 2002 | A1 |
20030013524 | Cochran | Jan 2003 | A1 |
20030030595 | Radley-Smith | Feb 2003 | A1 |
20040053690 | Fogel et al. | Mar 2004 | A1 |
20050132290 | Buchner et al. | Jun 2005 | A1 |
20050227811 | Shum et al. | Oct 2005 | A1 |
20050255916 | Chen | Nov 2005 | A1 |
20060058018 | Toulis et al. | Mar 2006 | A1 |
20070097832 | Koivisto et al. | May 2007 | A1 |
20070155505 | Huomo | Jul 2007 | A1 |
20070188444 | Vale et al. | Aug 2007 | A1 |
20070198117 | Wajihuddin | Aug 2007 | A1 |
20070218988 | Lucich | Sep 2007 | A1 |
20070279852 | Daniel et al. | Dec 2007 | A1 |
20070293319 | Stamper et al. | Dec 2007 | A1 |
20080014835 | Weston et al. | Jan 2008 | A1 |
20080045283 | Stamper et al. | Feb 2008 | A1 |
20080076519 | Chim | Mar 2008 | A1 |
20080153559 | De Weerd | Jun 2008 | A1 |
20080280684 | McBride et al. | Nov 2008 | A1 |
20090008875 | Wu et al. | Jan 2009 | A1 |
20090029771 | Donahue | Jan 2009 | A1 |
20090047865 | Nakano | Feb 2009 | A1 |
20090048009 | Brekelmans et al. | Feb 2009 | A1 |
20090053970 | Borge | Feb 2009 | A1 |
20090081923 | Dooley et al. | Mar 2009 | A1 |
20090082879 | Dooley et al. | Mar 2009 | A1 |
20090094287 | Johnson et al. | Apr 2009 | A1 |
20090104988 | Enge et al. | Apr 2009 | A1 |
20090197658 | Polchin | Aug 2009 | A1 |
20090206548 | Hawkins et al. | Aug 2009 | A1 |
20090251419 | Radely-Smith | Oct 2009 | A1 |
20090265642 | Carter et al. | Oct 2009 | A1 |
20090291764 | Kirkman et al. | Nov 2009 | A1 |
20090307592 | Kalanithi et al. | Dec 2009 | A1 |
20100009747 | Reville et al. | Jan 2010 | A1 |
20100026698 | Reville et al. | Feb 2010 | A1 |
20100035726 | Fisher et al. | Feb 2010 | A1 |
20100103075 | Kalaboukis et al. | Apr 2010 | A1 |
20100113148 | Haltovsky et al. | May 2010 | A1 |
20100144436 | Marks et al. | Jun 2010 | A1 |
20100167623 | Eyzaguirre et al. | Jul 2010 | A1 |
20100274902 | Penman et al. | Oct 2010 | A1 |
20100279823 | Waters | Nov 2010 | A1 |
20100331083 | Maharbiz | Dec 2010 | A1 |
20110021109 | Le et al. | Jan 2011 | A1 |
20110172015 | Ikeda et al. | Jul 2011 | A1 |
20110215998 | Fitzgerald | Sep 2011 | A1 |
20110239143 | Ye et al. | Sep 2011 | A1 |
20120007817 | Heatherly et al. | Jan 2012 | A1 |
20120050198 | Cannon | Mar 2012 | A1 |
20120052931 | Jaqua et al. | Mar 2012 | A1 |
20120052934 | Maharbiz et al. | Mar 2012 | A1 |
20120122059 | Schweikardt et al. | May 2012 | A1 |
20120190453 | Skaff et al. | Jul 2012 | A1 |
20120268360 | Mikhailov | Oct 2012 | A1 |
20120286629 | Johnson et al. | Nov 2012 | A1 |
20120295700 | Reiche | Nov 2012 | A1 |
20120295704 | Reiche et al. | Nov 2012 | A1 |
20130109267 | Schweikardt et al. | May 2013 | A1 |
20130109272 | Rindlisbacher | May 2013 | A1 |
20130122753 | Blakborn | May 2013 | A1 |
20130165223 | Leyland et al. | Jun 2013 | A1 |
20130173658 | Adelman et al. | Jul 2013 | A1 |
20130196766 | Leyland et al. | Aug 2013 | A1 |
20130196770 | Barney et al. | Aug 2013 | A1 |
20130231193 | Heatherly et al. | Sep 2013 | A1 |
20130271390 | Lyons et al. | Oct 2013 | A1 |
20130288563 | Zheng et al. | Oct 2013 | A1 |
20130324239 | Ur et al. | Dec 2013 | A1 |
20140002580 | Bear et al. | Jan 2014 | A1 |
20140011595 | Muller | Jan 2014 | A1 |
20140055352 | Davis et al. | Feb 2014 | A1 |
20140141865 | Tropper et al. | May 2014 | A1 |
20140213357 | Claffey | Jul 2014 | A1 |
20140235198 | Lee et al. | Aug 2014 | A1 |
20140235353 | Witchey | Aug 2014 | A1 |
20150258435 | Zhang et al. | Sep 2015 | A1 |
20160101361 | Scott et al. | Apr 2016 | A1 |
20160101364 | Scott et al. | Apr 2016 | A1 |
20160104321 | Scott et al. | Apr 2016 | A1 |
20170232347 | Scott et al. | Aug 2017 | A1 |
Number | Date | Country |
---|---|---|
103096987 | May 2013 | CN |
103236720 | Aug 2013 | CN |
203434701 | Feb 2014 | CN |
103999012 | Aug 2014 | CN |
H0920533 | Aug 1997 | JP |
2003038842 | Feb 2003 | JP |
2011036418 | Feb 2011 | JP |
2013135374 | Jul 2013 | JP |
2001012285 | Feb 2001 | WO |
2001069799 | Sep 2001 | WO |
2001069829 | Sep 2001 | WO |
2009037679 | Mar 2009 | WO |
2011112498 | Sep 2011 | WO |
2012160055 | Nov 2012 | WO |
2015138267 | Sep 2015 | WO |
Entry |
---|
“International Search Report & Written Opinion for PCT Patent Application No. PCT/US2015/038217”, dated Sep. 30, 2015, Filed Date: Jun. 29, 2015, 12 Pages. |
Gilpin, et al., “Robot pebbles: One centimeter modules for programmable matter through self-disassembly”, 2010 IEEE International Conference on Robotics and Automation, May 3, 2010. |
Schweikardt, Eric, “Designing Modular Robots”, Nov. 19, 2013, Available at: http://www.cmu.edu/architecture/research/grad_work/2009_phdcd_schweikardt_eric.pdf. |
“Skylanders Swapforce”, Sep. 11, 2013, Available at: http://www.skylanders.com/swapforce. |
“Disney Infinity”, Nov. 19, 2013, Available at: https://infinity.disney.com/en-gb. |
“Cubelets”, Sep. 11, 2013, Available at: http://www.modrobotics.com/. |
“Shapeways”, Nov. 19, 2013, Available at: http://shapeways.com/. |
Lampe, et al., “The Augmented Knight's Castle—Integrating Mobile and Pervasive Computing Technologies into Traditional Toy Environments”, Nov. 21, 2013, Available at: http://www.vs.inf.ethz.ch/publ/papers/mlampe-pg07-akc.pdf. |
Kikin-Gil, Ruth, “BuddyBeads”, Published on: Oct. 10, 2006, Available at: http://www.ruthkikin.com/Images/r.kikin-gil_thesis2005.pdf. |
Fortmann, et al., “Illumee: Aesthetic Light Bracelet as a Wearable Information Display for Everyday Life”, In Proceedings of ACM Conference on Pervasive and Ubiquitous Computing Adjunct Publication, Sep. 8, 2013, 4 pages. |
Labrune, et al., “Telebeads: Social Network Mnemonics for Teenagers”, In Proceedings of Conference on Interaction Design and Children, Jun. 7, 2006, 8 pages. |
Ahde, et al., “Hello—Bracelets Communicating Nearby Presence of Friends”, In Proceedings of the Tenth Anniversary Conference on Participatory Design, Sep. 30, 2008, 3 pages. |
Kuniavsky, Mike, “Smart Things: Ubiquitous Computing User Experience Design”, Published on: Sep. 2010, Available at: http://books.google.co.in/books?id=-WLyUCBBUVAC&pg=PA89&Ipg=PA89&dq=Interactive+Smart+Beads+and+Bracelet&source=bl&ots=HA6ZA1Bssz&sig=x1s2X1pGZle-5oVqX3uZA0jZ1ks&hl=en&sa=X&ei=BxWLUqSGI4X3rQfh9oDYCg&ved=0CFAQ6AEwBg#v=onepage&q=Interactive%20Smart%20Beads%20and%20Bracelet&f=false. |
Robertson, Judy, “Encouraging Girls to Study Geeky Subjects (Part 2): Programmable Bracelets”, Published on: Apr. 12, 2010, Available at: http://cacm.acm.org/blogs/blog-cacm/85132-encouraging-girls-to-study-geeky-subjects-part-2-programmable-bracelets/fulltext. |
Lampe, et al., “Integrating Interactive Learning Experiences into Augmented Toy Environments”, In Proceedings of the Pervasive Learning Workshop at the Pervasive Conference, May 2007, 8 pages. |
“Seebo Platform”, Published on: Jun. 22, 2013, Available at: http://www.seebo.com/. |
Raffle, et al., “Topobo: A Constructive Assembly System with Kinetic Memory”, In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Apr. 24, 2004. |
Schmid, et al., “Networking Smart Toys with Wireless ToyBridge and ToyTalk”, In IEEE International Conference on Computer Communications, Apr. 10, 2011, 2 pages. |
Patrizia, et al., “A Robotic Toy for Children with special needs: From requirements to Design”, In IEEE 11th International Conference on Rehabilitation Robotics, Nov. 20, 2013, 6 pages. |
Zaino, Jennifer, “JNFC Technology Brings New Life to Games”, In Journal of RFID, Oct. 1, 2012, 10 pages. |
U.S. Appl. No. 14/203,991, Zhang, et al., “Generation of Custom Modular Objects”, filed Mar. 11, 2014. |
U.S. Appl. No. 14/204,239, Zhang, et al., “Gaming System for Modular Toys”, filed Mar. 11, 2014. |
U.S. Appl. No. 14/204,483, Saul, et al., “Interactive Smart Beads”, filed Mar. 11, 2014. |
U.S. Appl. No. 14/204,740, Saul, et al., “A Modular Construction for Interacting with Software”, filed Mar. 11, 2014. |
U.S. Appl. No. 14/204,929, Zhang, et al., “Storing State for Physical Modular Toys”, filed Mar. 11, 2014. |
U.S. Appl. No. 14/205,077, Zhang, et al., “Data Store for a Modular Assembly System”, filed Mar. 11, 2014. |
“Disney Infinity”, Published on: Aug. 25, 2013, Available at: http://www.essentialkids.com.au/entertaining-kids/games-and-technology/disney-infinity-20130823-2sgg0.html. |
Marshall, Rick, “Skylanders: Swap Force Review” Published on: Nov. 1, 2013, Available at: http://www.digitaltrends.com/game-reviews/skylanders-swap-force-review/. |
Jennings, et al., “CONSTRUCT/VizM: A Framework for Rendering Tangible constructions”, In Proceedings of the 14th Congress of the Iberoamerican Society of Digital Graphics, Nov. 17, 2010, 4 pages. |
Kitamura, et al., “Real-time 3D Interaction with ActiveCube”, In Proceedings of Extended Abstracts on Human Factors in Computing Systems, Mar. 31, 2001, 2 pages. |
“Siftables are Changing the Shape of Computing”, Published on: May 9, 2010, Available at: http://singularityhub.com/2010/05/05/siftables-are-changing-the-shape-of-computing/. |
‘Cuff—fashion wearable bracelets’, 2014 Available at: http://www.elle.com/_mobile/news/fashion-accessories/cufflinc-wearable-techsrc=spr_TWITTER&spr_id=1448_51714286&linkId=7882609. |
‘Prodigy—Kickstarter’, 2014 Available at: https://www.kickstarter.com/projects/121511007/prodigy-the-game. |
CONSTRUKTS—Part time UI/UX and Engineer Positions, 2014 Available at: http://www.construkts.com. |
“International Preliminary Report on Patentability Issued in PCT Application No. PCT/US2015/038217”, dated Oct. 18, 2016, 10 Pages. |
“Second Written Opinion Issued in PCT Application No. PCT/US2015/038217”, dated Jun. 8, 2016, 8 Pages. |
Betters, “LeapFrog LeapBand is an activity band for kids with virtual pet capabilities,” downloaded from https://uk.news.yahoo.com/leapfrog-leapband-activity-band-kids-virtual-pet-capabilities-231500937.html?guccounter=1, 1 p. (May 2014). |
Final Office Action dated Jul. 9, 2018, from U.S. Appl. No. 15/582,146, 25 pp. |
International Preliminary Report on Patentability dated Jul. 21, 2016, from International Patent Application No. PCT/US2015/054103, 5 pp. |
International Search Report and Written Opinion dated Feb. 1, 2016, from International Patent Application No. PCT/US2015/054103, 12 pp. |
Murphy, “The Reinvented Tamagotchi: Bright, Flashy and Just as Needy,” downloaded from: http://mashable.com/2014/02/20/tamagotchi-friends/, 22 pp. (Feb. 2014). |
Office Action dated Oct. 6, 2016, from U.S. Appl. No. 14/509,940, 26 pp. |
Office Action dated Apr. 6, 2017, from U.S. Appl. No. 14/509,862, 15 pp. |
Office Action dated Oct. 5, 2017, from U.S. Appl. No. 15/582,146, 35 pp. |
Office Action dated Jun. 13, 2018, from U.S. Appl. No. 14/509,919, 18 pp. |
Office Action dated Nov. 19, 2018, from U.S. Appl. No. 15/582,146, 22 pp. |
Persson, “Minecraft,” downloaded from: https://web.archive.org/web/20140531165512/https://minecraft.net/game, 3 pp. (May 2014). |
“Proxi In-Device Charging Solution”, Retrieved From: http://powerbyproxi.com/consumer-electronics/industrial/proxi-in-device-charging-solution/, 8 pp. (May 19, 2013). |
Webster, “Nex Band is a smart, modular charm bracelet for gaming on your wrist,” downloaded from: http://www.theverge.com/2014/2/13/5289404/nex-band-is-a-smart-modular-charm-bracelet, 4 pp. (Feb. 2014). |
“World of Warcraft Crafting Skills”, Retrieved From: https://web.archive.org/web/20140527091410/http://us.battle.net/wow/en/, May 27, 2014, 3 Pages. |
Wu, “Customizable Wristband Sensor for Healthcare Monitoring 24/7,” downloaded from: http://marblar.com/idea/493o7, 4 pp. (Nov. 2013). |
Communication pursuant to Article 94(3) EPC dated Feb. 14, 2019, from European Patent Application No. 15782188.5, 4 pp. |
Final Office Action dated Jan. 2, 2019, from U.S. Appl. No. 14/509,919, 18 pp. |
“First office action and Search Report Issued in Chinese Patent Application No. 201580054924.2”, dated May 24, 2019, 13 Pages. |
“Final Office Action Issued in U.S. Appl. No. 15/582,146”, dated Apr. 4, 2019, 25 Pages. |
“Second Office Action Issued in Chinese Patent Application No. 201580054924.2”, dated Aug. 9, 2019, 9 Pages. |
“Third Office Action Issued in Chinese Patent Application No. 201580054924.2”, dated Sep. 20, 2019, 7 Pages. |
“Office Action Issued in Japanese Patent Application No. 2017-518972”, dated Sep. 24, 2019, 9 Pages. |
Number | Date | Country | |
---|---|---|---|
20150375114 A1 | Dec 2015 | US |