There are many ways that a user can interact with a computer game and typically a user controls the game via a keyboard and mouse, games controller (which may be handheld or detect body movement) or touch screen, dependent upon the platform on which the game is being played (e.g. computer, games console or handheld device). A number of games have also been developed in which gameplay is enabled (or unlocked) through the use of physical character toys which are placed on a custom base connected to a games console. By placing different toys on the custom base, different gameplay is enabled.
The embodiments described below are not limited to implementations which solve any or all of the disadvantages of known apparatus for interacting with interactive software experiences, such as games.
The following presents a simplified summary of the disclosure in order to provide a basic understanding to the reader. This summary is not an extensive overview of the disclosure and it does not identify key/critical elements or delineate the scope of the specification. Its sole purpose is to present a selection of concepts disclosed herein in a simplified form as a prelude to the more detailed description that is presented later.
A modular assembly system is described which enables interaction with an interactive software experience such as a game. The system enables a coherent physical whole object to be assembled from a core module and one or more peripheral modules. The core module includes a battery, processor and a wireless module which is able to communicate with the interactive software experience which runs on a separate computing device such as a smartphone, tablet or games console. Each of the peripheral modules stores a module ID and these IDs are collected by the core module and communicated to the interactive software experience. The user experience within the interactive software experience changes dependent upon the set of modules which are connected to form the coherent physical whole object and may also be altered as a result of manipulation of the coherent physical whole object or individual modules.
Many of the attendant features will be more readily appreciated as the same becomes better understood by reference to the following detailed description considered in connection with the accompanying drawings.
The present description will be better understood from the following detailed description read in light of the accompanying drawings, wherein:
Like reference numerals are used to designate like parts in the accompanying drawings.
The detailed description provided below in connection with the appended drawings is intended as a description of the present examples and is not intended to represent the only forms in which the present example may be constructed or utilized. The description sets forth the functions of the example and the sequence of steps for constructing and operating the example. However, the same or equivalent functions and sequences may be accomplished by different examples.
A system is described below which comprises a plurality of hardware modules which are each a sub-component of a coherent physical whole object, such as a toy. The modules may be connected together and re-arranged by users as part of interaction with an interactive software experience (e.g. a game) and in some examples, the act of assembly of a plurality of modules by a user into a complete object unlocks or enables the interactive software experience or parts thereof (e.g. particular features, mini-games, levels, etc.). User interaction with the coherent physical whole object (or modules that form part of the object), e.g. user manipulation of the object, may also affect the operation of the interactive software experience. The coherent physical whole object therefore acts as a user input device for the interactive software experience.
Once assembled, the coherent physical whole object is physically attached together to form a single object, i.e. requiring a deliberate action by the user to detach (e.g. an applied force to overcome a mechanical friction fit or a magnetic attachment holding the modules together, or an unclasping or threading action so that a module can be removed or disconnected from an adjacent module). This is in contrast to systems in which a module or modules are sensed to be near or touching one another, but no mechanical element holds them together (with the exception of gravity, if the whole assembly is only lifted vertically from underneath). The coherent physical whole object is connected wirelessly to the interactive software experience and is not connected by a wire or other physical linking. The coherent physical whole object 108 is moveable freely (e.g. in three dimensions) by a user and is capable of communicating with the interactive software experience while it is in motion. The coherent physical whole object 108 (and/or the modules from which it is formed) may comprise mechanical articulation or movement affordances, e.g. it may have joints such as hinges, or some elements may be mobile compared to other elements, e.g. sliding or rotating with respect to one another.
Each coherent physical whole object comprises at least one core module and one or more peripheral modules. A core module comprises additional processing capability compared to a peripheral module (further differences are described below) and generally a core module acts as a master while the peripheral modules act as slave modules; however a core module may in various examples be configured to act as a slave (e.g. where there is more than one core module).
The modules can, for example, represent parts (e.g. head, body, limbs) of a humanoid/animal/mystical character (e.g. a human, animal or robot), vehicles or parts thereof (e.g. chassis, wheels, roof, etc.), accessories for a vehicle or character (e.g. weapons, clothing, armor, or other objects which the character may wear/carry/hold), tracks (e.g. for a car, train, human, animal or other mobile object), bricks (e.g. as part of a construction set), baseboards or levels (e.g. where the tracks/bricks/baseboards/levels may form part of a playset such as a train set, model village, tower block, dolls house or other construction), parts of an educational toy (e.g. parts of a molecular model, skeleton or body, etc.) or fashion items or parts thereof. Where the interactive software experience is a game, the modules may comprise sub-components of a game piece (which may also be referred to as a toy or modular toy) and that game piece may be a character, vehicle, etc.
The processor 110 within the core module 102 is arranged to collect the IDs (which may be a unique ID or an ID shared with other identical-looking modules, e.g. an ID for a type of module) of each of the modules connected to form the coherent physical whole 100. The processor 110 may be a microprocessor, controller or any other suitable type of processor for processing computer executable instructions to control the operation of the core module in order to collect the IDs of connected modules. In the examples shown in
Each peripheral module 104 comprises one or more connectors 114 to physically attach the module to another module to form the coherent physical whole. Although
Each peripheral module 104 also comprises a storage element 116 which stores an identifier (ID) for the peripheral module (which may be referred to as the module ID). The storage element 116 may comprise memory or any other form of storage device. In the example shown in
Although not shown in
It will be appreciated that the modules 102, 104, 104′ shown in
In various examples, a module (which may be a peripheral module 104 or a core module 102) may comprise one or more sensors, actuators and/or displays that are controlled by and/or provide data to the processor 110 within the core module 102. Examples of sensors that may be used include: temperature sensors, vibration sensors, accelerometers, tilt sensors, gyroscopic sensors, rotation sensors, magnetometers, proximity sensors (active/passive infrared or ultrasonic), sound sensors, light sensors, etc. Examples of actuators that may be used include: motors, servos, vibration units, solenoids, speakers, etc. Examples of displays that may be used include one or more LEDs, a small LCD display, an e-ink display, etc. Where a module comprises a sensor, the sensor data may be communicated by the core module 102 to the interactive software experience.
The topology determination (in block 206) may use any suitable method. In various examples, each connector 112, 114 in a module 102, 104 may comprise hardware logic (such as an electronic switch) to enable the processor 110 within the core module 102 to dissect the bus (i.e. the electrical connections connecting all the modules) programmatically. This can be described with reference to
In the example shown in
In order that the core module knows when it has identified the relative position of all the connected modules, the core may first (prior to causing the bus to be dissected) detect the IDs of all the connected modules (block 31, e.g. when the bus is fully connected) and then proceed with the iterative discovery process until all detected IDs have been discovered. An example method of operation of the core module which uses this is described below.
In a first detection step (block 31) the core module detects all the connected modules, which in the example of
Referring back to
Some or all of the methods shown in
When a user re-arranges the modules (e.g. by removing or adding a new module), it may not be necessary to perform a full topology analysis (e.g. as shown in
In addition to collecting the module IDs and communicating them to the interactive software experience (in blocks 204-208), the core module may additionally perform one or more additional functions. As shown in
Where a peripheral module 104 or the core module 102 comprises one or more sensors, the core module 102 collects the sensor data (block 210) and communicates this data to the interactive software experience (block 212). As described above with reference to the IDs, the data which is communicated to the interactive software experience (e.g. via wireless module 108) may be the raw sensor data or an aggregated or processed form of the sensor data.
In various examples, the core module 102 may receive commands from the interactive software experience (block 214), for example where a module (core/peripheral) comprises an actuator or display. In response to receiving such a command, it may be processed within the core module (e.g. where the core module comprises an actuator/display) or may be passed to a connected module (block 216), e.g. to a module identified by its ID within the received command.
In various examples, such as the example shown in
In various examples, the central core (connection 412) may be free to move independently of the outer ring (connection 414). This may be achieved, for example, by forming the central core on a tab or tongue 422 which is only connected to the outer portion in one place 424, thereby forming an articulated arrangement, as shown in the second plan view 420 in
In order to physically connect the modules together, such that they do not separate when pushed gently or picked up, the connections 412, 414 may be formed from magnetic material, with each of the connectors 402, 404 being of opposite polarity so that the connections in different connectors are attracted to each other and the modules are held together by the magnetic attraction. By appropriate selection of magnets and dimensions, the attractive forces may be sufficient to hold the modules together when picked up by a user, but not so strong that they cannot be separated by the user when they wish to re-arrange the modules. Different strengths of magnets may be used for different applications (e.g. less strong attraction for toys for young children).
In an alternative configuration of magnetic connector, instead of using magnetic material to form the electrical connections, a magnet 432 may be provided behind a PCB 434 (which may be a flexible PCB) providing the contact face 436 (and contacts 412, 414), as shown in the third cross section 430 in
Where magnets are used in the connectors to physically hold them together, the magnets (e.g. magnet 432 or the magnets providing connections 412, 414 in the alternative implementation) may be permanent magnets or may be electromagnets. Where electromagnets are used, the magnets may be programmable such that their polarity may be altered under the control of the processor 110 within the core module (and/or any processors within the peripheral modules, where provided). Use of programmable electromagnets enables a processor (e.g. processor 110 within the core module) to control whether a particular module can connect to another module or not. For example, although a module may initially be able to connect (as the electromagnets have opposite polarities) the processor 110 may subsequently change the polarity of one or more of the magnets so that they no longer attract but instead repel each other, forcibly ejecting the module. This may, for example, be used as part of the interactive software experience (e.g. within game play, for example during a battle) and/or to restrict the interoperability of modules. Examples of limiting interoperability include, but are not limited to, limiting which modules can connect to which connections on a core module (e.g. only “head modules” may be allowed to connect to the “head connection” on the core, and where a non-head module is detected, e.g. in block 204 of
The interoperability of modules may also be restricted using magnetic polarity in examples where permanent magnets are used. For example, a “head” connector on a core module may have a central connector 412 of a first polarity and an outer connector 414 of a second polarity, where the first and second polarities may be the same or different. In contrast, a “limb” connector on a core module may have a central connector 412 of the second polarity and an outer connector 414 of the first polarity. This therefore restricts limb peripheral modules to connecting only to the limb connector on the core module and head peripheral modules to connecting only to the head connector on the core module.
Although
The polarity of magnets may also be exploited for purposes other than (or in addition to) rejecting/ejecting modules. In various examples, the polarity of magnets within the connectors may be used to limit the angle of movement (or angle of attachment) of a module relative to another module, as shown in the example in
Referring back to
There are many ways in which the angle of attachment may be determined by the core module 102. In one example, the core module may comprise a color sensor 602 (e.g. within the connector or adjacent to the connector 604 as shown in
Although
As described above, the core module 102 communicates with a computing device which runs an interactive software experience to provide details of the modules connected together to form a coherent physical whole object and in various examples to provide additional information such as sensor data, data on topology of modules, angle of attachment information, etc. The inputs received from the core module 102 provide inputs to the interactive software experience and affect the way that the experience works. In various examples, the coherent physical whole object 100 is shown within the GUI of the interactive software experience. Interaction of a user with the object 100 may therefore result in changes in the GUI and/or changes in the game play and assembly of particular combinations (or sets) of modules may enable or unlock parts of the interactive software experience. Where a user disconnects and then reconnects the same module (e.g. as detected by the core module and communicated to the interactive software experience) this may result in particular actions/effects within the interactive software experience (e.g. reloading a weapon where the module is a weapon in a game). As also described above, the interactive software experience may in some examples provide commands to the core module (either for execution by the core module or a peripheral module).
Computing-based device 800 comprises one or more processors 804 which may be microprocessors, controllers or any other suitable type of processors for processing computer executable instructions to control the operation of the device in order to run the interactive software experience. In some examples, for example where a system on a chip architecture is used, the processors 804 may include one or more fixed function blocks (also referred to as accelerators) which implement a part of the functionality in hardware (rather than software or firmware). Alternatively, or in addition, the functionality described herein can be performed, at least in part, by one or more hardware logic components. For example, and without limitation, illustrative types of hardware logic components that can be used include Field-programmable Gate Arrays (FPGAs), Program-specific Integrated Circuits (ASICs), Program-specific Standard Products (ASSPs), System-on-a-chip systems (SOCs), Complex Programmable Logic Devices (CPLDs).
Platform software comprising an operating system 806 or any other suitable platform software may be provided at the computing-based device to enable application software, such as the interactive software experiences 802 to be executed on the device.
The computer executable instructions may be provided using any computer-readable media that is accessible by computing based device 800. Computer-readable media may include, for example, computer storage media such as memory 808 and communications media. Computer storage media, such as memory 808, includes volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes, but is not limited to, RAM, ROM, EPROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information for access by a computing device. In contrast, communication media may embody computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave, or other transport mechanism. As defined herein, computer storage media does not include communication media. Therefore, a computer storage medium should not be interpreted to be a propagating signal per se. Propagated signals may be present in a computer storage media, but propagated signals per se are not examples of computer storage media. Although the computer storage media (memory 808) is shown within the computing-based device 800 it will be appreciated that the storage may be distributed or located remotely and accessed via a network or other communication link (e.g. using communication interface 810).
The communication interface 810 enables the computing-based device 800 to communicate with core modules 102. Where the computing-based device 800 communicates directly with a core module 102, the communication interface 810 comprises a wireless interface. In other examples, where the computing-based device 800 communicates with a core module via a network or intermediary device, the communication interface may use wired or wireless technology.
The computing-based device 800 also comprises an input/output controller 812 arranged to output display information to a display device 814 which may be separate from or integral to the computing-based device 800. The display information may provide a graphical user interface. The input/output controller 812 is also arranged to receive and process input from one or more devices, such as a user input device 816 (e.g. a mouse, keyboard, camera, microphone or other sensor). In some examples the user input device 816 may detect voice input, user gestures or other user actions and may provide a natural user interface (NUI). This user input may be used to control the interactive software experience 802. In various embodiments the display device 814 may also act as the user input device 816 if it is a touch sensitive display device. The input/output controller 812 may also output data to devices other than the display device, e.g. a locally connected printing device (not shown in
Any of the input/output controller 812, display device 814 and the user input device 816 may comprise NUI technology which enables a user to interact with the computing-based device in a natural manner, free from artificial constraints imposed by input devices such as mice, keyboards, remote controls and the like. Examples of NUI technology that may be provided include but are not limited to those relying on voice and/or speech recognition, touch and/or stylus recognition (touch sensitive displays), gesture recognition both on screen and adjacent to the screen, air gestures, head and eye tracking, voice and speech, vision, touch, gestures, and machine intelligence. Other examples of NUI technology that may be used include intention and goal understanding systems, motion gesture detection systems using depth cameras (such as stereoscopic camera systems, infrared camera systems, RGB camera systems and combinations of these), motion gesture detection using accelerometers/gyroscopes, facial recognition, 3D displays, head, eye and gaze tracking, immersive augmented reality and virtual reality systems and technologies for sensing brain activity using electric field sensing electrodes (EEG and related methods).
Each core module 102 in the super-object may identify the peripheral modules connected to it (e.g. three peripheral modules 104 and the base module 902), as described above. Based on the ID of the base module 902, each core module 102 knows that it is a base module and hence that there may be other core modules connected to it. Each core module 102 may discover the other one through electrical connections within the base module 902, through other means (e.g. using RFID) or alternatively this discovery may be performed by the interactive software experience and each of these are depicted in the message flows in
In the first message flow 1001, cores A and B each detect the ID of the base module 902 and the other core, e.g. using the methods described above. The arrows 1011 denote the ID collection (e.g. as in block 204 of
In the second message flow 1002, cores A and B each detect the ID of the base module 902 (arrows 1021, 1022) and then detect the presence of the other core (arrow 1023) via alternative means, e.g. by using RFID tags embedded in each core and an RFID reader in each core.
Having detected the presence of the other core (but not the peripheral modules connected to that other core) using either of the methods described above (e.g. arrows 1011-1012 or 1021-1023), each core then communicates the topology of its connected modules (peripheral modules 104 and base module 902) to an interactive software experience along with the identity of the other core identified as being also connected to the base module 902, as shown in the third message flow 1003. Where both cores 1004, 1005 communicate with the same interactive software experience 1006, the interactive software experience will receive details of the entire super-object 900 (arrows 1024, 1025); however where each core module communicates with a different interactive software experience 1007, 1008 (dotted arrows 1026, 1027), an interactive software experience will not have information about the modules connected to the other core. In this example, a central server 1009 may be accessed by the interactive software experience to request details about the other core (arrows 1028, 1029). The interactive software experience may also provide details about the core module and connected modules that it has received so that this data can be shared, by the central server, with the other interactive software experience. In this way, both the interactive software experiences 1007, 1008 receive details of all the modules within the super-object 900.
Once an interactive software experience receives details of all the modules within the super-object 900, it may allow two users to control the super-object (within the same interactive software experience) with both users playing the same type of game or alternatively, each of the players may play a different game type (e.g. one being a driving game and controlling a virtual representation of a vehicle part of a super-object and the other being a shooting game and controlling weapons within the super-object). Alternatively, each user may independently interact with the super-object via a different interactive software experience and the user experience may be totally separate or may be linked through communication between the two interactive software experiences.
The principle of a super-object, as described above, provides the ability of create objects in a hierarchical manner. A user can assemble one or more coherent physical whole objects (each of which comprises a core module) and then connect them together to form a super-object. The connection may use a base module 902, such as the one shown in
Although the present examples are described and illustrated herein as being implemented in a gaming system, the system described is provided as an example and not a limitation. As those skilled in the art will appreciate, the present examples are suitable for application in a variety of different types of interactive software experiences/systems.
The term ‘computer’ or ‘computing-based device’ is used herein to refer to any device with processing capability such that it can execute instructions. Those skilled in the art will realize that such processing capabilities are incorporated into many different devices and therefore the terms ‘computer’ and ‘computing-based device’ each include PCs, servers, mobile telephones (including smart phones), tablet computers, set-top boxes, media players, games consoles, personal digital assistants and many other devices.
The methods described herein may be performed by software in machine readable form on a tangible storage medium e.g. in the form of a computer program comprising computer program code means adapted to perform all the steps of any of the methods described herein when the program is run on a computer and where the computer program may be embodied on a computer readable medium. Examples of tangible storage media include computer storage devices comprising computer-readable media such as disks, thumb drives, memory etc. and do not include propagated signals. Propagated signals may be present in a tangible storage media, but propagated signals per se are not examples of tangible storage media. The software can be suitable for execution on a parallel processor or a serial processor such that the method steps may be carried out in any suitable order, or simultaneously.
This acknowledges that software can be a valuable, separately tradable commodity. It is intended to encompass software, which runs on or controls “dumb” or standard hardware, to carry out the desired functions. It is also intended to encompass software which “describes” or defines the configuration of hardware, such as HDL (hardware description language) software, as is used for designing silicon chips, or for configuring universal programmable chips, to carry out desired functions.
Those skilled in the art will realize that storage devices utilized to store program instructions can be distributed across a network. For example, a remote computer may store an example of the process described as software. A local or terminal computer may access the remote computer and download a part or all of the software to run the program. Alternatively, the local computer may download pieces of the software as needed, or execute some software instructions at the local terminal and some at the remote computer (or computer network). Those skilled in the art will also realize that by utilizing conventional techniques known to those skilled in the art that all, or a portion of the software instructions may be carried out by a dedicated circuit, such as a DSP, programmable logic array, or the like.
Any range or device value given herein may be extended or altered without losing the effect sought, as will be apparent to the skilled person.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.
It will be understood that the benefits and advantages described above may relate to one embodiment or may relate to several embodiments. The embodiments are not limited to those that solve any or all of the stated problems or those that have any or all of the stated benefits and advantages. It will further be understood that reference to ‘an’ item refers to one or more of those items.
The steps of the methods described herein may be carried out in any suitable order, or simultaneously where appropriate. Additionally, individual blocks may be deleted from any of the methods without departing from the spirit and scope of the subject matter described herein. Aspects of any of the examples described above may be combined with aspects of any of the other examples described to form further examples without losing the effect sought.
The term ‘comprising’ is used herein to mean including the method blocks or elements identified, but that such blocks or elements do not comprise an exclusive list and a method or apparatus may contain additional blocks or elements.
It will be understood that the above description is given by way of example only and that various modifications may be made by those skilled in the art. The above specification, examples and data provide a complete description of the structure and use of exemplary embodiments. Although various embodiments have been described above with a certain degree of particularity, or with reference to one or more individual embodiments, those skilled in the art could make numerous alterations to the disclosed embodiments without departing from the spirit or scope of this specification.
Number | Name | Date | Kind |
---|---|---|---|
4883440 | Bolli | Nov 1989 | A |
6149490 | Hampton et al. | Nov 2000 | A |
6290565 | Galyean, III | Sep 2001 | B1 |
6454624 | Duff et al. | Sep 2002 | B1 |
6471565 | Simeray | Oct 2002 | B2 |
6526375 | Frankel | Feb 2003 | B1 |
6575802 | Yim et al. | Jun 2003 | B2 |
6682392 | Chan | Jan 2004 | B2 |
6773322 | Gabai et al. | Aug 2004 | B2 |
6773344 | Gabai et al. | Aug 2004 | B1 |
6954659 | Tushinsky et al. | Oct 2005 | B2 |
7003588 | Takeda et al. | Feb 2006 | B1 |
7154363 | Hunts | Dec 2006 | B2 |
7316567 | Hsieh | Jan 2008 | B2 |
7371177 | Ellis | May 2008 | B2 |
7641476 | Didur et al. | Jan 2010 | B2 |
7695338 | Dooley et al. | Apr 2010 | B2 |
8079846 | Cookson | Dec 2011 | B1 |
8087939 | Rohrbach et al. | Jan 2012 | B2 |
8257157 | Polchin | Sep 2012 | B2 |
8317566 | Ganz | Nov 2012 | B2 |
8475275 | Weston et al. | Jul 2013 | B2 |
8548819 | Chan et al. | Oct 2013 | B2 |
8753163 | Gaute | Jun 2014 | B2 |
8753164 | Hansen | Jun 2014 | B2 |
8932123 | Murayama | Jan 2015 | B2 |
9555326 | Scott et al. | Jan 2017 | B2 |
9597607 | Bdeir | Mar 2017 | B2 |
9703896 | Zhang et al. | Jul 2017 | B2 |
20010049249 | Tachau et al. | Dec 2001 | A1 |
20020196250 | Anderson | Dec 2002 | A1 |
20030026090 | Bornovski | Feb 2003 | A1 |
20040110557 | Rowe | Jun 2004 | A1 |
20040215958 | Ellis et al. | Oct 2004 | A1 |
20050059483 | Borge | Mar 2005 | A1 |
20050132290 | Buchner et al. | Jun 2005 | A1 |
20070072680 | Ikeda | Mar 2007 | A1 |
20070097832 | Koivisto et al. | May 2007 | A1 |
20070184722 | Doherty | Aug 2007 | A1 |
20070191100 | Counts | Aug 2007 | A1 |
20070198117 | Wajihuddin | Aug 2007 | A1 |
20070211047 | Doan | Sep 2007 | A1 |
20080009348 | Zalewski et al. | Jan 2008 | A1 |
20080294763 | Uchida | Nov 2008 | A1 |
20090029771 | Donahue | Jan 2009 | A1 |
20090081923 | Dooley et al. | Mar 2009 | A1 |
20090197658 | Polchin | Aug 2009 | A1 |
20090291764 | Kirkman et al. | Nov 2009 | A1 |
20090307592 | Kalanithi et al. | Dec 2009 | A1 |
20100007528 | Urata et al. | Jan 2010 | A1 |
20100026458 | Samdahl et al. | Feb 2010 | A1 |
20100052916 | Canora et al. | Mar 2010 | A1 |
20100144429 | Ryan et al. | Jun 2010 | A1 |
20100167623 | Eyzaguirre et al. | Jul 2010 | A1 |
20110021109 | Le et al. | Jan 2011 | A1 |
20120050198 | Cannon | Mar 2012 | A1 |
20120122059 | Schweikardt et al. | May 2012 | A1 |
20120286629 | Johnson et al. | Nov 2012 | A1 |
20130109267 | Schweikardt et al. | May 2013 | A1 |
20130109272 | Rindlisbacher | May 2013 | A1 |
20130122753 | Blakborn | May 2013 | A1 |
20130165223 | Leyland et al. | Jun 2013 | A1 |
20130173658 | Adelman et al. | Jul 2013 | A1 |
20130196770 | Barney et al. | Aug 2013 | A1 |
20130288563 | Zheng et al. | Oct 2013 | A1 |
20140030955 | Smetanin | Jan 2014 | A1 |
20140244018 | Bach et al. | Aug 2014 | A1 |
20150004871 | Laursen | Jan 2015 | A1 |
20150104774 | Watry | Apr 2015 | A1 |
20150127146 | Carlson | May 2015 | A1 |
20150258434 | Scott et al. | Sep 2015 | A1 |
20170308624 | Zhang et al. | Oct 2017 | A1 |
Number | Date | Country |
---|---|---|
1834906 | Sep 2006 | CN |
103281928 | Sep 2013 | CN |
1271415 | Jan 2003 | EP |
1291138 | Mar 2003 | EP |
1883194 | Jan 2008 | EP |
2311539 | Apr 2011 | EP |
2001012285 | Feb 2001 | WO |
2001069799 | Sep 2001 | WO |
2001069829 | Sep 2001 | WO |
2005083546 | Sep 2005 | WO |
2009037679 | Mar 2009 | WO |
2012160055 | Nov 2012 | WO |
Entry |
---|
Gilpin, et al., “Robot Pebbles: One Centimeter Modules for Programmable Matter through Self-disassembly”, In Proceedings of the 2010 IEEE International Conference on Robotics and Automation, May 3, 2010, pp. 2485-2492. |
Gorbet, et al., “Triangles: Tangible Interface for Manipulation and Exploration of Digital Information Topography”, In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Jan. 1, 1998, pp. 49-56. |
“International Search Report & Written Opinion Issued in PCT Application No. PCT/US2015/018562”, dated Sep. 2, 2015, 12 Pages. |
“International Preliminary Report on Patentability Issued in PCT Application No. PCT/US2015/018562”, dated Jun. 13, 2016, 6 pages. |
Schweikardt, Eric, “Designing Modular Robots”, Nov. 19, 2013, Available at: http://www.cmu.edu/architecture/research/grad_work/2009_phdcd_schweikardt_eric.pdf. |
“Skylanders Swapforce”, Sep. 11, 2013, Available at: http://www.skylanders.com/swapforce. |
“Disney Infinity”, Nov. 19, 2013, Available at: https://infinity.disney.com/en-gb. |
“Cubelets”, Sep. 11, 2013, Available at: http://www.modrobotics.com/. |
“Shapeways”, Nov. 19, 2013, Available at: http://shapeways.com/. |
Lampe, et al., “The Augmented Knight's Castle—Integrating Mobile and Pervasive Computing Technologies into Traditional Toy Environments”, Nov. 21, 2013, Available at: http://www.vs.inf.ethz.ch/publ/papers/mlampe-pg07-akc.pdf. |
Kikin-Gil, Ruth, “BuddyBeads”, Published on: Oct. 10, 2006, Available at: http://www.ruthkikin.com/Images/r.kikin-gil_thesis2005.pdf. |
Fortmann, et al., “Illumee: Aesthetic Light Bracelet as a Wearable Information Display for Everyday Life”, In Proceedings of ACM Conference on Pervasive and Ubiquitous Computing Adjunct Publication, Sep. 8, 2013, 4 pages. |
Labrune, et al., “Telebeads: Social Network Mnemonics for Teenagers”, In Proceedings of Conference on Interaction Design and Children, Jun. 7, 2006, 8 pages. |
Kuniavsky, Mike, “Smart Things: Ubiquitous Computing User Experience Design”, Published on: Sep. 2010, Available at: http://books.google.co.in/books?id=-WLyUCBBUVAC&pg=PA89&lpg=PA89&dq=Interactive+Smart+Beads+and+Bracelet&source=bl&ots=HA6ZA1Bssz&sig=x1s2X1pGZIe-5oVqX3uZA0jZ1ks&h1=en&sa=X&ei=BxWLUqSGI4X3rQfh9oDYCg&ved=0CFAQ6AEwBg#v=onepage&q=Interactive%20Smart%20Beads%20and%20Bracelet&f=false. |
Robertson, Judy, “Encouraging Girls to Study Geeky Subjects (Part 2): Programmable Bracelets”, Published on: Apr. 12, 2010, Available at: http://cacm.acm.org/blogs/blog-cacm/85132-encouraging-girls-to-study-geeky-subjects-part-2-programmable-bracelets/fulltext. |
Lampe, et al., “Integrating Interactive Learning Experiences into Augmented Toy Environments”, In Proceedings of the Pervasive Learning Workshop at the Pervasive Conference, May 2007, 8 pages. |
“Seebo Platform”, Published on: Jun. 22, 2013, Available at: http://www.seebo.com/. |
Raffle, et al., “Topobo: A Constructive Assembly System with Kinetic Memory”, In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Apr. 24, 2004. |
Schmid, et al., “Networking Smart Toys with Wireless ToyBridge and ToyTalk”, In IEEE International Conference on Computer Communications, Apr. 10, 2011, 2 pages. |
Patrizia, et al., “A Robotic Toy for Children with special needs: From requirements to Design”, In IEEE 11th International Conference on Rehabilitation Robotics, Nov. 20, 2013, 6 pages. |
Zaino, Jennifer, “JNFC Technology Brings New Life to Games”, In Journal of RFID, Oct. 1, 2012, 10 pages. |
“The NEX band”, Jan. 3, 2014, Available at: http://www.mightycast.com. |
PCT Search Report and Written Opinion dated Mar. 22, 2016 for PCT Application No. PCT/US15/018562, 5 Pages. |
“Final Office Action Issued in U.S. Appl. No. 14/203,991”, dated Mar. 4, 2016, 5 Pages. |
“Non Final Office Action Issued in U.S. Appl. No. 14/203,991”, dated Aug. 16, 2016, 6 Pages. |
“Non Final Office Action Issued in U.S. Appl. No. 14/203,991”, dated Sep. 29, 2015, 11 Pages. |
“Notice of Allowance Issued in U.S. Appl. No. 14/203,991”, dated Mar. 6, 2017, 7 Pages. |
“Non-Final Office Action Issued in U.S. Appl. No. 14/204,239”, dated Mar. 21, 2016, 10 Pages. |
“Notice of Allowance Issued in U.S. Appl. No. 14/204,239”, dated Sep. 12, 2016, 8 Pages. |
“Final Office Actsion Issued in U.S. Appl. No. 14/204,483”, dated Nov. 30, 2017, 19 Pages. |
“Final Office Action Issued in U.S. Appl. No. 14/204,483”, dated Sep. 13, 2016, 17 Pages. |
“Non Final Office Action Issued in U.S. Appl. No. 14/204,483”, dated Apr. 15, 2016, 22 Pages. |
“Second Written Opinion Issued in PCT Application No. PCT/US2015/019341”, dated Jan. 26, 2016, 6 Pages. |
“Non Final Office Action Issued in U.S. Appl. No. 14/204,483”, dated Jun. 2, 2017, 18 Pages. |
“Non-Final Office Action Issued in U.S. Appl. No. 14/205,077”, dated Oct. 29, 2015, 11 Pages. |
“Notice of Allowance Issued in U.S. Appl. No. 14/205,077”, dated Jun. 28, 2016, 7 Pages. |
“Notice of Allowance Issued in U.S. Appl. No. 14/205,077”, dated Oct. 31, 2016, 7 Pages. |
“Notice of Allowance Issued in U.S. Appl. No. 14/204,483”, dated Apr. 2, 2018, 7 Pages. |
“Office Action Issued in U.S. Appl. No. 15/453,375”, dated Nov. 8, 2017, 9 Pages. |
“Office Action Issued in European Patent Application No. 15716200.9”, dated Jan. 3, 2018, 4 Pages. |
“Supplementary Search Report Issued in European Patent Application No. 15762353.9”, dated Oct. 6, 2017, 7 Pages. |
“Office Action Issued in Chinese Patent Application No. 201580013171.0”, dated Apr. 4, 2014, 7 Pages. |
Ahde, et al., “Hello—Bracelets Communicating Nearby Presence of Friends”, In Proceedings of the Tenth Anniversary Conference on Participatory Design, Sep. 30, 2008, 3 Pages. |
“International Search Report and Written Opinion Issued in PCT Application No. PCT/US2015/019341”, dated Jun. 15, 2015, 11 Pages. |
“International Search Report and Written Opinion Issued in PCT Application No. PCT/US2015/018560”, dated May 20, 2015, 11 Pages. |
“International Preliminary Report on Patentability Issued in PCT Application No. PCT/US2015/018561”, dated Mar. 10, 2016, 8 Pages. |
“International Search Report and Written Opinion Issued in PCT Application No. PCT/US2015/018561”, dated May 20, 2015, 11 Pages. |
“Second Written Opinion Issued in PCT Application No. PCT/US2015/018561”, dated Dec. 14, 2015, 6 Pages. |
“International Preliminary Report on Patentability Issued in PCT Application No. PCT/US2015/019341”, dated Apr. 26, 2016, 07 Pages. |
“Non Final Office Action Issued in U.S. Appl. No. 15/646,004”, dated Jun. 22, 2018, 12 Pages. |
“Office Action Issued in European Patent Application No. 15718279”, dated Jun. 13, 2018, 5 Pages. |
“Office Action Issued in Chinese Patent Application No. 201580013135.4”, dated Sep. 30, 2018, 9 Pages. |
“Office Action Issued in Chinese Patent Application No. 201580013167.4”, dated Aug. 28, 2018, 20 Pages. |
Number | Date | Country | |
---|---|---|---|
20150258435 A1 | Sep 2015 | US |