Embodiments of the present invention relates generally to the field of animatronic devices. More particularly, embodiments of the invention relate to computing toys that can interact with the user by connecting the user to the internet using a Smartphone, mini-computer, or any data processing device.
Plush toys, also known as companion toys, like teddy bears are sometimes referred to as comfort toys as they can provide stress relief to the holder of such toy. However, such toys are inanimate or provide limited functionality, and interact with the user only based on their built in programming capacity.
Although with advances in technology, such comfort toys have the ability to connect to the internet, their functionality still remains deeply limited. Thus, what is needed are computing based interactive animatronic devices that are versatile and can provide a user a wide variety of functionalities while providing being a comfort toy.
In one embodiment, an animatronic device/toy is configured as a stuffed toy, and comprises a plurality of sensors, including at least an audio sensor, a visual sensor, a proximity sensor, olfactory sensor, or a touch sensor. In another embodiment, the animatronic device can be ‘network-enabled’, that is, it is capable of connecting to the internet or a remote computer network via a wireless module device. In one embodiment, the animatronic device also comprises an audio transmitting module; a moveable appendage located on a face portion of the stuffed toy, the movable appendage is configured to imitate a lip like structure. The animatronic device further comprises at least one electro-mechanical device/actuator such as a servomotor or a magnetic actuator, coupled to the moveable appendage; the actuator can control mechanical movements of the moveable appendage. Further, the animatronic device comprises a data processing unit or system that includes at least one processor. The data processing system of the animatronic device is configured to receive a signal from the sensors of the animatronic device, the signal is generated due to a verbal, visual, proximity based, touch based, or odor based command, depending on the corresponding sensor transmitting the signal. The processing system then processes the signal to generate an instruction that is transmitted to a second data processing unit or system (e.g., Smartphone, tablet, desktop Personal Computer (PC), cloud computing unit, remotely networked server, etc.). In one embodiment, the second data processing system can be internally located within the animatronic device or it can be externally located. The data processing unit/system of the animatronic device then receives a response (in the form of an audible transmission) from the other (internal or external) processing unit/system. In another embodiment, the response can also be a non-verbal communication which translates in the movement of an actuator of the animatronic device.
Based on the response, the data processing system of the animatronic device controls an actuator of the animatronic device to generate mechanical movements in any moveable joint or appendage of the animatronic device (e.g., jaw, lips, eyes neck, arms, legs, etc.). In another embodiment, the second data processing system can be located within the animatronic device or can be externally located. In yet another embodiment, the actuator located near the movable lip like appendage is a servomotor. In one embodiment, an actuator or servomotor can be configured to head movement, eye movement, arm movement, or leg movement of the animatronic device.
In one embodiment, the Smartphone is enclosed within the animatronic device, and the sensors within the Smartphone are used by the animatronic device. In another embodiment, the data processing system of the animatronic device is capable of synchronizing an audio transmission along with the mechanical movements of the moveable appendage using an actuator. In another embodiment, the synchronization commands of the actuator are transmitted by the other data processing system (whether internally or externally located) along with the audible commands (if any). Thus, in this manner the internal processing power required by the animatronic device can be managed. In one embodiment the second data processing unit/system communicates with the data processing unit of the animatronic device using a micro-Universal Serial Bus (micro-USB) interface located on the second data processing unit/system.
In yet another embodiment, the sensors of the animatronic device further includes gyroscope(s), air pressure sensor(s), orientation sensor(s), geo-location or GPS sensor(s), and temperature sensor(s). In one embodiment, the proximity sensors include infrared or ultrasound sensor(s). In yet another embodiment, the second data processing unit/system can be a computing device within the same network as the animatronic device, and can communicated with an external web-based (or cloud based) computing system. Thus, in such an embodiment, based on the signal received from the data processing unit of the animatronic device, the second data processing system can transmit an instruction, over a computer network, to a cloud based on internet based remote server, receive a response, and transmit the response to the processing system of the animatronic device. In another embodiment, the processing system of the animatronic device and the second data processing unit can communicate using a wireless communication protocol. In another embodiment, the second data processing unit is located within the same network as the animatronic device. In one embodiment, the second data processing unit is located on a remote network.
The present invention is illustrated by way of example and not limitation in the figures of the accompanying drawings in which like references indicate similar elements.
Various embodiments and aspects of the inventions will be described with reference to details discussed below, and the accompanying drawings will illustrate the various embodiments. The following description and drawings are illustrative of the invention and are not to be construed as limiting the invention. Numerous specific details are described to provide a thorough understanding of various embodiments of the present invention. However, in certain instances, well-known or conventional details are not described in order to provide a concise discussion of embodiments of the present inventions.
Reference in the specification to “one embodiment” or “an embodiment” or “another embodiment” means that a particular feature, structure, or characteristic described in conjunction with the embodiment can be included in at least one embodiment of the invention. The appearances of the phrase “in one embodiment” in various places in the specification do not necessarily all refer to the same embodiment. The processes depicted in the figures that follow are performed by processing logic that comprises hardware (e.g., circuitry, dedicated logic, etc.), software, or a combination of both. Although the processes are described below in terms of some sequential operations, it should be appreciated that some of the operations described can be performed in a different order. Moreover, some operations can be performed in parallel rather than sequentially.
Disclosed herein are systems, methods, and devices that can use a Smartphone/tablet or any general computing device with an animatronic device (electronic toy) in the form of any user familiar object or toy, including a cute, cuddly, and appealing stuffed/plush toy, a doll or an action figure. In various embodiments, the computing device associated with the animatronic device can be located internally (inside) the animatronic device, or could be located externally (outside). In one embodiment, the computing device provides processing power (local or external) and also makes use of voice recognition and synthesis. In one embodiment, an external microphone and speaker(s) can form the audio interface. In another embodiment, the computing devices speakers/microphone are used.
In one embodiment, verbal communication is the primary user interface for the user familiar animatronic device. The animatronic device can also provide text to speech synthesis and facilitate use of several relevant software applications (e.g., ‘mobile apps’, etc.). The animatronic device can access web resources on the internet to provide requested information and also to substitute or augment limited local processing power of the computing device (e.g., Smartphone/tablet).
In one embodiment, widespread availability of fast wireless broadband access and powerful cloud computing resources can be utilized to accomplish tasks that would be beyond the capability of any local computing (e.g., onboard microprocessor). This enhanced processing capability can allow several desired processing intensive tasks including voice and facial recognition.
In an embodiment, the animatronic device can be a personal digital assistant that can take and play messages, setup alarms and reminders, greet and interact with people through its ability of voice and face recognition. In one embodiment, the user interacts with the animatronic device through voice command and query. It can enable verbal web queries in place of typing the same on a laptop or a phone. In another embodiment, the animatronic device can have speech synthesis and text to speech capabilities for providing query results, for holding verbal dialog or rather interacting with people. In yet another embodiment, the animatronic device can play internet radio on command, tell stories, play quizzes, play music, relay weather forecast, say find out which movies are playing where (using its Global Positioning System (GPS) feature for location) or suggest a restaurant. In yet another embodiment, the animatronic device can place voice calls using Voice over Internet Protocol (VoIP) functionality. In another embodiment, the animatronic device disclosed herein can be used as a baby monitor.
In one embodiment, the animatronic device discloses herein has electronic sensors such as Touch, Ultrasound proximity, Infrared proximity/motion, or smell (olfaction device). Such sensors can be located on the animatronic device itself or are available in the computing device (e.g., Smartphone/tablet). When the sensors are located on the animatronic device, the built-in computing devices sensors (e.g., Smartphone/tablet sensors) can supplement the existing sensors of the object toy. Thus, sensors that are already located on a Smartphone/Tablet such as accelerometers, gyroscopes, orientation, pressure, temperature, sound, light etc. can be used to supplement the sensors of the animatronic device. In one embodiment, sensor interfaces could be analog or digital (e.g., I2C/I2C bus). In one embodiment, any required mechanical motion is provided by electro-mechanical devices/actuators such as servo(s) or magnetic actuators.
In one embodiment, electronic device access port hatch or compartment 104 can be configured at the front of animatronic device 100. Compartment 104 is configured to have an outside cover or door 204 and a cavity section 202. As illustrated, electronic device compartment 104 is located on body 102 and is shown when the compartment door 204 is opened. Cavity section 202, in one embodiment, can accommodate an electronic device (e.g., Smartphone, tablet, small computer, etc.). For example, door 204 can be opened for a Smartphone insertion and removal or just to access the Smartphone for setup or rebooting. In one embodiment, compartment door 204 is coupled to cavity section 202 using hinges 212, as illustrated. In one embodiment, to close or lock compartment 104, compartment door 204 can be secured with a latch(es) or screw(es) (not shown) to cavity section 202. In one embodiment, such latch(es) or screw(s) can be configured to be placed on the opposite side from hinges 212.
In one embodiment, door 204 comprises adjustable side guiding rails 206 that can be loosened and an electronic device can be inserted between guiding rails 206. In one embodiment guide rails are provided on the side and bottom portions of door 204, as illustrated. In one embodiment, the electronic device (e.g., Smartphone) orientation is such that a camera of the electronic device faces the user with camera lens pointing the user via camera port 106. Thus, in case of a Smartphone having a built-in Camera lens at the rear or backside of the device, the Smartphone screen will be facing inwards towards cavity 202. Guiding rails 206 can be adjusted to fit snuggly against the Smartphone and optionally secured with screws (not shown) to hold the Smartphone in place. Compartment 104 can also have an open or transparent camera port 106 to let the Smartphone rear camera have an unobstructed outside view. In yet another embodiment, camera port 106 is configured to let an unobstructed outside view of an external camera (that is, not the Smartphone Camera).
In one embodiment, the electronic device communicates with the electronic components of animatronic device 100 by using wires. In such an embodiment, clips 208 can be introduced to support such wires. Thus, clips 208 can be used to create a clutter free environment. Wiring panel 210 can also be introduced, in one embodiment, where the wiring panel acts as an interface that connects/routes/guides the wires between the electronic device and the electronic components/systems of animatronic device 100.
As illustrated, mechanical frame 402 can extend vertically from the body portion to the head portion of the animatronic device. Mechanical frame 402 can extend horizontally (as illustrated) to provide actuators for the arms. Further, mechanical frame 402 can further have horizontal structures to provide support to batter compartment 410. In one embodiment, battery compartment has a built-in non-removable battery pack. In another embodiment, the battery pack can rechargeable. In yet another embodiment, the battery pack is removable. Battery compartment 410 can have two horizontal structures to secure battery compartment 410 to mechanical frame 402.
Camera 514 can be a standalone camera of the animatronic device, or can be the built-in camera of the electronic device (e.g., rear camera of a Smartphone or tablet, etc.). In one embodiment, camera 514 can use camera port 106 of compartment 104. In one embodiment, touch sensor(s) 516 can be placed at a back body portion (back side of the body position), at a front body portion (front side of the body portion), or a combination thereof, of the animatronic device. It should be noted, the sensor placements discussed herein are for illustrative purposes only, and a person of ordinary skill in the art would appreciate that the sensors can be placed at any appropriate location of the animatronic device, as found necessary during construction of the animatronic device.
In one embodiment, sensors 602 can be located on the animatronic device. Sensors 602 are interfaced to Smartphone 302 either through a separate interface board or as part of a magnetic actuator controller or a servo controller 608, as illustrated. Sensors 602 can be connected to servo controller 608 using an analog interface or I2C bus 604. Depending on the setup, 604 can represent an analog interface or I2C bus.
In one embodiment, Servo controller 608 acts as a slave processor Smartphone 302 processor. In another embodiment, servo controller 608 provides analog interfacing for the onboard external (not on Smartphone 302) sensors as well as digital outputs for any required tasks (such as eye blinking). In one embodiment, external sensors 602 used in the device with the servo controller 608 can have an analog interface only and no digital interface sensors are used.
A person of ordinary skill would appreciate that there could be one or more servos for the desired mechanical movements, depending on the size, form and the model of the animatronic device. In one embodiment, an optional level translator 606 can be used to transmit the signal generated by sensors 602. Signal voltage level translator 606 can be needed if there is a signal levels mismatch between sensors 602 and servo controller system 608 Level translator interacts with interface 604 to receive the sensor signal, and transmits the signal to servo controller 608. In one embodiment, servo controller 608 provides necessary instructions to servos 614 using servo control interface 612 that transmits the instructions from servo controller 608 to servos 614.
In one embodiment, the processing system of Smartphone 302 handles all the system management tasks, as needed. It interprets the command and can take into account any sensory data from the sensors for decision making, the processing system of Smartphone 302 can also determine if it can act on its own or the particular instance requires assistance from the web or rather specifically from the web based cloud computing servers. If such ‘cloud assist’ functionality is utilized then the processing unit gets the required data and suggested response from the servers. In either case, the local processing unit of Smartphone 302 ultimately decides on the audio and mechanical responses. Response may be either verbal or some other audio (e.g. sound) or selected magnetic actuator/servomotor instigation to execute the required movements, or a combination thereof.
In one embodiment, servo controller system 608 can synchronize servo 614 movements with the audio generated by Smartphone 302. For example, for lip synchronization of servomotor 614 placed near movable appendage, servo controller system 608 can transmit control signals to the respective servo control interface 612 in such a way that the audio transmitted from Smartphone 302 is synchronized with the movement of moveable appendage 405. Similarly, audio response can be synchronized with movements of any servo actuator (e.g., those located on the neck, arms, eyes, etc.).
In one embodiment, all decision making and response generation of the animatronic device is implemented in software (not hardwired) and is totally under software control and can easily be deleted, added, or modified as needed. In one embodiment, power supply/battery 610 can be in the form of single use batteries or may use rechargeable batteries. Incase rechargeable batteries are used, an external AC/DC converter (wall unit) (not shown) can be used along with voltage regulators and a battery charge monitoring circuit. In one embodiment, a system DC (Direct Current) bus delivers appropriate power to various devices. In any embodiment of the animatronic device, discussed herein, an additional (optional) compartment to access the battery compartment can be configured similar to compartment 104.
In one embodiment, external computing system 706 could be a Smartphone, tablet, personal computer or laptop with corresponding wireless protocol (e.g. Bluetooth protocol) and is able to control the animatronic device using supporting mobile applications or software(s) as well as a supported speech recognition software package.
In one embodiment, sensor placement is similar to the embodiment with the electronic device located inside the animatronic device, except for an external camera, which would be required in this embodiment. In another embodiment, the camera can be connected to onboard microcontroller 702. In another embodiment, the camera is directly interface with the external computing system 706.
In yet another embodiment, the external computing system 706 can be located on a remote server (e.g., internet). In one embodiment the external computing system 706 can be a dedicated web based server. In such an embodiment, onboard microcontroller system 702 is connected to a selected Cloud computing unit via a Wi-Fi (802.3) wireless link, and the animatronic device is managed from the external cloud computing unit. In this embodiment, the microcontroller acts as slave processor to the external (cloud) computing unit 706. The selected could computing unit 706 controls the animatronic device using software applications or mobile applications. In one embodiment, the software applications controlled by the external (cloud) computing system 706 includes face recognition as well as a speech recognition and speech to text to speech assistance. Other aspects of the embodiment using such cloud computing remain similar to as discussed using various other embodiments herein. Further, any embodiment of the animatronic device can use a combination of any technology/techniques disclosed herein.
As illustrated in
In one embodiment, an external audio board with an amplifier 620 and a CODEC (Encoder and Decoder) supports the device's audio communication. The encoded data (audio) can stream through microcontroller 802, over the wireless link, to the external processing system 706 (e.g. Smartphone, cloud computing unit, etc.).
In one embodiment, the external processing system, aided by the microcontroller, handles all the system management tasks. It interprets the command and also takes into consideration any sensory data from the sensors for decision making. The external processing system decides if it can act on its own or the particular instance requires assistance from the web or rather specifically from the web based cloud computing servers. If such ‘cloud assist’ feature is utilized, then the external processing system gets the required data and suggested response from the servers. In either case, the external processing system ultimately decides on the audio and mechanical responses that are transmitted to the built-in microcontroller system 802. In one embodiment, commands for the mechanical movements from the external processing system are transmitted to the microcontroller system. In another embodiment, the external processing system only transmits the audio responses, and the mechanical movements of the animatronic device are controlled by the microcontroller system. In one embodiment, responses can either be verbal or some other sort of audio (e.g. sound) or selected magnetic actuator/servomotor instigation to execute the required movements, or a combination thereof.
The techniques shown in the figures can be implemented using computer program instructions (computer code) and data stored and executed on one or more electronic systems (e.g., computer systems, etc.). Such electronic systems store and communicate (internally and/or with other electronic systems over a network) code and data using machine-readable media, such as machine-readable non-transitory storage media (e.g., magnetic disks; optical disks; random access memory; dynamic random access memory; read only memory; flash memory devices; phase-change memory). In addition, such electronic systems typically include a set of one or more processors coupled to one or more other components, such as one or more storage devices, user input/output devices (e.g., a keyboard, a touchscreen, and/or a display), and network connections. The coupling of the set of processors and other components is typically through one or more busses and bridges (also termed as bus controllers). The storage device and signals carrying the network traffic respectively represent one or more machine-readable storage media and machine-readable communication media. Thus, the storage device of a given electronic device typically stores code and/or data for execution on the set of one or more processors of that electronic device.
It should be apparent from this description that aspects of the present invention may be embodied, at least in part, in software. That is, the techniques may be carried out in a computer system or other computer system in response to its processor, such as a microprocessor, executing sequences of instructions contained in memory, such as a ROM, DRAM, mass storage, or a remote storage device. In various embodiments, hardware circuitry may be used in combination with software instructions to implement the present invention. Thus, the techniques are not limited to any specific combination of hardware circuitry and software nor to any particular source for the instructions executed by the computer system. In addition, throughout this description, various functions and operations are described as being performed by or caused by software code to simplify description. However, those skilled in the art will recognize what is meant by such expressions is that the functions result from execution of the code by a processor.
For example, computing system 900 may represent any of data processing systems described above performing any of the processes or methods described above. System 900 can include many different components. These components can be implemented as integrated circuits (ICs), portions thereof, discrete electronic devices, or other modules adapted to a circuit board such as a motherboard or add-in card of the computer system, or as components otherwise incorporated within a chassis of the computer system. Note also that system 900 is intended to show a high level view of many components of the computer system. However, it is to be understood that additional or fewer components may be present in certain implementations and furthermore, different arrangement of the components shown may occur in other implementations. System 900 may represent a desktop, a laptop, a tablet, a server, a mobile phone, a programmable logic controller, a personal digital assistant (PDA), a personal communicator, a network router or hub, a wireless access point (AP) or repeater, a set-top box, or a combination thereof.
In one embodiment, system 900 includes processor 901, memory 903, and devices 905-908 via a bus or an interconnect 922. Processor 901 may represent a single processor or multiple processors with a single processor core or multiple processor cores included therein. Processor 901 may represent one or more general-purpose processors such as a microprocessor, a central processing unit (CPU), or the like. More particularly, processor 901 may be a complex instruction set computing (CISC) microprocessor, reduced instruction set computing (RISC) microprocessor, very long instruction word (VLIW) microprocessor, or processor implementing other instruction sets, or processors implementing a combination of instruction sets. Processor 901 may also be one or more special-purpose processors such as an application specific integrated circuit (ASIC), a cellular or baseband processor, a field programmable gate array (FPGA), a digital signal processor (DSP), a network processor, a graphics processor, a network processor, a communications processor, a cryptographic processor, a co-processor, an embedded processor, or any other type of logic capable of processing instructions.
Processor 901, which may be a low power multi-core processor socket such as an ultra low voltage processor, may act as a main processing unit and central hub for communication with the various components of the system. Such processor can be implemented as a system on chip (SoC). In one embodiment, processor 901 may be an Intel® Architecture Core™-based processor such as an i3, i5, i7 or another such processor available from Intel Corporation, Santa Clara, Calif. However, other low power processors such as available from Advanced Micro Devices, Inc. (AMD) of Sunnyvale, Calif., an ARM-based design from ARM Holdings, Ltd. or a MIPS-based design from MIPS Technologies, Inc. of Sunnyvale, Calif., or their licensees or adopters may instead be present in other embodiments.
Processor 901 is configured to execute instructions for performing the operations and methods discussed herein. System 900 further includes a graphics interface that communicates with graphics subsystem 904, which may include a display controller and/or a display device.
Processor 901 may communicate with memory 903, which in an embodiment can be implemented via multiple memory devices to provide for a given amount of system memory. As examples, the memory can be in accordance with a Joint Electron Devices Engineering Council (JEDEC) low power double data rate (LPDDR)-based design such as the current LPDDR2 standard according to JEDEC JESD 207-2E (published April 2007), or a next generation LPDDR standard to be referred to as LPDDR3 that will offer extensions to LPDDR2 to increase bandwidth. As examples, 2/4/8 gigabytes (GB) of system memory may be present and can be coupled to processor 901 via one or more memory interconnects. In various implementations the individual memory devices can be of different package types such as single die package (SDP), dual die package (DDP) or quad die package (QDP). These devices can in some embodiments be directly soldered onto a motherboard to provide a lower profile solution, while in other embodiments the devices can be configured as one or more memory modules that in turn can couple to the motherboard by a given connector.
Memory 903 can be a machine readable non-transitory storage medium such as one or more volatile storage (or memory) devices such as random access memory (RAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), static RAM (SRAM), or other types of storage devices such as hard drives and flash memory. Memory 903 may store information including sequences of executable program instructions that are executed by processor 901, or any other device. For example, executable code and/or data of a variety of operating systems, device drivers, firmware (e.g., input output basic system or BIOS), and/or applications can be loaded in memory 903 and executed by processor 901. An operating system can be any kind of operating systems, such as, for example, Windows® operating system from Microsoft®, Mac OS®/iOS® from Apple, Android® from Google®, Linux®, Unix®, or other real-time or embedded operating systems such as VxWorks.
System 900 may further include IO devices such as devices 905-908, including wireless transceiver(s) 905, input device(s) 906, audio IO device(s) 907, and other IO devices 908. Wireless transceiver 905 may be a WiFi transceiver, an infrared transceiver, a Bluetooth transceiver, a WiMax transceiver, a wireless cellular telephony transceiver, a satellite transceiver (e.g., a global positioning system (GPS) transceiver), or other radio frequency (RF) transceivers, network interfaces (e.g., Ethernet interfaces) or a combination thereof.
Input device(s) 906 may include a mouse, a touch pad, a touch sensitive screen (which may be integrated with display device 904), a pointer device such as a stylus, and/or a keyboard (e.g., physical keyboard or a virtual keyboard displayed as part of a touch sensitive screen). For example, input device 906 may include a touch screen controller coupled to a touch screen. The touch screen and touch screen controller can, for example, detect contact and movement or break thereof using any of a plurality of touch sensitivity technologies, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with the touch screen.
Audio IO device 907 may include a speaker and/or a microphone and/or Codec to facilitate voice-enabled functions, such as voice recognition, voice replication, digital recording, and/or telephony functions. In hardware, in one embodiment, an audio codec can refer to a device(s) that encodes analog audio as digital signals and decodes digital back into analog. In one embodiment, Audio IO device 907 comprises an Analog-to-digital converter (ADC), a Digital-to-analog converter (DAC), or a combination thereof, running off the same clock. Other optional devices 908 may include a storage device (e.g., a hard drive, a flash memory device), universal serial bus (USB) port(s), parallel port(s), serial port(s), diagnostic port(s), a printer, a network interface, a bus bridge (e.g., a PCI-PCI bridge), sensor(s) (e.g., a motion sensor such as an accelerometer, gyroscope, a magnetometer, a light sensor, compass, a proximity sensor, etc.), or a combination thereof. Optional devices 908 may further include an imaging processing subsystem (e.g., a camera), which may include an optical sensor, such as a charged coupled device (CCD) or a complementary metal-oxide semiconductor (CMOS) optical sensor, utilized to facilitate camera functions, such as recording photographs and video clips. Certain sensors may be coupled to interconnect 907 via a sensor hub (not shown), while other devices such as a keyboard or thermal sensor may be controlled by an embedded controller (not shown), dependent upon the specific configuration or design of system 900.
To provide for persistent storage of information such as data, applications, one or more operating systems and so forth, a mass storage (not shown) may also couple to processor 901. In various embodiments, to enable a thinner and lighter system design as well as to improve system responsiveness, this mass storage may be implemented via a solid state device (SSD). However in other embodiments, the mass storage may primarily be implemented using a hard disk drive (HDD) with a smaller amount of SSD storage to act as a SSD cache to enable non-volatile storage of context state and other such information during power down events so that a fast power up can occur on RE-initiation of system activities. Also a flash device may be coupled to processor 901, e.g., via a serial peripheral interface (SPI). This flash device may provide for non-volatile storage of system software, including a basic input/output software (BIOS) as well as other firmware of the system.
Note that while system 900 is illustrated with various components of a data processing system, it is not intended to represent any particular architecture or manner of interconnecting the components; as such details are not germane to embodiments of the present invention. It will also be appreciated that network computers, handheld computers, mobile phones, and other data processing systems which have fewer components or perhaps more components may also be used with embodiments of the invention.
Thus, methods, devices, and computer readable medium to control and interact with an electronic (animatronic) toy are disclosed. Although the present invention has been described with reference to specific exemplary embodiments, it will be evident that various modifications and changes may be made to these embodiments without departing from the broader spirit and scope of the invention as set forth in the claims. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.
The present application claims priority from U.S. Provisional Patent Application No. 62/088,669 filed on Dec. 7, 2014, and is titled “CLOUD COMPUTING EMPOWERED STUFFED/PLUSH TOY,” and U.S. Provisional Patent Application No. 62/089,841 filed on Dec. 10, 2014, and is titled “SMARTPHONEORELECTRONIC TABLET BASED STUFFED PLUSH TOY,” and the contents of which are incorporated by reference.
Number | Date | Country | |
---|---|---|---|
62088669 | Dec 2014 | US | |
62089841 | Dec 2014 | US |