Not Applicable
Not Applicable
The present invention relates to toys in general, and in particular to an interactive robotic toy.
Toys that cling to fingers and finger puppets are known in the art. Such toys provide entertainment to children and adults. U.S. Pat. No. 7,029,361 to Seibert et al, entitled “Finger puppets with sounds” directs to a toy being held on or by a finger, which includes a body having a top end and a bottom end, and means for mounting the toy on a finger coupled to the body. The toy also includes a computer chip and a speaker, within the body, for generating sound. The toy further includes a switch electrically connected to the computer chip and a finger tapper movably mounted to the bottom end of the body. When the finger tapper is depressed, the switch is actuated to cause the generation of the sound.
However, there is no interactive robotic toy which can cling to a finger of a person or which exhibits a plurality of physical animations in response to user actions, such as kissing, cradling, hanging upside down, petting and the like. There are no interactive robotic toys wherein the physical animation is a combination of sound and motion and include head motion, eyes blinking or sound animations (e.g., sound of laughing, sound of sneezing, sound of a kiss and the like).
It is, therefore, a prime object of the present invention to provide an interactive robotic toy which can cling to a finger of a person.
It is another object of the present invention to provide robotic toy exhibits a plurality of physical animations in response to user actions.
It is another object of the present invention to provide robotic toy which exhibits a plurality of physical animations in response to user actions including kissing, cradling, hanging upside down, petting and the like.
It is another object of the present invention to provide robotic toy exhibits a plurality of physical animations including a combination of sound and motion and may include head motion, eyes blinking or sound animations (e.g., sound of laughing, sound of sneezing, sound of a kiss and the like).
In accordance with the present invention, an interactive robotic toy is provided. The toy includes a body section including flexible limbs and a head section, rotatably coupled with the body section. The head section includes eyes and respective eye lids, operable to cover and uncover the eyes. The toy also includes a speaker, operative to produce sounds and a motor, operative to rotate the head section relative to the body section. The toy also includes an eyes blink actuator, operable to move the eye lids such that eye lids cover and uncover said eyes.
The toy further includes at least one touch sensor for detecting touch and at least one sound sensor for detecting sound in the vicinity of the toy.
At least one rotational motion sensor is provided for detecting rotational motion of the toy about selected axes. A memory is provided for storing a plurality of physical animations.
Further, a processor, coupled with the speaker, the motor, the eyes blink actuator, the touch sensor, the sound sensor, the rotation motion sensor and the memory is provided for selecting at least one physical animation corresponding to signals received from the touch sensor, the sound sensor and the rotational motion sensor. The processor produces signals corresponding to the selected physical animation for the speaker, the motor and/or the eyes blink actuator.
To these and to such other objects that may hereinafter appear, the present invention relates to an interactive robotic toy as described in detail in the following specification and recited in the annexed claims, taken together with the accompanying drawings, in which like numerals refer to like parts and in which:
The disclosed technique overcomes the disadvantages of the prior art by providing an interactive robotic toy which can cling to a finger of a person. The robotic toy exhibits a plurality of physical animations in response to user actions. Such actions may include kissing, cradling, hanging upside down, petting and the like. The physical animation is a combination of sound and motion and may include head motion, eyes blinking or sound animations (e.g., sound of laughing, sound of sneezing, sound of a kiss and the like).
Reference is now made to
Interactive robotic toy 100 includes a body section 102 and a head section 104 rotatably coupled with body section 102. Interactive robotic toy 100 further includes flexible left and right arms 106L and 106R respectively, flexible left and right legs 108L and 108R respectively and a flexible tail 110. Flexible left and right arms 106L and 106R, flexible left and right legs 108L and 108R and flexible tail 110 are all coupled with body section 102. Body section 102 further includes a batteries cavity in which batteries are located, covered by a batteries covered 120.
Head section 104 includes left and right eyes 112L and 112R, a mouth opening 114, an on-off switch 116 and loudspeaker holes such as hole 118. Left and right eyes 112L and 112R may be embodied as spheres rotating about an axis perpendicular to axis 115. Half of the spheres are of a color similar to the body color of interactive robotic toy 100 (i.e., emulating eye lids). This half is referred to herein as the “lids side”. The other half of the sphere are of a dark color (e.g., black) thus emulating the eyes, referred to as the “eyes side”.
When left and right eyes 112L and 112R are rotated such that the eyes side thereof are facing the user, left and right eyes 112L and 112R appear to be open. When left and right eyes 112L and 112R are rotated such that the lids side thereof are facing the user, left and right eyes 112L and 112R appear to be closed. Alternatively, left and right eyes 112L and 112R include respective left and right eye lids 113L and 113R operable to cover or uncover the respective left and right eyes 112L and 112R (i.e., close or open left and right eyes 112L and 112R).
Interactive robotic toy 100 may cling to a finger of a user via the flexible limbs thereof (i.e., left and right arms 106L and 106 R, left and right legs 108L and 106 R or tail 110). A cross sectional view of interactive robotic toy 100 is depicted in
As mentioned above, interactive robotic toy 100 includes a plurality of physical animations in response to various actions by the user. For example, when interactive toy 100 is turned on, interactive robotic toy 100 may sound a laugh, and blink. When hanged upside down via tail, interactive robotic toy 100 may produce sounds associated with excitement. When cradled, interactive robotic toy 100 may produce sounds associated with content and close eyes 112L and 112R.
With reference to
Reference is now made to
Orientation sensor 208 is, for example, at least one ball switch, a gyroscope or an accelerometer, detecting information relating to the orientation of interactive robotic toy 200 about selected axes. Memory 216 stores a plurality of physical animations for interactive robotic toy. A physical animation is defined as a combination of sound animation and motion animation. A motion animation is, for example, the motion of the head and the blinking of the eyes of the interactive robotic toy.
Touch sensor 204 detects touch, for example, on the head section of the interactive robotic toy, produces a signal indicative that the head section was touched and provides that signal to processor 202. Orientation sensor 208 detects information relating to the orientation of interactive robotic toy 200, produces a signal or signals respective of this information. Sound sensor 208 detects sound in the vicinity of the interactive toy, produces a signal indicative to that sound and provides this signal to processor 202. As mentioned above, interactive robotic toy may include two or more sound sensors, which define an array of microphones.
Processor 202 receives the signals produced by touch sensor 204, sound sensor 206 and orientation sensor 208. Processor 202 determines when interactive robotic toy 200 was touched according to the signal received from touch sensor 204. Processor 202 determines when a sound was made in the vicinity of interactive robotic toy 200 and the nature of this sound (e.g., the detected sound is a sound of a kiss). For example, processor 202 compares the time signature or the frequency signature (e.g., a Fourier Transform of the time signal) or both to stored signatures. When an array of microphones is employed processor 202 may further determine the direction from which the sound arrived at interactive robotic toy 200, for example, by employing interferometry techniques or correlation based techniques (e.g., Multiple Signal Classification—MUSIC).
Processor 202 selects a physical animation or animations associated with the received signals and the information (e.g., nature of the received sound received, direction or arrival of the received sound or the orientation of interactive robotic toy 200) derived therefrom. Once processor 202 selects the physical animation or animations, processor 202 produces corresponding signal to eyes blink actuator 210, motor 212 and speaker 214 to produce the selected animation.
For example, when the interactive robotic toy is held upright and touched on the head, motor 212 moves the head from side to side and speaker 214 produces a laughing sound. As a further example, when the interactive robotic toy is held horizontally (e.g., cradled) eyes blink actuator 210 rotates the eyes or the eye lids such that the eyes of the interactive robotic toy appear closed and speaker 214 produces a snoring sound. As another example, when the interactive toy is held upside down, then eyes blink actuator 210 rotates the eyes or the eye lids such that the eyes of the interactive robotic toy appear and speaker 214 produce a sound associated with excitement (e.g., a “Yehh” cry). As yet another example, when a user kisses the interactive robotic toy (i.e., sound sensor 206 detects the sound of a kiss), the speaker 214 produces the sound of a kiss. Furthermore, when an array of microphones is employed and the direction of arrival of the sound is determined, motor 212 rotates the head of the interactive robotic toy to turn toward the direction from which the sound arrived.
While only a single preferred embodiment of the present invention has been disclosed for purposes of illustration, it is obvious that many modifications and variations could be made thereto. It is intended to cover all of those modifications and variations which fall within the scope of the present invention, as defined by the following claim:
Priority is claimed on U.S. Provisional Patent Application No. 62/503,363, filed on May 9, 2017, the entire contents of which are incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
62503363 | May 2017 | US |