The present invention relates to an interactive robotic apparatus and, more particularly, to a personal interactive robotic apparatus, which detects user interactions and performs responsive motion animations.
Various interactive robots are well known. Personal robots that display pre-determined movements are also known. Conventional personal robots typically move in predictable ways, and do not positively interact with the user or exhibit a personality. This limits their use and utility.
The present invention provides an interactive robotic apparatus that interacts with a user, especially an elderly individual to provide companionship and comfort. The interactive apparatus receives inputs from the user and reacts and interacts. The interactive robotic apparatus includes microphones and a phototransistor to detect sounds and movement. The interactive robotic apparatus also includes a speaker to generate sounds responsive to the interaction with the user and exhibits a breathing animation and heartbeat.
As required, detailed embodiments of the present invention are disclosed herein. However, it is to be understood that the disclosed embodiments are merely exemplary of the invention that may be embodied in various and alternative forms. The figures are not necessarily to scale; some features may be exaggerated or minimized to show details of particular components. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as a representative basis for the claims and/or as a representative basis for teaching one skilled in the art to variously employ the present invention.
Moreover, except where otherwise expressly indicated, all numerical quantities in this description and in the claims are to be understood as modified by the word “about” in describing the broader scope of this invention. Practice within the numerical limits stated is generally preferred. Also, unless expressly stated to the contrary, the description of a group or class of materials as suitable or preferred for a given purpose in connection with the invention implies that mixtures or combinations of any two or more members of the group or class may be equally suitable or preferred.
Referring to the figures, an interactive robotic apparatus of the present invention is generally indicated by reference numeral 20. The interactive robotic apparatus 20 generally includes a head assembly 22, a body assembly 24, left 26 and right 28 front legs, left 27 and right 29 back legs, and a plush covering 30 such as fur.
The head assembly 22 includes a face plate 32 with eye sockets 34 and 36, a nose 38 and mouth 40. The eye sockets 34 and 36 receive eyes 42 and 44, respectively, which are covered by lenses 46 and 48, respectively, and held in place with retaining rings 50 and 52, respectively. Each eye 42 and 44 includes eyelids 54 and 56, respectively. A microphone 55 is mounted to the face 32 to pick-up sounds and voice signals to interactively respond. A photo transistor 57 is also mounted to the nose 38 to detect movement.
An eye actuating mechanism 58 includes left 60 and right 62 eyelid actuators, each mounted to an eye carriage 64 and 66, respectively. Each eyelid actuator 60 and 62 includes a rubber cylinder 68 and 70, which impinges upon the back of the eyelids 54 and 56, to actuate the eyelids. As the eyelid actuators rotate in one direction or the other, the rubber cylinders 68 and 70 cause the eyelids 54 and 56 to rotate about an axis of rotation of the eyes 42 and 44.
The eye actuating mechanism 58 also includes an eye actuator 72, which drives an eye movement gear 74 coupled to the left eye carriage 64. The left eye carriage 64 is pivotably coupled to the right eye carriage 66 via arcuate gears 76 and 78, respectively. Rotation of the eye actuator 72 in a first direction then in the opposite direction causes the eyes 42 and 44 to move back and forth. The eye actuating mechanism 58 as well as the face 32 is fastened to a face plate 80.
An RFID sensor 81 is secured to the face plate 80 in the area near the mouth 40.
An ear actuating mechanism 82 is also fastened to the face plate 80 and includes left 84 and right 86 ears, and a servo actuator 88 coupled to the left 84 and right 86 ears to move the ears up and down or back and forth, for example.
A nose actuating mechanism 90 includes a nose servo actuator 92 coupled to a rod 94, which extends through articulated nose disks 96 and is capped by the nose 38. Activation of the nose servo actuator 92 moves the nose 38 up and down or side to side, for example. The back of the head plate 98 is coupled to the face plate 80 to enclose the components of the head assembly 22.
The body assembly 24 includes a neck actuating mechanism 100, which includes a head rotation servo actuator 102 to rotate the head assembly 22 to the left and right, and a head nod actuator 104 to move the head 22 up and down. The head assembly 22 is pivotally attached to the body assembly 24 at a neck 106.
The body assembly 24 includes a belly actuating mechanism 110, which includes a belly actuator 112 coupled to a lobed cam 114 rotated by the belly actuator 112. The lobed cam 114 impinges upon a breast plate 116, which is hingedly secured to a front body plate 118. As the lobed cam 114 is rotated by the belly actuator 112, the breast plate 116 moves in and out simulating a breathing motion. A battery pack 120 is mounted in the body 24 to power the actuators and control circuit 150, discussed herein below. A speaker 122 is mounted to the front body plate 118 behind a speaker grill 124. A heartbeat simulator 126 is mounted within the body assembly 24 to simulate a heartbeat. The front body plate 118 is fastened to a back body plate 128 enclosing the body 24.
Referring to
The MCU 152 controls the rotation of the eyes 42 and 44 and blinking of the eyelids 54 and 56. In response to sounds received via microphone 55 and inputs from touch sensors 156, the MCU 152 may actuate the nose actuator 92 to move the nose 38 up and down, and actuate the ears actuator 88 to move the ears 84 and 86. The MCU 152 also controls rotation of the head assembly 22 and associated servo actuators. The MCU 152 sends a signal to the heartbeat actuator 126 and breathing actuator 112 to simulate a heartbeat and breathing, respectively.
Operationally, the MCU 152 produces various moods such as happy, unhappy, and sleepy, for example. A happy expression may include moving the head 22 and nose 38 up, while blinking the eyes 42 and 44 by actuating the eyelids 54 and 56, and outputting a happy sound via speaker 122. When unhappy, the MCU 152 may move the head 22 down, and outputting an unhappy sound, for example. A sleepy expression may include moving the head 22 down, closing the eyes 42 and 44 by actuating the eyelids 54 and 56, and outputting a snoring sound via the speaker 122.
When touched or petted, detected by the MCU 152 via input from the touch sensors 156, the MCU 152 may output a happy expression. If a food accessory such as a dog bone or treat containing an RFID is placed near the mouth 40, the RFID coil 81 will sense the presence of the food accessory, which will be detected by the MCU 152. The MCU 152 may generate a happy response such as moving the head 22 and nose 38 up, while blinking the eyes 42 and 44 by actuating the eyelids 54 and 56, and outputting a happy sound via speaker 122, for example. Other RFID accessories may be used to elicit other responses. If the g/position sensor 158 or contact switches 156 detect a sudden movement such as a strike or drop, the MCU 152 may move the head 22 down and output an unhappy sound via speaker 122. If the g/position sensor 158 detects that the apparatus 20 is being held upside-down, the MCU 152 may move the head 22 side to side quickly and output an angry or unhappy sound via speaker 122, for example.
It is to be understood that while certain forms of this invention have been illustrated and described, it is not limited thereto, except in so far as such limitations are included in the following claims and allowable equivalents thereof.
This application claims benefit of co-pending application Ser. No. 61/583,999, filed Jan. 6, 2012, entitled INTERACTIVE PERSONAL ROBOTIC APPARATUS.
Number | Date | Country | |
---|---|---|---|
61583999 | Jan 2012 | US |