1. Technical Field
The present invention relates to lifelike electronic apparatuses, and particularly to a lifelike electronic apparatus with a lifelike covering.
2. General Background
Recently quadruped-walking type pet robots have been developed and are sold widely. These pet robots resemble dogs or cats and are kept as pets. Such a pet robot is equipped with software that emulates real animal's emotions. Emotions such as “joy” and “anger” in are programmed in the software and can be made to respond to user's inputs. The robot may respond to inputs such as “patting” and “striking” as well as input from environmental conditions.
Such pet robot generally includes a housing for accommodating various components, such as sensors, actuators, mechanical movement units, etc. The sensors are configured for sensing the surrounding condition and generating sensing signals to activate corresponding components to perform actions. However, when the pet robot is overburden with a large number of sensors, layouts and arrangements between the sensors and other components in the housing may become complicated, and as a result, assemblying of the components may take an inordinate amount of time and labor.
What is needed, therefore, is a lifelike electronic apparatus that applies an improved component configuration that is effective to reduce time and labor consuming in an assembly of the lifelike electronic apparatus.
A lifelike electronic apparatus is provided. The apparatus includes a lifelike covering and a housing covered by the lifelike covering. The housing is configured with a power source, a central processing unit (CPU), a plurality of actuators, and a plurality of mechanical movement units. The lifelike covering includes a flexible covering body and a flexible circuit board covered by the flexible covering. The flexible circuit board is configured with a plurality of sensors and at least one interface. Each of the sensors is configured for sensing an external input and generating a corresponding sensing signal. The interface is configured for transferring power from the power source to the sensors, and for transferring the sensing signals to the CPU, so as to activate the CPU to generate an action control signal to the corresponding actuators, thereby driving the corresponding mechanical movement units to perform a corresponding action.
Other advantages and novel features will be drawn from the following detailed description with reference to the attached drawing.
The components of the drawings are not necessarily drawn to measuring scale, the emphasis instead being placed upon clearly illustrating the principles of the lifelike electronic apparatus. Moreover, in the drawings, like reference numerals designate corresponding parts throughout the several views.
Referring to
Each of the areas can be equipped with particular sensors 16 for performing a particular application. For example, the head area 110 is configured with one or more light sensors 16a in an eye part of the head area 110 for sensing external light and generating a light sensing signal; the head area 110 is further configured with one or more touch sensitive sensors 16b for sensing a user's touch thereon and generating a touch sensing signal; the body area 112 is configured with one or more pressure sensors 16c for sensing a user's tap or blow thereon and generating a pressure sensing signal; the tail area 113 is configured with one or more infrared sensors 16d for sensing a user's infrared ray and generating an infrared sensing signal. However, it should be noted that a number of the sensors 16, types of the sensors 16, and an arrangement of the sensors 16 are not limited to the embodiments described herein.
The flexible circuit board 11/11′ is further configured with at least one interface 17. For simplicity, in the embodiment as shown in
Referring to
The CPU 21 receives and processes signals, including the identified signals and the sensing signals without being processed by the processing unit 18, from the interface 17. The CPU 21 generates action control signals based on the processed signals and an input-output comparison table, and transmits the action control signals to corresponding actuators 23 so as to activate the actuators 23 to drive corresponding mechanical movement units 24 to perform an action.
The input-output comparison table is configured for recording a relationship between the external inputs and corresponding outputs. That is, the input-output comparison table records the sources of the sensing signals and/or the values of the sensing signals, and the action control signals that consists of control objects and actions. For example, if the sensing signal is from the light sensor 16a in the eye part of the head area 110 and the light value is greater than a predetermined light value, namely where the apparatus may be in a light ambience, the corresponding control objects are the actuators 23 in an eye part of the apparatus and the corresponding action is narrowing of the eyes of the apparatus. If the sensing signal is from the pressure sensor 16c in the body area 112 and the value of the pressure value is greater than a predetermined pressure value, namely where the user of the apparatus may be angry, the control objects are the actuators 23 in a mouth part of the apparatus and the corresponding action is opening a mouth of the apparatus and outputting a speech of “ouch.” If the sensing signal is from the infrared sensor 16d in the tail area 113, the control objects are the actuators 23 in a neck part of the apparatus and the corresponding action is turning a head of the apparatus and outputting a speech of “who is standing behind me.”
When the CPU 21 receives the sensing signals without being processed by the processing unit 18 from the interface 17, the CPU 21 identifies the sensing signals according to the coordinates thereof, namely determines the sources of the sensing signals and the values of the external inputs, and generates corresponding action control signals based on the identified signals and the input-output comparison table. Alternatively, when the CPU 21 receives the identified signals from the interface 17, the CPU 21 directly generates corresponding control signals based on the identified signals and the input-output comparison table.
By utilizing a configuration of the lifelike covering 1 described above, during an assembly of the apparatus, the sensors 16 are directly configured in the lifelike covering 1 and are separated from an assembly of components of the housing 2. When the lifelike electronic apparatus applies a large number of sensors 16, such configuration of the lifelike covering 1 is effective to improve an assembly speed as compared with current assemblies of current lifelike electronic apparatuses that have sensors together with other components installed in housings thereof.
Although the present invention has been specifically described on the basis of a preferred embodiment thereof, the invention is not to be construed as being limited thereto. Various changes or modifications may be made to the embodiment without departing from the scope and spirit of the invention.
Number | Date | Country | Kind |
---|---|---|---|
200710200035.X | Jan 2007 | CN | national |