1. Technical Field
The disclosure relates to a robot and, more particularly, to a robot and a control method adapted for the robot.
2. Description of the Related Art
There are a variety of robots on the market today, such as electronic toys, electronic pets, and the like. Some robots can simulate biological functions. For example, a robot may close its eyes to simulate sleeping. What is needed though, is a robot that can respond to instructions from a user related to the simulated biological functions.
The components in the drawings are not necessarily drawn to scale, the emphasis instead being placed upon clearly illustrating the principles of the robot. Moreover, in the drawings, like reference numerals designate corresponding parts throughout the several views.
The robot 1 includes a storage unit 10, a clock unit 20, an input unit 30, an output unit 40, and a processing unit 50. The storage unit 10 stores an action database 110, a biological function table 120 of the robot 1, and an output adjustment table 130 of the robot 1. The action database 110 stores a list of normal biological actions and predetermined actions to be performed by the robot 1. The robot 1 performs the normal biological actions during the biological functions. When the robot 1 is not during the biological functions and receives the instructions from the user, the robot 1 performs the predetermined actions.
Each of the biological functions is assigned a time period. The time period column records a time period of each biological function of the robot 1. Each time period includes a beginning time and an ending time. For example, 23:00 pm is the beginning time and 8:00 am is the ending time of the “sleeping” biological function. At 23:00 pm the robot 1 begins the “sleeping” biological function, and at 8:00 am the robot 1 ends the “sleeping” biological function. The normal biological action column records a normal biological action of each biological function performed by the robot 1. Once the robot 1 enters a biological function, the robot 1 performs the normal biological action of the biological function until the ending time of the biological function comes.
The action adjustment parameter column records an action adjustment parameter of each of the biological functions. For example, the action adjustment parameter of the “sleeping” biological function is assigned “action/sound slowly,” and the action adjustment parameter of the “eating” biological function is assigned “action slowly and sound lower.” The output column records a plurality of outputs to be performed by the robot 1. When the robot 1 receives an instruction from the input unit 30 during a biological function, the processing unit 50 performs an output according to the action adjustment parameter of the biological function and pauses the normal biological action of the biological function. For example, when the robot 1 receives the “touch head” instruction from the input unit 30 while during the “sleeping” biological function, the processing unit 50 controls the robot 1 to open eyes and walk slowly. When the robot 1 receives the “how are you” sound instruction from the input unit 30 while during the “eating” biological function, the processing unit 50 controls the robot 1 to stand up and say “not bad” quietly. If the robot 1 is not during the biological functions and receives the “touch head” instruction from the input unit 30, the processing unit 50 only controls the robot 1 to walk in an average speed.
The clock unit 20 is configured for measuring time. The input unit 30 is configured for generating the instructions in response to user input. The output unit 40 is configured for outputting an action. The processing unit 50 further includes a management module 510, a judgment module 520, and a performing module 530. The management module 510, connected to the clock unit 20, is configured for managing each biological function of the robot 1. When the time of the clock unit 20 reaches a beginning time of a biological function, the management module 510 fetches a normal biological action of the biological function from the biological function table 120 and controls the performing module 530 to begin performing the normal biological action defined in the action database 110, and the output unit 40 begins outputting the normal biological action. When the time of the clock unit 20 reaches an ending time of the biological function, the management module 510 finishes the biological function and controls the performing module 530 to stop the normal biological action.
The judgment module 520 is configured for judging whether the input unit 30 generates the instructions in response to user input during the biological functions. If the input unit 30 generates an instruction while during a biological function, the judgment module 520 controls the performing module 530 to pause the normal biological action of the biological function, and fetches the action adjustment parameter of the biological function from the output adjustment table 130 and controls the performing module 530 to perform an output according to the action adjustment parameter, the output unit 40 outputs the output.
If the instruction is received while during the biological function, in step S440, the judgment module 520 fetches the action adjustment parameter of the biological function from the output adjustment table 130 and controls the performing module 530 to perform the output according to the action adjustment parameter. In step S450, after the output is finished, the judgment module 520 controls the performing module 530 to resume performing the normal biological action of the biological function. In step S460, when the time of the clock unit 20 reaches the ending time of the biological function, the management module 510 controls the performing module 530 to stop the normal biological action.
It is understood that the invention may be embodied in other forms without departing from the spirit thereof. Thus, the present examples and embodiments are to be considered in all respects as illustrative and not restrictive, and the invention is not to be limited to the details given herein.
Number | Date | Country | Kind |
---|---|---|---|
2008 1 0305169 | Oct 2008 | CN | national |
Number | Name | Date | Kind |
---|---|---|---|
5966526 | Yokoi | Oct 1999 | A |
6227966 | Yokoi | May 2001 | B1 |
6442450 | Inoue et al. | Aug 2002 | B1 |
6505098 | Sakamoto et al. | Jan 2003 | B1 |
6711469 | Sakamoto et al. | Mar 2004 | B2 |
6718232 | Fujita et al. | Apr 2004 | B2 |
7967657 | Ganz | Jun 2011 | B2 |
20030060930 | Fujita et al. | Mar 2003 | A1 |
20030093280 | Oudeyer | May 2003 | A1 |
20030109959 | Tajima et al. | Jun 2003 | A1 |
20030216160 | Yokoi | Nov 2003 | A1 |
20050119037 | Yokoi | Jun 2005 | A1 |
20050148390 | Murase et al. | Jul 2005 | A1 |
20050246063 | Oonaka | Nov 2005 | A1 |
20060287032 | Yokoi | Dec 2006 | A1 |
20070168942 | Kaplan | Jul 2007 | A1 |
20080192580 | Larian | Aug 2008 | A1 |
20080195566 | Lee et al. | Aug 2008 | A1 |
20080208776 | Lee et al. | Aug 2008 | A1 |
20080215182 | Lee et al. | Sep 2008 | A1 |
Number | Date | Country | |
---|---|---|---|
20100106296 A1 | Apr 2010 | US |