The present invention relates to robots, and more particularly, to a robot apparatus and an output control method adapted for the robot apparatus.
There are many robotic designs in the market today. Robots may be designed to perform tedious manufacturing tasks or for entertainment. There are also some robots designed for use in home settings. Family robots are equipped with all kinds of external sensors, such as a microphone, a charge-coupled device (CCD) camera, and the like. A family robot can be programmed to respond in some manner when it recognizes the voice or appearance of a family member using voice recognition and/or image recognition software. However, it is a very complex procedure for a robot to analyze external stimulus using such software and mistakes are common. As a result, the family robot may perform a wrong output.
Accordingly, what is needed in the art is a robot system that overcomes the deficiencies of the prior art.
A robot system is provided. The robot system includes a robot apparatus and several wireless communication devices. The wireless communication devices are configured to send radio frequency (RF) signals of identification (ID) codes. The robot apparatus includes a communicating unit, a sensing unit, a buffer unit, a storage unit, a processing unit, and an output unit. The communicating unit is for receiving the RF signals of ID codes from the wireless communication devices within a predetermined area and time period. The sensing unit is for sensing people and obtaining the number of people within the predetermined area and time period. The buffer unit is for storing previous and current condition data, wherein the previous data, which is initialized to null, comprise ID codes and the number of people updated and stored at a previous time, and the current data include current ID codes and the number of people in the predetermined area as determined by the communicating unit and the sensing unit. The storage unit is for storing an output table, which respectively associates a plurality of outputs with various combinations and/or changes in the ID codes and the number of people in the predetermined area.
The processing unit includes an ID presence determining (IDPD) module, an updating module, and an output decision module. The IDPD module is for comparing the current ID codes and the number of people with previous data stored previously in the buffer unit, and generating an update signal when the comparison is not equal. The updating module is for replacing the previous data with the current data based on the update signal. The output decision module is configured for acquiring output data in the storage unit associated with any differences between previous data and current data in the output table. The output unit is for performing an output according to the output data.
Other advantages and novel features will be drawn from the following detailed description with reference to the attached drawings.
The components in the drawings are not necessarily drawn to scale, the emphasis instead being placed upon clearly illustrating the principles of a robot system. Moreover, in the drawings, like reference numerals designate corresponding parts throughout the several views.
The communicating unit 11 is configured for receiving RF signals of ID codes from the RFID cards 8 within a predetermined area and time period. The sensing unit 12 is configured for sensing people and obtaining the number of people within the predetermined area and time period. The sensing unit 12 can be configured at any predetermined position on the robot 1. The sensing unit 12 may be a microphone to pick up ambient sound in the predetermined area, a charge-coupled device (CCD) camera to capture images of people in the predetermined area, or other sensing unit, such as an infrared sensing unit, an ultrasonic sensing unit, and the like.
The buffer unit 50 includes a previous data buffer 501 and a current data buffer 502. The current data buffer 502 stores current RF and sensory data of the robot 1. The current RF and sensory data include the ID codes received by the communicating unit 11, and the number of people sensed by the sensing unit 12. The previous data buffer 501 stores same kinds of previously recorded data. By default, the previous data is initialized to null. When the current data does not match the previous data, the processing unit 20 replaces the previous data with the current data. When the previous data and the current data are the same, no update to the previous data takes place in the previous data buffer 501.
The processing unit 20 includes an ID presence determining (IDPD) module 21, an output decision module 22, and an updating module 23. The IDPD module 21 is configured for comparing current ID codes and the number of people in the predetermined area in the current data buffer 502 with what were determined previously in the previous data buffer 501, and generating an update signal when the comparison is not equal.
The output decision module 22, electrically coupled to the IDPD module 21, includes an action decision module 221, a light decision module 222, a sound decision module 223, and a communication decision module 224. The output decision module 22 is configured for acquiring output data (i.e. the sound data 41, the light data 42, the communication data 43, and the action data 44) in the storage unit 40 associated with any differences between previous data and current data in the output table 45 and controlling the output unit 30 to perform an output.
The output unit 30 includes an action control module 31, a light module 32, a sound module 33, and a communication module 34. The light module 32, electrically coupled to the light decision module 222, is configured for emitting light. The sound module 33, electrically coupled to the sound decision module 223, is configured for outputting voice warning. The communication module 34, electrically coupled to the communication decision module 224, is configured for providing a communicative output. The communication module 34 may communicate with an external communication apparatus (not shown) and send the communicative output to the external communication apparatus. The action control module 31, electrically coupled to the action decision module 221, is configured for performing actions. The action control module 31 includes a head control module 311 for controlling the head of the robot 1, a tail control module 312 for controlling a tail of the robot 1, and a limb control module 313 for controlling limbs of the robot 1.
Taking row No. 1 for example, when the previous data are “two communicated ID codes of R1 and R3 and two sensed persons” and the current data are “three communicated ID codes and three sensed persons”, the processing unit 20 controls the output unit 30 to perform corresponding output according to the previous data and the current data, that is, for example, the light decision module 222 controls the light module 32 to emit a slowly flashing blue light, the sound decision module 223 controls the sound module 33 to output voice warning “mother is back”, and the head control module 311 controls the robot 1 to raise its head and the limb control module 313 controls the robot 1 to walk towards R2 (mother).
When the previous data are “three communicated ID codes and three sensed persons” and the current data are “three communicated ID codes and five sensed persons”, as shown in row No. 2, the processing unit 20 controls the output unit 30 to perform corresponding output according to the previous data and the current data, that is, for example, the light decision module 222 controls the light module 32 to emit a slowly flashing yellow light, the sound decision module 223 controls the sound module 33 to output voice warning “guests come”, and the limb control module 313 controls the robot 1 to walk towards the guests and the tail control module 312 controls the robot 1 to swing the tail.
As shown in row No. 3, when the previous data are “nothing communicated and nobody sensed” and the current data are “nothing communicated and one sensed person”, the processing unit 20 controls the output unit 30 to perform corresponding output according to the previous data and the current data, that is, for example, the light decision module 222 controls the light module 32 to emit a quickly flashing red light, the sound decision module 223 controls the sound module 33 to output warning voice, the communication decision module 224 controls the communication module 34 to send out the communication data of “a stranger is in the room”, and the head control module 311 controls the robot 1 to face the stranger and the limb control module 313 controls the robot 1 to retreat.
When the previous data are “three communicated ID codes and three sensed persons” and the current data are “two communicated ID codes of R1 and R2 and two sensed persons”, as shown in row No. 4, the processing unit 20 controls the output unit 30 to perform corresponding output according to the previous data and the current data, that is, for example, the light decision module 222 controls the light module 32 to emit a slowly flashing green light, the sound decision module 223 controls the sound module 33 to output voice warning “the child goes out”, and the head control module 311 controls the robot 1 to shake the head.
If the comparison is not equal, that is, when the current data does not match the previous data, in step S105, the IDPD module 21 further generates the update signal to the updating module 23. In step S106, the updating module 23 replaces the previous data with the current data. In step S107, the output decision module 22 acquires the output data based on the associated output found in the output table 45. In step S108, the output unit 30 performs the output based on the output data.
It is understood that the output does not have to include all the three modules, i.e. the light decision module 222, the sound decision module 223 and the communication decision module 224; accordingly, the output unit 30 does not have to include all of the light module 32, the sound module 33 and the communication module 34. Furthermore, the action control module 31 does not have to include all of the head control module 311, the tail control module 312 and the limb control module 313.
In addition to being able to use the robot system to monitor changes in the composition of groups of people within a pre-determined area centered around the system, and perform actions associated with those changes, the system may be employed to monitor other kinds of changes as well. For example, used in a parking garage, the system could track vehicles and alert to the presence of unauthorized vehicles and warn people in the area of unauthorized vehicles or persons whose presence might mean an act of theft or assault is imminent.
It is understood that the invention may be embodied in other forms without departing from the spirit thereof. Thus, the present examples and embodiments are to be considered in all respects as illustrative and not restrictive, and the invention is not to be limited to the details given herein.
Number | Date | Country | Kind |
---|---|---|---|
2007 1 0074768 | Jun 2007 | CN | national |
Number | Name | Date | Kind |
---|---|---|---|
6429016 | McNeil | Aug 2002 | B1 |
7099745 | Ebert | Aug 2006 | B2 |
7228203 | Koselka et al. | Jun 2007 | B2 |
7245216 | Burkley et al. | Jul 2007 | B2 |
7456596 | Goodall et al. | Nov 2008 | B2 |
7720572 | Ziegler et al. | May 2010 | B2 |
7739534 | Cheng et al. | Jun 2010 | B2 |
7814355 | Cheng et al. | Oct 2010 | B2 |
7873913 | Lian et al. | Jan 2011 | B2 |
7932809 | Nair et al. | Apr 2011 | B2 |
7949899 | Chen et al. | May 2011 | B2 |
7996111 | Cheng et al. | Aug 2011 | B2 |
8001426 | Cheng et al. | Aug 2011 | B2 |
8065622 | Li et al. | Nov 2011 | B2 |
20020011367 | Kolesnik | Jan 2002 | A1 |
20020060542 | Song et al. | May 2002 | A1 |
20030025472 | Jones et al. | Feb 2003 | A1 |
20030176986 | Dietsch et al. | Sep 2003 | A1 |
20040134337 | Solomon | Jul 2004 | A1 |
20040207355 | Jones et al. | Oct 2004 | A1 |
20040211444 | Taylor et al. | Oct 2004 | A1 |
20050000543 | Taylor et al. | Jan 2005 | A1 |
20080167751 | Cheng et al. | Jul 2008 | A1 |
20080177421 | Cheng et al. | Jul 2008 | A1 |
20080306629 | Chiang et al. | Dec 2008 | A1 |
20080306741 | Wang et al. | Dec 2008 | A1 |
20090063155 | Chiang et al. | Mar 2009 | A1 |
20090083039 | Chiang et al. | Mar 2009 | A1 |
20090132250 | Chiang et al. | May 2009 | A1 |
Number | Date | Country |
---|---|---|
2004160630 | Jun 2004 | JP |
2004160630 | Jun 2004 | JP |
M269107 | Jul 2005 | TW |
Number | Date | Country | |
---|---|---|---|
20080306629 A1 | Dec 2008 | US |