This application claims the priority benefit of Taiwan application serial no. 100144852, filed on Dec. 6, 2011. The entirety of the above-mentioned patent application is hereby incorporated by reference herein and made a part of this specification.
1. Field of the Invention
The present invention is related to a control device and a robot using the control device, and more particularly, is related to a facial expression control device and a robot head using the facial expression control device.
2. Description of Related Art
The technique of emulation robot head having a humanoid appearance is proposed by Hara and Kobayashi of Tokyo Institute of Technology. They used pneumatic actuators to control the artificial facial skin fabricated by the silicone rubber. The artificial facial skin is provided with softness and flexibility, and thus the facial skin can express six basic facial expressions (surprised, frightened, sad, angry, happy and disgust) by pulling 19 control points disposed behind the facial skin. Wherein the selection of the control points are based on the facial expression coding system proposed by Ekman and 14 expression control units sufficient to compose 6 basic facial expressions are selected and used. And according to the definition, the expressions of human face can be composed from 44 groups of expression control units, i.e., the number of the expression control units determines the number of variation of expressions. Accordingly, many researchers regard this as a basis, and then the techniques related the robot head are published one after another, including the techniques of controlling facial skin variation by the methods of using memory alloy, motor, electroactive polymers (EVA) and the like, wherein the method of using motors to control facial skin for expression variation is the most commonly used. The main reason of using motors to control facial expressions is that motor has a faster response rate and uses electric power as the power source, and other assistant devices are not necessary (e.g., pneumatic compressors).
The well known companies who develop the related products of the emulation robot head include Kokoro of Japan, Hanson Robotic and Wow Wee of US and Xi An Superman of China. The robot heads of each of the above mentioned companies have different degree of freedom (DOF) (the expression variation) according to their different purpose. However, the expression variation mainly depends upon how many actuators have been used. In addition, the patents related to robot head include U.S. Pat. No. 7,113,848 and Japan Patent Publication No. 200235440. U.S. Pat. No. 7,113,848 discloses a humanoid face capable of facial expression including a plurality of actuators disposed in a casing, a linkage connected to the actuators and an outer skin connected to the linkage. Japan Patent Publication No. 200235440 discloses a humanoid face capable of facial expression including multiple flexible latching rings disposed at particular locations in an inner side of the skin, and the latching rings connected to the skin by a special connecting adhesive.
Review the currently known patents, references and products, it can be seen that, regardless of the way to achieve the facial expression, the robot head capable of facial expression generally has to use a large number of motors (e.g., 10 to 20 motors), pneumatic actuator, electric power driving memory alloy and the like to vary the controlling points of the facial skin to achieve different facial expressions. Each of the large number of actuators (motors, pneumatic actuators, electric power driving memory alloy) used in the conventional robot head operates in a way of capable to vary the position of one control point (a single degree of freedom). Thus in order that the robot head has a sufficient facial expressions (joyous, angry, sad, happy and the like), at least 12 motors is necessary to respectively drive different control point, resulting in manufacturing cost of the robot head remaining high and increase of difficulty of mechanism design and repair. The most important thing is the reliability of products may be decreased. And those reasons may be the main obstacle for the robot head capable of facial expressions to become a widespread product.
According to the currently known patents, references and products, it can be seen that, the robot head capable of facial expression generally has to use a large number of actuators to drive the controlling points of the facial skin to show variety facial expressions. And the more actuators are used, the more facial expressions the robot head has, which results in high manufacturing cost of the robot head and complex fabricating processes. Accordingly, the invention provides a simplified device which has various facial expressions with less actuator used.
The present invention provides a facial expression control device including a frame, a rotating element, a plurality of pushing bars, an actuator and a linking assembly. The actuator drives the rotating element to rotate, so that the pushing bars with the same length correspondingly prop against the facial expression control structures of the rotating element via the relative movement between the rotating element and the pushing bars, wherein each of the facial expression control structures may be indentations or protrusions relative to the surface of the rotating element and thereby each of the facial expression control structures has a shifting distance relative to the surface, and thus when the pushing bars prop against the facial expression control structures, the lengths protruded from the surface of the rotating element vary, and the control bars of the linking assembly are further driven to rotate. Accordingly, the control points of the facial skin linked with the control bars are driven to make the facial skin show the expression variations.
In light of the above, the facial expression control device in the present invention has rows of facial expression control structures with different height or depth disposed on the rotating element of the expression selecting assembly to provide a plurality of shifting distances, and by means of the cooperation of the facial expression control structures and the pushing bars with pushing or pulling the facial expression control device, the robot head using the facial expression control device can represent various facial expressions with less number of actuators.
In order to make the aforementioned features and advantages of the disclosure more comprehensible, embodiments with reference to accompanying drawings are described in detail below.
The accompanying drawings constituting a part of this specification are incorporated herein to provide a further understanding of the invention. Here, the drawings illustrate embodiments of the invention and, together with the description, serve to explain the principles of the invention.
According to the related art, based on the current technology, if the robot is required to have sufficient facial expressions, a plurality of control points are necessary to be disposed on the facial skin, and thus the required quantity of the actuators which respectively drive the control points cannot effectively be reduced. Limited by the current technology, the manufacturing cost cannot effectively reduced, and it may lead to the robot capable of various facial expressions cannot be extensively used. On the other hand, if the quantity of the actuators is reduced, the robot may have less expression variation and the robot may look inflexible.
Accordingly, the present invention provides a facial expression control device. A plurality of rows of facial expression control structures with different heights or depths are disposed on the rotating element of the facial expression control device to provide shifting distances. And with the cooperation of the pushing bars, the facial expression control device can show various expressions by using only one rotating element to drive the plurality of control points of the facial skin. In other words, a plurality of control points are controlled by comparatively less actuators in the present invention, so that the robot head has a plenty of facial expression variation with good emulation. The following describes the configuration of the facial expression control device of the present invention and applications.
The facial expression control device 130 includes a frame 1342, a rotating element 1344, a plurality of pushing bars 1346 and an actuator 1348. The frame 1342 includes a pair of third sidewalls 1342a and a fourth sidewall 1342b, wherein the two third sidewalls 1342a are substantially parallel to each other, and the fourth sidewall 1342b is opposite to the second sidewall 132b and connected between the two third sidewalls 1342a, and the fourth sidewall 1342b has a plurality of holes 1342c. The rotating element 1344 is pivoted to third sidewalls 1342a of the frame 1342 and located in the space surrounded by the third sidewalls 1342a, the fourth sidewall 1342b and the second sidewall 132b. The facial expression control device 130 is pushed by the pushing assembly 136 and moves relative to the main base 132.
The rotating element 1344 has at least one surface 1344a and a plurality of rows of facial expression control structures 1344b arranged in rows on the surface 1344a, and each of the facial expression control structures 1344b has a shifting distance relative to the surface 1344a. More specifically, the rotating element 1344 includes a rotating shaft 1344c, a sleeve 1344d, a transmission element 1344e and a pair of sliding elements 1344f, wherein the sleeve 1344d is disposed around and fixed to the rotating shaft 1344c, and the sleeve 1344d can be a cylinder or a polyhedral prism as required. In the present embodiment, the sleeve 1344d is a cylinder, and the surface 1344a and the facial expression control structures 1344b are disposed on the sleeve 1344d. In other embodiments not shown in figures, the sleeve 1344d can be a polyhedral prism and thus the rotating element 1344 may have a plurality of surfaces connected to one after another, and multiple rows of the facial expression structures 1344b can be disposed on each surface according to the requirements. Moreover, the facial expression control structures 1344b of
As described above, the transmission element 1344e is disposed around the rotating shaft 1344c located beside the sleeve 1344d, wherein the transmission element 1344e contacts with the actuator 1348 fixed on the frame 1342, so that when the actuator 1348 is driven, the transmission element 1344e is rotated by the actuator 1348 and thereby drives the rotating shaft 1344c and the sleeve 1344d to rotate. The second ring 1344f is disposed around the rotating shaft 1344c and located at the two sides of the sleeve 1344d. The main base 132 further has a pair of sliding slots 132d overlapped with a portion of the assembling slot 132c. Each of the sliding elements 1344f has a protruding portion 1344g, and the protruding portions 1344g are respectively located in the sliding slots 132d. The sliding slot 132d can be formed on the assembling plate 150, wherein the location of the assembling plate 150 corresponds to the location of the assembling slot 132c, and the sliding slot 132d is assembled on the first sidewall 132a of the main base 132.
In addition, the pushing bars 1346 are arranged in a row and respectively passing through the holes 1342c of the fourth sidewall 1342b of the frame 1342, and further respectively prop against to one row of the facial expression control structures 1344b disposed on the surface 1344a of the rotating element 1344. Additionally, the quantity of the facial expression control structures 1344b of each row is different or not, wherein the quantity of the facial expression control structures 1344b is the same in every row in this embodiment. The number of the pushing bars 1346 can be less than or equal to the number of the facial expression control structures 1344b of each row according to actual requirements. In other words, the sleeve 1344d of the rotating element 1344 is modulized in fabrication, and thus the quantity of facial expression control structures 1344b of each row is predetermined. In order to meet the demand of number of expressions of every robot head, the quantity of pushing bars 1346 can be changed. For example, if a robot head having facial expressions with variety and diversification is needed, the largest number of pushing bars 1346 is equal to the number of facial expression structures 1344b of each row; and if the robot head having facial expressions is required to have comparatively less expressions, the control points which drives the facial skin (not shown) can be reduced and thus the number of pushing bars 1346 can be less than that of the facial expression control structures 1344b of each row.
In addition, regardless of the facial expression control structures 1344b being the indentations concaving to the surface 1344a or the protrusions protruding from the surface 1344a, the distances of ends of any two adjacent facial expression control structures 1344b relative to the surface 1344a may vary, so that the facial skin (not shown) can show much more facial expressions. In more detailed, any two adjacent facial expression control structures 1344b can both be the protrusions (or indentations), and the distances of the ends of any two adjacent protrusions (or indentations) relative to the surface 1344a are the same or different. Moreover, any two of the adjacent facial expression control structures 1344b can be a protrusion and a indentation, and the distance between the top terminal of the protrusion and the surface 1344a and that between the indentation and the surface 1344a can also be the same or different.
Referring to
The following describes in detail how the facial expression control device 130 drives the facial skin (not shown) to show expressions.
Referring to
When the pushing assembly 136 pulls back the facial expression control device 130 to return to the original position, the springs 138c, which are compressed by a distance changing between the limiting rings 138b and the first assembling plates 138d due to the movement of the pushing bars 1346, may drive the limiting rings 138b back to the original position due to its own resilience.
As described above, in the facial expression control device 130 of this embodiment, only one actuator 1348 is necessary to drive the rotating element 1344 to rotate, and the pushing bars 1346 can further respectively prop against the facial expression control structures 1344b cooperated with the pushing assembly 136 pushing the facial expression control device 130 to drive the linking assembly 138 to pull the facial skin (not shown), wherein the number of control points to influence the facial expressions is determined according to the number of pushing bars 1346 and the number of facial expression control structures 1344b disposed on the rotating element 1344. Furthermore, the number of each row of facial expression control structures 1344b can be changed according to the requirements, and thus the number of the facial expression control structures 1344b can be increased to facilitate the facial skin (not shown) to show much more various expressions with good emulation.
Compared to the facial skin of the conventional robot head needs a large quantity of actuators to drive the control points, thus the manufacturing cost of the conventional robot head is rather expensive. Since less actuator is used in the robot head of the present invention to control the plurality of control points of the facial skin, the more facial expressions of the robot head with diversity and good emulation is achieved, and the manufacturing cost is also effectively reduced.
In addition, though one row of the pushing bars and one row of the facial expression control structures are used in the description of the first embodiment, by this teaching people who have ordinary skill in the art may derive to other modifications according to the actual requirements. For instance, more rows of the pushing bars 1346 and more rows of the facial expression control structures 1344b can be disposed.
In such configuration, the quantity of combinations of expression variations can be increased and thus the robot head can show much more expressions with diversity and a good emulation.
Additionally, wires are used in the connecting structure between the pushing bars and the control bars described in the first, second and third embodiments, but the connection between the pushing bars and the control bars can be modified in other embodiments within the spirit of driving the pushing bars and the control bars of the present invention. The following describes another three of the possible embodiments.
In this way, when the pushing bars 1346 move relative to the second assembling plate 138h, through the cooperation of the first latching structure 338c and the second latching structure 1346a, the control bars 338a may rotate by taking the rotating shaft 138e as a rotating center and further drive the facial skin (not shown) to show expressions.
In light of the foregoing, in the facial expression control device and robot head using the same of the present invention, only one actuator is used to drive the rotating element to rotate, with the cooperation of heights or depths formed by the pushing bars respectively propping against the facial expression control structures, and by means of the cooperation of the pushing assembly pushing the facial expression control device for driving the linking assembly to drive the facial skin, the robot head further shows varied facial expressions. Compared to the conventional robot head, since the control points controlled by less actuator with the robot head shows a plenty of facial expressions with good emulation, and thus the number of actuators is reduced compared to the conventional robot and the manufacturing cost of the robot head is also effectively reduced. And the robot heads can further be produced with modulization so that the whole fabricating cost of the robot head can be reduced and it facilitates the popularity of the robot.
Furthermore, the quantity of each row of facial expression control structures can be changed according to the requirements, and the configuration of the facial expression control structures and the pushing bars and the connecting between the pushing bars and the control bars can also be changed according to the requirements, and thus the facial expressions are sufficient with good emulation and the facial skin having more expression variations without changing the quantity of actuators, and the design of the facial expressions is further flexible.
Although the invention has been described with reference to the above embodiments, it will be apparent to one of the ordinary skill in the art that modifications to the described embodiment may be made without departing from the spirit of the invention. Accordingly, the scope of the invention will be defined by the attached claims not by the above detailed descriptions.
Number | Date | Country | Kind |
---|---|---|---|
100144852 A | Dec 2011 | TW | national |
Number | Name | Date | Kind |
---|---|---|---|
961262 | Slough | Jun 1910 | A |
1496406 | Bertsch | Jun 1924 | A |
1547183 | Steele | Jul 1925 | A |
1821243 | Springer | Sep 1931 | A |
2175311 | Preston | Oct 1939 | A |
2246381 | Paull | Jun 1941 | A |
2285472 | Tenenbaum | Jun 1942 | A |
2633669 | Churus | Apr 1953 | A |
2641866 | Schiller | Jun 1953 | A |
2720053 | Knott | Oct 1955 | A |
2938302 | Walss | May 1960 | A |
2954639 | Walss | Oct 1960 | A |
2969610 | Weiner | Jan 1961 | A |
3719118 | Colburn | Mar 1973 | A |
3738055 | Marble | Jun 1973 | A |
3757465 | Barlow | Sep 1973 | A |
4272918 | Inoue | Jun 1981 | A |
4537300 | Facchini | Aug 1985 | A |
5021878 | Lang | Jun 1991 | A |
5142803 | Lang | Sep 1992 | A |
5924969 | Waluda | Jul 1999 | A |
6068536 | Madland et al. | May 2000 | A |
6352464 | Madland et al. | Mar 2002 | B1 |
6652349 | Wichter | Nov 2003 | B1 |
6758717 | Park et al. | Jul 2004 | B1 |
6905390 | Fukui et al. | Jun 2005 | B2 |
7021988 | Patton | Apr 2006 | B2 |
7113848 | Hanson | Sep 2006 | B2 |
7234988 | Patton | Jun 2007 | B2 |
7738997 | Lin | Jun 2010 | B2 |
7904204 | Lin | Mar 2011 | B2 |
8596152 | Piacenza et al. | Dec 2013 | B2 |
8662955 | Fai et al. | Mar 2014 | B1 |
20020019193 | Maggiore et al. | Feb 2002 | A1 |
20040249510 | Hanson | Dec 2004 | A1 |
20110179893 | Elliott et al. | Jul 2011 | A1 |
Number | Date | Country |
---|---|---|
101940843 | Jan 2011 | CN |
2134303 | Aug 1984 | GB |
1982-11279 | Jan 1982 | JP |
2002-035440 | Feb 2002 | JP |
3110260 | Apr 2005 | JP |
201010783 | Mar 2010 | TW |
M406453 | Jul 2011 | TW |
Entry |
---|
“Search Report of China Counterpart Application”, issued on Mar. 8, 2013, p. 1-p. 6. |
“http://hansonrobotics.wordpress.com/”, retrieved on Apr. 2, 2012, Hanson Robotics. |
“http://www.wowwee.com/en/support/elvis”, retrieved on Apr. 2, 2012, Wow Wee Astonishing Imagination: Elvis. |
“Office Action of Taiwan Counterpart Application” , issued on Jan. 20, 2014, p. 1-p. 4. |
Hiroshi Kobayashi, et al., “Study on face robot for active human interface-mechanisms of face robot and expression of 6 basic facial expressions”, Proceedings of IEEE International Workshop on Robot and Human Communication, Nov. 3-5, 1993, pp. 276-281. |
H. Kobayashi, et al., “Study on Face Robot Platform as a KANSEI Medium”, IEEE International Conference on Industrial Electronics, Control and Instrumentation, Oct. 2000, pp. 481-486. |
Takashi Minato, et al., “Development of an android robot for studying human-robot interaction”, Proc. 17th International Conference on Industrial and Engineering Applications of Artificial Intelligence and Expert Systems, Ottawa, Canada, 2004, pp. 424-434. |
Jun-Ho Oh, et al., “Design of Android type Humanoid Robot Albert HUBO”, Intelligent Robots and Systems, IEEE/RSJ International Conference, Oct. 9-15, 2006, pp. 1428-1433. |
Dong-Wook Lee, et al., “Development of an Android for Emotional Expression and Human Interaction”, Proceedings of International Federation of Automatic Control, Jul. 6-11, 2008, pp. 4336-4337. |
Shin'Ichiro Nakaoka, et al., “Creating facial motions of Cybernetic Human HRP-4C”, 9th IEEE-RAS International Conference on Humanoid Robots, Dec. 7-11, 2009, pp. 561-567. |
Chyi-Yeu Lin, et al., “The realization of robot theater:humanoid robots and theatric performance”, International Conference on Advanced Robotics (ICAR), Jun. 22-26, 2009, pp. 1-6. |
D. Hanson, et al. “Identity Emulation (IE): Bio-inspired Facial Expression Interfaces for Emotive Robots”, Proc. AAAI National Conference in Edmonton, 2002, pp. 1-11. |
Takuya Hashimoto, et al., “Development of the Face Robot SAYA for Rich Facial Expressions”, SICE-ICASE International Joint Conference 2006, Oct. 18-21, 2006, pp. 5423-5428, Bexco, Busan, Korea. |
Hiroshi Ishiguro, “Android Science: Toward a new cross-interdisciplinary framework”, The 12th International Symposium of Robotics Research, 2005, pp. 1-6. |
Minoru Hashimoto, et al., “Development and Control of a Face Robot Imitating Human Muscular Structures”, Proceedings of the 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, Oct. 9-15, 2006, pp. 1855-1860. |
Karsten Berns, et al., “The Mechatonic Design of a Human-like Robot Head”, 16th CISM-IFToMM Symposium on Robot Design, Dynamics, and Control (ROMANSY), 2006, pp. 1-8. |
“Office Action of China Counterpart Application”, issued on May 20, 2014, p. 1-p. 7. |
Number | Date | Country | |
---|---|---|---|
20130139631 A1 | Jun 2013 | US |