The present invention relates to a technique of controlling a robot.
A technique of judging the ease of movement of a robot by regarding an area in which the robot may interfere with a structure or the like as a warning area has been proposed (see Japanese Patent No. 4276624). A technique of causing a robot to travel while contacting a structure has been proposed (see International Patent Publication No. WO2015/087504).
In the case where a robot grasps an object with its hand having a plurality of finger mechanisms, however, the object grasping state is uncertain even though an output from a six-axis force sensor disposed on the hand may show a state of equilibrium. When an output from the six-axis force sensor indicates the equilibrium state, a control device which has received the output recognizes that, as shown in
In contrast, the robot hand 13 may actually be grasping the object W in a state, as shown in
In view of the foregoing, it is an object of the present invention to provide a robot and a control device of the robot that can improve the recognition accuracy of the position/posture of a functional body such as a hand with respect to an object, to attain improved stability of the operation involving a change in position/posture of the body.
The present invention relates to a robot and its control device, the robot including a body, a first limb movably connected to the body, a functional body movably connected to the first limb, a second limb movably connected to the body, and an image capturing device having a fixed position and posture relative to the functional body.
The control device includes: a storage device that stores environmental information representing a manner in which a structure is arranged in a world coordinate system; an operation generating element that generates an operation of the robot, the operation including operations of the first limb and the functional body that cause the functional body to interact with the structure, the arrangement manner of which is represented by the environmental information, and an operation of the second limb that causes at least one of a position and a posture of the body to be changed; a state detecting element that detects a state of interaction between the functional body and the structure; a calculating element that calculates, based on a captured image acquired by the image capturing device, a first reference vector and a second reference vector that represent a manner in which the structure extends in an image capturing device coordinate system and in a functional body coordinate system, respectively; and a correcting element that calculates an amount of correction for correcting a posture of the functional body coordinate system in the world coordinate system estimated from the result of detection by the state detecting element of the state of interaction between the functional body and the structure, on a condition that a magnitude of a deviation of an angle made by the first reference vector and the second reference vector with respect to a reference angle is greater than a threshold value.
According to the control device and the robot including the control device of the present invention, in a case where the magnitude of a deviation of an angle made by two reference vectors with respect to a reference angle is greater than a threshold value, it is highly probable that the state of interaction between the functional body and the structure is divergent from a desired state. Thus, in such a case, the posture of the functional body coordinate system in the world coordinate system, which is estimated from the detection result of the state of interaction between the functional body and the structure, is corrected such that the magnitude of the deviation becomes not greater than the threshold value. With this, an event that an operation of the robot is generated so as to cause the relative position/posture of the body coordinate system to be changed with respect to the functional body coordinate system on the basis of a false recognition result that the state of interaction between the functional body and the structure agrees with the desired state is avoided, and accordingly, the improvement in stability of the operation of the robot is achieved.
(Configuration of Robot)
A robot 1 as an embodiment of a mobile device of the present invention shown in
An arm 12 includes a first arm link, connected to the body 10 through the intermediary of the shoulder joint mechanism, and a second arm link, having one end connected to an end of the first arm link through the intermediary of the elbow joint mechanism and the other end connected to a base of the hand 13 through the intermediary of the wrist joint mechanism. The shoulder joint mechanism has two degrees of freedom of rotation about the yaw and pitch axes. The elbow joint mechanism has one degree of freedom of rotation about the pitch axis. The wrist joint mechanism has two degrees of freedom of rotation about the roll and pitch axes.
A camera C is fixed to a distal end of each arm 12. A hand 13 includes a palm and one or more finger mechanisms (movable members) which are movable with respect to the palm. The hand 13 is configured to be able to grasp an object in between the palm and one or more finger mechanisms, or in between two or more finger mechanisms, by operations of the finger mechanisms.
A leg 14 includes a first leg link, connected to the body 10 through the intermediary of the hip joint mechanism, and a second leg link, having one end connected to an end of the first leg link through the intermediary of the knee joint mechanism and the other end connected to the foot 15 through the intermediary of the ankle joint mechanism. The hip joint mechanism has three degrees of freedom of rotation about the yaw, pitch, and roll axes. The knee joint mechanism has one degree of freedom of rotation about the pitch axis. The ankle joint mechanism has two degrees of freedom of rotation about the pitch and roll axes. The robot 1 is capable of traveling autonomously with a movement involving the repeated floor-leaving and floor-landing operations of the respective, right and left legs 14.
(Configuration of Control Device)
The control device 2 shown in
The internal sensor group S1 includes, besides a GPS measurement device or an acceleration sensor for measuring the position (of the center of gravity) of the robot 1, a gyro sensor for measuring a posture of the body 10, a rotary encoder that measures a joint angle about an axis of a joint mechanism, and a six-axis force sensor that measures an external force acting on a hand 13.
The external sensor group S2 includes, besides the camera C, a motion capture system (not shown) independent of the robot 1, a stereo image sensor mounted on the head 11 for measuring a trajectory of the position of an object such as a ball related to execution of a task, and an active sensor mounted on the body 10 and using infrared light.
The control device 2 includes: a storage device 20, a state detecting element 22, a calculating element 24, a correcting element 26, and an operation generating element 28. The storage device 20 stores information necessary for generating an operation of the robot 1, such as “environmental information” representing a manner in which a structure is arranged in a world coordinate system. The state detecting element 22 detects a state of interaction between a hand 13 and a structure W on the basis of output signals from six-axis force sensors or contact sensors (disposed on the finger mechanisms) included in the internal sensor group S1. The calculating element 24 calculates a first reference vector U1 and a second reference vector U2 which represent how a structure W extends in a camera coordinate system and in a hand coordinate system, respectively, on the basis of a captured image acquired by the camera C. The correcting element 26 calculates an amount of correction for correcting a posture of the hand coordinate system in the world coordinate system, on a condition that the magnitude of a deviation Δϕ of an angle ϕ made by the first reference vector U1 and the second reference vector U2 with respect to a reference angle ϕ0 is greater than a threshold value E. The operation generating element 28 generates an operation or a gait of the robot 1, which includes operations of an arm 12 (first limb) and a hand 13 (functional body) that cause the hand 13 to interact with a structure W and an operation of a leg 14 (second limb) that causes the position/posture of the body 10 to be changed.
A single processor (arithmetic processing unit) may function as the plurality of elements 22 to 28, or a plurality of processors (multicore processors) may cooperate through mutual communications to function as the plurality of elements 22 to 28.
(Functions)
An operation of grasping a structure W such as a handrail with a hand 13 is generated by the operation generating element 28, and the operation of the robot 1 is controlled in accordance with the generated operation. Then, when the hand 13 grasps the structure W, the state of grasping the structure W with the hand 13 (the state of interaction between them) is detected on the basis of the output signals from the six-axis force sensors or contact sensors (disposed on the finger mechanisms F1, F2) included in the internal sensor group S1 (STEP 01 in
In the case where the output signals indicate a state of equilibrium (when the maximum, average, or accumulated temporal change amount of the output signals is not greater than a judgment value), it is estimated that the state of grasping the structure W with the hand 13 is a desired state shown in
A captured image is acquired by the camera C included in the external sensor group S2 (STEP 02 in
Edge points are extracted from the captured image (or a gray-scale image generated as the captured image is gray-scaled). Then, from the captured image, one or more linear components are extracted, which are each composed of a group of edge points arranged in a straight line (STEP 04 in
In the case where two or more linear components are extracted, the presence or absence of linear component pair(s) is judged (STEP 06 in
If no linear component pair is extracted (NO in STEP 06 in
On the other hand, if any linear component pair is extracted (YES in STEP 06 in
Specifically, the position of point of intersection between the pair of linear components (ρ1=x cos θ1+y sin θ1 and ρ2=x cos θ2+y sin θ2) in the image coordinate system is represented as coordinate values (xQ, yQ) of the vanishing point Q by the relational expression (01).
xQ=(ρ2 sin θ1−ρ1 sin θ2)/(sin θ1 cos θ2−cos θ1 sin θ2),
yQ={(ρ1−cos θ1)/sin θ1} xq (01).
With this, the point Q of intersection of the pair of linear components L1 and L2 is defined, as shown in
xQ=f(u1X/u1Z), yQ=f(u1Y/u1Z) (02)
Thus, the first reference vector U1 which represents the position of the vanishing point Q in the image coordinate system with reference to the origin OC of the camera coordinate system is calculated in accordance with the relational expression (03).
U1=t(xq, yq, f) (03)
Subsequently, the second reference vector U2 which represents, in the hand coordinate system (X1, Y1, Z1), the position of the vanishing point Q in the image coordinate system is calculated (STEP 10 in
PC=T+R·P1 (04).
The second reference vector U2 is calculated in accordance with the relational expression (05).
U2=T+R·P1+U1 (05).
Further, an angle ϕ made by the first reference vector U1 and the second reference vector U2 is calculated (STEP 12 in
ϕ=arccos(U1·U2)/|U1·U2|) (06).
It is judged whether the magnitude of a deviation Δϕ(=ϕ−ϕ0) of the angle ϕ with respect to a reference angle ϕ0 is greater than a threshold value ε (STEP 14 in
If it is judged that the magnitude of the deviation Δϕ is not greater than the threshold value a (NO in STEP 14 in
If it is judged that the magnitude of the deviation Δϕ is greater than the threshold value ε (YES in STEP 14 in
The posture of the hand coordinate system in the world coordinate system is corrected by that amount of correction. This correction is a concept that encompasses both of the following: that the posture of the hand coordinate system in the world coordinate system is corrected computationally, not accompanied by a change in posture of the hand 13 in the real space; and that the posture of the hand coordinate system in the world coordinate system is corrected computationally and the posture of the hand 13 in the real space is changed so as to conform to the corrected posture of the hand coordinate system in the world coordinate system. Then, with reference to the position and posture of the hand coordinate system in the world coordinate system, an operation of the robot 1, including the desired temporal change manners of the position and posture of the body coordinate system, is generated in accordance with the kinematic model of the robot 1, on the basis of a joint angle of each joint mechanism.
In a state where a relation between the robot 1 and its environment s unknown or in a state where the recognition accuracy thereof is low, when the state of grasping a structure W with a hand 13 (the state of interaction between them) is detected, the “environmental information” representing the position/posture of the structure in the world coordinate system may be corrected by the above-described amount of correction, with reference to the position/posture of the hand coordinate system.
(Effects)
According to the control device 2 of the robot 1 as an embodiment of the present invention, in a case where the magnitude of the deviation Δϕ of the angle ϕ (see the relational expression (06)) made by the two reference vectors U1 and U2 with respect to the reference angle ϕ0 is greater than the threshold value ε, there is a high probability that the state of interaction between the hand 13 (functional body) and the structure W, detected by the state detecting element 22, is divergent from a desired state (see
In the embodiment described above, the vanishing point Q was defined as the point of intersection of a linear component pair (see
Number | Date | Country | Kind |
---|---|---|---|
2017-064459 | Mar 2017 | JP | national |
Number | Name | Date | Kind |
---|---|---|---|
6317652 | Osada | Nov 2001 | B1 |
6438454 | Kuroki | Aug 2002 | B1 |
6556892 | Kuroki | Apr 2003 | B2 |
6889118 | Murray, IV | May 2005 | B2 |
6901313 | Mori | May 2005 | B2 |
6904334 | Asano | Jun 2005 | B2 |
6980889 | Ito | Dec 2005 | B2 |
6999851 | Kato | Feb 2006 | B2 |
7054718 | Miyamoto | May 2006 | B2 |
7099747 | Mikami | Aug 2006 | B2 |
7127326 | Lewis | Oct 2006 | B2 |
7133070 | Wheeler | Nov 2006 | B2 |
7269477 | Fukuchi | Sep 2007 | B2 |
7269478 | Fukuchi | Sep 2007 | B2 |
7269480 | Hashimoto | Sep 2007 | B2 |
7302312 | Murray, IV | Nov 2007 | B2 |
7330775 | Orita | Feb 2008 | B2 |
7386364 | Mikami | Jun 2008 | B2 |
7805218 | Nagasaka | Sep 2010 | B2 |
7881824 | Nagasaka | Feb 2011 | B2 |
8340817 | Honda | Dec 2012 | B2 |
8386076 | Honda | Feb 2013 | B2 |
8463433 | Nagasaka | Jun 2013 | B2 |
9073209 | Lee | Jul 2015 | B2 |
9957003 | Kamioka | May 2018 | B2 |
20110060248 | Ishida | Mar 2011 | A1 |
Number | Date | Country |
---|---|---|
4276624 | Mar 2009 | JP |
2004052597 | Jun 2004 | WO |
2015087504 | Jun 2015 | WO |
Entry |
---|
Kumar et al., Robust gesture detection and recognition using dynamic time warping and multi-class probability estimates, 2013, IEEE, p. 30-36 (Year: 2013). |
Schultje et al., Comparison of Trajectory Generation Methods for a Human-Robot Interface based on Motion Tracking in the Int2Bot , 2014, IEEE, p. 710-715 (Year: 2014). |
Silva et al., Towards Force Interaction Control of Biped Walking Robots, 2004, IEEE, p. 2568-2573 (Year: 2004). |
Zhang et al., Fast Human Whole Body Motion Imitation Algorithm for Humanoid Robots, 2016, IEEE, p. 1430-1435 (Year: 2016). |
Number | Date | Country | |
---|---|---|---|
20180281881 A1 | Oct 2018 | US |