The present invention relates to a body orientation estimation device and a body orientation estimation program capable of detecting an orientation of a target even under an environment that a positional relation between a sensor and the target changes variously.
In a detection device of face orientation shown in Patent Literature 1, a distance to a head 4 is respectively detected by a plurality of distance sensors 11a to 11e arranged horizontally on a head rest 3 and an approximate curve 5 is prepared based on this head distance. A part of the prepared approximate curve 5 is approximated based on an ellipse 6 and a yaw angle of the head 4 is estimated based on a long axis of the ellipse 6.
Further, according to an estimation device of a human body orientation shown in Patent Literature 2, a distance sensor 50 is provided, the target person is approximated by an ellipse in accordance with the measurement result of the distance sensor 50, and the position of the center of gravity thereof is calculated. The estimation device tracks change of the position of the center of gravity and calculates movement direction of the target, thereby estimating this movement direction as an orientation of the target. Similarly, also in a positional detection device shown in Patent Literature 3, the device measures the target by a plurality of distance sensors 11 to 14, approximates an outline of the target by an ellipse from a point group of measured distances, tracks change of the center of gravity, calculates a movement direction of the target and estimates this as an orientation of the target.
[Patent Literature 1] Japanese Laid-Open Patent Publication No. 2016-175509
[Patent Literature 2] Japanese Laid-Open Patent Publication No. 2011-081634
[Patent Literature 3] Japanese Laid-Open Patent Publication No. 2010-169521
As shown in PTL 1, in case of orientation of the head 4 leaning on the head rest 3, a positional relation between each sensors 11a to 11e and the head 4 is almost fixed. Therefore, it is possible to estimate the yaw angle of the head 4 by the ellipse calculated on the basis of the distance to the head 4.
However, as shown in PL2 and PL3, in an environment in which the positional relation between the sensor and the target (body) changes in various ways, depending on the positional relation between the sensor and the target, the way of obtaining the distance measurement data changes, and the approximate curve may not fit the target well in some cases. In such a case, the orientation of the target cannot be detected.
The present invention has been made to solve the above-mentioned problems, and an object thereof is to provide a body orientation estimation device and a body orientation estimation program capable of detecting the orientation of a target even under an environment that the positional relation between a sensor and the target changes variously.
To achieve the object, the body orientation estimation device according to the present invention includes a distance measurement unit for measuring a target, a quadratic function calculation unit for calculating quadratic functions approximated based on a plurality of distance measurement data measured by the distance measurement unit based on a plurality of coordinate systems, a quadratic function selection unit for selecting the quadratic function having the smallest approximation error to the plurality of the distance measurement data among the quadratic functions calculated by the quadratic function calculation unit, and an orientation estimation unit for estimating an orientation of the target based on the quadratic function selected by the quadratic function selection unit.
A body direction estimation program according to the present invention allows a computer to execute a distance measurement obtainment function for obtaining distance measurement data measuring a target, a quadratic function calculation function for calculating quadratic functions approximated based on a plurality of the distance measurement data obtained by the distance measurement obtainment function based on a plurality of coordinate systems, a quadratic function selection function for selecting the quadratic function having the smallest approximation error to the plurality of the distance measurement data among a plurality of quadratic functions calculated by the quadratic function calculation function, and an orientation estimation function for estimating an orientation of the target based on the quadratic function selected by the quadratic function selection function.
According to the body orientation estimation device and the body orientation estimation program of the present invention, the quadratic functions approximated based on a plurality of distance measurement data obtained by distance measurement of the target are calculated based on a plurality of coordinate systems. Among the plurality of calculated quadratic functions, the quadratic function having the smallest approximation error to the plurality of the distance measurement data is selected, and the orientation of the target is estimated based on the selected quadratic function. That is, in a case where the quadratic function calculated based on one coordinate system cannot fit the target well, the orientation of the target is estimated based on the quadratic function calculated based on the other coordinate system. Therefore, even under a circumstance that the positional relation between the distance measurement unit (sensor) and the target changes variously, there is an effect that the orientation of the target can be detected.
Hereinafter, preferred embodiments of the present invention will be described with reference to the accompanying drawings. First, with reference to
As shown in
The wheels 17 are provided at both left and right ends of the bottom part of the outer case 2. A motor (not shown) is connected to each of the left and right wheels 17, and the moving body 1 is moved by driving the motors based on a control signal from a drive unit 18 (see
Next, with reference to
The CPU 11 is an arithmetic device for controlling the respective sections mutually connected with the bus line 14. A control program 12a is stored in the flash ROM 12 as a non-volatile rewritable memory device for storing the program executed by the CPU 11 and data of fixed values. Upon execution of the control program 12a by the CPU 11, a main processing shown in
The RAM 13 is a memory for storing rewritably various work data and flags and the like in execution of the control program 12a by the CPU 11, and includes a distance measurement data memory 13a in which distance measurement data MP measured by the distance measurement sensor 16 are stored, an ellipse position orientation memory 13b, a quadratic function position orientation memory 13c, a Kalman filter predicted position orientation memory 13d (hereinafter, abbreviated as “KF predicted position orientation memory 13d”), a correlation result position orientation memory 13e, an estimated current position orientation memory 13f, and a previous value memory 13g.
The ellipse position orientation memory 13b is a memory for storing position and orientation of the user H estimated from approximate ellipse C (see
The KF predicted position orientation memory 13d is a memory for storing current position and orientation of the user H estimated based on Kalman filter prediction step (Formula 9) from a value of the previous value memory 13g mentioned later. The correlation result position orientation memory 13e is a memory for storing position and orientation of the user H estimated to be most likely of the user H among a plurality of positions and orientations of the user H stored in the ellipse position orientation memory 13b and the quadratic function position orientation memory 13c.
The estimated current position orientation memory 13f is a memory for storing position and orientation of the user H estimated by the Kalman filter based on the value of the correlation result position orientation memory 13e. The previous value memory 13g is a memory for storing the value of the estimated current position orientation memory 13f and the estimated velocity and angular velocity of the user H.
The drive unit 18 is a device to move and operate the moving body 1, and is constituted from the wheels 17 (see
Next, with reference to
Referring back to
Therefore, in the processing of S3 shown in
In the present embodiment, “0.6 m” is exemplified as the shoulder width W1 and “0.3 m” is exemplified as the thickness W2. That is, in a case where a double length of the long axis a of the approximate ellipse C lies within ±10% of the shoulder width W1 of the user H and a double length of the short axis b lies within ±10% of the thickness W2 of the user H, the approximate ellipse C is determined to be close to the size of the user H. In this manner, in a case where the approximate ellipse C is determined to be close to the body size of the user H, the center CP and orientations DC1, DC2 of the approximate ellipse C are stored in the ellipse position orientation memory 13b.
Referring back to
y=a
1
x
2
+b
1
x+c
1 (Formula 2)
x=a
2
y
2
+b
2
y+c
2 (Formula 3)
In Formula 2, a1, b1 and c1 are coefficients and these coefficients are determined based on the least squares method of Formula 2 and the distance measurement data MP. In Formula 3, a2, b2 and c2 are also coefficients and these coefficients are determined based on the least squares method of Formula 3 and the distance measurement data MP. That is, in the present embodiment, an axis line of the quadratic function Qxy has a shape upright along the x-axis and an axis line of the quadratic function Qyx is parallel to a shape upright along the y-axis.
Since the distance measurement sensor 16 obtains the distance to the user H by irradiating laser light in all directions, a part of the body of the user H is obtained as the distance measurement data MP of the user H. That is, the distance measurement data MP obtained from the distance measurement sensor 16 is biased in its distribution. Further, as shown in
That is, since the quadratic function Qxy and the quadratic function Qyx have different shapes, the distribution of the distance measurement data MP that fits the quadratic function Qxy and the distribution of the distance measurement data MP that fits the quadratic function Qyx may exist depending on the distribution of the distance measurement data MP. In the present embodiment, by calculating two quadratic functions Qxy and Qyx having different shapes from the distance measurement data MP, candidates of the quadratic function Q can be given that is more likely to fit the distribution of the distance measurement data MP.
Referring back to
According to the selected quadratic function Q, the position and orientation of the user Hare calculated. In the present embodiment, the position and orientation of the user H are calculated according to the positional relation between the vertex T of the quadratic function Q and the distance measurement data MP. A calculation method for position and orientation of the user H by the quadratic function Q will be described with reference to
In
In the present embodiment, in a case where the vertex T of the quadratic function Q is located outside the cluster edge E1 and the cluster edge E2, the position and orientation of the user H are calculated based on the cluster edge E1 and the cluster edge E2 which are certainly measured. Specifically, a midpoint Ec between the cluster edges E1 and E2 is defined as the position of the user H and orientations DQ1, DQ2 perpendicular to the straight line connecting the cluster edges E1 and E2 is defined as the orientation of the user H. Specifically, a coordinate of the cluster edge E1 in the x-axis direction is defined as XE1 and a coordinate of the cluster edge E1 in the y-axis direction is defined as YE1, and a coordinate of the cluster edge E2 in the x-axis direction is defined as XE2 and a coordinate of the cluster edge E2 in the y-axis direction is defined as YE2, an orientation of the user H DQ1 is calculated by Formula 4 and an orientation of the user H DQ2 is calculated by Formula 5.
Thus, even in a case where only a part (front or back) of the body of the user H is measured as the distance measurement data MP, the position and orientation of the user H are calculated based on the cluster edge E1 and the cluster edge E2 which are certainly measured, therefore an error in the position and orientation of the user H can be minimized. Two orientations DQ1 and DQ2 are calculated as the orientation of the user H because the quadratic function Q cannot distinguish which direction (front or back) the user H faces, as is the case for the approximate ellipse C. Which of the orientations DQ1, DQ2 is finally defined as the orientation of the user H is determined in the processing of S7 described later.
Next, with reference to
In a case where the vertex T exists between the cluster edges E1 and E2, first, a distance relation between the cluster edge E2 and the vertex T is determined. Specifically, as shown in
That is, in the present embodiment, in a case where the distance between the midpoint Ec and the cluster edge E2 is larger than 0.9 times of the distance between the midpoint Ec and the vertex T, it is determined that the cluster edge E2 and the vertex T are distant from each other.
First, with reference to
Thus, in the present embodiment, in a case where the cluster edge E2 and the vertex T are distant from each other, based on the cluster edges E1 and E2 which are estimated as the positions of both shoulders, the position and orientation of the user H are calculated. Specifically, the midpoint Ec is defined as the position of the user H and the orientations DQ1, DQ2 perpendicular to the line connecting the cluster edges E1 and E2 are defined as the orientations of the user H. That is, the orientations of the user H are calculated from Formula 4 and Formula 5. As a result, the position and orientation of the user H are calculated based on the cluster edges E1 and E2 estimated as the positions of both shoulders, thus the position and orientation of the user H can be defined with greater accuracy.
Next, the calculation method of the position and orientation of the user H in a case where the cluster edge E2 and the vertex T are close, that is, where the above Formula 6 is not satisfied, will be described with reference to
Since the cluster edge E2 is adjacent to the outside of the vertex T, the cluster edge E2 is located “behind” the vertex T when viewed from the distance measurement sensor 16 (moving body 1). That is, the vertex T becomes an obstruction (barrier) of laser light from the distance measurement sensor 16 and laser light does not reach the cluster edge E2, thus the cluster edge E2 is in a state that the distance measurement thereof cannot be made. As a result, since the whole of the corresponding left or the shoulder cannot be measured, the cluster edge E2 may be not accurately measured.
In the present embodiment, the orientations DQ1, DQ2 of the user H are calculated based on the vertex T in addition to the cluster edges E1 and E2. The vertex T is located within a range where the distance measurement data MP are accurately obtained, that is, the vertex T is the position defined as a left or right shoulder.
Specifically, in a case where the coordinate in the x-axis direction of the midpoint Ec is defined as Xc, the coordinate in the y-axis direction of the midpoint Ec is defined as Yc, the coordinate in the x-axis direction of the vertex T is defined as XT and the coordinate in the y-axis direction of the vertex T is defined as YT, the orientation DQ1 of the user H is calculated by the following Formula 7 and the orientation DQ2 of the user H is calculated by the following Formula 8.
That is, in a case where the cluster edge E2 and the vertex T are close, the orientations DQ1, DQ2 are calculated from the direction of a line perpendicular to the line connecting both the midpoint Ec and the vertex T. That is, the orientations DQ1, DQ2 are calculated by considering both the vertex T located in a range where the distance measurement data MP are accurately measured and being determined as the position of the left or right shoulder on the approximated quadratic function Q, and the cluster edge E2 which is actually measured by the distance measurement sensor 16 but may not be accurately measured due to the effect of the vicinity of the vertex T. As a result, error in the orientations DQ1, DQ2 can be reduced. The orientations DQ1, DQ2 calculated in the manner described above are stored as the orientation of the user H and the midpoint Ec is stored as the position of the user H in the quadratic function position orientation memory 13c.
Referring back to
Assuming that the position of the user H in the x-axis direction stored in the previous value memory 13g is Px,k-1, the position of the user H in the y-axis direction is Py,k-1, the orientation of the user H is Pθ,k-1, the velocity of the user H in the x-axis direction is Vx,k-1, the velocity of the user H in the y-axis direction is Vy,k-1, the angular velocity of the user H is Vθ,k-1, the elapsed time from the previous processing of S9 or S11 is Δt, and the system noise component of the user H due to the acceleration disturbance and the like are nxs, nys, nθs, the predicted position of the user H in the x-axis direction Pxp,k, the position of the user H in the y-axis direction Pyp,k, the orientation of the user H Pθp,k, the velocity of the user H in the x-axis direction Vxp,k, the velocity of the user H in the y-axis direction Vyp,k, the angular velocity of the user H Vθp,k are calculated by Formula 9 which is publicly known as Kalman filter prediction step.
The position of the user H in the x-axis direction Pxp,k, the position of the user H in the y-axis direction Pyp,k, the orientation of the user H Pθp,k, the velocity of the user H in the x-axis direction Vxp,k, the velocity of the user H in the y-axis direction Vyp,k, the angular velocity of the user H Vθp,k, which are predicted by Formula 9, are stored in the KF predicted position orientation memory 13d.
After the processing of S6, a correlation processing is conducted based on the position and orientation of the user H stored in the ellipse position orientation memory 13b, the quadratic function position orientation memory 13c, and the KF predicted position orientation memory 13d, and the corresponding position and orientation of the user H are stored in the correlation result position orientation memory 13e (S7).
Specifically, first, the position and orientation of the user H stored in the ellipse position orientation memory 13b and the quadratic function position orientation memory 13c are narrowed down by a “gate” based on the previous value stored in the KF predicted position orientation memory 13d. This eliminates the position and orientation of the user H of the ellipse position orientation memory 13b and the quadratic function position orientation memory 13c that differ significantly from the position and orientation of the user H on previous values.
Further, among the positions and orientations of the user H narrowed down by the gate, those estimated to be the most likely position and orientation of the user H are selected by the nearest neighbor algorithm and stored in the correlation result position orientation memory 13e. When at least one of the position and orientation of the user H is not selected by the narrowing-down by the gate and the nearest neighbor algorithm, a value indicating that the position and orientation of the user H are not selected is stored in the correlation result position orientation memory 13e. Since the gate and the nearest neighbor algorithm are well-known techniques, a detailed explanation thereof will be omitted.
Among the positions and orientations of the user H calculated from the approximate ellipse C and the quadratic function Q, those estimated to be most likely of the user H are selected by the narrowing-down by the gate and the nearest neighbor algorithm.
After the processing of S7, it is checked whether the position and orientation of the user H are stored in the correlation result position orientation memory 13e (S8). If the position and orientation of the user H are stored (S8: Yes)), the current position and orientation of the user H are estimated from the value of the correlation result position orientation memory 13e by the known Kalman filter and stored in the estimated current position orientation memory 13f (S9). Specifically, in the values stored in the correlation result position orientation memory 13e, assuming that the position of the user H in the x-axis direction is Pxc, the position of the user H in the y-axis direction is Pyc, the orientation of the user H is Pθc, the velocity of the user H in the x-axis direction is Vxc, the velocity in the y-axis direction is Vyc, the angular velocity of the user H is Vθc, and observation error component due to a detection error of the distance measurement sensor 16 and a likelihood function and the like due to a movement of the user H are nxp, nyp, nθp, the position of the user H in the x-axis direction Px,k, the position of the user H in the y-axis direction Py,k and the orientation of the user H Pθ,k are calculated by Formula 10.
The velocity Vxc of the user H in the x-axis direction is calculated based on the position of the user H in the x-axis direction stored in the previous value memory 13g and the position of the user H in the x-axis direction stored in the correlation result position orientation memory 13e. Similarly, the velocity Vyc of the user H in the y-axis direction is calculated based on the position of the user H in the y-axis direction of the previous value memory 13g and the position of the user H in the y-axis direction of the correlation result position orientation memory 13e, and the angular velocity Vθc of the user H is calculated based on the orientation of the user H of the previous value memory 13g and the orientation of the user H of the correlation result position orientation memory 13e.
Since the position and orientation of the user H, which are estimated to be most likely of the user H, stored in the correlation result position orientation memory 13e in the processing of S7 includes observation error components, i.e., an error that can actually occur, such as a detection error of the distance measurement sensor and a movement of the user H, the position and orientation of the user H, in which the error with respect to the position and orientation of the user H that is actually observed, is small, can be estimated.
On the other hand, if the position and orientation of the user H are not stored in the correlation result position orientation memory 13e (S8: No), that is, if the value indicating that the position and orientation of the user H is not selected in the processing of S7 is stored in the correlation result position orientation memory 13e, the value of the KF predicted position orientation memory 13d is stored in the estimated current position orientation memory 13f (S10). That is, in the processing of S7, even when it is determined that the position and orientation of the user H calculated in the processing of S2 to S5 are not likely to be of the user H, the current position and orientation of the user H based on the value of the previous value memory 13g are stored in the estimated current position orientation memory 13f, although the predicted values. As a result, a situation in which the position and orientation of the user H are not updated can be prevented.
After the processing of S9 and S10, the control signal based on the position and orientation of the user H of the estimated current position orientation memory 13f is output to the drive unit 18 to operate the drive unit (S11). As a result, the moving body 1 is moved and operated based on the estimated position and orientation of the user H.
After the processing of S12, the position and orientation of the user H stored in the estimated current position orientation memory 13f, the velocity of the user H, and the angular velocity of the user H are stored in the previous value memory 13g (S12). Specifically, first, the velocity and angular velocity of the user H are calculated based on the position and orientation of the user H of the estimated current position and orientation memory 13f and the position and orientation of the user H stored in the previous value memory 13g, and after these velocity and angular velocity of the user H are stored in the previous value memory 13g, the velocity and angular velocity of the user H in the estimated current position orientation memory 13f are stored. After the processing of S12, subsequent processing after S1 are repeated.
As described in the above, according to the moving body 1 in the present embodiment, based on the measured distance measurement data MP obtained by the distance measurement sensor 16, the approximate ellipse C, the quadratic functions Qxy in the xy coordinate system, and the quadratic functions Qyx in the yx coordinate system are calculated, and the position and orientation of the user H in each case are calculated. Among calculated positions and orientations of the user H, the position and orientation of the user H which are estimated to be most likely of the user H are selected by the gate and the nearest neighbor algorithm, and the current position and orientation of the user H are estimated from the selected position and orientation of the user H.
The positional relation between the moving body 1 and the user H changes variously every moment. Thus, a plurality of shapes such as the approximate ellipse C and a plurality of approximate curves in a plurality of the coordinate systems such as the quadratic functions Qxy, Qyx are calculated from the distance measurement data MP, and if the approximate curve based on one shape or coordinate system does not fit the user H well, the position and orientation of the user H are estimated based on the more fitting approximate curve on the basis of the other shape or coordinate system. Thereby, even if the positional relation between the distance measurement sensor 16 and the user H variously changes, the position and orientation of the user H can be detected.
Although the present invention has been described based on embodiments, the present invention is not limited to the above-described embodiments in any way, and it can be easily understood that various improvements and modifications are possible within the spirit of the present invention.
In the above embodiments, the case where two types of quadratic function, the quadratic function Qxy in the xy coordinate system and the quadratic function Qyx in the yx coordinate system, are defined as the quadratic function calculated from the distance measurement data MP has been described. However, the present invention is not necessarily limited thereto. The quadratic function in the xz coordinate system or yz coordinate system may be calculated and the quadratic function Q in the other coordinate system may be calculated.
In the above embodiments, the case where the distance measurement data MP is approximated by quadratic function has been described. However, the present invention is not necessarily limited thereto. The distance measurement data MP may be approximated by polynomial function with degree 3 or more such as cubic function or quartic function.
In the above embodiments, the case where the quadratic function Qxy has a shape upright along the x-axis and an axis line of the quadratic function Qyx has a shape upright along the y-axis has been described. However, the present invention is not necessarily limited thereto. The quadratic function Qxy, Qyx may be a shape tilting to x-axis or y-axis side.
In the above embodiments, the case where the body around the shoulder is measured by the distance measurement sensor 16 has been described. However, the present invention is not necessarily limited thereto. As long as the body of the user H is measured, around other parts such as the abdomen and waist may be measured.
In the above embodiments, the case where the position and orientation of the user H are estimated based on the approximate ellipse C has been described. However, the present invention is not necessarily limited thereto. The position and orientation of the user H may be estimated based on only quadratic function Q and may be estimated based on only the approximate ellipse C.
In the above embodiment, the case where the long axis a and the short axis b of the approximate ellipse C are approximated to the shoulder width W1 and the thickness W2 of the user H, the center CP of the approximate ellipse C and the orientations DC1, DC2 are stored in the ellipse position orientation memory 13b has been described. However, the present invention is not necessarily limited thereto. In a case where the circumference of the approximate ellipse C is approximated to the perimeter length of around the shoulder of the user H, the center CP of the approximate ellipse C and the orientations DC1, DC2 may be stored in the ellipse position orientation memory 13b, and in a case where the area of the approximate ellipse C is approximated to the sectional area of around the shoulder of the user H, the center CP of the approximate ellipse C and the orientations DC1, DC2 may be stored in the ellipse position orientation memory 13b.
In the above embodiments, the case where the cluster edges E1, E2 in the quadratic function Q are defined as the distance measurement data MP at the both ends of the distance measurement data MP has been described. However, the present invention is not necessarily limited thereto. Among the distance measurement data MP on the quadratic function Q, the point which is approximated to the distance measurement data MP at one end may be defined as the cluster edge E1 and the point which is approximated to the distance measurement data MP at the other end may be defined as the cluster edge E2.
In the above embodiment, the case where the position of the user H in the approximate ellipse C is made as the center CP of the approximate ellipse and the position of the user H in the quadratic functions Qxy, Qyx are defined as the midpoint Ec between the cluster edges E1 and E2 has been described. However, the present invention is not necessarily limited thereto. The position of the user H, in any of the approximate ellipse C or the quadratic function Q, may be defined either as the center CP of the approximate ellipse C or as the midpoint Ec between the cluster edges E1 and E2. The position of the user H calculated by other method such as the least squares method may be applied.
In the above embodiment, the case where the orientations DQ1, DQ2 are calculated from the direction of a line perpendicular to the line connecting the midpoint Ec and the vertex T, when the vertex T exists between the cluster edges E1 and E2 and the cluster edge E2 is close to the vertex T has been described. However, the present invention is not necessarily limited thereto. In this case, the orientations DQ1, DQ2 may be calculated from the direction of a line perpendicular to the line connecting the cluster edges E1 and E2 as is the case where the cluster edge E2 and the vertex T are distant.
The numerical values listed in the above embodiments are merely examples, and matter of course, it is possible to adopt other numerical values.
Number | Date | Country | Kind |
---|---|---|---|
2018-057835 | Mar 2018 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2019/012299 | 3/25/2019 | WO | 00 |