The present application claims priority from Japanese application JP2021-050639 filed on Mar. 24, 2021, the contents of which is hereby incorporated by reference into this application.
The present disclosure relates to a posture recognition system, a posture recognition method, and a recording medium.
A posture recognition system that recognizes a user's posture has drawn attention in recent years. A posture recognition system is used, for example, to find burdens on workers engaged in industrial or agricultural activities, ensure safety of the workers, or improve work efficiency, and can send an instruction according to a worker's posture to the worker.
A motion capture system that photographs a user by using photographing equipment such as camera and a close-contact type wearable sensor system that keeps a sensor such as an acceleration sensor capable of detecting a worker's motion in close contact with a user's body are known as posture recognition systems. With the motion capture system, however, the user must remain within a photographing range of the photographing equipment, which has put a limitation on a user's action. With the close-contact type wearable sensor system, there are also problems one example of which is that the sensor itself has been a limitation on a worker's action.
In contrast, JP-2009-18158-A discloses a technique that attaches a plurality of sensors such as temperature sensors and a plurality of pieces of wireless equipment to clothing and recognizes the user's posture from signals obtained by the respective sensors. This technique makes it possible to reduce limitations on the user's action.
In order to solve a problem of a posture recognition error that occurs in a case where a sensor is changed in its position due to slipping of clothing in a clothing type sensor system in which the sensor is attached to clothing, a technique is disclosed in “Shinya Namikawa, Yu Enokibori, Kenji Mase, “A study of preventive measures against misalignment of clothing type sensor by arranging a plurality of sensors of the same type in proximity,” Multimedia, Distributed, Cooperative, and Mobile (DICOMO2016) Symposium, July 2016, p. 1183-1189” in which a plurality of sensors are arranged in proximity and an optimal sensor is adopted from among these sensors.
In the technique described in JP-2009-18158-A, no consideration has been given to an error that occurs in the case where a sensor is attached to clothing. Although consideration has been given to an error that occurs when a sensor is changed in its position due to slipping of clothing in the technique described in “Shinya Namikawa, Yu Enokibori, Kenji Mase, “A study of preventive measures against misalignment of clothing type sensor by arranging a plurality of sensors of the same type in proximity,” Multimedia, Distributed, Cooperative, and Mobile (DICOMO2016) Symposium, July 2016, p. 1183-1189,” what is done here is merely to select, from among the plurality of sensors arranged in proximity, the sensor that provides the smallest error that occurs in the case where a specific motion is made. Accordingly, it has been difficult to properly respond to an error whose cause and magnitude are changed according to the user's motion.
It is an object of the present disclosure to provide a posture recognition system, a posture recognition method, and a recording medium that can properly respond to an error that is changed according to a user's motion.
A posture recognition system according to an aspect of the present disclosure includes a posture estimation section and an error estimation section. The posture estimation section calculates, on the basis of sensor data measured by a posture sensor attached to clothing, posture data indicating a posture of a user wearing the clothing. The error estimation section calculates error data, which is an estimate of an error that is occurring in the posture data, on the basis of the sensor data and the posture data.
According to the present invention, it is possible to properly respond to an error that is changed according to a user's motion.
A description will be given below of embodiments of the present disclosure with reference to drawings. It should be noted, however, that the following embodiments are illustrative for description of the present disclosure and are not intended to limit the scope of the present disclosure to these embodiments. A person skilled in the art can carry out the present disclosure in various other manners without departing from the scope of the present disclosure.
In the configuration of the invention described below, the same portions or portions having the same function may be denoted by the same reference signs in common in different drawings for omission of redundant description. In the case where there are a plurality of elements having the same or similar functions, such elements may be described by adding different suffixes to the same reference sign. It should be noted, however, that in the case where there is no need to distinguish between the plurality of elements, such elements may be described by omitting suffixes.
Actual positions, sizes, shapes, and regions of the components illustrated in the drawings may not be given to facilitate the understanding of the present disclosure. Accordingly, the present disclosure is not limited to the positions, sizes, shapes, and regions disclosed in the drawings. Even if a component is illustrated in singular form in the present specification, there may be the plurality of components unless otherwise indicated in context.
<1. Overall Configuration of the System>
The sensor section 110 is a wearable unit attached to clothing 111 of the user 210. The sensor section 110 may be, for example, pasted to the clothing 111 by using adhesive tape or embedded in advance in the clothing 111. The clothing 111 is, for example, common clothing that is not in close contact with a body of the user 210 such as work clothing.
The sensor section 110 includes a posture sensor 112 and a communication section 113. The posture sensor 112 is a motion measurement sensor for measuring the motions of the user 210. The motions refer to motions in general including not only motions associated with industrial and agricultural activities but also motions for specific purposes such as dancing, gymnastics, and performance with musical instruments.
In
Although the posture sensors 112 are attached to the clothing 111 in such a manner as to be arranged on respective body parts of the user 210 in the example illustrated in
Only one type or a plurality of types of the posture sensors 112 may be used, and the type of the posture sensors 112 may be selected as appropriate according to a target's posture to be estimated. For example, in the case where the posture of the user 210 is directly estimated, it is preferred that the type of sensors that can measure the positions or motions of the respective body parts of the user 210 such as acceleration sensor or position sensor be used. The type of the posture sensors 112 is not limited to this example, and gyro sensor, geomagnetic sensor, myoelectric sensor, acceleration sensor, video sensor, or audio sensor, for example, may also be used.
The sensor hub 220 sends the motion data 140 from the respective posture sensors 112 to the information processing apparatus 120 via the communication section 113. Although not illustrated in
Referring back to the description of
It should be noted that the information processing apparatus 120 may include a single server or other computer system in which optional portions of the input apparatus, the output apparatus, the processing apparatus, and the storage apparatus are connected to each other by a network. Functions equivalent to those realized by the above program may be realized by hardware such as FPGA (Field Programmable Gate Array) or ASIC (Application Specific Integrated Circuit). A neural network which will be described later may be realized by FPGA.
The information processing apparatus 120 includes a communication section 121, a posture estimation section 122, an error estimation section 123, an information generation section 124, and a control section 125 as functional components.
The communication section 121 communicates with external apparatuses such as the sensor section 110. For example, the communication section 121 receives the motion data 140 from the communication section 113 of the sensor section 110. The communication section 121 sends posture information 150 indicating the posture of the user 210 estimated on the basis of the motion data 140. Although destinations of the posture information 150 are not specifically limited, the posture information 150 is sent, for example, to the storage apparatus that stores the posture information 150, a display apparatus that displays the posture information 150, or an analysis apparatus that analyzes the posture information 150.
The posture estimation section 122 estimates the posture of the user 210 on the basis of the motion data 140 received by the communication section 121 and calculates posture data indicating the estimated posture which is an estimate of the posture. The error estimation section 123 estimates an error that is occurring in the posture data calculated by the posture estimation section 122 and calculates error data indicating the estimated error. The posture estimation section 122 and the error estimation section 123 can be configured, for example, by using a neural network.
The information generation section 124 generates, on the basis of the posture data calculated by the posture estimation section 122 and the error data calculated by the error estimation section 123, additional information which is an evaluation of the error data. The control section 125 controls the information processing apparatus 120 as a whole.
<2. Posture Estimation Section>
In the present embodiment, the posture estimation section 122 calculates a feature quantity representing a feature of the estimated posture as posture data. The feature quantities correspond to states of moving parts (joints) connected to body parts and are, for example, elbow bending angle, lower back bending angle, head orientation, and opening leg angle.
The type and arrangement of the posture sensors 112 are selected according to the target's posture to be estimated. A description will be given below of an example in which a lower back bending angle is calculated as a feature quantity which is posture data by using an acceleration sensor as the posture sensor 112 unless otherwise specified. In this case, the posture estimation section 122 includes a lower back state estimation section 301 that calculates a lower back bending angle as illustrated in
The lower back state estimation section 301 calculates, for example, a rotation angle of the lower back in a fore-and-aft direction as a bending angle of the lower back which is a feature quantity on the basis of the motion data 140 from the four posture sensors 112 arranged on shoulders, a chest, and the lower back of the user 210 illustrated in
The lower back state estimation section 301 includes, for example, a learning model built by known supervised learning by using a deep neural network (hereinafter referred to as a DNN) that has the motion data 140 as an explanatory variable and a feature quantity as an objective variable. It should be noted that the lower back state estimation section 301 may calculate a feature quantity from the motion data 140 of the posture sensors 112 without using a DNN. For example, the lower back state estimation section 301 can estimate the lower back bending angle by acquiring, in advance, the motion data 140 (acceleration) in a given posture (e.g., upright posture) of the user as an initial state or by acquiring the motion data 140 of other sensor such as position sensor different from the acceleration sensor without using a DNN.
It should be noted that in the case where a feature quantity other than the lower back bending angle is calculated, it is only necessary to add, in keeping with the feature quantity, a functional section such as elbow rotation angle estimation section or head orientation estimation section which calculates the feature quantity to the posture estimation section 122.
<3. Error Estimation Section>
Normally, the posture sensors 112 are attached to the clothing 111 such that the positions and orientations thereof with respect to the body of the user 210 are as expected in advance as illustrated in
For example, in the case where there is a slight slack in the clothing 111 on an abdomen as illustrated in
In the case where the posture sensors 112 are attached to the clothing 111 as described above, an error may occur in posture data calculated by the posture estimation section 122 due to deformation of the clothing 111. The error estimation section 123 calculates error data which is an estimate of an error that is occurring in the posture data on the basis of the motion data 140 and the posture data. It should be noted that since error data is changed over time, the error estimation section 123 acquires an error as chronological data. The acquired chronological data may be stored, for example, in the storage apparatus within the information processing apparatus 120 or in an external storage apparatus.
The error estimation section 123 includes, for example, a learning model built by known supervised learning by using a DNN that has the motion data 140 and posture data as explanatory variables and an error as an objective variable. A learning model can be built, for example, by causing the user 210 wearing the clothing 111 to which the posture sensors 112 are attached to perform various tasks, acquiring an error between a user's actual posture and an estimated posture (posture data), and using known supervised learning in which the motion data 140, posture data, and error are used as training data.
It should be noted that the motion data 140 used as training data can be selected as appropriate according to the target's posture to be estimated and the body part. For example, in the case where the lower back bending angle is estimated, the motion data 140 used as training data may be motion data from the posture sensors 112 arranged on the abdomen and chest or data obtained by adding motion data from the posture sensors 112 arranged on the shoulders and upper arms to the motion data from the posture sensors arranged on the abdomen and chest. In the case where the lower back bending angle is estimated, posture data used as training data may be the lower back bending angle or data obtained by adding feature quantities associated with the upper arms to the lower back bending angle.
As means of acquiring the user's actual posture, the actual posture may be visually determined from a video image acquired by using imaging equipment such as camera. Alternatively, a known motion capture system or a close-contact type wearable sensor system may be used.
The error estimation section 123 may estimate an error by classifying the error into a plurality of classes.
<4. Information Generation Section>
The information generation section 124 generates additional information which is an evaluation of error data on the basis of posture data calculated by the posture estimation section 122 and error data calculated by the error estimation section 123 and generates the posture information 150 that includes the posture data, the error data, and the additional information.
Additional information is, for example, evaluation information indicating availability or reliability of the estimated posture or notification information according to an error. The evaluation information refers specifically to information indicating the extent to which an estimated posture is reliable and may be calculated by using an error itself or by using a statistical value such as time average of an error. Notification information is, for example, information for notifying the user 210 or an administrator of the posture recognition system 1 that an error is large in the case where the error or the statistical value thereof is equal to or larger than a threshold. Notification information may be, for example, output as an alert tone from the information processing apparatus held by the user 210 or the administrator of the posture recognition system 1.
<5. Operation>
First, each of the posture sensors 112 of the sensor section 110 regularly or continuously measures the motion data 140 and outputs the motion data 140 to the sensor hub 220. The sensor hub 220 sends the motion data 140 sent from each of the posture sensors 112 to the information processing apparatus 120 via the communication section 113 (step S101).
When the communication section 121 of the information processing apparatus 120 receives the motion data 140, the posture estimation section 122 estimates the posture of the user 210 on the basis of the motion data 140 and calculates a feature quantity of the estimated posture as posture data indicating the estimated posture (step S102).
The error estimation section 123 estimates an error that is occurring in the feature quantity calculated by the posture estimation section 122 on the basis of the motion data 140 and the feature quantity and calculates error data that indicates the estimated error (step S103).
The information generation section 124 generates additional information which is an evaluation of the error data on the basis of the feature quantity and the error data (step S104).
The information generation section 124 generates the posture information 150 that includes the feature quantity, the error data, and the additional information. The communication section 121 sends the posture information (step 5105) and terminates the process.
<6. Advantageous Effect of the Embodiments>
According to the embodiment described above, the posture estimation section 122 calculates posture data indicating the posture of the user 210 wearing the clothing 111 on the basis of the motion data 140 measured by the posture sensors 112 attached to the clothing 111. The error estimation section 123 calculates error data which is an estimate of an error that is occurring in the posture data on the basis of the motion data 140 and the posture data. Accordingly, it becomes possible to recognize the cause and magnitude of the error that are changed according to the motion of the user 112 due to attachment of the posture sensors 112 to the clothing 111, which makes it possible to properly respond to the error. For example, it becomes possible for an application program that performs a variety of processes on the basis of posture data to perform the processes in consideration of the error, which makes it possible to perform the processes on the basis of the posture data with more accuracy.
In the present embodiment, posture data and error data are chronological data. Accordingly, it becomes possible to find the change in error over time, which makes it possible to more properly respond to the error.
In the present embodiment, the information generation section 124 generates additional information which is an evaluation of error data. Accordingly, it becomes possible to issue notification about reliability of posture data, which makes it possible to properly respond to an error with ease.
In the present embodiment, a description will be given of an example of the posture recognition system 1 that can reduce an error by generating posture data and error data corresponding to each of the plurality of posture sensors 112 attached to different positions. The posture sensors 112 are specifically attached to the clothing 111 in such a manner as to be arranged on a plurality of different body parts of the user 210.
In the present embodiment, the posture estimation section 122 calculates posture data indicating the same posture of the user 210 for each posture sensor 112 on the basis of the motion data 140 measured by each of the plurality of posture sensors 112 (e.g., the posture sensors 112b and 112c illustrated in
It should be noted that although the plurality of posture sensors 112 are attached to different positions in the present embodiment, the posture sensors 112 may be attached in different manners in addition to or in place of the positions where the posture sensors 112 are attached. The manners in which the posture sensors 112 are attached include, for example, pasting to the clothing 111 with adhesive tape and embedding in the clothing 111 in advance.
According to the embodiment described above, it is possible to confirm how error data is changed according to the positions where the posture sensors 112 are attached and the manners in which the posture sensors 112 are attached. Accordingly, it is possible to find the position and manner in which the posture sensors 112 are attached that provide a small error, which makes it possible to realize the arrangement of the posture sensors 112 with a smaller error.
In the present embodiment, a description will be given of an example of the posture recognition system 1 that can reduce an error by generating posture data and error data corresponding to each of the plurality of posture sensors 112 attached to different pieces of clothing. The posture sensors 112 are specifically attached to corresponding positions of respective pieces of the clothing 111 of the same type and different sizes. We assume here that the sizes are S, M, and L in the order from small to large. It should be noted, however, that the posture sensors 112 may be attached to different types of clothing.
The posture estimation section 122 calculates posture data indicating the same posture of the user 210 on the basis of the motion data 140 measured by each of the posture sensors 112 attached to the plurality of pieces of clothing 111, respectively, for each posture sensor 112 (i.e., for each piece of the clothing 111). Here, the same user 210 puts on the plurality of pieces of clothing 111 of different sizes in sequence and makes a given motion, and the posture estimation section 122 calculates feature quantities in sequence by calculating posture data corresponding to the respective pieces of the clothing 111 in sequence. It should be noted, however, that the posture estimation section 122 may calculate posture data in parallel for the plurality of pieces of clothing 111 put on by the different users 210. A single motion or a series of motions such as radio calisthenics may be made as a given motion.
The error estimation section 123 calculates, for each posture sensor 112, error data indicating an error that is occurring in that posture sensor 112. The information generation section 124 may generate additional information for each posture sensor 112 or may generate, as additional information, information obtained by comparing pieces of error data corresponding to the respective posture sensors 112 with each other.
According to the embodiment described above, it is possible to generate error data for each of the plurality of pieces of clothing. Accordingly, it becomes possible to select the piece of clothing 111 that suits the user 210 from among the plurality of pieces of clothing 111 such as selecting the piece of clothing 111 that provides the smallest error for the body shape of the user 120.
In the present embodiment, a description will be given of an example of the posture recognition system 1 that can reduce an error by generating posture data and error data corresponding to each of the plurality of posture sensors 112 attached to different positions as in embodiment 2. It should be noted, however, that, in the present embodiment, an example will be described in which the posture sensors 112 are attached to the clothing 111 in such a manner as to be arranged on the same body part of the user 210.
In the example in
In the present embodiment, the posture estimation section 122 calculates posture data indicating the same posture of the user 210 for each posture sensor 112 on the basis of the motion data 140 measured by each of the plurality of posture sensors 112 (e.g., the posture sensors 112d to 112f illustrated in
It should be noted that although the plurality of posture sensors 112 are attached to the positions corresponding to the forearm in the present embodiment, the body part to which the posture sensors are attached is not limited to the forearm, and the posture sensors may be attached to a body part that can be considered a single rigid body (e.g., upper arm, forearm, shin).
As described above, it is possible, in the present embodiment, to realize more accurate posture recognition such as recognizing in what manner the clothing 111 is deformed.
The posture recognition system 1 in each of the embodiments described above is applicable, for example, to a work support system or an education system. The work support system can be used, for example, for training of maintenance tasks. The education system can be used, for example, for practicing dancing or yoga poses.
Number | Date | Country | Kind |
---|---|---|---|
2021-050639 | Mar 2021 | JP | national |