This application is based upon and claims the benefit of priority from Japanese patent application No. 2020-138238, filed on Aug. 18, 2020, the disclosure of which is incorporated herein in its entirety by reference.
The present disclosure relates to a motion state monitoring system, a training support system, a method for controlling the motion state monitoring system, and a control program.
The motion detection apparatus disclosed in Japanese Unexamined Patent Application Publication No. 2020-81413 includes: a posture detection unit that detects, by using measurement data of a set of sensors (an acceleration sensor and an angular velocity sensor) attached to a body part of a body of a user (a subject), a posture of the body part; a time acquisition unit that acquires a time elapsed from when measurement of a motion is started; and a motion state detection unit that detects a motion state of the user by using the posture detected by the posture detection unit and the elapsed time acquired by the time acquisition unit.
However, the motion detection apparatus disclosed in Japanese Unexamined Patent Application Publication No. 2020-81413 has a problem that since the motion state of the user is detected using measurement data of only a set of sensors attached to the body part of the body of the user (the subject), a more complicated motion state of the user cannot be effectively monitored.
The present disclosure has been made in view of the aforementioned circumstances and an object thereof is to provide a motion state monitoring system, a training support system, a method for controlling the motion state monitoring system, and a control program that are capable of effectively monitoring a complicated motion state of a subject by monitoring a motion state of the subject using a result of detection performed by one or a plurality of sensors that are selected from among a plurality of sensors based on a motion to be monitored.
A first exemplary aspect is a motion state monitoring system including: a selection unit configured to select one or a plurality of sensors from among a plurality of sensors associated with a plurality of respective body parts of a body of a subject based on one or a plurality of specified motions to be monitored; a calibration result determination unit configured to determine whether or not a calibration of each of at least the one or plurality of sensors selected by the selection unit has been completed; a calculation processing unit configured to generate a result of a calculation indicating a motion state of the subject based on a result of detection performed by each of the one or plurality of sensors selected by the selection unit when the calibration result determination unit determines that the calibration has been completed; and an output unit configured to output the result of the calculation performed by the calculation processing unit. By using a result of detection performed by one or a plurality of sensors selected from among a plurality of sensors associated with a plurality of respective body parts based on a motion to be monitored, this motion state monitoring system can output a more accurate result of a calculation indicating a motion state of the subject than when a result of detection performed by a set of sensors attached to one body part is used. As a result, a user can effectively monitor a complicated motion state of the subject. Further, this motion state monitoring system can output a more accurate result of a calculation by using a result of detection performed by the sensor for which a calibration has been completed. As a result, a user can more accurately monitor the motion state of the subject.
The calibration result determination unit is configured to determine that the calibration has been completed when an output value of each of at least the one or plurality of sensors selected by the selection unit falls within a predetermined range after a predetermined period of time has elapsed from when the calibration of each of at least the one or plurality of sensors is started.
A time at which the calibration is started may be a time at which an instruction that the calibration is to be started has been given in a state in which each of at least the one or plurality of sensors selected by the selection unit has been brought to a standstill. Further, the time at which the calibration is started may be a time at which power of each of at least the one or plurality of sensors selected by the selection unit is turned on in a state in which each of at least the one or plurality of sensors has been brought to a standstill.
It is desirable that the output unit be further configured to output, when the calibration of one of at least the one or plurality of sensors selected by the selection unit is not completed, information prompting a user to bring the sensor for which the calibration is not completed to a standstill. By this configuration, a user can determine for which the sensor the calibration is not completed and bring this sensor to a standstill.
The calibration result determination unit is further configured to determine whether or not all the calibrations of the plurality of sensors associated with the plurality of respective body parts of the body of the subject have been completed. By this configuration, it is possible to complete the calibration before pairing is performed.
Another exemplary aspect is a training support system including: a plurality of measuring instruments each including one of the plurality of sensors associated with a plurality of respective body parts of a body of a subject; and the motion state monitoring system according to any one of the above-described aspects. By using a result of detection performed by one or a plurality of sensors selected from among a plurality of sensors associated with a plurality of respective body parts based on a motion to be monitored, this training support system can output a more accurate result of a calculation indicating a motion state of the subject than when a result of detection performed by a set of sensors attached to one body part is used. As a result, a user can effectively monitor a complicated motion state of the subject. Further, this training support system can output a more accurate result of a calculation by using a result of detection performed by the sensor for which a calibration has been completed. As a result, a user can more accurately monitor the motion state of the subject.
Another exemplary aspect is a method for controlling a motion state monitoring system, the method including: selecting one or a plurality of sensors from among a plurality of sensors associated with a plurality of respective body parts of a body of a subject based on one or a plurality of specified motions to be monitored; determining whether or not a calibration of each of at least the one or plurality of selected sensors has been completed; generating a result of a calculation indicating a motion state of the subject based on a result of detection performed by each of the one or plurality of selected sensors when it is determined that the calibration has been completed; and outputting the result of the calculation. In this method for controlling a motion state monitoring system, by using a result of detection performed by one or a plurality of sensors selected from among a plurality of sensors associated with a plurality of respective body parts based on a motion to be monitored, it is possible to output a more accurate result of a calculation indicating a motion state of the subject than when a result of detection performed by a set of sensors attached to one body part is used. As a result, a user can effectively monitor a complicated motion state of the subject. Further, in this method for controlling a motion state monitoring system, it is possible to output a more accurate result of a calculation by using a result of detection performed by the sensor for which a calibration has been completed. As a result, a user can more accurately monitor the motion state of the subject.
Another exemplary aspect is a control program for causing a computer to: select one or a plurality of sensors from among a plurality of sensors associated with a plurality of respective body parts of a body of a subject based on one or a plurality of specified motions to be monitored; determine whether or not a calibration of each of at least the one or plurality of selected sensors has been completed; generate a result of a calculation indicating a motion state of the subject based on a result of detection performed by each of the one or plurality of selected sensors when it is determined that the calibration has been completed; and output the result of the calculation. By using a result of detection performed by one or a plurality of sensors selected from among a plurality of sensors associated with a plurality of respective body parts based on a motion to be monitored, this control program can output a more accurate result of a calculation indicating a motion state of the subject than when a result of detection performed by a set of sensors attached to one body part is used. As a result, a user can effectively monitor a complicated motion state of the subject. Further, this control program can output a more accurate result of a calculation by using a result of detection performed by the sensor for which a calibration has been completed. As a result, a user can more accurately monitor the motion state of the subject.
According to the present disclosure, it is possible to provide a motion state monitoring system, a training support system, a method for controlling the motion state monitoring system, and a control program that are capable of effectively monitoring a complicated motion state of a subject by monitoring a motion state of the subject using a result of detection performed by one or a plurality of sensors that are selected from among a plurality of sensors based on a motion to be monitored.
The above and other objects, features and advantages of the present disclosure will become more fully understood from the detailed description given hereinbelow and the accompanying drawings which are given by way of illustration only, and thus are not to be considered as limiting the present disclosure.
Hereinafter, although the present disclosure will be described with reference to an embodiment of the present disclosure, the present disclosure according to claims is not limited to the following embodiment. In order to clarify the explanation, the following descriptions and the drawings are partially omitted and simplified as appropriate. Further, the same symbols are assigned to the same elements throughout the drawings, and redundant descriptions are omitted as necessary.
As shown in
The measuring instruments 11_1 to 11_11 are respectively attached to body parts 20_1 to 20_11 from which motions are to be detected among various body parts of the body of a subject P, and detect the motions of the respective body parts 20_1 to 20_11 by using motion sensors (hereinafter simply referred to as sensors) 111_1 to 111_11 such as gyro sensors. Note that the measuring instruments 11_1 to 11_11 are associated with the respective body parts 20_1 to 20_11 by pairing processing performed with the motion state monitoring apparatus 12.
As shown in
Referring back to
The motion state monitoring apparatus 12 is an apparatus that outputs a result of a calculation indicating a motion state of the subject P based on results (sensing values) of detection performed by the sensors 111_1 to 111_11. The motion state monitoring apparatus 12 is, for example, one of a Personal Computer (PC), a mobile phone terminal, a smartphone, and a tablet terminal, and is configured so that it can communicate with the sensors 111_1 to 111_11 via a network (not shown). The motion state monitoring apparatus 12 can also be referred to as a motion state monitoring system.
Specifically, the motion state monitoring apparatus 12 includes at least a selection unit 121, a calculation processing unit 122, an output unit 123, and a calibration result determination unit 124.
The selection unit 121 selects, from among the sensors 111_1 to 111_11 associated with the respective body parts 20_1 to 20_11 of the body of the subject P, one or a plurality of sensors used to measure a motion (a motion such as bending and stretching the right elbow and internally and externally rotating the left shoulder) to be monitored which is specified by a user such as an assistant.
The calibration result determination unit 124 determines whether or not calibrations of at least the one or plurality of sensors selected by the selection unit 121 have been completed.
A calibration is, for example, processing for measuring an output value (an error component) of a sensor in a standstill state, the sensor being used to measure a motion to be monitored, and subtracting the error component from a measured value. It should be noted that the output value of the sensor is stabilized within a predetermined range after about 20 seconds has elapsed from when the sensor is brought to a standstill (see
The calculation processing unit 122 performs calculation processing based on a result of detection performed by each of the one or plurality of sensors selected by the selection unit 121, and generates a result of the calculation indicating a motion state of the motion to be monitored. It should be noted that the calculation processing unit 122 performs the aforementioned calculation processing when the calibration result determination unit 124 determines that the calibration has been completed. By doing so, the calculation processing unit 122 can prevent the result of detection performed by the sensor that has not been calibrated from being used erroneously.
The output unit 123 outputs a result of the calculation performed by the calculation processing unit 122. The output unit 123 is, for example, a display apparatus, and displays a result of a calculation performed by the calculation processing unit 122 on a monitor, for example, by graphing the result. In this embodiment, an example in which the output unit 123 is a display apparatus will be described. However, the output unit 123 is not limited to being a display apparatus, and may instead be a speaker for outputting by voice a result of a calculation performed by the calculation processing unit 122, or a transmission apparatus that transmits a result of a calculation performed by the calculation processing unit 122 to an external display apparatus or the like.
Further, the output unit 123 may be configured to output a result of determination performed by the calibration result determination unit 124. For example, the output unit 123 may output information indicating that the calibration has been completed, and when the calibration is not completed even after a predetermined period of time has elapsed, it may output information prompting a user to bring the sensor for which the calibration is not completed to a standstill.
In the training support system 1, pairing processing is first performed between the measuring instruments 11_1 to 11_11 and the motion operation state monitoring apparatus 12, whereby the measuring instruments 11_1 to 11_11 and the body parts 20_1 to 20_11 are respectively associated with each other (Step S101). Note that the pairing processing can also be performed in advance by previously registering the above respective measuring instruments and body parts.
After that, a user specifies a motion to be monitored of the subject P (Step S102). This allows the output unit 123, which is a display apparatus, to display the body part to which the sensor used to measure the specified motion to be monitored is to be attached (Step S103). A method for a user to specify a motion to be monitored will be described below with reference to
As shown in
After that, as shown in
This selection list 303 includes, for example, motions such as bending and stretching of the right shoulder, adduction and abduction of the right shoulder, internal and external rotation of the right shoulder, bending and stretching of the right elbow, pronation and supination of the right forearm, bending and stretching of the head, rotation of the head, bending and stretching of the chest and the waist, rotation of the chest and the waist, lateral bending of the chest and the waist, bending and stretching of the left shoulder, adduction and abduction of the left shoulder, internal and external rotation of the left shoulder, bending and stretching of the left elbow, and pronation and supination of the left forearm. The user selects more detailed motions to be monitored from this selection list 303. By doing so, among the body parts “1” to “11” (the body parts 20_1 to 20_11) to which the sensors are to be attached shown in the human body schematic diagram 301, the body part to which the sensor used to measure the motions to be monitored specified by the user is to be attached is highlighted.
In the example shown in
Note that, in the example of
In the example shown in
Here, it is possible to measure the bending and stretching motion of the right elbow based on a result of the detection performed by each of the sensor (111_1) attached to the right upper arm (the body part 20_1) and the sensor (111_2) attached to the right forearm (the body part 20_2). Similarly, it is possible to measure the internal and external rotation motion of the right shoulder based on the result of the detection performed by each of the sensor (111_1) attached to the right upper arm (the body part 20_1) and the sensor (111_2) attached to the right forearm (the body part 20_2).
Further, it is possible to measure the bending and stretching motion of the left elbow based on a result of the detection performed by each of the sensor (111_6) attached to the left upper arm (the body part 20_6) and the sensor (111_7) attached to the left forearm (the body part 20_7). Similarly, it is possible to measure the internal and external rotation motion of the left shoulder based on the result of the detection performed by each of the sensor (111_6) attached to the left upper arm (the body part 20_6) and the sensor (111_7) attached to the left forearm (the body part 20_7).
Therefore, in the example shown in
Note that when there is a sensor of which the power is off among the sensors used to measure the motions to be monitored, the sensor (more specifically, the body part to which the sensor of which the power is off is to be attached) of which the power is off may be highlighted.
Specifically, in the example shown in
After the motion to be monitored is specified (Step S102) and the body part to which the sensor used to measure the motion to be monitored is to be attached is displayed (Step S103), a calibration of the sensor used to measure the motion to be monitored is subsequently performed (Step S104).
During the calibration, the monitor 300 displays, for example, the information that “Calibration is in progress. Place the sensor on the desk and do not move it” as shown in
In this example, a calibration is performed on the sensors 111_1, 111_2, 111_6, and 111_7 used to measure the motions to be monitored. However, the calibration is not limited to being performed on the sensors used to measure the motions to be monitored, and may instead be performed on all the sensors 111_1 to 111_11, for example, before the pairing processing. Note that it is only required that the calibration be completed before the start of measurement of the motion to be monitored.
After the calibration has been completed, the sensor is attached to the subject P (Step S105). In this example, the sensors 111_1, 111_2, 111_6, and 111_7 are attached to the body parts 20_1, 20_2, 20_6, and 20_7 of the subject P, respectively.
After that, the motion to be monitored is measured based on a result of detection performed by each of the sensors 111_1, 111_2, 111_6, and 111_7 (Step S106).
As shown in
In the examples shown in
Note that the monitor 300 may display all the graphs showing the respective results of detection performed by the four sensors 111_1, 111_2, 111_6, and 111_7. Further, the monitor 300 may display all the graphs showing the results of the calculations indicating the motion states of the four motions to be monitored.
Further, the graphs 308_1 and 308_2 showing the motion states of the motions to be monitored may be displayed so that they are each displayed in a size larger than that of information (e.g., the startup status 306 of each sensor, the remaining battery power 307 of each sensor, and the graphs 305_1 and 305_2 showing the results of detection performed by the sensors) about the sensor. Thus, it is possible to more easily visually recognize the motion state of the subject P.
Note that the result of the calculation indicating the motion state of the “bending and stretching of the right elbow” can be determined by, for example, a difference between the result of detection performed by the sensor 111_1 attached to the right upper arm and the result of detection performed by the sensor 111_2 attached to the right forearm. Therefore, the calculation processing unit 122 generates a result of the calculation indicating the motion state of the “bending and stretching of the right elbow” based on the result of detection performed by each of the sensors 111_1 and 111_2 selected by the selection unit 121. Then the output unit 123, which is the display apparatus, graphs and displays the result of the calculation generated by the calculation processing unit 122 on the monitor 300.
Further, the result of the calculation indicating the motion state of the “bending and stretching of the left elbow” can be determined by, for example, a difference between the result of detection performed by the sensor 111_6 attached to the left upper arm and the result of detection performed by the sensor 111_7 attached to the left forearm. Therefore, the calculation processing unit 122 generates a result of the calculation indicating the motion state of the “bending and stretching of the left elbow” based on the result of detection performed by each of the sensors 111_6 and 111_7 selected by the selection unit 121. Then the output unit 123, which is the display apparatus, graphs and displays the result of the calculation generated by the calculation processing unit 122 on the monitor 300.
Similarly, the result of the calculation indicating the motion state of the “internal and external rotation of the right shoulder” can be determined by, for example, a difference between the result of detection performed by the sensor 111_1 attached to the right upper arm and the result of detection performed by the sensor 111_2 attached to the right forearm. Therefore, the calculation processing unit 122 generates a result of the calculation indicating the motion state of the “internal and external rotation of the right shoulder” based on the result of detection performed by each of the sensors 111_1 and 111_2 selected by the selection unit 121. Then the output unit 123, which is the display apparatus, can graph and display the result of the calculation generated by the calculation processing unit 122 on the monitor 300.
Similarly, the result of the calculation indicating the motion state of the “internal and external rotation of the left shoulder” can be determined by, for example, a difference between the result of detection performed by the sensor 111_6 attached to the left upper arm and the result of detection performed by the sensor 111_7 attached to the left forearm. Therefore, the calculation processing unit 122 generates a result of the calculation indicating the motion state of the “internal and external rotation of the left shoulder” based on the result of detection performed by each of the sensors 111_6 and 111_7 selected by the selection unit 121. Then the output unit 123, which is the display apparatus, can graph and display the result of the calculation generated by the calculation processing unit 122 on the monitor 300.
As described above, the motion state monitoring apparatus 12 according to this embodiment and the training support system 1 which includes this motion state monitoring apparatus 12 output a result of a calculation indicating a motion state of the subject based on a result of detection performed by each of one or a plurality of sensors corresponding to the motions to be monitored among a plurality of sensors associated with a plurality of respective body parts. By this configuration, the motion state monitoring apparatus 12 according to this embodiment and the training support system 1 which includes this motion state monitoring apparatus 12 can output a more accurate result of a calculation indicating a motion state of the subject than when a result of detection performed by a set of sensors attached to one body part is used. As a result, a user can effectively monitor a complicated motion state of the subject. Further, the motion state monitoring apparatus 12 according to this embodiment and the training support system 1 which includes this motion state monitoring apparatus 12 can output a more accurate result of a calculation by using a result of detection performed by the sensor for which a calibration has been completed. As a result, a user can more accurately monitor the motion state of the subject.
Note that the order of the processes performed in the training support system 1 is not limited to the order of the processes shown in
As shown in
The attaching direction detection unit 125 is configured so that it can detect information about the attaching directions of the sensors 111_1 to 111_11 with respect to the respective reference attaching directions. The output unit 123 outputs information about the attaching direction of the sensor with respect to the reference attaching direction thereof detected by the attaching direction detection unit 125 together with the result of detection performed by the sensor, and outputs the result of detection performed by the sensor in which the attaching direction of the sensor has been taken into account. By doing so, a user can more accurately grasp the result of detection performed by the sensor.
As described above, the motion state monitoring apparatus according to the aforementioned embodiment and the training support system which includes this motion state monitoring apparatus output a result of a calculation indicating a motion state of the subject based on a result of detection performed by each of one or a plurality of sensors corresponding to the motions to be monitored among a plurality of sensors associated with a plurality of respective body parts. By this configuration, the motion state monitoring apparatus according to the aforementioned embodiment and the training support system which includes this motion state monitoring apparatus can output a more accurate result of a calculation indicating a motion state of the subject than when a result of detection performed by a set of sensors attached to one body part is used. As a result, a user can effectively monitor a complicated motion state of the subject. Further, the motion state monitoring apparatus according to the aforementioned embodiment and the training support system which includes this motion state monitoring apparatus can output a more accurate result of a calculation by using a result of detection performed by the sensor for which a calibration has been completed. As a result, a user can more accurately monitor the motion state of the subject.
Further, although the present disclosure has been described as a hardware configuration in the aforementioned embodiment, the present disclosure is not limited thereto. In the present disclosure, control processing of the motion state monitoring apparatus can be implemented by causing a Central Processing Unit (CPU) to execute a computer program.
Further, the above-described program can be stored and provided to a computer using any type of non-transitory computer readable media. Non-transitory computer readable media include any type of tangible storage media. Examples of non-transitory computer readable media include magnetic storage media (such as floppy disks, magnetic tapes, hard disk drives, etc.), optical magnetic storage media (e.g., magneto-optical disks), CD-ROM (compact disc read only memory), CD-R (compact disc recordable), CD-R/W (compact disc rewritable), and semiconductor memories (such as mask ROM, PROM (programmable ROM), EPROM (erasable PROM), flash ROM, RAM (random access memory), etc.). The program may be provided to a computer using any type of transitory computer readable media. Examples of transitory computer readable media include electric signals, optical signals, and electromagnetic waves. Transitory computer readable media can provide the program to a computer via a wired communication line (e.g., electric wires, and optical fibers) or a wireless communication line.
From the disclosure thus described, it will be obvious that the embodiments of the disclosure may be varied in many ways. Such variations are not to be regarded as a departure from the spirit and scope of the disclosure, and all such modifications as would be obvious to one skilled in the art are intended for inclusion within the scope of the following claims.
Number | Date | Country | Kind |
---|---|---|---|
2020-138238 | Aug 2020 | JP | national |