This application is based upon and claims the benefit of priority from Japanese Patent Application No. 2013-138281, filed on Jul. 1, 2013; and Japanese Patent Application No. 2013-138303, filed on Jul. 1, 2013, the entire contents of all of which are incorporated herein by reference.
Embodiments described herein relate generally to a motion information processing apparatus and a method.
Conventionally, in rehabilitation, a number of experts provide cooperative support to enable persons who have mental and physical disabilities caused by various reasons, such as diseases, injuries, and aging, and congenital disabilities to live better lives. In rehabilitation, for example, cooperative support is provided by a number of experts, such as rehabilitation specialists, rehabilitation nurses, physical therapists, occupational therapists, speech-language-hearing therapists, clinical psychologists, prosthetists, and social workers.
In recent years, there have been developed motion capture technologies for digitally recording a motion of a person or an object. Examples of systems of the motion capture technologies include an optical, a mechanical, a magnetic, and a camera system. Widely known is the camera system for digitally recording a motion of a person by attaching markers to the person, detecting the markers with a tracker, such as a camera, and processing the detected markers, for example. Examples of systems using no marker or no tracker include a system for digitally recording a motion of a person by using an infrared sensor, measuring a distance from the sensor to the person, and detecting the size of the person and various types of motions of the skeletal structure. Examples of the sensors provided with such a system include Kinect (registered trademark).
According to embodiment, A motion information processing apparatus comprising, an obtaining unit, a judging unit and a controlling unit. The obtaining unit configured to obtain motion information of a subject who performs a motion of at least one of a squat and a jump. The judging unit configured to, on a basis of predetermined conditions for the motion of said at least one of the squat and the jump, judge whether the motion of the subject indicated by the motion information obtained by the obtaining unit satisfies a certain condition included in the predetermined conditions. The controlling unit configured to exercise control so that a judgment result obtained by the judging unit is provided as a notification.
Exemplary embodiments of a motion information processing apparatus and a method are described below with reference to the accompanying drawings. Motion information processing apparatuses described below may be used alone or in a manner incorporated in a system, such as a medical chart system and a rehabilitation section system.
As illustrated in
The motion information acquiring unit 10 detects a motion of a person, an object, or the like in a space where rehabilitation is performed, thereby acquiring motion information indicating the motion of the person, the object, or the like. The motion information will be described in detail in an explanation of processing of a motion information generating unit 14, which will be described later. The motion information acquiring unit 10 is Kinect (registered trademark), for example.
As illustrated in
The color image acquiring unit 11 captures a photographic subject, such as a person and an object, in a space where rehabilitation is performed, thereby acquiring color image information. The color image acquiring unit 11, for example, detects light reflected by the surface of the photographic subject with a light receiving element and converts visible light into an electrical signal. The color image acquiring unit 11 then converts the electrical signal into digital data, thereby generating color image information of one frame corresponding to a capturing range. The color image information of one frame includes capturing time information and information in which each pixel contained in the frame is associated with an RGB (red, green, and blue) value, for example. The color image acquiring unit 11 generates color image information of a plurality of consecutive frames from visible light sequentially detected, thereby capturing the capturing range as video. The color image information generated by the color image acquiring unit 11 may be output as a color image in which the RGB values of respective pixels are arranged on a bit map. The color image acquiring unit 11 includes a complementary metal oxide semiconductor (CMOS) and a charge coupled device (CCD) as the light receiving element, for example.
The distance image acquiring unit 12 captures a photographic subject, such as a person and an object, in a space where rehabilitation is performed, thereby acquiring distance image information. The distance image acquiring unit 12, for example, irradiates the surroundings with infrared rays and detects reflected waves, which are irradiation waves reflected by the surface of the photographic subject, with a light receiving element. The distance image acquiring unit 12 then derives a distance between the photographic subject and the distance image acquiring unit 12 based on the phase difference between the irradiation waves and the reflected waves and a time from the irradiation to the detection. The distance image acquiring unit 12 thus generates distance image information of one frame corresponding to the capturing range. The distance image information of one frame includes capturing time information and information in which each pixel contained in the capturing range is associated with a distance between the photographic subject corresponding to the pixel and the distance image acquiring unit 12, for example. The distance image acquiring unit 12 generates distance image information of a plurality of consecutive frames from reflected waves sequentially detected, thereby capturing the capturing range as video. The distance image information generated by the distance image acquiring unit 12 may be output as a distance image in which the gray scales of colors corresponding to the distances of the respective pixels are arranged on a bit map. The distance image acquiring unit 12 includes a CMOS and a CCD as the light receiving element, for example. The light receiving element may be shared by the color image acquiring unit 11. The unit of distance calculated by the distance image acquiring unit 12 is the meter (m), for example.
The speech recognition unit 13 collects speech of the surroundings, identifies the direction of a sound source, and recognizes the speech. The speech recognition unit 13 includes a microphone array provided with a plurality of microphones and performs beam forming. Beam forming is a technology for selectively collecting speech travelling in a specific direction. The speech recognition unit 13, for example, performs beam forming with the microphone array, thereby identifying the direction of a sound source. The speech recognition unit 13 uses a known speech recognition technology, thereby recognizing a word from the collected speech. In other words, the speech recognition unit 13 generates information in which a word recognized by the speech recognition technology, the direction in which the word is output, and time at which the word is recognized are associated with one another as a speech recognition result, for example.
The motion information generating unit 14 generates motion information indicating a motion of a person, an object, or the like. The motion information is generated by considering a motion (gesture) of a person as a plurality of successive postures (poses), for example. Specifically, the motion information generating unit 14 performs pattern matching using a human body pattern. The motion information generating unit 14 acquires coordinates of respective joints forming a skeletal structure of a human body from the distance image information generated by the distance image acquiring unit 12. The coordinates of respective joints obtained from the distance image information are values represented by a coordinate system of a distance image (hereinafter, referred to as a “distance image coordinate system”). The motion information generating unit 14 then converts the coordinates of respective joints in the distance image coordinate system into values represented by a coordinate system of a three-dimensional space in which rehabilitation is performed (hereinafter, referred to as a “world coordinate system”). The coordinates of respective joints represented by the world coordinate system correspond to skeletal information of one frame. Skeletal information of a plurality of frames corresponds to motion information. The processing of the motion information generating unit 14 according to the first embodiment will be specifically described.
In the first embodiment, the motion information generating unit 14 stores therein in advance a human body pattern corresponding to various postures by learning, for example. Every time the distance image acquiring unit 12 generates distance image information, the motion information generating unit 14 acquires the generated distance image information of each frame. The motion information generating unit 14 then performs pattern matching of the human body pattern with the acquired distance image information of each frame.
The human body pattern will now be described.
In the example of
While the explanation has been made of the case where the body pattern has the information on 20 joints in
The motion information generating unit 14 performs pattern matching of the human body pattern with the distance image information of each frame. The motion information generating unit 14, for example, performs pattern matching of the human body surface of the human body pattern illustrated in
In the pattern matching, the motion information generating unit 14 may supplementarily use information indicating the positional relation of the joints. The information indicating the positional relation of the joints includes connection relations between joints (e.g., “the joint 2a and the joint 2b are connected”) and ranges of motion of the respective joints, for example. A joint is a part connecting two or more bones. An angle formed by bones changes in association with a change in posture, and the range of motion varies depending on the joints. The range of motion is represented by the maximum value and the minimum value of the angle formed by bones connected by a joint, for example. The motion information generating unit 14 also learns the ranges of motion of the respective joints in the learning of the human body pattern, for example. The motion information generating unit 14 stores therein the ranges of motion in association with the respective joints.
The motion information generating unit 14 converts the coordinates of the respective joints in the distance image coordinate system into values represented by the world coordinate system. The world coordinate system is a coordinate system of a three-dimensional space where rehabilitation is performed. In the world coordinate system, the position of the motion information acquiring unit 10 is set as an origin, the horizontal direction corresponds to an x-axis, the vertical direction corresponds to a y-axis, and a direction orthogonal to the xy-plane corresponds to a z-axis, for example. The value of the coordinates in the z-axis direction may be referred to as a “depth”.
The following describes the conversion processing from the distance image coordinate system to the world coordinate system. In the first embodiment, the motion information generating unit 14 stores therein in advance a conversion equation used for conversion from the distance image coordinate system to the world coordinate system. The conversion equation receives coordinates in the distance image coordinate system and an incident angle of reflected light corresponding to the coordinates and outputs coordinates in the world coordinate system, for example. The motion information generating unit 14, for example, inputs coordinates (X1, Y1, Z1) of a certain joint and an incident angle of reflected light corresponding to the coordinates to the conversion equation, thereby converting the coordinates (X1, Y1, Z1) of the certain joint into coordinates (x1, y1, z1) in the world coordinate system. Because the correspondence relation between the coordinates in the distance image coordinate system and the incident angle of reflected light is known, the motion information generating unit 14 can input the incident angle corresponding to the coordinates (X1, Y1, Z1) to the conversion equation. The explanation has been made of the case where the motion information generating unit 14 converts the coordinates in the distance image coordinate system into the coordinates in the world coordinate system. Alternatively, the motion information generating unit 14 can convert the coordinates in the world coordinate system into the coordinates in the distance image coordinate system.
The motion information generating unit 14 generates skeletal information from the coordinates of the respective joints represented by the world coordinate system.
In the first row of
Every time the motion information generating unit 14 receives the distance image information of each frame from the distance image acquiring unit 12, the motion information generating unit 14 performs pattern matching on the distance image information of each frame. The motion information generating unit 14 thus performs conversion from the distance image coordinate system to the world coordinate system, thereby generating the skeletal information of each frame. The motion information generating unit 14 then outputs the generated skeletal information of each frame to the motion information processing apparatus 100 and stores the skeletal information in a motion information storage unit, which will be described later.
The processing of the motion information generating unit 14 is not necessarily performed by the method described above. While the explanation has been made of the method in which the motion information generating unit 14 uses a human body pattern to perform pattern matching, the embodiment is not limited thereto. Instead of the human body pattern or in addition to the human body pattern, the motion information generating unit 14 may use a pattern of each part to perform pattern matching.
While the explanation has been made of the method in which the motion information generating unit 14 obtains the coordinates of the respective joints from the distance image information in the description above, for example, the present embodiment is not limited thereto. The motion information generating unit 14 may obtain the coordinates of respective joints using color image information in addition to the distance image information, for example. In this case, the motion information generating unit 14, for example, performs pattern matching of a human body pattern represented by a color image coordinate system with the color image information, thereby obtaining the coordinates of the human body surface from the color image information. The color image coordinate system has no information on “distance Z” included in the distance image coordinate system. The motion information generating unit 14 acquires the information on “distance Z” from the distance image information, for example. The motion information generating unit 14 then performs arithmetic processing using the two pieces of information, thereby obtaining the coordinates of the respective joints in the world coordinate system.
The motion information generating unit 14 outputs the color image information generated by the color image acquiring unit 11, the distance image information generated by the distance image acquiring unit 12, and the speech recognition result output from the speech recognition unit 13 to the motion information processing apparatus 100 as needed. The motion information generating unit 14 then stores the pieces of information in the motion information storage unit, which will be described later. Pixel positions in the color image information can be associated with pixel positions in the distance image information in advance based on the positions of the color image acquiring unit 11 and the distance image acquiring unit 12 and the capturing direction. As a result, the pixel positions in the color image information and the pixel positions in the distance image information can also be associated with the world coordinate system derived by the motion information generating unit 14. The association processing and the use of the distance (m) calculated by the distance image acquiring unit 12 makes it possible to calculate the height and the length of each part of the body (the length of the arm and the length of the abdomen) and to calculate the distance between two pixels specified on a color image. Similarly, the capturing time information of the color image information can be associated with the capturing time information of the distance image information in advance. The motion information generating unit 14 refers to the speech recognition result and the distance image information. If the joint 2a is present near the direction in which a word recognized as speech at certain time is spoken, the motion information generating unit 14 can output the word as a word spoken by a person having the joint 2a. The motion information generating unit 14 outputs the information indicating the positional relation of the joints to the motion information processing apparatus 100 as needed and stores the information in the motion information storage unit, which will be described later.
While the explanation has been made of the case where the motion information acquiring unit 10 detects a motion of one person, the embodiment is not limited thereto. If a plurality of persons are included in the capturing range of the motion information acquiring unit 10, the motion information acquiring unit 10 may detect motions of the persons. If a plurality of persons are captured in the distance image information of a single frame, the motion information acquiring unit 10 associates pieces of skeletal information on the persons generated from the distance image information of the single frame with one another. The motion information acquiring unit 10 then outputs the skeletal information to the motion information processing apparatus 100 as the motion information.
The configuration of the motion information acquiring unit 10 is not limited to the configuration described above. In the case where the motion information is generated by detecting a motion of a person with another motion capture, such as an optical, a mechanical, or a magnetic motion capture, for example, the motion information acquiring unit 10 does not necessarily include the distance image acquiring unit 12. In this case, the motion information acquiring unit 10 includes markers attached to the human body so as to detect a motion of the person and a sensor that detects the markers as a motion sensor. The motion information acquiring unit 10 detects a motion of the person with the motion sensor, thereby generating the motion information. The motion information acquiring unit 10 uses the positions of the markers included in an image captured by the color image acquiring unit 11 to associate the pixel positions in the color image information with the coordinates in the motion information. The motion information acquiring unit 10 outputs the motion information to the motion information processing apparatus 100 as needed. In the case where the motion information acquiring unit 10 outputs no speech recognition result to the motion information processing apparatus 100, for example, the motion information acquiring unit 10 does not necessarily include the speech recognition unit 13.
While the motion information acquiring unit 10 outputs the coordinates in the world coordinate system as the skeletal information in the embodiment, the embodiment is not limited thereto. The motion information acquiring unit 10 may output the coordinates in the distance image coordinate system yet to be converted, for example. The conversion from the distance image coordinate system to the world coordinate system may be performed by the motion information processing apparatus 100 as needed.
Returning to the description of
As explained above, various types of training are conventionally performed as part of rehabilitation functional training. For example, from the standpoint of preventive medicine and sports medicine, squat training and jump training are performed as rehabilitation functional training. For example, during squat training, it is important to perform training properly while keeping a correct posture, and it is desirable that the subject performs training while a caregiver is checking the subject's posture. Further, for example, during jump training, the subject practices to land with a correct posture so as to learn to perform a jump that has a lower possibility of damaging his/her ligaments. During jump training also, it is desirable that the subject performs training while a caregiver is watching and checking the subject's posture during the actual jump. As additional information, for example, it is known that, while playing sports such as skiing, basketball, soccer, and volleyball, ligaments are likely to be damaged by a motion of twisting a knee (i.e., a motion where the orientation of the knee comes inside of the orientation of the toe) when landing from a jump.
In the current situation, although the number of people who undergo rehabilitation will increase in the future from the standpoint of preventive medicine and sports medicine, the number of caregivers who aid the rehabilitation is significantly insufficient. To cope with this situation, the motion information processing apparatus 100 according to the first embodiment is configured to make it possible to easily and conveniently evaluate, for example, a gradual motion such as a motion of squat training or a quick motion such as a motion of jump training.
For example, the motion information processing apparatus 100 may be an information processing apparatus such as a computer, a workstation, or the like. As illustrated in
The output unit 110 is configured to output various types of information used for evaluating a gradual motion such as a squat motion or a quick motion such as a jump motion. For example, the output unit 110 displays a Graphical User Interface (GUI) used by an operator who operates the motion information processing apparatus 100 when inputting various types of requests through the input unit 120, displays display information generated by the motion information processing apparatus 100, or outputs a warning sound. For example, the output unit 110 may be configured by using a monitor, a speaker, headphones, or the headphone portion of a headset. Further, the output unit 110 may be configured by using a display device that is designed to be attached to the body of the user, e.g., a goggle-type display device or a head-mount display device.
The input unit 120 is configured to receive an input of the various types of information used for evaluating a gradual motion such as a squat motion or a quick motion such as a jump motion. For example, the input unit 120 receives inputs of various types of requests (e.g., a request indicating that a predetermined threshold value used for evaluating the gradual motion or the quick motion should be set; a request indicating that an evaluating process should be started; a request indicating that a selection should be made from various types of information; and a request indicating that a measuring process should be performed using the GUI) from the operator of the motion information processing apparatus 100 and transfers the received various types of requests to the motion information processing apparatus 100. For example, the input unit 120 may be configured by using a mouse, a keyboard, a touch command screen, a trackball, a microphone, or the microphone portion of a headset. Alternatively, the input unit 120 may be a sensor configured to obtain biological information such as a blood pressure monitor, a heart rate monitor, a clinical thermometer, or the like.
The storage unit 130 is a semiconductor memory element, such as a random access memory (RAM) and a flash memory, or a storage device, such as a hard disk device and an optical disk device, for example. The control unit 140 is provided by an integrated circuit, such as an application specific integrated circuit (ASIC) and a field programmable gate array (FPGA), or a central processing unit (CPU) executing a predetermined computer program.
A configuration of the motion information processing apparatus 100 according to the first embodiment has thus been explained. The motion information processing apparatus 100 according to the first embodiment configured as described above easily and conveniently evaluates the motion of at least one of a squat and a jump, by using a configuration explained in detail below. More specifically, the motion information processing apparatus 100 of the present disclosure includes an obtaining unit, a judging unit, and a controlling unit. The obtaining unit is configured to obtain motion information of a subject who performs a motion of at least one of a squat and a jump. The judging unit is configured to, on the basis of predetermined conditions for the motion of at least one of the squat and the jump, judge whether the motion of the subject indicated by the motion information obtained by the obtaining unit satisfies a certain condition included in the predetermined conditions. The controlling unit is configured to exercise control so that a judgment result obtained by the judging unit is provided as a notification. With this configuration, the motion information processing apparatus 100 easily and conveniently evaluates a gradual motion such as a squat or a quick motion such as a jump. In the description of exemplary embodiments below, the first to the third embodiments will be explained by using examples where squat training is performed as a gradual motion, whereas the fourth to the sixth embodiments will be explained by using examples where jump training is performed as a quick motion.
The motion information storage unit 131 is configured to store therein the various types of information acquired by the motion information acquiring unit 10. More specifically, the motion information storage unit 131 stores therein the motion information generated by the motion information generating unit 14. Even more specifically, the motion information storage unit 131 stores therein the skeleton information corresponding to each of the frames generated by the motion information generating unit 14. In this situation, the motion information storage unit 131 may further store therein the color image information, the distance image information, and the speech recognition result that are output by the motion information generating unit 14, while keeping these pieces of information in correspondence with each of the frames.
For example, as illustrated in
In this situation, in the motion information illustrated in
Further, as illustrated in
Further, as illustrated in
Further, the “color image information” and the “distance image information” included in the motion information contain image data in a binary format such as Bitmap, a Joint Photographic Experts Group (JPEG) format, or the like, or contain a link to such image data or the like. Further, instead of the recognition information described above, the “speech recognition result” included in the motion information may be audio data itself or a link to the recognition information or the audio data.
The setting information storage unit 132 is configured to store therein setting information used by the controlling unit 140 (explained later). More specifically, the setting information storage unit 132 stores therein the predetermined conditions used by the controlling unit 140 (explained later) for evaluating the gradual motion performed by a rehabilitation subject. For example, the setting information storage unit 132 stores therein a condition used for judging whether a squat performed by a subject is performed with a correct posture. Next, a correct posture for the rehabilitation squat training will be explained, with reference to
In other words, as illustrated in
Accordingly, the setting information storage unit 132 is configured to store therein various types of setting information used for evaluating a correct squat posture such as that illustrated in
Next, details of the controlling unit 140 included in the motion information processing apparatus 100 will be explained. As illustrated in
The obtaining unit 141 is configured to obtain motion information of a subject who performs a gradual motion. More specifically, the obtaining unit 141 obtains the motion information acquired by the motion information acquiring unit 10 and stored in the motion information storage unit 131. For example, the obtaining unit 141 obtains the color image information, the distance image information, the speech recognition result, the skeleton information, and the like that are stored in the motion information storage unit 131 for each of the frames. In one example, the obtaining unit 141 obtains all the pieces of color image information, distance image information, and skeleton information that are related to a series of motions during the squat training of the subject.
On the basis of the predetermined conditions for the gradual motion, the judging unit 142 is configured to judge whether the motion of the subject indicated by the motion information obtained by the obtaining unit 141 satisfies a certain condition included in the predetermined conditions. More specifically, the judging unit 142 judges whether the skeleton information of the subject for each of the frames obtained by the obtaining unit 141 satisfies conditions of the setting information stored in the setting information storage unit 132. For example, on the basis of the predetermined conditions used when squat training is performed, the judging unit 142 judges whether the squat performed by the subject indicated by the skeleton information obtained by the obtaining unit 141 satisfies a certain condition included in the predetermined conditions stored in the setting information storage unit 132.
In one example, as the predetermined conditions used when squat training is performed, the judging unit 142 uses at least one of the following in a series of motions of the squat: how much the knees are projecting with respect to the positions of the ankles; how much the heels are off the ground; how much the arms are projecting rearward; and how much the upper body is leaning forward.
For example, as illustrated in
The calculation using the difference between the z-axis coordinates of the two points described above is merely an example, and possible embodiments are not limited to this example. For instance, when the subject is positioned facing straight to the motion information acquiring unit 10, it is also acceptable to calculate the depth distance “d1” from a joint corresponding to a knee and a joint corresponding to a tarsus. Alternatively, it is also acceptable to calculate the depth distance “d1” from a joint corresponding to a knee and another position of a foot. Further, for example, depending on the orientation of the subject with respect to the motion information acquiring unit 10, the judging unit 142 may judge how much the knees are projecting, by using not only z-axis coordinates, but also x-axis coordinates and/or y-axis coordinates.
In this situation, the length of “d1” may arbitrarily be set by the user (e.g., the subject, a caregiver, or the like). In one example, the length of “d1” may arbitrarily be set according to the age and the gender of the subject, whether the rehabilitation is performed after injury treatment or the rehabilitation is performed from the standpoint of preventive medicine or sports medicine. In this situation, the length of “d1” may be set as an absolute position or a relative position. In other words, “d1” may simply be set as a length from the position of an ankle to the position of a knee or may be set as a length to a position relative to a position of correct squat training. The various types of information used in the judging process described above are stored as the setting information in the setting information storage unit 132. In other words, the judging unit 142 reads the setting information from the setting information storage unit 132 and performs the judging process.
Further, the judging unit 142 further judges whether the subject is performing the gradual motion. If it has been determined that the subject is performing the gradual motion, the judging unit 142 judges whether the motion of the subject indicated by the motion information obtained by the obtaining unit 141 satisfies the certain condition included in the predetermined conditions. For example, the judging unit 142 first judges whether the subject is performing squat training and, if it has been determined that the subject is performing squat training, the judging unit 142 judges whether the performed squat training is performed with a correct posture.
In addition, by using the motion information of the subject who performs the gradual motion obtained by the obtaining unit 141, the judging unit 142 is capable of calculating other various types of information besides the distance between the joints. More specifically, the judging unit 142 is capable of calculating an angle of joints, a speed, acceleration, and the like. After that, the judging unit 142 judges whether the gradual motion (e.g., the squat training) is correct or not by using the calculated various types of information. Further, the judging unit 142 is capable of performing the judging process not only in a real-time manner while the subject is performing the squat training, but also by reading the motion information from the past, for example.
In that situation, for example, the subject himself/herself or an operator (e.g., a medical doctor or a physiotherapist) inputs an instruction request for a judging process via the input unit 120. In that situation, the operator causes the obtaining unit 141 to obtain desired piece of motion information by inputting the name of a subject, a name number, a date of training, and the like. The obtaining unit 141 obtains a corresponding piece of motion information of the subject for which the request was received via the input unit 120, from the motion information storage unit 131. When the process is performed in a real-time manner while the squat training is being performed, it is also acceptable to configure the motion information to be automatically obtained without receiving an operation from the operator.
Returning to the description of
Next, examples of displayed contents displayed by the display controlling unit 143 will be explained, with reference to
In this situation, as illustrated in the display region R1 in
Next, an example of a series of processes during a squat training evaluation will be explained, with reference to
Further, as illustrated in
In other words, when the subject presses the “ON” button illustrated in
In other words, on the basis of the motion information acquired by the motion information acquiring unit 10, the judging unit 142 judges whether the subject is performing squat training, by performing the judging process illustrated in
In this situation, the predetermined threshold value related to the conditions of the setting information used for evaluating the squat training is arbitrarily set. For example, as illustrated in
In other words, the judging unit 142 calculates the depth distance between the left ankle and the left knee and the depth distance between the right ankle and the right knee for each of all the frames acquired by the motion information acquiring unit 10 and judges whether each of the distances exceeds “8 cm” or not. For example, as illustrated in
In contrast, as illustrated in
As explained above, the judging unit 142 may perform the judging process on the basis of the absolute positions of the sites of the subject, by using the directly-input value as the threshold value. However, the judging unit 142 may also perform a judging process on the basis of relative positions. For example, the judging unit 142 may perform a judging process by setting a state of the subject satisfying a predetermined condition as a reference state and comparing the difference from the reference state with a threshold value. In one example, while the subject is performing a squat (e.g., bending his/her knees) that satisfies the condition, the threshold value setting button included in the display region R2 is pressed. With this arrangement, the motion information processing apparatus 100 evaluates the squat training by using the state as a reference state and judging how much the squats performed thereafter deviate from the reference state. The tolerance amount for deviations from the reference state may arbitrarily be set.
Further,
The display examples illustrated in
Further,
The examples of the displayed contents displayed under the control of the display controlling unit 143 have thus been explained. The examples described above are merely examples. The display controlling unit 143 is capable of causing various types of displayed contents to be displayed. For instance, in the description above, the example in which the subject is a single person is explained; however, there may be two or more subjects.
For example, as illustrated in
In this situation, the evaluation start button illustrated in
The obtaining unit 141 obtains the motion information for each of all the frames of both of the subjects and sends the obtained motion information to the judging unit 142. On the basis of the motion information obtained by the obtaining unit 141, the judging unit 142 judges the squat training of the squat subject 1 and the squat subject 2. The display controlling unit 143 displays judgment results of the squat training of the squat subject 1 and the squat subject 2 obtained by the judging unit 142. In this situation, as illustrated in
In the exemplary embodiment described above, the example is explained in which the squat training is evaluated in a real-time manner, while the squat training is being performed. However, possible embodiments are not limited to this example. For instance, squat training may be evaluated by using the motion information of squat training that was performed in the past.
Next, a process performed by the motion information processing apparatus 100 according to the first embodiment will be explained, with reference to
As illustrated in
If the evaluation function is on (step S102: Yes), the judging unit 142 judges whether the subject is performing a squat (step S103). If the subject is performing a squat (step S103: Yes), the judging unit 142 extracts coordinate information of relevant sites (e.g., the ankles and the knees) (step S104), calculates the distances (step S105), and judges whether the threshold value is exceeded (step S106).
If at least one of the distances exceeds the threshold value (step S106: Yes), the display controlling unit 143 displays a warning (step S107), and judges whether an instruction to turn off the evaluation function is received (step S108). On the contrary, if the threshold value is not exceeded (step S106: No), the display controlling unit 143 judges whether an instruction to turn off the evaluation function is received (step S108).
If an instruction to turn off the evaluation function is received (step S108: Yes), the motion information processing apparatus 100 resets the count values of the number of times of warning and the total count to indicate “0” times (step S109), and the process is ended. On the contrary, if no instruction to turn off the evaluation function is received (step S108: No), the process returns to step S101 so that the motion information processing apparatus 100 continues to obtain motion information of the subject. Also, when the evaluation function is off (step S102: No) or when the subject is not performing a squat (step S103: No), the process returns to step S101, so that the motion information processing apparatus 100 continues to obtain motion information of the subject.
The procedure in the process is described above using the example in which it is judged whether the subject is performing a squat (when the squat judging function is on). However, the motion information processing apparatus 100 according to the first embodiment may also perform an evaluation even if the squat judging function is off. In that situation, during the procedure in the process illustrated in
As explained above, according to the first embodiment, the obtaining unit 141 obtains the motion information of the subject who performs the squat motion. On the basis of the predetermined conditions for the squat motion, the judging unit 142 judges whether the motion of the subject indicated by the motion information obtained by the obtaining unit 141 satisfies the certain condition included in the predetermined conditions. The controlling unit 140 exercises control so that the judgment result obtained by the judging unit 142 is provided as the notification. Accordingly, the motion information processing apparatus 100 according to the first embodiment is able to evaluate the squat motion, by only having the subject perform the squat motion in front of the motion information acquiring unit 10. The motion information processing apparatus 100 thus makes it possible to easily and conveniently evaluate the squat motion.
As a result, for example, even when the subject undergoes rehabilitation by himself/herself, the motion information processing apparatus 100 is able to prompt the subject to undergo the rehabilitation with a correct posture. Thus, the motion information processing apparatus 100 is able to compensate for a shortage of caregivers and makes it possible for the subject to undergo rehabilitation that is better than a certain level, without being conditioned by the level of skills of the caregiver.
Further, according to the first embodiment, the controlling unit 140 exercises control so as to notify that the motion is normal, when the judging unit 142 has determined that the squat performed by the subject satisfies the certain condition included in the predetermined conditions and so that the warning is issued when the judging unit 142 has determined that the squat performed by the subject does not satisfy the certain condition included in the predetermined conditions. Consequently, the motion information processing apparatus 100 according to the first embodiment makes it possible to easily and conveniently evaluate the squat training, which is considered important from the standpoint of preventive medicine and sports medicine.
Further, according to the first embodiment, the judging unit 142 uses how much the knee is projecting with respect to the position of the ankle in the series of motions of a squat, as the predetermined condition used when a squat is performed. Consequently, the motion information processing apparatus 100 according to the first embodiment is able to perform the evaluation on the basis of the important motion among the series of motions of a squat and thus makes it possible to prompt the subject to have a more correct posture.
Further, according to the first embodiment, the judging unit 142 further judges whether the subject is performing a squat motion and, if it has been determined that the subject is performing a squat motion, the judging unit 142 judges whether the motion of the subject indicated by the motion information obtained by the obtaining unit 141 satisfies the certain condition included in the predetermined conditions. Consequently, the motion information processing apparatus 100 according to the first embodiment prevents the judging process from being performed on motions that are irrelevant to the motion being the target of the evaluation. The motion information processing apparatus 100 thus makes it possible to provide clearer judgment results.
Further, according to the first embodiment, with respect to the predetermined conditions used by the judging unit 142, the input unit 120 receives the input operation for setting the predetermined threshold value used for judging whether the certain condition is satisfied or not. Consequently, the motion information processing apparatus 100 according to the first embodiment is able to set a fine-tuned threshold value for each subject and thus makes it possible to easily and conveniently perform the evaluation properly.
Further, according to the first embodiment, the input unit 120 receives at least one of the threshold value for the absolute position and the threshold value for the relative position with respect to the predetermined conditions. Consequently, the motion information processing apparatus 100 according to the first embodiment makes it possible to easily and conveniently perform the evaluation in a flexible manner.
Further, according to the first embodiment, the input unit 120 further receives the input operation for starting the judging process performed by the judging unit 142. Consequently, the motion information processing apparatus 100 according to the first embodiment is able to prevent the evaluation from being automatically performed on motions irrelevant to the evaluation and is able to perform the evaluation only on the motion being the target of the evaluation. Thus, the motion information processing apparatus 100 makes it possible to provide clearer evaluation results.
In the first embodiment described above, the example was explained in which the squat training is evaluated on the basis of the depth distance between the ankle and the knee. However, possible embodiments are not limited to this example. The squat training may be evaluated on the basis of other sites of the body. In a second embodiment, various judging processes performed on the basis of other sites of the body will be explained. The judging processes explained below may additionally be used together with the judging process that uses the depth distance between the ankle and the knee described in the first embodiment or may be used alone. The second embodiment is different from the first embodiment for the contents of the process performed by the judging unit 142. The second embodiment will be explained below while a focus is placed on the difference.
The judging unit 142 according to the second embodiment uses at least one of the following in a series of motions of a squat: how much the heels are off the ground; how much the arms are projecting rearward; and how much the upper body is leaning forward.
For example, as illustrated in
In this situation, it is possible to calculate “d2=|y15−y16|”, when the height-direction distance “d2” is the distance between “2o (x15,y15,z15)” and “2p (x16,y16,z16)”. Alternatively, it is possible to calculate “d2=|y19−y20|”, when the height-direction distance “d2” is the distance between “2s (x19,y19,z19)” and “2t (x20,y20,z20)”.
Further, for example, as illustrated in
In this situation, because the subject is positioned facing straight to the motion information acquiring unit 10, it is possible to calculate “d3=|z9−z11|”, when the depth distance “d3” is the distance between “2i (x9,y9,z9)” and “2k (x11,y11,z11)”. Alternatively, it is possible to calculate “d3=|y5−y7|”, when the depth distance “d3” is the distance between “2e (x5,y5,z5)” and “2g (x7,y7,z7)”. In other words, when the subject is positioned facing straight to the motion information acquiring unit 10, it is possible to calculate how much the arms are projecting rearward, by calculating the difference between the z-axis coordinates of the two points.
Further, for example, as illustrated in
In this situation, because the subject is positioned facing straight to the motion information acquiring unit 10, it is possible to calculate “d4=|z15−z4|”, when the depth distance “d4” is the distance between “2d (x4,y4,z4)” and “2o (x15,y15,z15)”. Alternatively, it is possible to calculate “d4=|z19−z4|”, when the depth distance “d4” is the distance between “2d (x4,y4,z4)” and “2s (x19,y19,z19)”. Further, the height-direction distance of the center of the buttocks is “y4”.
The lengths of the distances described above may arbitrarily be set by the user (e.g., the subject, a caregiver, or the like). In one example, the lengths of “d2”, “d3”, and “d4” and the length of the height-direction distance of the center of the buttocks may arbitrarily be set according to the age and the gender of the subject, whether the rehabilitation is performed after injury treatment or the rehabilitation is performed from the standpoint of preventive medicine or sports medicine. In this situation, the length of each of the distances may be set as an absolute position or a relative position, like in the first embodiment. The various types of information used in the judging process described above are stored as the setting information in the setting information storage unit 132. In other words, the judging unit 142 reads the setting information from the setting information storage unit 132 and performs the judging process.
The examples described above are merely examples, and possible embodiments are not limited to these examples. In other words, as the setting information used for evaluating the squat training, information other than the information illustrated in
The judging unit 142 according to the second embodiment evaluates the squat training by performing the various types of judging processes described above either each type alone or in combination as appropriate. Further, the display controlling unit 143 according to the second embodiment exercises control so as to notify that the motion is normal or so that the warning is issued, in accordance with the judgment result from the judging process.
As explained above, according to the second embodiment, as the predetermined conditions used when a squat is performed, the judging unit 142 uses at least one of the following in the series of motions of the squat: how much the heels are off the ground; how much the arms are projecting rearward; and how much the upper body is leaning forward. Consequently, the motion information processing apparatus 100 according to the second embodiment is able to evaluate the squat training on the basis of the entire motions of the subject who performs the squat training. The motion information processing apparatus 100 thus makes it possible to easily and conveniently evaluate the squat training with a high level of precision.
The first and the second embodiments have thus been explained. The present disclosure may be carried out in other various modes besides the first and the second embodiments described above.
In the first and the second embodiments described above, the examples are explained in which the subject who performs the squat training is positioned facing straight to the motion information acquiring unit 10. However, possible embodiments are not limited to these examples. For instance, the present disclosure is applicable to situations where the subject is not positioned facing straight to the motion information acquiring unit 10 (i.e., the subject is not oriented in the depth direction).
In this situation, in the example illustrated in
In this situation, the length of “d5” may arbitrarily be set by the user (e.g., the subject, a caregiver, or the like). In one example, the length of “d5” may arbitrarily be set according to the age and the gender of the subject, whether the rehabilitation is performed after injury treatment or the rehabilitation is performed from the standpoint of preventive medicine or sports medicine. In this situation, the length of “d5” may be set as an absolute position or a relative position, like in the examples described above. The various types of information used in the judging process described above are stored as the setting information in the setting information storage unit 132. In other words, the judging unit 142 reads the setting information from the setting information storage unit 132 and performs the judging process.
The above example is explained by using the situation where the depth distance “d5” is calculated on the basis of the three-dimensional distance between the two points when the subject is not positioned facing straight to the motion information acquiring unit 10. However, possible embodiments are not limited to this example. For instance, it is also acceptable to correct the coordinate information of joints as if the subject was positioned facing straight, by using a sagittal plane or a coronal plane of the subject and to calculate the depth distance on the basis of the corrected coordinate information. In that situation, for example, the judging unit 142 calculates a coronal plane of the subject from the coordinates of the joints corresponding to the head, the waist, and the two shoulders of the subject. After that, the judging unit 142 calculates the angle between the calculated coronal plane and the sensor surface of the motion information acquiring unit 10 and corrects the coordinate information of the joints by using the calculated angle. In other words, the judging unit 142 corrects the coordinate information of the joints so that the coronal plane and the sensor surface become parallel to each other and calculates the depth information by using the values of the z-axis coordinates of the corrected coordinate information. Further, the judging unit 142 may further calculate a sagittal plane by using information of the coronal plane and perform the judging process by using information of the calculated sagittal plane.
The above example is explained by using the situation where the squat training is evaluated by using the distance between the two points. However, possible embodiments are not limited to this example. For instance, the evaluation may be performed on the basis of the gravity point, angles, torsion of the body axis, and the position of the head of the subject, as well as the speed at which the training is performed, and the like.
For example, as illustrated in
Alternatively, the judging unit 142 judges whether the waist is lowered properly, by judging whether the joint “2d” corresponding to the center of the buttocks alternates between a value smaller than a predetermined value “c” and a value larger than the predetermined value “c” during the squat training. The example illustrated in
In the first and the second embodiments described above, the example is explained in which the judging unit 142 judges whether the gradual motion (e.g., the squat training) is properly performed. However, possible embodiments are not limited to this example. For instance, it is acceptable to provide a prevention function to prevent a fall. For example, the judging unit 142 may judge whether the subject is likely to fall during a squat training judging process and may alert the user when the subject is likely to fall.
Next, an example of the fall prevention function will be explained. For example, the judging unit 142 judges how much the upper body is tilting rearward by judging whether the angle “θ2” illustrated in
The example described above is merely an example, and the fall prevention function may be realized in any other embodiments. For example, the judging unit 142 may judge how much the upper body is tilting rearward by judging whether the frequency with which the height-direction coordinates of the right and left tarsi become higher than the height-direction coordinates of the ankles exceeds a predetermined level of frequency.
In the first and the second embodiments described above, the example is explained in which the prescribed joints (e.g., the tarsi, the ankles, the knees, and the like) are used as the coordinates used for evaluating the gradual motion (e.g., the squat training). However, possible embodiments are not limited to this example. For instance, it is acceptable to evaluate a gradual motion (e.g., squat training) by using coordinates of a position that is set between predetermined joints.
In the first and the second embodiments described above, the squat training is used as a targeted gradual motion. However, possible embodiments are not limited to this example. For instance, shoulder flexion (e.g., raising arms forward) in joint range-of-motion training or push-up or sit-up exercise in sports medicine may be used as a targeted motion. In that situation, setting information is stored for each targeted motion, so that each motion is evaluated by using various types of threshold values.
As explained above, the motion information processing apparatus 100 of the present disclosure makes it possible to easily and conveniently evaluate the gradual motion. Consequently, the subject is able to use the motion information processing apparatus 100 safely, easily, and conveniently, even in a clinic or an assembly hall where few caregivers are available. Further, for example, it is also possible to improve motivation of subjects by saving data of each of the subjects in the motion information processing apparatus 100 and making the data public in a ranking format.
In that situation, for example, the storage unit 130 stores therein the number of times a warning was issued for each of the subjects so that the results are displayed in a ranking format in ascending order of the number of times of warning, either regularly or in response to a publication request from a subject. Further, the subjects may be listed in a ranking format in smaller categories according to the gender, the age, and the like of the subjects, for each of different types of motions.
In the first and the second embodiments described above, the example is explained in which the squat training is performed as the rehabilitation functional training. However, possible embodiments are not limited to this example. For instance, the present disclosure is applicable to a situation where a sports athlete or the like performs a squat as part of his/her training. In that situation, for example, the motion information processing apparatus 100 is installed in a training gym or the like so that the subject who performs a squat performs the squat training while using the motion information processing apparatus 100.
The first and the second embodiments above are explained using the example in which the operator switches on and off the judging function via the input unit 120. However, possible embodiments are not limited to this example. For instance, the judging function may be switched on and off as being triggered by a predetermined motion of the subject. In that situation, for example, the judging unit 142 starts and ends the process of judging whether the motion of the subject indicated by the motion information obtained by the obtaining unit 141 satisfies the certain condition included in the predetermined conditions, on the basis of the predetermined motion of the subject. With this arrangement, for example, the subject is able to switch on and off at a start and an end of the judging process by simply performing the predetermined motion in front of the motion information acquiring unit 10. Thus, even if the subject is by himself/herself, he/she is able to have the squat training evaluated easily.
In the first to the third embodiments above, the examples in which the squat training is evaluated have been explained. Next, examples in which jump training is evaluated will be explained in fourth to sixth embodiments below. Like the motion information processing apparatus 100 explained above with reference to
The motion information processing apparatus 100a according to the fourth embodiment is configured as explained with reference to
The motion information storage unit 131a is configured to store therein the various types of information acquired by the motion information acquiring unit 10. More specifically, the motion information storage unit 131a stores therein the motion information generated by the motion information generating unit 14. Even more specifically, the motion information storage unit 131a stores therein the skeleton information corresponding to each of the frames generated by the motion information generating unit 14. In this situation, the motion information storage unit 131a may further store therein the color image information, the distance image information, and the speech recognition result that are output by the motion information generating unit 14, while keeping these pieces of information in correspondence with each of the frames.
Like the motion information storage unit 131 according to the first to the third embodiments, the motion information storage unit 131a stores therein, as illustrated in
For example, to continue the explanation with reference to
In this situation, in the motion information illustrated in
Further, as illustrated in
Further, as illustrated in
Further, the “color image information” and the “distance image information” included in the motion information contain image data in a binary format such as Bitmap, JPEG, or the like, or contain a link to such image data or the like. Further, instead of the recognition information described above, the “speech recognition result” included in the motion information may be audio data itself or a link to the recognition information or the audio data.
The setting information storage unit 132a is configured to store therein setting information used by a controlling unit 140a (explained later). More specifically, the setting information storage unit 132a stores therein the predetermined conditions used by the controlling unit 140a (explained later) for evaluating the quick motion performed by a rehabilitation subject. For example, the setting information storage unit 132a stores therein a condition used for judging whether the landing of a jump performed by a subject is performed with a correct posture. Next, a correct posture for the rehabilitation jump training will be explained, with reference to
In other words, as illustrated in
Accordingly, the setting information storage unit 132a is configured to store therein various types of setting information used for evaluating a correct jump landing posture such as that illustrated in
Next, details of the controlling unit 140a included in the motion information processing apparatus 100a will be explained. As illustrated in
The obtaining unit 141a is configured to obtain motion information of a subject who performs a quick motion. More specifically, the obtaining unit 141a obtains the motion information acquired by the motion information acquiring unit 10 and stored in the motion information storage unit 131a. For example, the obtaining unit 141a obtains the color image information, the distance image information, the speech recognition result, the skeleton information, and the like that are stored in the motion information storage unit 131a for each of the frames. In one example, the obtaining unit 141a obtains all the pieces of color image information, distance image information, and skeleton information that are related to a series of motions during the jump training of the subject.
On the basis of the predetermined conditions for the quick motion, the judging unit 142a is configured to judge whether the motion of the subject indicated by the motion information obtained by the obtaining unit 141a satisfies a certain condition included in the predetermined conditions. More specifically, the judging unit 142a judges whether the skeleton information of the subject for each of the frames obtained by the obtaining unit 141a satisfies conditions of the setting information stored in the setting information storage unit 132a. For example, on the basis of the predetermined conditions used when jump training is performed, the judging unit 142a judges whether the jump performed by the subject indicated by the skeleton information obtained by the obtaining unit 141a satisfies a certain condition included in the predetermined conditions stored in the setting information storage unit 132a.
In one example, as the predetermined conditions used when jump training is performed, the judging unit 142a uses at least one of the degree of knock-kneed state and the degree of bow-legged state at the time of the landing of the jump. For example, as the predetermined condition used when a jump is performed, the judging unit 142a uses an opening amount of the legs at the time of the landing of the jump.
For example, as illustrated in
In this situation, the threshold value used for judging the distance “d6” may arbitrarily be set by the user (e.g., the subject, a caregiver, or the like). In one example, the threshold value for “d6” may arbitrarily be set according to the age and the gender of the subject, whether the rehabilitation is performed after injury treatment or the rehabilitation is performed from the standpoint of preventive medicine or sports medicine. In this situation, the threshold value for “d6” may be set as an absolute position or a relative position. In other words, the threshold value may simply be set as a length from the position of the line between the buttock and the ankle to the position of the knee or may be set as a length to a position relative to a position of correct jump training. The various types of information used in the judging process described above are stored as the setting information in the setting information storage unit 132a. In other words, the judging unit 142a reads the setting information from the setting information storage unit 132a and performs the judging process.
Further, the judging unit 142a further judges whether the subject is performing the quick motion. If it has been determined that the subject is performing the quick motion, the judging unit 142a judges whether the motion of the subject indicated by the motion information obtained by the obtaining unit 141a satisfies the certain condition included in the predetermined conditions. For example, the judging unit 142a first judges whether the subject is performing jump training and, if it has been determined that the subject is performing jump training, the judging unit 142a judges whether the performed jump training is performed with a correct posture. In one example, the judging unit 142a judges whether the subject is performing jump training on the basis of changes in the value of the y-axis coordinates of predetermined joints of the subject. Next, an example of a judging process to judge whether jump training is performed will be explained.
In other words, when the subject jumps and goes up, as illustrated in
In this situation, as illustrated in
After that, if it has been determined that the subject performed a jump, the judging unit 142a performs the jump training evaluation as illustrated in
In addition, by using the motion information of the subject who performs the quick motion obtained by the obtaining unit 141a, the judging unit 142a is capable of calculating other various types of information besides the distance between the joints. More specifically, the judging unit 142a is capable of calculating an angle of joints, a speed, acceleration, and the like. After that, the judging unit 142a judges whether the quick motion (e.g., the jump training) is correct or not by using the calculated various types of information. Further, the judging unit 142a is capable of performing the judging process not only in a real-time manner while the subject is performing the jump training, but also by reading the motion information from the past, for example.
In that situation, for example, the subject himself/herself or an operator (e.g., a medical doctor or a physiotherapist) inputs an instruction request for a judging process via the input unit 120. In that situation, the operator causes the obtaining unit 141a to obtain desired piece of motion information by inputting the name of a subject, a name number, a date of training, and the like. The obtaining unit 141a obtains a corresponding piece of motion information of the subject for which the request was received via the input unit 120, from the motion information storage unit 131a. When the process is performed in a real-time manner while the jump training is being performed, it is also acceptable to configure the motion information to be automatically obtained without receiving an operation from the operator.
Returning to the description of
Next, examples of displayed contents displayed by the display controlling unit 143a will be explained, with reference to
In this situation, as illustrated in the display region R3 in
Next, an example of a series of processes during a jump training evaluation will be explained, with reference to
Further, as illustrated in
In other words, when the subject presses the “ON” button illustrated in
In other words, on the basis of the motion information acquired by the motion information acquiring unit 10, the judging unit 142a judges whether the subject is performing jump training, by performing the judging process illustrated in
In this situation, the predetermined threshold value related to the conditions of the setting information used for evaluating the jump training is arbitrarily set. For example, as illustrated in
In other words, the judging unit 142a calculates the degree of knock-kneed state of the left knee or the degree of knock-kneed state of the right knee for each of all the frames acquired by the motion information acquiring unit 10 and judges whether each of the distances exceeds “5 cm” or not. For example, as illustrated in
In contrast, as illustrated in
The example in
As explained above, the judging unit 142a may perform the judging process on the basis of the absolute positions of the sites of the subject, by using the directly-input value as the threshold value. However, the judging unit 142a may also perform a judging process on the basis of relative positions. For example, the judging unit 142a may perform a judging process by setting a state of the subject satisfying a predetermined condition as a reference state and comparing the difference from the reference state with a threshold value. In one example, when the subject performed a jump that satisfies the condition or when it is presumed that the subject performed a jump that satisfies the condition, the threshold value setting button included in the display region R4 is pressed. In other words, for example, the subject sets a value corresponding to the state where the degrees of knock-kneed state and bow-legged state are low as the threshold value. Alternatively, it is also acceptable to automatically set information about a relative position used by the controlling unit 140a as a reference position. In one example, the controlling unit 140a sets the degrees of knock-kneed state and bow-legged state obtained when the subject was determined by the judging unit 143a to have performed a jump that satisfies the condition, as a reference degree. Alternatively, when the judging unit 143a presumes that the subject performed a jump that satisfies the condition (when the subject is in a state where the jump training condition is satisfied while the jump judging function is off), the controlling unit 140a sets the degrees of knock-kneed state and bow-legged state corresponding to the state satisfying the condition, as a reference degree. With any of these arrangements, the motion information processing apparatus 100a evaluates the jump training by using the state as a reference state and judging how much the jumps performed thereafter deviate from the reference state. The tolerance amount for deviations from the reference state may arbitrarily be set.
Further,
The display examples illustrated in
Further,
The examples of the displayed contents displayed under the control of the display controlling unit 143a have thus been explained. The examples described above are merely examples. The display controlling unit 143a is capable of causing various types of displayed contents to be displayed. For instance, in the description above, the example in which the subject is a single person is explained; however, there may be two or more subjects.
For example, as illustrated in
In this situation, the evaluation start button illustrated in
The obtaining unit 141a obtains the motion information for each of all the frames of both of the subjects and sends the obtained motion information to the judging unit 142a. On the basis of the motion information obtained by the obtaining unit 141a, the judging unit 142a judges the jump training of the jump subject 1 and the jump subject 2. The display controlling unit 143a displays judgment results of the jump training of the jump subject 1 and the jump subject 2 obtained by the judging unit 142a. In this situation, as illustrated in
In the exemplary embodiment described above, the example is explained in which the jump training is evaluated in a real-time manner, while the jump training is being performed. However, possible embodiments are not limited to this example. For instance, jump training may be evaluated by using the motion information of jump training that was performed in the past.
Next, a process performed by the motion information processing apparatus 100a according to the fourth embodiment will be explained, with reference to
As illustrated in
If the evaluation function is on (step S202: Yes), the judging unit 142a judges whether the subject has just performed a jump (step S203). If the subject has just performed a jump (step S203: Yes), the judging unit 142a extracts coordinate information of relevant sites (e.g., the buttocks, the knees, and the ankles) (step S204), calculates the distances (step S205), and judges whether the threshold value is exceeded (step S206).
If at least one of the distances exceeds the threshold value (step S206: Yes), the display controlling unit 143a displays a warning (step S207), and judges whether an instruction to turn off the evaluation function is received (step S208). On the contrary, if the threshold value is not exceeded (step S206: No), the display controlling unit 143a judges whether an instruction to turn off the evaluation function is received (step S208).
If an instruction to turn off the evaluation function is received (step S208: Yes), the motion information processing apparatus 100a resets the count values of the number of times of warning and the total count to indicate “0” times (step S209), and the process is ended. On the contrary, if no instruction to turn off the evaluation function is received (step S208: No), the process returns to step S201 so that the motion information processing apparatus 100a continues to obtain motion information of the subject. Also, when the evaluation function is off (step S202: No) or when the subject is not performing a jump (step S203: No), the process returns to step S201, so that the motion information processing apparatus 100a continues to obtain motion information of the subject.
The procedure in the process is described above using the example in which it is judged whether the subject is performing a jump (when the jump judging function is on). However, the motion information processing apparatus 100a according to the fourth embodiment may also perform an evaluation even if the jump judging function is off. In that situation, during the procedure in the process illustrated in
Next, details of the process at step S203 will be explained. As illustrated in
If the difference is equal to or larger than the threshold value A (step S303: Yes), the judging unit 142a obtains the coordinate information of the predetermined joint (step S304). Subsequently, the judging unit 142a calculates the difference between the height-direction coordinate in the current frame and the height-direction coordinate in the immediately-preceding frame, with respect to the predetermined joint (step S305), and judges whether the difference is equal to or smaller than the negative threshold value B (step S306).
If the difference is equal to or smaller than the threshold value B (step S306: Yes), the judging unit 142a determines that the subject has just performed a jump (step S307) and performs the process at step S204. On the contrary, if the difference at step S303 is not equal to or larger than the threshold value A (step S303: No), the judging unit 142a returns to step S301. If the difference at step S306 is not equal to or smaller than the threshold value B (step S306: No), the judging unit 142a returns to step S304.
During the process at steps S304 through S306, there is a possibility that the process enters a loop of these steps if the condition is not satisfied in the judging process at step S306. To cope with this situation, it is also acceptable to exercise control in such a manner that the process returns to step S301 if a predetermined period of time has elapsed. For example, it is acceptable to exercise control in such a manner that the process returns to step S301 if the condition at step S306 is not satisfied within two seconds after passing through step S304.
As explained above, according to the fourth embodiment, the obtaining unit 141a obtains the motion information of the subject who performs the jump motion. On the basis of the predetermined conditions for the jump motion, the judging unit 142a judges whether the motion of the subject indicated by the motion information obtained by the obtaining unit 141a satisfies the certain condition included in the predetermined conditions. The controlling unit 140a exercises control so that the judgment result obtained by the judging unit 142a is provided as a notification. Accordingly, the motion information processing apparatus 100a according to the fourth embodiment is able to evaluate the jump motion, by only having the subject perform the jump motion in front of the motion information acquiring unit 10. The motion information processing apparatus 100a thus makes it possible to easily and conveniently evaluate the jump motion.
As a result, for example, even when the subject undergoes rehabilitation by himself/herself, the motion information processing apparatus 100a is able to prompt the subject to undergo the rehabilitation with a correct posture. Thus, the motion information processing apparatus 100a is able to compensate for a shortage of caregivers and makes it possible for the subject to undergo rehabilitation that is better than a certain level, without being conditioned by the level of skills of the caregiver.
Further, according to the fourth embodiment, the controlling unit 140a exercises control so as to notify that the motion is normal when the judging unit 142a has determined that the jump performed by the subject satisfies the certain condition included in the predetermined conditions and so that the warning is issued when the judging unit 142a has determined that the jump performed by the subject does not satisfy the certain condition included in the predetermined conditions. Consequently, the motion information processing apparatus 100a according to the fourth embodiment makes it possible to easily and conveniently evaluate the jump training, which is considered important from the standpoint of preventive medicine and sports medicine.
Further, according to the fourth embodiment, the judging unit 142a uses at least one of the degree of knock-kneed state and the degree of bow-legged state at the time of the landing of the jump, as the predetermined condition used when a jump is performed. Consequently, the motion information processing apparatus 100a according to the fourth embodiment is able to perform the evaluation on the basis of the important motion at the time of the landing of the jump and thus makes it possible to prompt the subject to perform a jump with a more correct posture.
Further, according to the fourth embodiment, the judging unit 142a uses the opening amount of the legs at the time of the landing of the jump, as the predetermined condition used when a jump is performed. Consequently, the motion information processing apparatus 100a according to the fourth embodiment is able to easily and conveniently judge the degree of knock-kneed state and the degree of bow-legged state at the time of the landing of the jump of the subject and thus makes it possible to easily and conveniently evaluate the jump training.
Further, according to the fourth embodiment, the judging unit 142a further judges whether the subject is performing a jump motion, and if it has been determined that the subject is performing a jump motion, the judging unit 142a judges whether the motion of the subject indicated by the motion information obtained by the obtaining unit 141a satisfies the certain condition included in the predetermined conditions. Consequently, the motion information processing apparatus 100a according to the fourth embodiment prevents the judging process from being performed on motions that are irrelevant to the motion being the target of the evaluation. The motion information processing apparatus 100a thus makes it possible to provide clearer judgment results.
Further, according to the fourth embodiment, with respect to the predetermined conditions used by the judging unit 142a, the input unit 120 receives the input operation for setting the predetermined threshold value used for judging whether the certain condition is satisfied or not. Consequently, the motion information processing apparatus 100a according to the fourth embodiment is able to set a fine-tuned threshold value for each subject and thus makes it possible to easily and conveniently perform the evaluation properly.
Further, according to the fourth embodiment, the input unit 120 receives at least one of the threshold value for the absolute position and the threshold value for the relative position with respect to the predetermined conditions. Consequently, the motion information processing apparatus 100a according to the fourth embodiment makes it possible to easily and conveniently perform the evaluation in a flexible manner.
Further, according to the fourth embodiment, the input unit 120 further receives the input operation for starting the judging process performed by the judging unit 142a. Consequently, the motion information processing apparatus 100a according to the fourth embodiment is able to prevent the evaluation from being automatically performed on motions irrelevant to the evaluation and is able to perform the evaluation only on the motion being the target of the evaluation. Thus, the motion information processing apparatus 100a makes it possible to provide clearer evaluation results.
In the fourth embodiment described above, the example in which the jump training is evaluated on the basis of the degree of knock-kneed state at the time of the landing of the jump was explained. However, possible embodiments are not limited to this example. For instance, it is acceptable to evaluate jump training on the basis of the degree of bow-legged state at the time of the landing of a jump. As a fifth embodiment, an example in which jump training is evaluated on the basis of a degree of bow-legged state at the time of the landing will be explained. The judging processes explained below may additionally be used together with the jump training judging process that uses the degree of knock-kneed state at the time of the landing explained in the fourth embodiment or may be used alone. The fifth embodiment is different from the fourth embodiment for the contents of the process performed by the judging unit 142a. The fifth embodiment will be explained below while a focus is placed on the difference.
The judging unit 142a according to the fifth embodiment performs a judging process by using a degree of bow-legged state at the time of the landing of a jump.
In this situation, the threshold value used for judging the distance “d7” may arbitrarily be set by the user (e.g., the subject, a caregiver, or the like). In one example, the threshold value for “d7” may arbitrarily be set according to the age and the gender of the subject, whether the rehabilitation is performed after injury treatment or the rehabilitation is performed from the standpoint of preventive medicine or sports medicine. In this situation, the threshold value for “d7” may be set as an absolute position or a relative position. In other words, the threshold value may simply be set as a length from the position of the line between the buttock and the ankle to the position of the knee or may be set as a length to a position relative to a position of correct jump training. The various types of information used in the judging process described above are stored as the setting information in the setting information storage unit 132a. In other words, the judging unit 142a reads the setting information from the setting information storage unit 132a and performs the judging process.
Further, the judging unit 142a further judges whether the subject is performing the quick motion. If it has been determined that the subject is performing the quick motion, the judging unit 142a judges whether the motion of the subject indicated by the motion information obtained by the obtaining unit 141a satisfies the certain condition included in the predetermined conditions. For example, the judging unit 142a first judges whether the subject is performing jump training and, if it has been determined that the subject is performing jump training, the judging unit 142a judges whether the performed jump training is performed with a correct posture.
The judging unit 142a according to the fifth embodiment evaluates the jump training by performing the various types of judging processes described above either each type alone or in combination as appropriate. Further, the display controlling unit 143a according to the fifth embodiment exercises control so as to notify that the motion is normal or so that the warning is issued, in accordance with the judgment result from the judging process.
As explained above, according to the fifth embodiment, the judging unit 142a performs the judging process by using the degree of bow-legged state at the time of the landing of the jump. Consequently, the motion information processing apparatus 100a according to the fifth embodiment is able to evaluate the jump training on the basis of whether the subject is in a bow-legged state, in addition to whether the subject is in a knock-kneed state, at the time of the landing of the subject who performs the jump training. The motion information processing apparatus 100a thus make it possible to easily and conveniently evaluate the jump training with a high level of precision.
The fourth and the fifth embodiments have thus been explained. The present disclosure may be carried out in other various modes besides the fourth and the fifth embodiments described above.
In the fourth and the fifth embodiments described above, the examples are explained in which the subject who performs the jump training is positioned facing straight to the motion information acquiring unit 10. However, possible embodiments are not limited to these examples. For instance, the present disclosure is applicable to situations where the subject is not positioned facing straight to the motion information acquiring unit 10 (i.e., the subject is not oriented in the depth direction).
For example, when the subject is standing vertically straight, the angle θ formed by a vector CR expressed as an outer product of a vector AR connecting “2m” to “2n” and a vector BR connecting “2n” to “2o” and a vector CL expressed as an outer product of a vector AL connecting “2q” to “2r” and a vector BL connecting “2r” to “2s” is “approximately 180 degrees” as illustrated in
Accordingly, the judging unit 142a calculates these vectors from the motion information acquired by the obtaining unit 141a and calculates the angle θ from the calculated vectors. After that, on the basis of the calculated angle θ, the judging unit 142a judges whether the jump training of the subject was performed with a correct posture.
Further, it is also acceptable to judge the degree of knock-kneed state or the degree of bow-legged state by calculating how much the knees are positioned apart from each other on the basis of a distance (an inter-knee distance) between the right knee and the left knee.
The judging unit 142a calculates the inter-knee distance “d8” in each of the frames and judges the degree of knock-kneed state and the degree of bow-legged state at the time of the landing of the jump, by comparing the values of “d8” with a predetermined threshold value. For example, if “d8” is smaller than the predetermined threshold value, the judging unit 142a determines that the subject is in a knock-kneed state. On the contrary, if “d8” is larger than the predetermined threshold value, the judging unit 142a determines that the subject is in a bow-legged state.
In this situation, the threshold value used for judging a knock-kneed state and a bow-legged state by using “d8” may arbitrarily be set by the user (e.g., the subject, a caregiver, or the like). In one example, the threshold value for “d8” may arbitrarily be set for a knock-kneed state and for a bow-legged state, according to the age and the gender of the subject, whether the rehabilitation is performed after injury treatment or the rehabilitation is performed from the standpoint of preventive medicine or sports medicine. In this situation, the threshold value for “d8” may be set for a knock-kneed state and for a bow-legged state, as an absolute position or a relative position, like in the examples described above. The various types of information used in the judging process described above are stored as the setting information in the setting information storage unit 132a. In other words, the judging unit 142a reads the setting information from the setting information storage unit 132a and performs the judging process.
The above example is explained by using the situations where it is judged whether the subject is in a knock-kneed state or a bow-legged state on the basis of the vectors and the angles and where the inter-knee distance “d8” is calculated on the basis of the three-dimensional distance between the two points, when the subject is not positioned facing straight to the motion information acquiring unit 10. However, possible embodiments are not limited to this example. For instance, it is also acceptable to correct the coordinate information of joints as if the subject was positioned facing straight, by using a sagittal plane or a coronal plane of the subject and to calculate an inter-knee distance on the basis of the corrected coordinate information. In that situation, for example, the judging unit 142a calculates a coronal plane of the subject from the coordinates of the joints corresponding to the head, the waist, and the two shoulders of the subject. After that, the judging unit 142a calculates the angle between the calculated coronal plane and the sensor surface of the motion information acquiring unit 10 and corrects the coordinate information of the joints by using the calculated angle. In other words, the judging unit 142a corrects the coordinate information of the joints so that the coronal plane and the sensor surface become parallel to each other and calculates inter-knee information by using values of the corrected coordinate information. Further, the judging unit 142a may further calculate a sagittal plane by using information of the coronal plane and perform the judging process by using information of the calculated sagittal plane.
The above examples are explained by using the situation where the jump training is evaluated by using the distances. However, possible embodiments are not limited to these examples. For instance, the evaluation may be performed on the basis of the gravity point, angles, torsion of the body axis, and the position of the head of the subject, as well as the speed at which the training is performed, and the like. Further, for example, the evaluation may be performed, on the basis of positions of the arms, swings of the arms, or the like.
In the fourth and the fifth embodiments described above, the example is explained in which the prescribed joints (e.g., the tarsi, the ankles, and the knees) are used as the coordinates used for evaluating the quick motion (e.g., the jump training). However, possible embodiments are not limited to this example. For instance, it is acceptable to evaluate a quick motion (e.g., jump training) by using the coordinates of a position that is set between predetermined joints.
In the fourth and the fifth embodiments described above, the example is explained in which the jump training is used as a targeted quick motion. However, possible embodiments are not limited to this example. For instance, a pivot motion may be used as a targeted motion. In that situation, setting information is stored for each targeted motion, so that each motion is evaluated by using various types of threshold values.
As explained above, the motion information processing apparatus 100a of the present disclosure makes it possible to easily and conveniently evaluate the quick motion. Consequently, the subject is able to use the motion information processing apparatus 100a safely, easily, and conveniently, even in a clinic or an assembly hall where few caregivers are available. Further, for example, it is also possible to improve motivation of subjects by saving data of each of the subjects in the motion information processing apparatus 100a and making the data public in a ranking format.
In that situation, for example, the storage unit 130a stores therein the number of times a warning was issued for each of the subjects so that the results are displayed in a ranking format in ascending order of the number of times of warning, either regularly or in response to a publication request from a subject. Further, the subjects may be listed in a ranking format in smaller categories according to the gender, the age, and the like of the subjects, for each of different types of motions.
In the fourth and the fifth embodiments described above, the example is explained in which the jump training is performed as the rehabilitation functional training. However, possible embodiments are not limited to this example. For instance, the present disclosure is applicable to a situation where a sports athlete or the like performs a jump as part of his/her training. In that situation, for example, the motion information processing apparatus 100a is installed in a training gym or the like so that the subject who performs a jump performs the jump training while using the motion information processing apparatus 100a.
The fourth and the fifth embodiments above are explained using the example in which the operator switches on and off the judging function via the input unit 120. However, possible embodiments are not limited to this example. For instance, the judging function may be switched on and off as being triggered by a predetermined motion of the subject. In that situation, for example, the judging unit 142a starts and ends the process of judging whether the motion of the subject indicated by the motion information obtained by the obtaining unit 141a satisfies the certain condition included in the predetermined conditions, on the basis of the predetermined motion of the subject. With this arrangement, for example, the subject is able to switch on and off at a start and an end of the judging process by simply performing the predetermined motion in front of the motion information acquiring unit 10. Thus, even if the subject is by himself/herself, he/she is able to have the jump training evaluated easily.
The first to the sixth embodiments have thus been explained. The present disclosure may be carried out in other various modes besides the first to the sixth embodiments described above.
In the first to the sixth embodiments above, the motion information processing apparatus 100 configured to evaluate the squat training is described in the first to the third embodiments, whereas the motion information processing apparatus 100a configured to evaluate the jump training is described in the fourth to the sixth embodiments. These embodiments described above may be performed by mutually-different motion information processing apparatuses. Alternatively, a single motion information processing apparatus may evaluate both the squat training and the jump training.
In that situation, for example, in the motion information processing apparatus, the judging unit judges whether the motion of the subject is a squat motion or a jump motion on the basis of the motion information obtained by the obtaining unit. After that, if it has been determined that the motion is a squat motion, for example, the judging unit judges whether the squat performed by the subject satisfies the certain condition included in the predetermined conditions, on the basis of the predetermined conditions that are used when squat training is performed and are stored in the setting information storage unit. Alternatively, if it has been determined that the motion is a jump motion, for example, the judging unit judges whether the jump performed by the subject satisfies the certain condition included in the predetermined conditions, on the basis of the predetermined conditions that are used when jump training is performed and are stored in the setting information storage unit. After that, the controlling unit provides a judgment result as a notification.
In the first to the third embodiments described above, the examples are explained in which the motion information processing apparatus 100 obtains the motion information of the subject who performs the squat training and displays the evaluation result. In the fourth to the sixth embodiments described above, the examples are explained in which the motion information processing apparatus 100a obtains the motion information of the subject who performs the jump training and displays the evaluation result. However, possible embodiments are not limited to these examples. For instance, these processes may be performed by a service providing apparatus that is provided in a network.
For example, the service providing apparatus 200 is configured to provide each of the terminal devices 300 with the same processes as those performed by the motion information processing apparatus 100 illustrated in
Further, for example, the service providing apparatus 200 is configured to provide each of the terminal devices 300 with the same processes as those performed by the motion information processing apparatus 100a, as a service. In other words, the service providing apparatus 200 includes functional units equivalent to the obtaining unit 141a, the judging unit 142a, and the display controlling unit 143a. Further, the functional unit equivalent to the obtaining unit 141a is configured to obtain the motion information of the subject who performs a quick motion. Further, on the basis of the predetermined conditions for the quick motion, the functional unit equivalent to the judging unit 142a is configured to judge whether the motion of the subject indicated by the motion information obtained by the functional unit equivalent to the obtaining unit 141a satisfies a certain condition included in the predetermined conditions. Further, the functional unit equivalent to the display controlling unit 143a is configured to exercise control so that a judgment result obtained by the functional unit equivalent to the judging unit 142a is provided as a notification. The network 5 may be wireless or wired and may be configured by using any arbitrary type of communication network such as the Internet or a Wide Area Network (WAN).
When the service is provided by the service providing apparatus 200 as illustrated in
Further, the motion information processing apparatus 100 according to the first to the third embodiments described above and the motion information processing apparatus 100a according to the fourth to the sixth embodiments described above are merely examples, and the constituent elements thereof may be integrated and separated as necessary. For example, it is acceptable to integrate the motion information storage unit 131 and the setting information storage unit 132 together or to separate the judging unit 142 into a calculating unit configured to calculate the distances and the like and a comparing unit configured to compare the calculated values with the threshold values. Further, for example, it is also acceptable to integrate the motion information storage unit 131a and the setting information storage unit 132a together or to separate the judging unit 142a into a calculating unit configured to calculate the distances and the like and a comparing unit configured to compare the calculated values with the threshold values.
Rule information and a recommended caregiving state for the rehabilitation described in the first to the seventh embodiments above do not necessarily have to be those that are defined by the Japanese Orthopaedic Association, but may be those that are defined by other various organizations. For example, it is acceptable to use various types of regulations and rules that are defined by the “International Society of Orthopaedic Surgery and Traumatology (SICOT)”, “the American Academy of Orthopaedic Surgeons (AAOS)”, “the European Orthopaedic Research Society (EORS)”, “the International Society of Physical and Rehabilitation Medicine (ISPRM)”, or “the American Academy of Physical Medicine and Rehabilitation (AAPM&R)”.
As an example, the AAOS defines various rules for knee exercises using a wall squat. The motion information processing apparatus of the present disclosure may use these rules. In other words, the motion information processing apparatus may use any of the following as a rule for a wall squat performed by a subject: “ensure that the buttocks do not slide lower than the knees”; and “ensure that the knees do not move forward over the toes”.
The motion information processing apparatuses of the present disclosure are able to use various types of rules and regulations defined by various organizations not only for the wall squat mentioned above, but also for other exercises and training programs related to squats and other exercises and training programs related to jumps.
As explained above, according to the first to the seventh embodiments, the motion information processing apparatuses and methods of the present disclosure make it possible to easily and conveniently evaluate the motion of at least one of the squat and the jump.
While certain embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the inventions. Indeed, the novel embodiments described herein may be embodied in a variety of other forms; furthermore, various omissions, substitutions and changes in the form of the embodiments described herein may be made without departing from the spirit of the inventions. The accompanying claims and their equivalents are intended to cover such forms or modifications as would fall within the scope and spirit of the inventions.
Number | Date | Country | Kind |
---|---|---|---|
2013-138281 | Jul 2013 | JP | national |
2013-138303 | Jul 2013 | JP | national |