DETECTION APPARATUS, DETECTION SYSTEM, MOTION ANALYSIS SYSTEM, RECORDING MEDIUM, AND ANALYSIS METHOD

Abstract
An apparatus which accurately analyzes a motion state of an exercise appliance and performs a notification of the accurately analyzed motion state. A sensor unit includes a sensor section that is attached to a golf club and detects a swing motion of the golf club, and an imaging portion that captures an image of a part of interest corresponding to the golf club.
Description
TECHNICAL FIELD

The present invention relates to a detection apparatus, a detection system, a motion analysis system, a recording medium recording an analysis program, and an analysis method.


BACKGROUND ART

In recent years, there have been needs for apparatuses analyzing motion of a subject (human) in various fields. For example, a motion form of a subject (athlete), such as a swing trajectory of an exercise appliance such as a golf club, a tennis racket, or a baseball bat, is analyzed and an exercise appliance suitable for the athlete is selected or the motion form is improved on the basis of analysis results so that athletic ability of the athlete can be increased.


Regarding such a motion analysis apparatus and a motion analysis method, for example, PTL 1 discloses a motion detection apparatus and a motion analysis method using an optical motion capture device. This apparatus captures an image of a measurement target object (a subject or an exercise appliance) attached with a marker by using an infrared camera or the like, and analyzes motion by calculating a movement trajectory of the marker by using the captured image.


Regarding a motion analysis apparatus and a motion analysis method, for example, PTL 2 discloses a motion detection apparatus and a motion analysis method in which motion of a subject due to a swing of an exercise appliance is detected by an inertial sensor attached to a subject, and the motion is analyzed on the basis of motion data of the subject which is output from the inertial sensor. Such an apparatus has an advantage that motion capture means such as an infrared camera is not necessary, and handling is easy.


However, when motion of a subject is detected by using an angular velocity sensor as an inertial sensor, and the motion is analyzed, it is necessary to remove a bias of the inertial sensor. In other words, it is necessary to determine the origin of motion of a subject.


The bias collectively indicates a zero bias in an initial state in which angular velocity is zero before a subject starts motion, and a drift caused by an external factor such as a power source fluctuation or a temperature change.


In order to remove the bias, it is necessary to obtain a bias value in an initial state. For example, a standing still period in which a subject stands still is set before a swing is started in swing analysis of a golf club. A bias value in an initial state is determined on the basis of a signal output from an angular velocity sensor, and the like in a predetermined period with detection of the standing still state as a trigger. In other words, the origin of motion of a subject is determined.


In the motion analysis apparatus and the motion analysis method disclosed in PTL 2, since the angular velocity sensor or the like in which a detection direction is defined according to an assumed coordinate system is used as an inertial sensor acquiring motion data, it is necessary to match a motion direction of a subject with a movement direction assumed by the inertial sensor in a case where the inertial sensor is attached to a subject, in order to analyze motion of the subject with high accuracy. For example, a mark indicating a movement direction assumed by the inertial sensor is stuck to the inertial sensor, and a direction indicated by the mark is matched with a motion direction of a subject with the naked eyes when the inertial sensor is attached to the subject.


CITATION LIST
Patent Literature

PTL 1: JP-A-2010-110382


PTL 2: JP-A-2008-73210


SUMMARY OF INVENTION
Technical Problem

However, in a case where a standing still state of a subject is detected on the basis of a signal output from an angular velocity sensor or the like, if the subject moves very slow, a standing still state of the subject is wrongly detected, and thus the origin of motion is erroneously determined. Therefore, a bias value cannot be accurately determined in the standing still state, and, as a result, there is a problem in that a motion state of the subject cannot be accurately analyzed, and a notification of a wrong motion state is performed. Since a distance between a ball hitting surface of a golf club and a golf ball is not constant in a standing still state of a subject, there is a problem in that a motion state cannot be analyzed with high accuracy.


When an inertial sensor is attached to a subject, in a method in which a direction indicated by a mark stuck to the inertial sensor is matched with a motion direction of the subject with the naked eyes, in a case where a motion direction of the subject is not clearly shown, or in a case where the motion direction is separated from the inertial sensor mark even if clearly shown, it is hard to match a direction indicated by the mark with a motion direction of the subject with the naked eyes with high accuracy through adjustment, and thus there is a problem in that a lot of time is required for attachment and adjustment.


Solution to Problem

The invention has been made in order to solve at least some of the above-described problems, and the invention can be realized in the following aspects or application examples.


Application Example 1

A detection apparatus according to this application example includes a sensor portion that is attached to an exercise appliance, and detects a swing motion of the exercise appliance; and an image capturing portion that captures an image of a part of interest.


According to this configuration, since a motion state is analyzed on the basis of a captured image of the part of interest obtained by the image capturing portion and an output signal detected by the sensor portion, it is possible to analyze the motion state of the exercise appliance with higher accuracy than in a case where analysis is performed by using only the sensor portion.


Application Example 2

The detection apparatus according to the application example may further include a notification portion that performs a notification of a motion state of the exercise appliance which is analyzed on the basis of at least one of an output signal from the sensor portion and a captured image obtained by the image capturing portion.


Application Example 3

In the detection apparatus according to the application example, it is preferable that the part of interest is a ball hitting portion for hitting a ball through the swing motion of the exercise appliance.


According to this configuration, it is possible to analyze a motion state of the ball hitting portion by imaging the ball hitting portion.


Application Example 4

In the detection apparatus according to the application example, it is preferable that the motion state includes a standing still state of the exercise appliance which is determined on the basis of a plurality of captured images obtained by capturing images of the ball hitting portion with the passage of time.


According to this configuration, it is possible to determine a standing still state of the exercise appliance on the basis of a plurality of captured images obtained by imaging the ball hitting portion with the passage of time.


Application Example 5

In the detection apparatus according to the application example, the standing still state may be a state in which the exercise appliance stands still before the swing motion is started with the exercise appliance.


Application Example 6

In the detection apparatus according to the application example, the exercise appliance may be a golf club, and the sensor portion and the image capturing portion are attached to a shaft or a grip of the golf club.


Application Example 7

In the detection apparatus according to the application example, the sensor portion and the image capturing portion may be accommodated in the same casing.


Application Example 8

In the detection apparatus according to the application example, the exercise appliance includes the part of interest, and the detection apparatus further includes a determination portion that extracts a predetermined reference image from a captured image of the part of interest obtained by the image capturing portion, and determines the quality of an attachment position where the sensor portion is attached to the exercise appliance on the basis of the extracted predetermined reference image.


According to this configuration, an image of the part of interest of the exercise appliance is captured, a predetermined reference image is extracted from the captured image of the part of interest, and the quality of an attachment position of the sensor portion attached to the exercise appliance is determined on the basis of the extracted predetermined reference image. Therefore, since the quality of an attachment position of the sensor portion is determined on the basis of the extracted predetermined reference image extracted from the captured image, it is possible to quickly and highly accurately adjust an attachment position of the sensor portion compared with a case where adjustment is performed with the naked eyes.


Application Example 9

A motion analysis system according to this application example includes an analysis unit that analyzes the motion state of the exercise appliance on the basis of a captured image obtained by the image capturing portion and an output signal from the sensor portion.


According to this configuration, since a motion state is analyzed on the basis of a captured image of the part of interest obtained by the image capturing portion and an output signal detected by the sensor portion, it is possible to analyze the motion state of the exercise appliance with higher accuracy than in a case where analysis is performed by using only the sensor portion.


Application Example 10

In the motion analysis system according to the application example, it is preferable that the detection apparatus transmits the captured image and the output signal to the analysis unit, and the analysis unit performs analysis on the basis of the captured image and the output signal so as to output a trigger signal indicating the motion state of the exercise appliance.


According to this configuration, in a case where the detection apparatus receives a trigger signal indicating a state of an output signal in the analysis unit, the detection apparatus can output a motion state in the analysis unit on the basis of the trigger signal.


Application Example 11

In the motion analysis system according to the application example, it is preferable that the analysis unit determines standing-still of the exercise appliance on the basis of the captured image, and analyzes the swing motion on the basis of the output signal.


According to this configuration, standing-still of the exercise appliance is determined on the basis of a captured image, and thus it is possible to accurately determine the standing-still of the exercise appliance.


Application Example 12

A detection system according to this application example includes an image capturing portion that captures an image of a part of interest of an exercise appliance to which a sensor portion detecting a swing motion is attached; and a determination portion that extracts a predetermined reference image from a captured image of the part of interest obtained by the image capturing portion, and determines the quality of an attachment position where the sensor portion is attached to the exercise appliance on the basis of the extracted predetermined reference image.


According to this configuration, an image of the part of interest of the exercise appliance is captured, a predetermined reference image is extracted from the captured image of the part of interest, and the quality of an attachment position of the sensor portion attached to the exercise appliance is determined on the basis of the extracted predetermined reference image. Therefore, since the quality of an attachment position of the sensor portion is determined on the basis of the extracted predetermined reference image extracted from the captured image, it is possible to quickly and highly accurately adjust an attachment position of the sensor portion compared with a case where adjustment is performed with the naked eyes.


Application Example 13

In the detection system according to the application example, it is preferable that the image capturing portion captures an image of the part of interest from a direction of viewing the part of interest from the sensor portion attached to the exercise appliance.


According to this configuration, an image of the part of interest is captured from the sensor portion attached to the exercise appliance, and thus it is possible to acquire an accurate attachment situation of the sensor portion.


Application Example 14

In the detection system according to the application example, it is preferable that the determination portion determines the quality of the attachment position on the basis of a difference between a direction specified on the basis of the predetermined reference image and a preset reference direction.


According to this configuration, it is possible to determine the quality of an attachment position of the sensor portion with high accuracy on the basis of a difference between a direction of the predetermined reference image and a reference direction.


Application Example 15

It is preferable that the detection system according to the application example further includes a notification portion that performs a notification of a result determined by the determination portion.


According to this configuration, it is possible to perform a notification of a determination result of the quality of an attachment position of the sensor portion.


Application Example 16

In the detection system according to the application example, the predetermined reference image may be an image of a reference mark provided on the exercise appliance.


Application Example 17

It is preferable that the detection system according to the application example further includes a projection portion that projects the predetermined reference image onto the part of interest.


According to this configuration, a predetermined reference image is projected onto a part of interest, and thus the predetermined reference image can be set regardless of the part of interest.


Application Example 18

It is preferable that the detection system according to the application example further includes a sensor unit; and an analysis unit that is connected to the sensor unit through communication with the sensor unit, the sensor unit includes the sensor portion; and the image capturing portion, and the analysis unit includes the determination portion.


According to this configuration, since the sensor unit detects a swing motion of the exercise appliance, captures an image of a part of interest corresponding to the exercise appliance, and the analysis unit determines the quality of an attachment position of the sensor portion, it is possible to miniaturize the sensor unit.


Application Example 19

An analysis system according to this application example includes a sensor portion that is attached to an exercise appliance and detects motion information of the exercise appliance; an image capturing portion that captures an image of a location including a part of interest of the exercise appliance; an image processing portion that acquires distance information regarding a distance between the part of interest and a predetermined target object on the basis of a captured image obtained by the image capturing portion; an analysis portion that analyzes a swing motion of the exercise appliance on the basis of the motion information; and a correction portion that corrects at least one of the motion information and an analysis result in the analysis portion by using the distance information.


According to this configuration, distance information between the part of interest and the predetermined target object is acquired on the basis of a captured image of the part of interest obtained by the image capturing portion, a swing motion of the exercise appliance is analyzed on the basis of motion information acquired by the sensor portion, and at least one of the motion information and an analysis result is corrected by using the distance information. Therefore, an error of the motion information is improved, and thus a motion state of the exercise appliance can be analyzed with high accuracy.


Application Example 20

In the analysis system according to the application example, it is preferable that the image processing portion analyzes an image having undergone image processing, and calculates the distance information by counting the number of pixels forming the image.


According to this configuration, the distance information is calculated by counting the number of pixels forming the image having undergone image processing, and thus it is possible to accurately calculate the distance information.


Application Example 21

In the analysis system according to the application example, it is preferable that the analysis portion analyzes trajectory information regarding a trajectory along which the exercise appliance is moved on the basis of the motion information, and the correction portion corrects the trajectory information by using the distance information.


According to this configuration, since the trajectory information is corrected on the basis of the distance information, it is possible to increase reliability of a trajectory of the exercise appliance.


Application Example 22

In the analysis system according to the application example, it is preferable that the analysis portion determines whether or not the exercise appliance stands still on the basis of the distance information in each of a plurality of captured images obtained with the passage of time.


According to this configuration, it is possible to accurately determine a standing still state of the exercise appliance on the basis of a plurality of captured images obtained with the passage of time.


Application Example 23

In the analysis system according to the application example, the part of interest may be a ball hitting portion for hitting the target object through a swing motion of the exercise appliance.


Application Example 24

It is preferable that the analysis system according to the application example further includes a sensor unit that is attached to the exercise appliance; and an analysis unit that performs communication with the sensor unit, the sensor unit includes the sensor portion; and the image capturing portion, the analysis unit includes the image processing portion; the correction portion; and the analysis portion, and the motion information and the captured images are transmitted from the sensor unit to the analysis unit through the communication.


According to this configuration, since the sensor unit detects a swing motion of the exercise appliance, and captures an image of the part of interest corresponding to the exercise appliance, and the analysis unit analyzes the swing motion of the exercise appliance, it is possible to miniaturize the sensor unit attached to the exercise appliance.


Application Example 25

A recording medium according to this application example records an analysis program causing a computer to execute an image processing function of acquiring distance information regarding a distance between a part of interest of an exercise appliance and a predetermined target object on the basis of a captured image of the part of interest obtained by an image capturing portion, the exercise appliance being attached with a sensor portion detecting motion information; an analysis function of analyzing a swing motion of the exercise appliance on the basis of the motion information; and a correction function of correcting at least one of the motion information and an analysis result of the swing motion by using the distance information.


According to this configuration, distance information between the part of interest and the predetermined target object is acquired on the basis of a captured image of the part of interest obtained by the image capturing portion, a swing motion of the exercise appliance is analyzed on the basis of motion information acquired by the sensor portion, and at least one of the motion information and an analysis result is corrected by using the distance information. Therefore, an error of the motion information is improved, and thus a motion state of the exercise appliance can be analyzed with high accuracy.


Application Example 26

An analysis method according to this application example includes an image capturing process of capturing an image of a part of interest of an exercise appliance, the exercise appliance being attached with a sensor portion detecting motion information; an image processing process of acquiring distance information regarding a distance between the part of interest and a predetermined target object on the basis of a captured image of the part of interest; an analysis process of analyzing a swing motion of the exercise appliance on the basis of the motion information; and a correction process of correcting at least one of the motion information and an analysis result of the swing motion by using the distance information.


According to this method, distance information between the part of interest and the predetermined target object is acquired on the basis of a captured image of the part of interest obtained by the image capturing portion, a swing motion of the exercise appliance is analyzed on the basis of motion information acquired by the sensor portion, and at least one of the motion information and an analysis result is corrected by using the distance information. Therefore, an error of the motion information is improved, and thus a motion state of the exercise appliance can be analyzed with high accuracy.





BRIEF DESCRIPTION OF DRAWINGS


FIG. 1 is a block diagram schematically illustrating a sensor unit of a motion detection apparatus according to Embodiment 1 of the invention.



FIG. 2 is a block diagram schematically illustrating an analysis unit of the motion detection apparatus according to Embodiment 1.



FIG. 3A is a schematic diagram in which the motion detection apparatus according to Embodiment 1 is applied to a golf club.



FIG. 3B is a schematic diagram in which the motion detection apparatus according to Embodiment 1 is applied to the golf club.



FIG. 3C is a schematic diagram in which the motion detection apparatus according to Embodiment 1 is applied to the golf club.



FIG. 4 is a schematic diagram illustrating a relationship between the golf club to which the motion detection apparatus according to Embodiment 1 is applied and a subject.



FIG. 5 is a flowchart illustrating a flow of a process in a motion analysis method.



FIG. 6 is a diagram illustrating an example of an evaluation index for determining a standing still state.



FIG. 7 is a schematic diagram in which a motion detection apparatus according to Embodiment 2 of the invention is applied to a golf club.



FIG. 8 is a block diagram schematically illustrating the motion detection apparatus according to Embodiment 2.



FIG. 9A is a diagram for explaining correction of a head position when a swing is started and a golf ball is hit in a motion detection apparatus according to Embodiment 3 of the invention.



FIG. 9B is a diagram for explaining correction of a head position when a swing is started and a golf ball is hit in the motion detection apparatus according to Embodiment 3 of the invention.



FIG. 10A is a diagram for explaining adjustment of a position of a sensor unit attached to a golf club in Embodiment 3 of the invention.



FIG. 10B is a diagram for explaining adjustment of a position of the sensor unit attached to the golf club in Embodiment 3.



FIG. 10C is a diagram for explaining adjustment of a position of the sensor unit attached to the golf club in Embodiment 3.



FIG. 11 is a diagram schematically illustrating a sensor unit according to Embodiment 4 of the invention.



FIG. 12A is a diagram for explaining adjustment of a position of a sensor unit attached to a golf club in Embodiment 4.



FIG. 12B is a diagram for explaining adjustment of a position of the sensor unit attached to the golf club in Embodiment 4.



FIG. 13 is a block diagram schematically illustrating a motion detection apparatus according to Embodiment 5 of the invention.



FIG. 14 is a block diagram illustrating details of a processing section of an analysis unit according to Embodiment 5.



FIG. 15 is a block diagram illustrating details of a storage section of the analysis unit.



FIG. 16A is a diagram for explaining correction of a head position when a swing is started.



FIG. 16B is a diagram for explaining correction of a head position when a golf ball is hit.



FIG. 17 is a flowchart illustrating a flow of a process in a motion analysis method.





DESCRIPTION OF EMBODIMENTS

Hereinafter, embodiments of the invention will be described with reference to the drawings. In the following respective drawings, a dimension or a scale of each constituent element may be illustrated to be different from that of an actual constituent element so that a size of each constituent element can be recognized in the drawings.


Embodiment 1

A motion detection apparatus 1 including a sensor unit 10 according to Embodiment 1 is an apparatus detecting motion of a subject M as illustrated in FIG. 4, and uses, for motion analysis, a motion form of the subject M, such as a swing trajectory of an exercise appliance, for example, a golf club 500 as illustrated in FIG. 3B, a tennis racket, or a baseball bat. The motion detection apparatus 1 corresponds to a motion detection system.


Hereinafter, a description will be made of a case where the motion detection apparatus 1 is applied to the golf club 500 as an example of the embodiment.



FIG. 1 is a block diagram schematically illustrating the motion detection apparatus 1 according to Embodiment 1, and is a schematic diagram mainly illustrating a sensor section 100 of the sensor unit 10. FIG. 2 is a block diagram schematically illustrating the motion detection apparatus 1 according to the present embodiment, and is a schematic diagram mainly illustrating an analysis unit 50. FIGS. 3A to 3C are schematic diagrams illustrating an example in which the motion detection apparatus 1 is applied to the golf club 500, and do not illustrate the analysis unit 50. FIG. 4 is a schematic diagram illustrating a relationship with the subject M in a case where the motion detection apparatus 1 is applied to the golf club 500. FIG. 5 is a flowchart illustrating a flow of a process in a motion analysis method using of the motion detection apparatus 1. FIG. 6 is a diagram illustrating an example of an evaluation index for determining a standing still state.


[Configuration of Motion Detection Apparatus 1]


The motion detection apparatus 1 illustrated in FIG. 1 to FIGS. 3A to 3C is configured to include the sensor unit 10 and the analysis unit 50.


<Configuration of Sensor Unit 10>


The sensor unit 10 is configured to include the sensor section 100, a casing 130 accommodating the sensor section 100, and a holder 200.


<Configuration of Sensor Section 100>


The sensor section 100 is configured to include a notification portion 30, a sensor 110, an imaging portion 150, and a controller 120, and these constituent elements are provided in the same casing 130.


<Sensor 110>


The sensor 110 can detect predetermined physical quantities associated with motion, and can output signals corresponding to detected physical quantities such as an acceleration, an angular velocity, a velocity, and an angular acceleration.


The sensor 110 is provided with three-axis detection type acceleration sensors 112x, 112y and 112z (hereinafter, collectively referred to as “three-axis acceleration sensors”) detecting accelerations in X axis, Y axis and Z axis directions. The sensor 110 is provided with three-axis detection type gyro sensors (angular velocity sensors) 114x, 114y and 114z (hereinafter, collectively referred to as “three-axis gyro sensors”) detecting angular velocities in the X axis, Y axis and Z axis directions. The sensor 110 is provided as a six-axis detection type motion sensor including the three-axis acceleration sensors and the three-axis gyro sensors.


Here, each of the three-axis gyro sensors (angular velocity sensors) 114x to 114z may employ a vibration type angular velocity sensor. The vibration type angular velocity sensor causes a vibrator to vibrate at a constant frequency. If an angular velocity is applied to the vibrator, a Coriolis force is generated, and the vibrator vibrates in different directions due to the Coriolis force. If a displacement caused by the Coriolis force is detected, the angular velocity is detected, and thus it is possible to detect a physical quantity associated with motion.


In the motion detection apparatus 1 of the present embodiment, a configuration of the sensor 110 is not particularly limited, and may be changed as appropriate according to a measurement target object whose motion is detected.


<Imaging Portion 150>


The imaging portion 150 corresponds to an image capturing portion, captures an image of a subject, and outputs captured image data to the controller 120. In Embodiment 1, the imaging portion 150 is assumed to be a digital camera including an imaging element which outputs an electric signal according to an image picture formed by an optical component. In Embodiment 1, as illustrated in FIG. 3A, the imaging portion 150 is accommodated on a first side surface side of the casing 130. In other words, as illustrated in FIG. 3B, in a case where the motion detection apparatus 1 is applied to the golf club 500, the imaging portion 150 is attached to the casing 130 so as to image the vicinity of a head 500h of the golf club 500.


<Controller 120>


The controller 120 is configured to include a data processing portion 120A, a power source portion 120B, and a communication portion 120C. The controller 120 is connected to the sensors 112x to 112z and 114x to 114z, the imaging portion 150, the notification portion 30, and the analysis unit 50.


The data processing portion 120A performs packet data conversion on an output signal from each of the sensors 112x to 112z and 114x to 114z into packet data along with, for example, time information (time base).


The data processing portion 120A transmits the signal having undergone the packet data conversion to the communication portion 120C.


The data processing portion 120A performs packet data conversion on an image signal for an image captured by the imaging portion 150 along with time information (time base). The data processing portion 120A transmits the image signal having undergone the packet data conversion to the communication portion 120C.


In the following description, a signal obtained by converting each of an output signal from each of the sensors 112x to 112z and 114x to 114z and an image signal from the imaging portion 150 into packet data is referred to as a “motion signal 70”.


The communication portion 120C performs a process of transmitting the motion signal 70 (packet data) transmitted from the data processing portion 120A to the analysis unit 50. A transmission method between the sensor unit 10 and the analysis unit 50 is not particularly limited, and may use wireless communication such as WiFi (registered trademark).


The controller 120 is provided with the power source portion 120B, and supplies power which is required for operations of the sensor 110, the imaging portion 150, the controller 120, and the like. A configuration of the power source portion 120B is not particularly limited, and a primary battery (for example, a dry battery or a lithium battery) or a secondary battery (a nickel hydrogen battery or a lithium ion battery) may be used. The power source portion 120B may be provided in the analysis unit 50 so as to supply power to the sensor section 100.


<Configuration of Holder 200>


The holder 200 is an attachment which attaches the sensor section 100 to an exercise appliance in order to detect a swing trajectory of the exercise appliance which is a detection target of the motion detection apparatus 1.


As illustrated in FIG. 3B, in a case where the motion detection apparatus 1 is applied to the golf club 500, the holder 200 is an attachment which attaches the sensor section 100 to an exercise appliance such as the golf club 500. A shape of the holder 200 is not particularly limited, but, in a case where the motion detection apparatus is applied to the golf club 500, the holder maybe attached so that the sensor section 100 is provided on a shaft 500s or a grip 500g and the sensor section 100 is attachable to and detachable from the holder. The sensor section 100 is preferably attached to the golf club 500 so that the notification portion 30 which will be described later is directed in the same direction of an end of the grip 500g. The holder 200 may be changed as appropriate according to the type of exercise appliance.


<Configuration of Notification Portion 30>


The notification portion 30 is provided in the sensor section 100 as illustrated in FIGS. 1 and 2. The notification portion 30 is configured to include a light emitter 132 as illustrated in FIGS. 3A to 3C. The notification portion 30 is provided to visually notify the subject M of a state of an output signal from the sensor section 100 or various states of the motion detection apparatus 1. The notification portion 30 notifies the subject M of a state of an output signal from the sensor section 100 or various states of the motion detection apparatus 1 through blinking of the light emitter 132. The notification portion 30 of the motion detection apparatus 1 of Embodiment 1 is configured to include, as an example, a first light emitter 132a and a second light emitter 132b. The first light emitter 132a and the second light emitter 132b can emit light with a plurality of colors (for example, red and green) by using light emitting elements such as light emitting diodes. Therefore, the notification portion 30 may notify the subject of a state of the motion signal 70 or various states detected by the motion detection apparatus 1 depending on a difference between light emission colors of the light emitter 132.


The notification portion 30 is preferably provided on a second side surface opposing the first side surface of the casing 130 of the sensor section 100, that is, an upper surface side in a case of being attached to the golf club 500.


For example, in a case where the sensor unit 10 is attached to a backside of the shaft 500s of the golf club 500 which will be described later, if the notification portion 30 is provided on only the surface of the casing 130, the subject M may be hindered from visually recognizing (perceiving) light emission of the notification portion 30. Therefore, the notification portion 30 is provided on another side surface of the casing 130, and thus the subject M can visually recognize (perceive) light emission of the notification portion 30 regardless of an attachment method of the sensor unit 10. The notification portion 30 is preferably provided at both ends of the casing 130 of the sensor unit 10 in a width direction (for example, a direction intersecting a direction in which the shaft 500s extends). The subject can visually recognize (perceive) light emission of the notification portion 30 regardless of a dominant arm in a swing of the golf club 500.


<Configuration of Analysis Unit 50>


Referring to FIG. 2 again, a configuration of the analysis unit 50 will be described.


As illustrated in FIG. 2, the analysis unit 50 is configured to include a processing section (CPU) 201, a communication section 210, an operation section 220, a ROM 230, a RAM 240, a nonvolatile memory 250, and a display section 260.


The communication section 210 performs a process of receiving the motion signal 70 (packet data) transmitted from the sensor unit 10, and transmitting the motion signal to the processing section 201. The operation section 220 performs a process of acquiring operation data from the subject M or an assistant (not illustrated), and transmitting the operation data to the processing section 201. The ROM 230 stores programs for the processing section 201 performing various computation processes or control processes, or various programs or data, etc., for realizing application functions.


The RAM 240 is a storage section which is used as a work region of the processing section 201, and temporarily stores a program or data read from the ROM 230, data which is input from the operation section 220, results of calculation which is performed by the processing section 201 according to various programs or application functions, and the like.


The display section 260 displays a process result in the processing section 201 as text, a graph, or other images. The display section 260 is, for example, a CRT, an LCD, and a touch panel display. A single touch panel display may realize functions of the operation section 220 and the display section 260.


The processing section 201 is configured to include a calculation portion 202, a determination portion 204, and an analysis portion 206. The processing section 201 performs various processes such as a computation process, an analysis process, and a determination process on the motion signal 70 which is received from the sensor unit 10 via the communication section 210 according to the program stored in the ROM 230.


In the processing section 201, the calculation portion 202 performs a calculation process on the motion signal 70 transmitted from the sensor unit 10. The determination portion 204 determines whether or not the subject M is in a standing still state, that is, the golf club 500 attached with the sensor unit 10 is in a swing origin state on the basis of a result of the calculation process. The determination portion 204 stores a bias value in the RAM 240 in a case where the standing still state is determined.


In Embodiment 1, it is assumed that the standing still state is determined on the basis of an image signal included in the motion signal 70, but a determination of the standing still state based on a calculation result of an output signal which is output from the sensor 110 may also be performed in addition to the determination based on the image signal. For example, in a case where the golf club 500 is moved at a higher speed than a predetermined reference, the standing still state is determined on the basis of a calculation result of an output signal from the sensor 110, and, in a case where the speed of the golf club 500 transitions to a low speed, the standing still state maybe determined on the basis of an image signal. There maybe an aspect in which a user such as the subject M can select a determination method.


The calculation portion 202 performs a calculation process on the motion signal 70 transmitted from the sensor unit 10. The analysis portion 206 performs motion analysis on a measurement target on the basis of a result of the calculation process. The determination portion 204 performs a determination of whether or not motion is detected or a determination of completion of a motion analysis result on the basis of the motion analysis result, and the like.


The processing section 201 transmits a trigger (result) signal 80 regarding a determination of a standing still state, a determination of whether or not motion is detected, and completion of a motion analysis result, to the sensor unit 10, and transmits the trigger signal to the notification portion 30.


The analysis unit 50 may employ a personal computer, a high function mobile phone (smart phone), a multi-function portable terminal (tablet terminal) or the like having the above-described functions.


<Aspect in Which Motion Detection Apparatus 1 is Applied to Golf Club 500>


A description will be made of an aspect in which the above-described motion detection apparatus 1 is applied to the golf club 500.



FIG. 3A is a schematic diagram illustrating an exterior of the sensor section 100 forming the sensor unit 10. In the sensor section 100, the sensor 110 and the controller 120 forming the sensor section 100 are accommodated in the casing 130. The first light emitter 132a and the second light emitter 132b forming the notification portion 30 are provided on the second side surface of the casing 130.



FIGS. 3B and 3C are diagrams illustrating a state in which the sensor unit 10 is attached to the golf club 500 as an example of an embodiment of the motion detection apparatus 1. As illustrated in FIG. 3B, the sensor section 100 is attached to the golf club 500 by using the holder 200. Specifically, as illustrated in FIG. 3C, the sensor section 100 is attached to be fitted to the holder 200 attached to the shaft 500s or the grip 500g of the golf club 500. The sensor unit 10 is attached to the golf club 500 so that the light emitter 132 (132a and 132b) of the notification portion 30 is directed toward the end side of the grip 500g. This is so that the subject M can easily visually recognize (perceive) light emission.



FIG. 4 schematically illustrates a situation in which the subject M holds the golf club 500.


As illustrated in FIG. 4, in a case where the motion detection apparatus 1 detects a swing motion of the golf club 500 performed by the subject M, and analyzes the swing according to a motion analysis method which will be described later, the subject M can perceive a state of the motion detection apparatus 1 by visually recognizing light emission of the notification portion 30. Therefore, the subject M can perform a swing without averting a visual line e.


If the casing 130 of the sensor section 100 enters a visual field of the subject M, a swing which is different from a normal swing may be performed since the subject is concerned about the casing during a swing. Therefore, the casing 130 of the sensor section 100 is preferably attached to the backside of the shaft 500s when viewed from the subject M in a standing still state (at address) before swinging the golf club 500. In this case, as described above, the notification portion 30 is provided on another side surface of the casing 130 of the sensor section 100, and thus the subject M can easily visually recognize (perceive) whether or not the notification portion 30 emits light.


In a case where the casing 130 is attached to the backside of the shaft 500s, the imaging portion 150 is set to image a part of interest in the vicinity of the head 500h of the golf club 500, specifically, the vicinity of a face which is a ball hitting portion hitting a golf ball (not illustrated) through a swing motion.


There may be an aspect in which an imaging direction of the imaging portion 150 can be changed through an operation of the subject M. For example, there maybe an aspect in which an imaging direction is set to a target line direction, a place where a swing is performed is imaged, and a captured landscape image is linked with swing data, and is also displayed on the display section 260 of the analysis unit 50. Consequently, it is possible to save the time and effort to manually input place information or the like.


<Motion Analysis Method>


A motion analysis method of the present embodiment includes a measurement preparation process, a motion measurement process, a transmission process of transmitting the motion signal 70 obtained in the motion measurement process to the analysis unit 50, and an analysis process of analyzing the motion signal 70 transmitted in the transmission process. The motion analysis method includes a standing still state notification process of performing a notification of completion of the measurement preparation process, a measurement completion notification process of performing a notification of completion of the motion measurement process, and a transmission completion notification process of performing a notification of completion of transmission of the motion signal 70 from the sensor unit 10 to the analysis unit 50.


Each process will be described for each step with reference to a flowchart illustrated in FIG. 5 with respect to the motion analysis method using the motion detection apparatus 1. Regarding the description of the motion analysis method, the motion analysis method in which the motion detection apparatus 1 is applied to the golf club 500 will be described.


<Measurement Preparation Process>


The measurement preparation process is a process of preparing for measurement of motion, and is a process of measuring a bias of the sensor 110 before starting motion (swing).


Here, the bias collectively indicates a zero bias in an initial state in which angular velocity is zero before the subject M starts motion, and a drift caused by an external factor such as a power source fluctuation or a temperature change.


In the measurement preparation process, in step S10, the analysis unit 50 acquires the motion signal 70 in a case of a standing still state (so-called address state) in which the subject M holds the golf club 500 and stands still.


In the measurement preparation process, in step S20, the calculation portion 202 calculates the motion signal 70 acquired by the analysis unit 50.


In the measurement preparation process, in step S25, an image signal is extracted from the motion signal 70 calculated by the calculation portion 202, the extracted image signal is processed, and thus a captured image of a region including the head 500h of the golf club 500 and a golf ball is extracted at a predetermined time interval. In the measurement preparation process, image processing is performed on the extracted captured image so that a distance between a predetermined portion (for example, a face surface) of the head 500h and the golf ball is calculated, and the calculated distance information is stored in the nonvolatile memory 250.


In the measurement preparation process, the distance information stored in the nonvolatile memory 250 is read, and a change in the distance information associated with a temporal change is examined as illustrated in FIG. 6. For example, in a case where a distance L between the face surface and the golf ball changes to L1, L2, and L3, if a change amount of the distance information is equal to or less than a predetermined reference value as in L2 and L3, and a state in which the change amount is equal to or less than is the predetermined reference value lasts for a predetermined time (for example, 3 seconds), the determination portion 204 determines that the golf club 500 is in a standing still state.


In Embodiment 1, a standing still state is determined on the basis of a change in a distance between the face surface and the golf ball, but is not limited thereto. For example, in a case where a golf ball is not set, a standing still state maybe determined on the basis of a change in a distance between the face surface and a golf tee. Instead of the golf tee, a predetermined mark or the like drawn on the ground may be used.


In the measurement preparation process, in step S30, in a case where it is determined that the golf club 500 is included in the range of a standing still state (YES), the process proceeds to a notification of “standing still state detection” in step S41, and an output signal from the sensor 110 at that time is stored in the RAM 240 as a bias value. In a case where it is determined that the motion signal 70 is not included in the range of a standing still state in step S30 (NO), a notification of a “standing still state detection error” is performed in step S42, and the process returns to step S10 so as to be performed again from acquisition of the motion signal 70 in a standing still state.


Detection of a standing still state in the measurement preparation process is not limited to a method of determining a standing still state by processing an image captured by the imaging portion 150, and there may be an aspect of analyzing an output signal from the sensor 110. In this case, in the measurement preparation process, the calculation portion 202 calculates the motion signal 70 acquired by the analysis unit in step S20, the motion signal 70 calculated by the calculation portion 202 is compared with a value of the motion signal 70 in a standing still state, which is a first threshold value recorded in the ROM 230 in advance in step S30, and the determination portion 204 performs a first determination of whether or not the motion signal 70 is lower than the value thereof in a standing still state, which is the first threshold value in a predetermined period.


The predetermined period for a determination is set as appropriate depending on a measurement target, and the period in the present embodiment is 3 seconds. In a case where it is determined that the motion signal 70 is included in the range of a standing still state in step S30 (YES), the process proceeds to a notification of “standing still state detection” in step S41, and the motion signal 70 at that time is stored in the RAM 240 as a bias value.


<Standing Still State Notification Process>


The standing still state notification process is a process of performing a notification of a determination result of whether or not the golf club 500 and the subject M holding the golf club 500 are in a standing still (address) state on the basis of the motion signal 70 in the above-described measurement preparation process.


In the standing still state notification process, in a case where it is determined that the golf club 500 and the subject M holding the golf club 500 are in a standing still state on the basis of the motion signal 70 in step S30, a notification of “standing still state detection” is performed in step S41. The notification is also a notification that the measurement target has started motion.


In the standing still state notification process, in a case where it is determined that the golf club 500 and the subject M holding the golf club 500 are not in a standing still state on the basis of the motion signal 70 in step S30, a notification of a “standing still state detection error” is performed in step S42.


The standing still state notification process is performed by the light emitter 132 provided in the notification portion 30. Here, the notification of standing still state detection in the notification portion 30 is performed by using blinking and light emission colors of the first light emitter 132a and the second light emitter 132b. The notification portion 30 may change a light emission color and a blinking pattern according to information notified by the subject M.


The notification of “standing still state detection” in step S41 is performed by using a light emission color and a notification (blinking) pattern of the light emitter 132. A light emission color and a notification pattern corresponding to “standing still state detection” are set in advance.


The notification of “standing still state detection error” in step S42 is performed through light emission of the light emitter 132 so as to be different from the notification of “standing still state detection”. A light emission color and a notification pattern corresponding to “standing still state detection error” are set in advance. Consequently, the subject M is notified of “standing still state detection error” and is also prompted to maintain a standing still (address) state.


<Motion Measurement Process>


The motion measurement process is a process of measuring motion (swing) of the subject M holding the golf club 500. The motion measurement process is a process in which the sensor 110 mounted in the sensor unit 10 measures motion (swing) of the subject M.


In the motion measurement process, in step S50, an acceleration or the like associated with the motion of the subject M is acquired from the sensor unit 10 as the motion signal 70.


<Transmission Process>


The transmission process is a process of transmitting, to the analysis unit 50, the motion signal 70 which is acquired in the motion measurement process and is based on the motion (swing) of the subject M holding the golf club 500.


In the transmission process, the motion signal 70 acquired in step S50 is transmitted from the sensor unit 10 to the analysis unit 50.


In the transmission process, in step S70, the determination portion 204 performs a second determination of whether or not an error (for example, an over-range or missing) is included in the motion signal 70 transmitted to the analysis unit 50 in step S60. In the transmission process, in step S70, the determination portion 204 performs the second determination of whether or not the acceleration or the like associated with the motion exceeds a preset value on the basis of the motion signal 70 transmitted to the analysis unit 50 in step S60.


The error determination is performed through comparison with the normal motion signal 70 recorded in the ROM 230 in advance as a second threshold value. The determination of the acceleration or the like associated with the motion is performed through comparison with the motion signal 70 recorded in the ROM 230 in advance as the second threshold value. The determination of the acceleration or the like associated with the motion may be performed by using any value of the motion signal 70 such as the maximum value or the minimum value of the motion signal 70 of the subject M as the second threshold value.


In a case where it is determined that an error is not included in the motion signal 70 in step S70, or the motion signal 70 exceeds the preset second threshold value (satisfies the condition of the threshold value) (YES), the process proceeds to a notification of “favorable measurement” in step S81. In a case where it is determined that an error is included in the motion signal 70 in step S70 (NO), the process proceeds to a notification of “measurement error” in step S82, and also returns to step S10 so that motion analysis is performed again from acquisition of the motion signal 70 in a standing still state.


The notification of “favorable measurement” in step S81 is performed by using a light emission color and a notification (blinking) pattern of the light emitter 132. A light emission color and a notification pattern corresponding to “favorable measurement” are set in advance.


The notification of “measurement error” in step S82 is performed by using a light emission color and a notification (blinking) pattern of the light emitter 132. A light emission color and a notification pattern corresponding to “measurement error” are set in advance. Consequently, the subject M is notified of “measurement error”, and it is also prompted that motion analysis is performed again from acquisition of the motion signal 70 in a standing still state in step S10.


In the transmission process, in step S90, the determination portion 204 determines whether or not transmission of the motion signal 70 transmitted to the analysis unit 50 in step S60 is completed. The determination of transmission completion is performed by receiving start parity and end parity which are added to the motion signal 70 (packet data) to be transmitted, by the data processing portion 120A of the sensor unit 10. In a case where the end parity is received until a predetermined time stored in the ROM 230 in advance elapses after the start parity is received in step S90, transmission completion is determined, and thus the process proceeds to a notification of “transmission completion” in step S101. In a case where the end parity is not received until the predetermined time elapses, the process proceeds to a notification of “transmission error” in step S102, and also returns to step S10 so that motion analysis is performed again from acquisition of the motion signal 70 in a standing still state.


The notification of “transmission completion” in step S101 is performed by using a light emission color and a notification (blinking) pattern. A light emission color and a notification pattern corresponding to “transmission completion” are set in advance. Consequently, the subject M is notified of “transmission completion”.


The notification of “transmission error” in step S102 is performed by using a light emission color and a notification pattern corresponding to “transmission error” and set in advance. Consequently, the subject M is notified of “transmission error”, and it is also prompted that motion analysis is performed again from acquisition of the motion signal 70 in a standing still state in step S10.


<Analysis Process>


The analysis process is a process of analyzing the motion signal 70 transmitted to the analysis unit 50, acquired in the motion measurement process, and based on the motion (swing) of the subject M holding the golf club 500.


In the analysis process, the motion signal 70 transmitted to the analysis unit 50 in step S110 and based on the motion (swing) of the subject M, is analyzed according to a predetermined analysis program stored in the ROM 230. In the analysis process, an analysis result is displayed (output) on the display section 260.


A technique of analyzing a swing on the basis of output signals from the sensor 110, included in the motion signal 70, may employ a technique disclosed in a patent publication (for example, JP-A-2014-90773).


In the analysis process, in step S120, the determination portion 204 determines the analysis result in step S110. The determination of the analysis result is performed on the basis of an analysis result stored in the ROM 230 in advance.


In the analysis process, in step S120, the determination portion 204 compares the analysis result of the motion signal 70 analyzed in step S110 with an analysis result (hereinafter, referred to as a “standard analysis result”) within a predetermined range recorded in the ROM 230 in advance, so as to determine whether or not the analysis result is included in the range of the standard analysis result.


In a case where it is determined that the analysis result is included in the range of the standard analysis result in step S120 (YES), the process proceeds to a notification of “analysis completion” in step S131. In a case where it is determined that the analysis result is not included in the range of the standard analysis result in step S120 (NO), the process proceeds to a notification of “analysis error” in step S132, and also returns to step S10 so that motion analysis is performed again from acquisition of the motion signal 70 in a standing still state.


The notification of “analysis completion” in step S131 is performed by using a light emission color and a notification (blinking) pattern. A light emission color and a notification pattern corresponding to “analysis completion” are set in advance. Consequently, the subject M is notified of “analysis completion”.


The notification of “analysis error” in step S132 is performed by using a light emission color and a notification (blinking) pattern. A light emission color and a notification pattern corresponding to “analysis error” are set in advance. Consequently, the subject M is notified of “analysis error”. Consequently, the subject M is notified of “analysis error”, and it is also prompted that motion analysis is performed again from acquisition of the motion signal 70 in a standing still state in step S10.


In the motion analysis method, a series of processes are finished when the notification of “analysis completion” in step S131 is performed.


In the above-described motion analysis method, the respective steps after the notification of “standing still state detection” in step S41 are continuously performed. In the above-described motion analysis method, the notifications in the respective steps after the notification of “standing still state detection” in step S41 may be omitted or added as appropriate.


According to Embodiment 1, the following effects are achieved.


According to the motion detection apparatus 1, the subject M holding an exercise appliance can visually perceive a state of the motion detection apparatus 1 through light emission of the notification portion 30 without changing an attitude of the subject.


Therefore, in the motion detection apparatus 1, the subject M can perform motion (swing) while holding the exercise appliance without averting the visual line e or lacking attention. Thus, it is possible to detect a natural motion (swing) attitude and thus to increase reliability of motion analysis.


Embodiment 2

Next, a description will be made of Embodiment 2 of the invention with reference to FIGS. 7 and 8. In the following description, the same part as the part having already been described is given the same reference numeral, and a description thereof will be omitted.



FIG. 7 is a schematic diagram illustrating an example in which the motion detection apparatus 1 according to Embodiment 2 is applied to the golf club 500, and FIG. 8 is a block diagram schematically illustrating the motion detection apparatus 1 according to Embodiment 2.


Embodiment 1 employs an aspect in which the imaging portion 150 is integrally incorporated into the casing 130, but Embodiment 2 employs an aspect in which the imaging portion 150 is separate from the casing 130 without being incorporated thereinto, and is attached to the shaft 500s of the golf club 500. In other words, as illustrated in FIG. 7, the imaging portion is attached to the part closer to the head 500h than the casing 130 in the shaft 500s of the golf club 500 by using a holder 160. In this case, if a casing 152 of a camera 156 is attached to the holder 160, a lens portion 154 of the camera 156 is directed toward the head 500h and can thus image the head 500h and a golf ball in detail.


As illustrated in FIG. 8, the imaging portion 150 includes a communication device 158, the camera 156, and a controller 157 controlling functions of the communication device 158 and the camera 156.


The communication device 158 can perform short-range radio communication with the communication portion 120C. In this case, a transmission method is not particularly limited, and may employ a protocol of a short-range radio communication standard such as Bluetooth (registered trademark).


According to the above-described Embodiment 2, it is possible to calculate a distance between the face surface of the golf club 500 and a golf ball with high accuracy, and thus to detect a standing still state in detail, in addition to the effects described in Embodiment 1.


Embodiment 3

<Correction of Position of Head 500h>



FIGS. 9A and 9B are diagrams for explaining correction of a position of the head 500h when a swing is started and a golf ball is hit in a motion detection apparatus according to Embodiment 3 of the invention.


A description will be made of a motion detection apparatus and a detection system according to the present embodiment with reference to the drawings. The same constituent elements as in the above-described embodiments are given the same reference numerals, and repeated description will be omitted.



FIG. 9A illustrates trajectories (trajectories of the head 500h and the grip 500g) of the golf club 500 drawn by using a position of the head 500h before being corrected, obtained through computation, and FIG. 9B illustrates trajectories of the golf club 500 drawn by using a position of the head 500h after being corrected. In Embodiment 1, an XYZ coordinate system (global coordinate system) having a target line indicating a target hit ball direction as an X axis, an axis on a horizontal plane perpendicular to the X axis as a Y axis, and a vertically upward direction (a direction opposite to the gravitational direction) as a Z axis is defined, and FIGS. 9A and 9B illustrate the X axis, the Y axis, and the Z axis.


In FIGS. 9A and 9B, the reference signs S1, HP1, and GP1 respectively indicate positions of the shaft 500s, the head 500h, and the grip 500g during swing starting, and the reference signs S2, HP2, and GP2 respectively indicate positions of the shaft 500s, the head 500h, and the grip 500g at impact.


In FIGS. 9A and 9B, the position HP1 of the head 500h during swing starting matches the origin (0,0,0) of the XYZ coordinate system. A dashed line HL1 and a solid line HL2 respectively indicate trajectories of the head 500h during a backswing and a downswing, and a dashed line GL1 and a solid line GL2 respectively indicate trajectories of the grip 500g during a backswing and a downswing. A connection point between the dashed line HL1 and the solid line HL2 and a connection point between the dashed line GL1 and the solid line GL2 respectively correspond to a position of the head 500h and a position of the grip 500g at a top of the swing (when a swing direction is changed).


Since the head 500h is located slight in front of a ball during swing starting, and comes into contact with the ball at impact, positions of the head 500h during swing starting and at impact may be substantially the same as each other in an actual swing. However, as illustrated in FIG. 9A, the position HP2 of the head 500h at impact, obtained through computation, is located at a position slightly deviated from the position HP1 of the head 500h during swing starting due to the influence of an integration error or the like of an acceleration or an angular velocity. In other words, the trajectory illustrated in FIG. 9A is slightly different from a trajectory of an actual swing.


Therefore, for example, if a position of the head 500h at one of the time of swing starting and the time of impact is corrected to match the other position under the premise that positions of the head 500h during swing starting and at impact are substantially the same as each other in an actual swing in FIG. 9A, as illustrated in FIG. 9B, the positions of the head 500h during swing starting and at impact are the same as each other, and thus trajectories closer to the actual swing than the trajectories illustrated in FIG. 9A can be obtained. If a position of the head 500h right before impact is used, it is possible to correct an error with higher accuracy than in the above description.


<Detailed Attachment Method of Sensor Section 100>


In a case where the above-described motion analysis process or correction of a position of the head 500h is performed, a calculation process is performed on the premise that a target line expressed by the X axis matches a normal direction of the head 500h. However, even in a case where a direction of a reference mark 190 engraved on the holder 200 is substantially matched with a direction of the head 500h of the golf club 500, a fine deviation occurs between the target line and the normal direction of the head 500h. Therefore, the deviation is corrected in detail, and thus it is possible to perform motion analysis or position correction with higher accuracy.



FIGS. 10A to 10C are diagrams illustrating a method of adjusting a target line and a normal direction of the head 500h in detail on the basis of an image signal obtained through imaging in the imaging portion 150. In other words, the processing section 201 of the analysis unit 50 extracts an image signal obtained through imaging in the imaging portion 150 from the motion signal 70. The calculation portion 202 performs image processing on the basis of the extracted image signal, so as to extract an image of a marker of the head 500h of the golf club 500, corresponding to a predetermined reference image.


The analysis portion 206 determines a normal direction PX to a face 500f on the basis of the marker image, and calculates an angle R formed between a target line direction (X direction) recognized by the sensor section 100 and the normal direction PX. In a case where a mark 500m is engraved on the head 500h, the mark 500m may be used as the marker.


In a case where the mark 500m or the like is not engraved on the head 500h, the calculation portion 202 may perform image processing on the basis of the extracted image signal by using the head 500h of the golf club 500 as a marker, so as to extract the face 500f. In this case, the analysis portion 206 calculates the angle R formed between a target line direction (X direction) recognized by the sensor section 100 and the normal direction PX to the face 500f of the golf club 500.


The processing section 201 transmits notification information to the notification portion 30 according to a difference between the calculated angles. The notification portion 30 notifies a user of a direction of deviation between the target line and the normal direction of the head 500h and the extent of deviation on the basis of the notification information.


For example, FIG. 10A illustrates that an angle R1 is formed between the target line direction and the normal direction PX in one direction, that is, an attachment angle of the holder 200 is deviated by almost the angle R1. FIG. 10B illustrates that the target line direction substantially matches the normal direction PX, that is, the holder 200 is accurately mounted. FIG. 10C illustrates that an angle R2 is formed between the target line direction and the normal direction PX in the other direction opposite to one direction, that is, an attachment angle of the holder 200 is deviated by almost the angle R2.


In any case of FIGS. 10A to 10C, the light emitter 132 notifies the user by changing a light emitting part in accordance with a deviation direction of an attachment angle and the extent of deviation.


According to Embodiment 3, the following effects are achieved.


In a case where the sensor section 100 is attached to the golf club 500 via the holder 200, the notification portion 30 can guide a state of positioning. Therefore, the user can easily attach the sensor section 100 to the golf club 500 with high accuracy in accordance with the guide of the notification portion 30. Consequently, it is possible to increase reliability of motion analysis for a swing, performed by the motion detection apparatus 1.


Embodiment 4

Next, Embodiment 4 of the invention will be described with reference to FIGS. 11, 12A and 12B. FIG. 11 is a schematic diagram illustrating the sensor unit 10, and FIGS. 12A and 12B are diagrams for explaining adjustment of a position of the sensor unit 10 attached to the golf club 500. In the following description, the same part as the part having already been described is given the same reference numeral, and a description thereof will be omitted.


In Embodiment 4, the sensor section 100 is configured to further include a projection portion 153. The projection portion 153 is driven in a state in which the holder 200 is attached to the golf club 500, and the sensor section 100 is mounted in the holder 200.


The projection portion 153 projects a pattern image 155 indicating a target line direction recognized by the sensor section 100 onto an upper surface of the head 500h, that is, the same surface as the surface on which the mark 500m is engraved. In Embodiment 4, the pattern image 155 employs a cross-line image indicating a target line direction and an orthogonal direction which is orthogonal thereto, but is not limited thereto. For example, the pattern image 155 may be a line image generated by a laser light source.


The user visually recognizes the pattern image 155 projected onto the upper surface of the head 500h, and adjusts the holder 200 so that the orthogonal direction of the pattern image 155 is parallel to the surface direction of the face 500f.


According to the above-described Embodiment 4, since a light emission state of the light emitter 132 is not required to be visually recognized when attachment of the golf club 500 and the holder 200 is adjusted, the adjustment work is easily performed, in addition to the same effects as in Embodiment 3.


The embodiments of the invention have been described with reference to the drawings, but a specific configuration is not limited to the embodiments, and design change and the like may occur within the scope without departing from the spirit of the invention. For example, the notification portion 30 performs a notification by using light blinking of the light emitter 132, but is not limited thereto. For example, a notification may be assumed to be performed by using a sound or vibration. There may be an aspect by using display on the display section 260 of the analysis unit 50.


Instead of adjustment of turning the holder 200 in accordance with the normal direction PX to the face 500f of the golf club 500, there maybe an aspect of an analysis program for analyzing motion on the basis of the angle R formed between the target line direction (X direction) and the normal direction PX, or an aspect of performing a correction process in a process of correcting a position of the head 500h.


An apparatus performing the above-described technique maybe implemented by a single apparatus, and may be implemented by a combination of a plurality of apparatuses, and thus various aspects may occur. For example, there maybe an aspect in which the analysis processing function of the analysis unit 50 is realized by only the sensor unit 10.


Embodiment 5


FIG. 13 is a block diagram schematically illustrating a motion detection apparatus 1′ according to Embodiment 5, and is a diagram illustrating a relationship between the sensor unit 10 and an analysis unit 50′.



FIG. 14 is a block diagram illustrating details of the processing section 201 of the analysis unit 50′, and FIG. 15 is a block diagram illustrating details of a storage section 350 of the analysis unit 50′.



FIGS. 16A and 16B are diagrams schematically illustrating the sensor unit 10 of the motion detection apparatus 1′. FIG. 16A is a diagram illustrating a trajectory of the golf club 500 before being corrected, and FIG. 16B is a diagram illustrating a trajectory of the golf club 500 after being corrected. FIG. 17 is a flowchart illustrating a flow of a process in the motion analysis method.


Hereinafter, a description will be made of an analysis system according to the present embodiment, a recording medium recording the analysis program, and an analysis method with reference to the drawings. The same constituent elements as in the above-described embodiments are given the same reference numerals, and repeated description will be omitted.


As illustrated in FIG. 13, the motion detection apparatus 1′ is configured to include the sensor unit 10 and the analysis unit 50′.


The sensor unit 10 is configured to include a sensor section 100, a casing 130 accommodating the sensor section 100, and a holder 200.


The sensor section 100 is configured to include a notification portion 30, a sensor 110, an imaging portion 150, and a controller 120, and these constituent elements are provided in the same casing 130.


The analysis unit 50′ is configured to include a processing section (CPU) 201, a communication section 210, an operation section 220, the storage section 350, and a display section 260.


The analysis unit 50′ may be a personal computer, a high function mobile phone (smart phone), a multi-function portable terminal (tablet terminal) or the like.


The storage section 350 maybe a ROM, a RAM, a nonvolatile memory, and the like, and stores programs for the processing section 201 performing various computation processes or control processes, various programs or data, etc., for realizing application functions, and temporarily stores a read program or data read from, data which is input from the operation section 220, results of calculation, etc., which is performed by the processing section 201 according to various programs or application functions, and the like.


The processing section 201 is configured to include a calculation portion 202, a determination portion 204, and an analysis portion 206. The processing section 201 performs various processes such as a computation process, an analysis process, and a determination process on the motion signal 70 which is received from the sensor unit 10 via the communication section 210 according to the program stored in the storage section 350.


In the processing section 201, the calculation portion 202 performs a calculation process on the motion signal 70 transmitted from the sensor unit 10. The determination portion 204 performs a determination of quality of an attachment position of the sensor unit 10 attached to the golf club 500 or various determinations associated with motion analysis, on the basis of results of the calculation process.


The processing section 201 transmits a trigger (result) signal 80 regarding a determination result or the like to the sensor unit 10, and transmits the trigger signal to the notification portion 30.


The calculation portion 202 performs a calculation process on the motion signal 70 transmitted from the sensor unit 10. The analysis portion 206 performs motion analysis of a measurement target on the basis of a calculation process result.


The determination portion 204 has a determination function of performing determination such as a determination of whether or not motion is detected or a determination of each timing of motion on the basis of a motion analysis result in the analysis portion 206.



FIG. 14 is a block diagram illustrating details of respective functions of the processing section 201. FIG. 15 is a diagram illustrating details of information stored in the storage section 350.


The calculation portion 202 includes an image processor 203, a position computer 304, a position corrector 205, and a speed computer 306. The analysis portion 206 includes a motion analysis information generator 208 having an analysis function. The position computer 304, the position corrector 205, the speed computer 306, and the motion analysis information generator 208 correspond to a correction portion having a correction function.


The storage section 350 stores a swing analysis program 251, a standing still state determination program 252, a distance calculation program 253, club specification information 254, and sensor attachment position information 255.


The standing still state determination program 252 and the distance calculation program 253 are sub-set programs called by the swing analysis program 251. A user may update or uninstall the swing analysis program 251, the standing still state determination program 252, and the distance calculation program 253 stored in the storage section 350. The user may install other sub-set programs as necessary. For example, there maybe a sub-set program in which an image of a golf ball hit by swinging the golf club 500 is captured, image processing is performed on the captured image so that a state of collision with the face 500f is analyzed, and thus a flight direction or a flight distance of the golf ball is estimated.


The image processor 203 has an image processing function. In other words, the image processor 203 extracts an image signal from the motion signal 70, and applies, for example, a well-known pattern matching technique to an image based on the extracted image signal so as to extract a captured image of a region including the head 500h of the golf club 500 and a golf ball.


The image processor 203 performs image processing using a well-known edge extraction technique or the like on the extracted captured image so as to generate a processed image, analyzes the processed image, and calculates a distance between a predetermined portion (for example, the face 500f) of the head 500h and the golf ball which is a predetermined target object by counting the number of pixels therebetween. The image processor 203 outputs the calculated distance information or data of the captured image having undergone the image processing in response to a request from another functional constituent element.


The position computer 304 performs a process of calculating a position (a coordinate of a position in an XYZ coordinate system) of the head 500h of the golf club 500 in a swing by using measured data output from the sensor unit 10.


The position computer 304 performs a process of calculating a position (a coordinate of a position in the XYZ coordinate system) of the grip 500g of the golf club 500 in a swing by using measured data output from the sensor unit 10. In the present embodiment, the XYZ coordinate system (global coordinate system) having a target line indicating a target hit ball direction as an X axis, an axis on a horizontal plane perpendicular to the X axis as a Y axis, and a vertically upward direction (a direction opposite to the gravitational direction) as a Z axis is defined.


Specifically, first, the position computer 304 computes an offset amount included in the measured data by using measured data (acceleration data and angular velocity data) during standing still (at address) of the user, stored in the storage section 350. Next, the position computer 304 subtracts the offset amount from the measured data after swing starting, stored in the storage section 350, so as to perform bias correction, and computes a position and an attitude (attitude angle) of the sensor unit 10 during a swing action of the user by using the bias-corrected measured data.


For example, the position computer 304 acquires the distance information output from the image processor 203, and determines a standing still state in an address state of the user in a case where a change in the distance is included in a predetermined range. The position computer 304 computes a position (initial position) of the sensor unit 10 in the XYZ coordinate system by using acceleration data measured by the three-axis acceleration sensors 112, and the club specification information 242 and the sensor attachment position information 244 stored in the storage section 350, and integrates subsequent acceleration data so as to compute changes in positions from the initial position of the sensor unit 10 in a time series.


The position computer 304 computes an attitude (initial attitude) of the sensor unit 10 during standing still (at address) of the user in the XYZ coordinate system by using acceleration data measured by the three-axis acceleration sensors 112, and then time-serially computes changes in attitudes from the initial attitude of the sensor unit 10 by performing rotation calculation using angular velocity data measured by the three-axis gyro sensors 114.


The position corrector 205 performs a process of correcting position information of the head 500h of the golf club 500, acquired from data measured by the sensor unit 10, on the basis of a difference between a position of the head 500h of the golf club 500 during swing starting and a position of the head 500h of the golf club 500 at impact.


Here, with reference to FIGS. 16A and 16B, a description will be made of correction of a position of the head 500h of the golf club 500.



FIG. 16A illustrates trajectories (trajectories of the head 500h and the grip 500g) of the golf club 500 drawn by using a position of the head 500h before being corrected, obtained through computation, and FIG. 16B illustrates trajectories of the golf club 500 drawn by using a position of the head 500h after being corrected.


In FIGS. 16A and 16B, the reference signs S1, HP1, and GP1 respectively indicate positions of the shaft 500s, the head 500h, and the grip 500g during swing starting, and the reference signs S2, HP2, and GP2 respectively indicate positions of the shaft 500s, the head 500h, and the grip 500g at impact.


In FIGS. 16A and 16B, the position HP1 of the head 500h during swing starting matches the origin (0,0,0) of the XYZ coordinate system. A dashed line HL1 and a solid line HL2 respectively indicate trajectories of the head 500h during a backswing and a downswing, and a dashed line GL1 and a solid line GL2 respectively indicate trajectories of the grip 500g during a backswing and a downswing. A connection point between the dashed line HL1 and the solid line HL2 and a connection point between the dashed line GL1 and the solid line GL2 respectively correspond to a position of the head 500h and a position of the grip 500g at a top of the swing (when a swing direction is changed).


Since the head 500h is located slight in front of a ball during swing starting, and comes into contact with the ball at impact, positions of the head 500h during swing starting and at impact may be substantially the same as each other in an actual swing. However, as illustrated in FIG. 16A, the position HP2 of the head 500h at impact, obtained through computation, is located at a position slightly deviated from the position HP1 of the head 500h during swing starting due to the influence of an integration error or the like of an acceleration or an angular velocity. In other words, the trajectory illustrated in FIG. 16A is slightly different from a trajectory of an actual swing.


Therefore, for example, if a position of the head 500h at one of the time of swing starting and the time of impact is corrected to match a position at the other thereof under the premise that positions of the head 500h during swing starting and at impact are substantially the same as each other in an actual swing in FIG. 16A, as illustrated in FIG. 16B, the positions of the head 500h during swing starting and at impact are the same as each other, and thus trajectories closer to the actual swing than the trajectories illustrated in FIG. 16A can be obtained.


If the correction is performed by using a position of the head 500h right before impact, that is, a distance XL between the head 500h and the golf ball, output from the image processor 203, it is possible to correct an error with higher accuracy than in the above description.


Referring to FIG. 14 again, the position corrector 205 analyzes the captured image output from the image processor 203 so as to calculate the distance XL between the head 500h of the golf club 500 and the golf ball at address. Here, by using a position of the head 500h of the golf club 500 at one of the time of swing starting and the time of impact, the position corrector 205 corrects a position of the head 500h of the golf club 500 at the other thereof by also taking into consideration the calculated distance XL.


The speed computer 306 corrects speed information of the head 500h of the golf club 500, acquired from the measured data in the sensor unit 10, on the basis of a difference between the position of the head 500h of the golf club 500 during swing starting and the position of the head 500h of the golf club 500 at impact. In the present embodiment, a speed of the head 500h is calculated by using time-series information regarding corrected positions of the head 500h generated by the position corrector 205.


The motion analysis information generator 208 performs a process of performing swing analysis by using the corrected position information or the corrected speed information, and generating motion analysis information which is information indicating an analysis result. For example, the motion analysis information generator 208 generates trajectory information (image data) indicating movement of the golf club 500 in a predetermined swing period by using time-series information regarding positions of various portions of the golf club 500, generated by the position corrector 205.


For example, the motion analysis information generator 208 may sequentially connect positions (coordinates) of the head 500h from the time of swing starting to the time of impact, and, similarly, may sequentially connect positions (coordinates) of the grip 500g from the time of swing starting to the time of impact, so as to generate trajectory information including trajectories (HL1 and HL2 in FIG. 16B) of the head and trajectories (GL1 and GL2 in FIG. 16B) of the grip from the time of swing starting to the time of impact. The motion analysis information generator 208 may correct the generated trajectory information on the basis of the motion signal 70. In other words, correction targets in the present embodiment may be at least one of motion information such as an acceleration or an angular velocity based on the motion signal 70, and information regarding an analyzed speed, trajectory or the like.


<Motion Analysis Method>


A motion analysis method of the present embodiment includes a measurement preparation process, a motion measurement process, a transmission process of transmitting the motion signal 70 obtained in the motion measurement process to the analysis unit 50′, and an analysis process of analyzing the motion signal 70 transmitted in the transmission process. The motion analysis method includes a standing still state notification process of performing a notification of completion of the measurement preparation process, a measurement completion notification process of performing a notification of completion of the motion measurement process, and a transmission completion notification process of performing a notification of completion of transmission of the motion signal 70 from the sensor unit 10 to the analysis unit 50′.


Each process will be described for each step with reference to a flowchart illustrated in FIG. 17 with respect to the motion analysis method of the present embodiment using the motion detection apparatus 1′. Regarding the description of the motion analysis method, the motion analysis method in which the motion detection apparatus 1′ is applied to the golf club 500 will be described.


<Measurement Preparation Process>


The measurement preparation process is a process of preparing for measurement of motion, and is a process of measuring a bias of the sensor 110 before starting motion (swing).


Here, the bias collectively indicates a zero bias in an initial state in which angular velocity is zero before the user starts motion, and a drift caused by an external factor such as a power source fluctuation or a temperature change.


In the measurement preparation process, in step S10, the analysis unit 50′ acquires the motion signal 70 in a case of a standing still state (so-called address state) in which the user holds the golf club 500 and stands still. The measurement preparation process includes an imaging process of the imaging portion 150 capturing an image of the head 500h of the golf club 500.


In the measurement preparation process, in step S20, the calculation portion 202 calculates the motion signal 70 acquired by the analysis unit 50′.


In the present embodiment, in step S20, the standing still state determination program 252 and the distance calculation program 253 stored in the storage section 350 are read and executed.


In other words, the distance calculation program 253 extracts an image signal from the motion signal 70 calculated by the calculation portion 202 in step S20, and extracts a captured image of a region including the head 500h of the golf club 500 and a golf ball at a predetermined time interval in an image processing process of processing the extracted image signal. The distance calculation program 253 performs image processing on the extracted captured image so as to calculate the distance XL between the head 500h and the golf ball, and record the calculated distance information in the storage section 350.


The standing still state determination program 252 reads the distance information which is stored in a time series, and examines a change in the distance information associated with time elapse. For example, in a case where a change amount of the distance XL is equal to or less than a predetermined reference value, and a state in which the change amount is equal to or less than the predetermined reference value lasts for a predetermined time (for example, 3 seconds), it is determined that the golf club 500 is in a standing still state.


In the measurement preparation process, in step S30, in a case where it is determined that the golf club 500 is in a standing still state (YES), the process proceeds to a notification of “standing still state detection” in step S41, and the motion signal 70 at that time is stored in the storage section 350 as a bias value.


In a case where it is determined that the golf club 500 is not in a standing still state in step S30 (NO), a notification of a “standing still state detection error” is performed in step S42, and the process returns to step S10 so as to be performed again from acquisition of the motion signal 70 in a standing still state.


The detection of a standing still state is not limited to analyzing an output signal from the sensor 110, and may have an aspect of determining a standing still state by processing an image captured by the imaging portion 150.


<Standing Still State Notification Process>


The standing still state notification process is a process of performing a notification of a determination result of whether or not the golf club 500 and the user holding the golf club 500 are in a standing still (address) state on the basis of the motion signal 70 in the above-described measurement preparation process.


In the standing still state notification process, in a case where it is determined that the golf club 500 and the user holding the golf club 500 are in a standing still state on the basis of the motion signal 70 in step S30, a notification of “standing still state detection” is performed in step S41. The notification is also a notification that the measurement target has started motion.


In the standing still state notification process, in a case where it is determined that the golf club 500 and the user holding the golf club 500 are not in a standing still state on the basis of the motion signal 70 in step S30, a notification of a “standing still state detection error” is performed in step S42.


The standing still state notification process is performed by the light emitter 132 provided in the notification portion 30. Here, the notification of standing still state detection in the notification portion 30 is performed by using blinking and light emission colors of the light emitter 132. The notification portion 30 may change a light emission color and a blinking pattern according to information notified by the user.


The notification of “standing still state detection” in step S41 is performed by using a light emission color and a notification (blinking) pattern of the light emitter 132. A light emission color and a notification pattern corresponding to “standing still state detection” are set in advance.


The notification of “standing still state detection error” in step S42 is performed through light emission of the light emitter 132 so as to be different from the notification of “standing still state detection”. A light emission color and a notification pattern corresponding to “standing still state detection error” are set in advance. Consequently, the user is notified of “standing still state detection error” and is also prompted to maintain a standing still (address) state.


<Motion Measurement Process>


The motion measurement process is a process of measuring motion (swing) of the user holding the golf club 500. The motion measurement process is a process in which the sensor 110 mounted in the sensor unit 10 measures motion (swing) of the user.


In the motion measurement process, in step S50, an acceleration or the like associated with the motion of the user is acquired from the sensor 110 as the motion signal 70.


<Transmission Process>


The transmission process is a process of transmitting, to the analysis unit 50′, the motion signal 70 which is acquired in the motion measurement process and is based on the motion (swing) of the user holding the golf club 500.


In the transmission process, in step S60, the acquired motion signal 70 is transmitted from the sensor unit 10 to the analysis unit 50′.


In the transmission process, in step S70, the determination portion 204 performs a second determination of whether or not an error (for example, an over-range or missing) is included in the motion signal 70 transmitted to the analysis unit 50′ in step S60. In the transmission process, in step S70, the determination portion 204 performs the second determination of whether or not the acceleration or the like associated with the motion exceeds a preset value on the basis of the motion signal 70 transmitted to the analysis unit 50′ in step S60.


The error determination is performed through comparison with the normal motion signal 70 recorded in the storage section 350 in advance as a threshold value. The determination of the acceleration or the like associated with the motion is performed through comparison with the motion signal 70 recorded in the storage section 350 in advance as the threshold value. The determination of the acceleration or the like associated with the motion may be performed by using any value of the motion signal 70 such as the maximum value or the minimum value of the motion signal 70 of the user as the threshold value.


In a case where it is determined that an error is not included in the motion signal 70 in step S70, or the motion signal 70 exceeds the preset threshold value (satisfies the condition of the threshold value) (YES), the process proceeds to a notification of “favorable measurement” in step S81. In a case where it is determined that an error is included in the motion signal 70 in step S70 (NO), the process proceeds to a notification of “measurement error” in step S82, and also returns to step S10 so that motion analysis is performed again from acquisition of the motion signal 70 in a standing still state.


The notification of “favorable measurement” in step S81 is performed by using a light emission color and a notification (blinking) pattern of the light emitter 132. A light emission color and a notification pattern corresponding to “favorable measurement” are set in advance.


The notification of “measurement error” in step S82 is performed by using a light emission color and a notification (blinking) pattern of the light emitter 132. A light emission color and a notification pattern corresponding to “measurement error” are set in advance. Consequently, the user is notified of “measurement error”, and it is also prompted that motion analysis is performed again from acquisition of the motion signal 70 in a standing still state in step S10.


In the transmission process, in step S90, the determination portion 204 determines whether or not transmission of the motion signal 70 transmitted to the analysis unit 50′ in step S60 is completed. The determination of transmission completion is performed by receiving start parity and end parity which are added to the motion signal 70 (packet data) by the data processing portion 120A of the sensor unit 10. In a case where the end parity is received until a predetermined time stored in the storage section 350 in advance elapses after the start parity is received in step S90 (YES), transmission completion is determined, and thus the process proceeds to a notification of “transmission completion” in step S101. In a case where the end parity is not received until the predetermined time elapses (NO), the process proceeds to a notification of “transmission error” in step S102, and also returns to step S10 so that motion analysis is performed again from acquisition of the motion signal 70 in a standing still state.


The notification of “transmission completion” in step S101 is performed by using a light emission color and a notification (blinking) pattern. A light emission color and a notification pattern corresponding to “transmission completion” are set in advance. Consequently, the user is notified of “transmission completion”.


The notification of “transmission error” in step S102 is performed by using a light emission color and a notification pattern corresponding to “transmission error” and set in advance. Consequently, the user is notified of “transmission error”, and it is also prompted that motion analysis is performed again from acquisition of the motion signal 70 in a standing still state in step S10.


<Analysis Process>


The analysis process is a process of analyzing the motion signal 70 transmitted to the analysis unit 50′, acquired in the motion measurement process, and based on the motion (swing) of the user holding the golf club 500.


In the analysis process, the motion signal 70 transmitted to the analysis unit 50′ in step S110 and based on the motion (swing) of the user, is analyzed according to the swing analysis program 251 stored in the storage section 350. In the analysis process, an analysis result is displayed (output) on the display section 260. The above-described correction process of correcting a position of the head 500h of the golf club 500 may be executed by the swing analysis program 251.


A technique of analyzing a swing motion on the basis of output signals (acceleration data and angular velocity data) from the sensor 110, included in the motion signal 70, may employ a technique disclosed in a patent publication (for example, JP-A-2014-90773).


In the analysis process, in step S120, the determination portion 204 determines the analysis result in step S110.


In the analysis process, in step S120, the determination portion 204 compares the analysis result of the motion signal 70 analyzed in step S110 with an analysis result (hereinafter, referred to as a “standard analysis result”) within a predetermined range recorded in the storage section 350 in advance, so as to determine whether or not the analysis result is included in the range of the standard analysis result.


In a case where it is determined that the analysis result is included in the range of the standard analysis result in step S120 (YES), the process proceeds to a notification of “analysis completion” in step S131. In a case where it is determined that the analysis result is not included in the range of the standard analysis result in step S120 (NO), the process proceeds to a notification of “analysis error” in step S132, and also returns to step S10 so that motion analysis is performed again from acquisition of the motion signal 70 in a standing still state.


The notification of “analysis completion” in step S131 is performed by using a light emission color and a notification (blinking) pattern. A light emission color and a notification pattern corresponding to “analysis completion” are set in advance. Consequently, the user is notified of “analysis completion”.


The notification of “analysis error” in step S132 is performed by using a light emission color and a notification (blinking) pattern. A light emission color and a notification pattern corresponding to “analysis error” are set in advance. Consequently, the user is notified of “analysis error”. Consequently, the user is notified of “analysis error”, and it is also prompted that motion analysis is performed again from acquisition of the motion signal 70 in a standing still state in step S10.


In the motion analysis method, a series of processes are finished when the notification of “analysis completion” in step S131 is performed.


In the above-described motion analysis method, the respective steps after the notification of “standing still state detection” in step S41 are continuously performed. In the above-described motion analysis method, the notifications in the respective steps after the notification of “standing still state detection” in step S41 may be omitted or added as appropriate.


According to Embodiment 5, the following effects are achieved in addition to the effects achieved by the above-described embodiments.


According to Embodiment 5, image processing is performed on an image of the head 500h of the golf club 500, captured by the imaging portion 150 of the sensor section 100, so that information regarding a standing still state of the head 500h at address or information regarding a distance between the head 500h and a golf ball is acquired, a swing of the golf club 500 is analyzed by using the acquired information, and thus it is possible to analyze a swing motion with high accuracy.


An apparatus performing the above-described technique may be implemented by a single apparatus, and may be implemented by a combination of a plurality of apparatuses, and thus various aspects may occur.


The embodiments of the invention have been described with reference to the drawings, but a specific configuration is not limited to the embodiments, and design change and the like may occur within the scope without departing from the spirit of the invention. For example, there may an aspect in which the motion detection apparatuses 1 and 1′ have the motion state analysis function of the analysis units 50 and 50′.


Each functional element illustrated in FIGS. 1, 2 and 8 indicates a functional configuration realized in cooperation between hardware and software, and a specific mounting aspect is not particularly limited. Therefore, there may be a configuration in which individually corresponding hardware is not necessarily mounted in each functional unit, and a single processor executes a program so as to realize functions of a plurality of functional units. In the above-described embodiments, some functions realized by software may be realized by hardware, or some functions realized by hardware maybe realized by software. Specific detailed configurations of other respective units of the motion detection apparatuses 1 and 1′ may also be arbitrarily changed within the scope without departing from the spirit of the invention.


REFERENCE SIGNS LIST




  • 1 AND 1′ MOTION DETECTION APPARATUS,


  • 10 SENSOR UNIT,


  • 30 NOTIFICATION PORTION,


  • 50 AND 50′ ANALYSIS UNIT,


  • 70 MOTION SIGNAL,


  • 80 TRIGGER SIGNAL,


  • 100 SENSOR SECTION,


  • 110 SENSOR,


  • 112
    x,
    112
    y, AND 112z ACCELERATION SENSOR,


  • 114
    x,
    114
    y, AND 114z ANGULAR VELOCITY SENSOR,


  • 120 CONTROLLER,


  • 120A DATA PROCESSING PORTION,


  • 120B POWER SOURCE PORTION,


  • 120C COMMUNICATION PORTION,


  • 130 CASING,


  • 132 LIGHT EMITTER,


  • 132
    a FIRST LIGHT EMITTER,


  • 132
    b SECOND LIGHT EMITTER,


  • 150 IMAGING PORTION,


  • 152 CASING,


  • 153 PROJECTION PORTION,


  • 154 LENS PORTION,


  • 155 PATTERN IMAGE,


  • 156 CAMERA,


  • 157 CONTROLLER,


  • 158 COMMUNICATION DEVICE,


  • 160 HOLDER,


  • 200 HOLDER,


  • 201 PROCESSING SECTION,


  • 202 CALCULATION PORTION,


  • 204 DETERMINATION PORTION,


  • 206 ANALYSIS PORTION,


  • 210 COMMUNICATION SECTION,


  • 220 OPERATION SECTION,


  • 230 ROM,


  • 240 RAM,


  • 250 NONVOLATILE MEMORY,


  • 260 DISPLAY SECTION,


  • 304 POSITION COMPUTER,


  • 306 SPEED COMPUTER,


  • 350 STORAGE SECTION,


  • 500 GOLF CLUB,


  • 500
    g GRIP,


  • 500
    h HEAD,


  • 500
    s SHAFT,


  • 500
    f FACE (FACE SURFACE),

  • M SUBJECT,

  • e VISUAL LINE


Claims
  • 1. A detection apparatus comprising: a sensor portion that is attached to an exercise appliance, and detects a swing motion of the exercise appliance; andan image capturing portion that captures an image of a part of interest,wherein the sensor portion and the image capturing portion are accommodated in the same casing.
  • 2. The detection apparatus according to claim 1, further comprising: a notification portion that performs a notification of a motion state of the exercise appliance which is analyzed on the basis of at least one of an output signal from the sensor portion and a captured image obtained by the image capturing portion.
  • 3. The detection apparatus according to claim 1, wherein the part of interest is a ball hitting portion for hitting a ball through the swing motion of the exercise appliance.
  • 4. The detection apparatus according to claim 3, wherein the motion state includes a standing still state of the exercise appliance which is determined on the basis of a plurality of captured images obtained by capturing images of the ball hitting portion with the passage of time.
  • 5. The detection apparatus according to claim 4, wherein the standing still state is a state in which the exercise appliance stands still before the swing motion is started with the exercise appliance.
  • 6. The detection apparatus according to claim 1, wherein the exercise appliance is a golf club, and the sensor portion and the image capturing portion are attached to a shaft or a grip of the golf club.
  • 7. (canceled)
  • 8. The detection apparatus according to claim 1, wherein the exercise appliance includes the part of interest, andwherein the detection apparatus further includes a determination portion that extracts a predetermined reference image from a captured image of the part of interest obtained by the image capturing portion, and determines the quality of an attachment position where the sensor portion is attached to the exercise appliance on the basis of the extracted predetermined reference image.
  • 9. A motion analysis system comprising: the detection apparatus according to claim 1; andan analysis unit that analyzes the motion state of the exercise appliance on the basis of a captured image obtained by the image capturing portion and an output signal from the sensor portion.
  • 10. The motion analysis system according to claim 9, wherein the detection apparatus transmits the captured image and the output signal to the analysis unit, andwherein the analysis unit performs analysis on the basis of the captured image and the output signal so as to output a trigger signal indicating the motion state of the exercise appliance.
  • 11. The motion analysis system according to claim 9, wherein the analysis unit determines standing-still of the exercise appliance on the basis of the captured image, and analyzes the swing motion on the basis of the output signal.
  • 12. A detection system comprising: an image capturing portion that captures an image of a part of interest of an exercise appliance to which a sensor portion detecting a swing motion is attached; anda determination portion that extracts a predetermined reference image from a captured image of the part of interest obtained by the image capturing portion, and determines the quality of an attachment position where the sensor portion is attached to the exercise appliance on the basis of the extracted predetermined reference image.
  • 13. The detection system according to claim 12, wherein the image capturing portion captures an image of the part of interest from a direction of viewing the part of interest from the sensor portion attached to the exercise appliance.
  • 14. The detection system according to claim 12, wherein the determination portion determines the quality of the attachment position on the basis of a difference between a direction specified on the basis of the predetermined reference image and a preset reference direction.
  • 15. The detection system according to claim 12, further comprising: a notification portion that performs a notification of a result determined by the determination portion.
  • 16. The detection system according to claim 12, wherein the predetermined reference image is an image of a reference mark provided on the exercise appliance.
  • 17. The detection system according to claim 12, further comprising: a projection portion that projects the predetermined reference image onto the part of interest.
  • 18. The detection system according to claim 12, further comprising: a sensor unit; andan analysis unit that is connected to the sensor unit through communication with the sensor unit,wherein the sensor unit includesthe sensor portion; andthe image capturing portion, andwherein the analysis unit includes the determination portion.
  • 19. An analysis system comprising: a sensor portion that is attached to an exercise appliance and detects motion information of the exercise appliance;an image capturing portion that captures an image of a location including a part of interest of the exercise appliance;an image processing portion that acquires distance information regarding a distance between the part of interest and a predetermined target object on the basis of a captured image obtained by the image capturing portion;an analysis portion that analyzes a swing motion of the exercise appliance on the basis of the motion information; anda correction portion that corrects at least one of the motion information and an analysis result in the analysis portion by using the distance information.
  • 20. The analysis system according to claim 19, wherein the image processing portion analyzes an image having undergone image processing, and calculates the distance information by counting the number of pixels forming the image.
  • 21. The analysis system according to claim 19, wherein the analysis portion analyzes trajectory information regarding a trajectory along which the exercise appliance is moved on the basis of the motion information, andwherein the correction portion corrects the trajectory information by using the distance information.
  • 22. The analysis system according to claim 19, wherein the analysis portion determines whether or not the exercise appliance stands still on the basis of the distance information in each of a plurality of captured images obtained with the passage of time.
  • 23. The analysis system according to claim 19, wherein the part of interest is a ball hitting portion for hitting the target object through a swing motion of the exercise appliance.
  • 24. The analysis system according to claim 19, further comprising: a sensor unit that is attached to the exercise appliance; andan analysis unit that performs communication with the sensor unit,wherein the sensor unit includesthe sensor portion; andthe image capturing portion,wherein the analysis unit includesthe image processing portion;the correction portion; andthe analysis portion, andwherein the motion information and the captured images are transmitted from the sensor unit to the analysis unit through the communication.
  • 25. A recording medium recording an analysis program causing a computer to execute: an image processing function of acquiring distance information regarding a distance between a part of interest of an exercise appliance and a predetermined target object on the basis of a captured image of the part of interest obtained by an image capturing portion, the exercise appliance being attached with a sensor portion detecting motion information;an analysis function of analyzing a swing motion of the exercise appliance on the basis of the motion information; anda correction function of correcting at least one of the motion information and an analysis result of the swing motion by using the distance information.
  • 26. An analysis method comprising: an image capturing process of capturing an image a part of interest of an exercise appliance, the exercise appliance being attached with a sensor portion detecting motion information;an image processing process of acquiring distance information regarding a distance between the part of interest and a predetermined target object on the basis of a captured image of the part of interest;an analysis process of analyzing a swing motion of the exercise appliance on the basis of the motion information; anda correction process of correcting at least one of the motion information and an analysis result of the swing motion by using the distance information.
Priority Claims (3)
Number Date Country Kind
2015-005652 Jan 2015 JP national
2015-005653 Jan 2015 JP national
2015-005654 Jan 2015 JP national
PCT Information
Filing Document Filing Date Country Kind
PCT/JP2016/000110 1/12/2016 WO 00