This application is based upon and claims the benefit of priority from the prior Japanese Patent Application Nos. 2011-181795 and 2012-179920, respectively filed Aug. 23, 2011 and Aug. 14, 2012, and the entire contents of which are incorporated herein by reference.
1. Field of the Invention
The present invention relates to a musical instrument that generates electronic sound and a light-emission controller used in this musical instrument.
2. Related Art
Conventionally, musical instruments have been proposed that generate electronic sound in response to music playing movements, when music playing movements of a player are detected. For example, a musical instrument (air drum) has been known that generates a percussion instrument sound with only a stick-shaped component, and with this musical instrument, a sensor is provided to the stick-shaped component, the sensor detecting a music playing movement by a player holding the component by hand and waving, and then a percussion instrument sound is generated.
According to such a musical instrument, music notes of this instrument can be generated without requiring a real instrument; therefore, it enables the enjoyment of music playing without being subjected to limitations in the music playing location or music playing space.
In regards to such a musical instrument, an instrument game device is proposed in FIG. 1 of Japanese Patent No. 3599115, for example, that is configured so as to capture an image of a music playing movement of a player using a stick-shaped component, while displaying a composite image combining this music playing movement and a virtual image showing an instrument set on a monitor, and generates a predetermined musical note depending on position information of the stick-shaped component and the virtual instrument set.
However, with the musical instrument capturing an image of a player and generating musical notes, the music-playing component must be identifiable from the captured image. More specifically, the position coordinates of a portion of the music-playing component contacting the virtual instrument must be specified in the captured image.
In this respect, with the instrumental game device described in Japanese Patent No. 3599115, it is configured so that a lamp is provided to a leading end of a penlight used by the player (FIG. 4), and the portion contacting the virtual instrument is distinguished by specifying the position coordinates of this lamp.
As a result, an electrical source for switching on the lamp is required in the music-playing component (penlight). However, in order to realize the aforementioned characteristic of not being subjected to the limitations in the music playing location and music playing space, it is necessary to provide an electrical source inside of the music-playing component that does not supply electrical power by wires, such as a battery. In addition, in view of characteristics of holding and playing by the player, the music-playing component requires a certain curbing of the weight thereof, and thus a simple means of providing a large battery in order to enable use over a long time period is not preferable.
In this regard, with the instrumental game device of Japanese Patent No. 3599115, switch-on control of this lamp is in no way taken into account, and thus a further improvement has been demanded from the viewpoint of a reduction in the electricity consumption of the music-playing component.
The present invention has been made by taking such demands into account, and has an object of providing a musical instrument and light-emission controller that realize a reduction in the electricity consumption of a music-playing component, in a musical instrument that generates musical notes based on the position coordinates of a light-emitting part of the music-playing component in image-capture space.
In order to achieved the above-mentioned object, a musical instrument according to an aspect of the present invention includes: a music-playing component to be held by a player, and including a light-emitting part that emits light and switches off; an image-capturing device that captures an image of an image-capture space that contains the player holding the music-playing component; a sound generating device that generates sound based on a position of the light-emitting part while emitting light in the image-capture space captured by the image-capturing device; a detector that detects start and end of a down swing movement of the music-playing component by the player; and a light-emission controller that (a) controls the light-emitting part to emit light when the detector detects the start of the down swing movement, and (b) controls the light-emitting part to switch off when the detector detects the end of the down swing movement.
In addition, a light-emission controller according to an aspect of the present invention includes: a detector that detects start and end of a movement provided to a component having a light-emitting part; and a control unit that (a) switches on the light-emitting part in response to the start of the movement detected by the detector, and (b) switches off the light-emitting part in response to the end of the movement detected by the detector.
Furthermore, in a control method of a musical instrument according to an aspect of the present invention comprising a music-playing component to be held by a player and including (i) a light-emitting part that emits light and switches off, (ii) an image-capturing device and (iii) a sound generating device, the method includes the steps of: capturing an image of an image-capture space that contains the player holding the music-playing component by the image-capturing device; generating sound by the sound generating device based on a position of the light-emitting part while emitting light in the image-capture space captured by the image-capturing device; detecting start and end of a down swing movement of the music-playing component by the player; and controlling the light-emitting part (a) to emit light when the start of the down swing movement is detected in the detecting step, and (b) to switch off when the end of the down swing movement is detected in the detecting step.
Hereinafter, embodiments of the present invention will be explained while referencing the drawings.
First, an overview of a musical instrument 1 as an embodiment of the present invention will be explained while referencing
As shown in
The sticks 10 are members of stick shape extending in a longitudinal direction, and correspond to a music-playing component of the present invention. A player conducts a music playing movement by making up swing and down swing movements about the wrist, etc. holding one end (base side) of the stick 10 in the hand. Various sensors such as an acceleration sensor are provided in the other end (leading end side) of the stick 10 in order to detect such a music playing movement of the player. Then, based on the music playing movement detected by the various sensors, the stick 10 sends a Note-on-Event to the center unit 30.
In addition, a marker 15 (refer to
The camera unit 20 is an optical camera that captures an image of the player carrying out music playing movements holding the sticks 10 at a predetermined frame rate, and corresponds to an image capturing device of the present invention. The camera unit 20 specifies position coordinates within image capturing space of the marker 15 while emitting light and transmits the position coordinates to the center unit 30.
Upon receiving a Note-on-Event from the stick 10, the center unit 30 generates a predetermined musical note in response to the position coordinate data of the marker 15 during reception. More specifically, the center unit 30 stores position coordinate data of a virtual drum set D shown in
In the musical instrument 1, such sticks 10 of the present embodiment perform light-emission control and switch-off control of the marker 15, while decreasing the electricity consumption. More specifically, although it is necessary to generate musical notes when the sticks 10 strike a virtual instrument in the musical instrument 1, generally, in a percussion instrument, the striking by the stick 10 is carried out when the stick 10 is swung down, and is not carried out when the stick 10 is swung up.
Therefore, the sticks 10 of the present embodiment realize a reduction in electricity consumption by performing light-emission control of the marker 15 on the condition of detecting start of a down swing movement, and subsequently, performing switch-off control of the marker 15 on the condition of detecting the end of the down swing movement and start of an up swing movement. It should be noted that the light-emission control refers to control causing the marker 15 to emit light and control to maintain a light-emitting state. However, the light-emitting state is not only a state always emitting light, and includes a state temporarily switching off as in blinking. In addition, switch-off control refers to control to switch off light emission of the marker 15 and control to maintain the switched off state.
Hereinafter, an embodiment of the present invention will be specifically explained.
First, the configurations of the sticks 10, camera unit 20 and center unit 30 configuring the musical instrument 1 of the present invention will be explained while referencing
As shown in
The CPU 11 executes control of the overall stick 10, and in addition to detection of the attitude of the stick 10, shot detection and movement detection based on the sensor values outputted from the motion sensor unit 14, for example, also performs control such as light-emission and switch-off of the marker 15. At this time, the CPU 11 reads marker characteristic information from the ROM 12, and performs light-emission control of the marker 15 in accordance with this marker characteristic information. In addition, the CPU 11 performs communication control with the center unit 30 via the data communication unit 16.
The ROM 12 stores processing programs for various processing executed by the CPU 11. In addition, the ROM 12 stores the marker characteristic information used in the light-emission control of the marker 15. Herein, the camera unit 20 must distinguish between the marker 15 of the stick 10A (first marker) and the marker 15 of the stick 10B (second marker). Marker characteristic information is information for the camera unit 20 to distinguish between the first marker and the second marker. For example, in addition to the shape, size, color, chroma, or brightness during light emission, it is possible to use the blinking speed or the like during light emission.
The CPU 11 of the stick 10A and the CPU 11 of the stick 10B read respectively different marker characteristic information, and perform light-emission control of the respective markers.
The RAM 13 stores the values acquired or generated in processing such as various sensor values outputted by the motion sensor unit 14.
The motion sensor unit 14 is various sensors for detecting the state of the stick 10, and outputs predetermined sensor values. Herein, an acceleration sensor, angular velocity sensor, magnetic sensor, or the like can be used as the sensors configuring the motion sensor unit 14, for example.
A three-axis sensor that outputs the acceleration occurring in each of the three axis directions of the X axis, Y axis and Z axis can be employed as the acceleration sensor. It should be noted that, as shown in
In addition, a sensor equipped with a gyroscope can be employed as the angular velocity sensor, for example. Herein, as shown in
Herein, the rotation angle 301 in the Y axis direction is the rotation angle in a front-back axis viewed from the player when the player holds the stick 10; therefore, it can be referred to as roll angle. The roll angle corresponds to the angle 302 showing how much the X-Y plane has been tilted relative to the X axis, and is produced from the player holding the stick 10 in a hand, and causing to rotate left and right about the wrist.
In addition, the rotation angle 311 in the X axis direction is the rotation angle in a left-right axis viewed from the player when the player holds the stick 10; therefore, it is can be referred to as pitch angle. The pitch angle corresponds to the angle 312 showing how much the X-Y plane is tilted relative to the Y axis, and is produced by the player holding the stick 10 in a hand, and waving the wrist in a vertical direction.
It should be noted that, although an illustration is omitted, the angular velocity sensor may be configured to jointly output the rotational angle in the Z axis direction as well. At this time, the rotation angle in the Z axis direction basically has the same property as the rotation angle 311 in the X axis direction, and is a pitch angle produced by the player holding the stick 10 in a hand, and waving the wrist in the vertical direction.
In addition, a sensor capable of outputting a magnetic sensor value in the three axis directions of the X axis, Y axis and Z axis shown in
The motion sensor unit 14 (in detail, the CPU 11 receiving sensor values from the motion sensor unit 14) uses such various sensors to detect the state of the stick 10 being held by the player (can also be called music playing state of player). As one example, the CPU 11 detects the striking timing of a virtual instrument by the stick 10 (shot timing) based on the acceleration output by the acceleration sensor (or sensor composite value). In addition, the CPU 11 detects a swing down movement and swing up movement of the stick 10 based on the sensor value outputted from each sensor.
Referring back to
The data communication unit 16 performs predetermined wireless communication with at least the center unit 30
The predetermined wireless communication may be configured to be performed by any method, and in the present embodiment, wireless communication with the center unit 30 is performed by way of infrared communication. It should be noted that the data communication unit 16 may be configured to perform wireless communication with the camera unit 20, and may be configured to perform wireless communication with the stick 10A and the stick 10B.
The explanation for the configuration of the stick 10 is as given above. Next, the configuration of the camera unit 20 will be explained while referencing
The camera unit 20 is configured to include a CPU 21, ROM 22, RAM 23, a marker detector 24, and data communication unit 25.
The CPU 21 executes control of the overall camera unit 20. For example, based on position coordinate data of the marker 15 detected by the marker detector 24 and marker characteristic information, the CPU 21 performs control to calculate the position coordinate data of each of the markers 15 (first marker and second marker) of the sticks 10A and 10B. In addition, the CPU 21 performs communication control to transmit the calculated position coordinate data and the like to the center unit 30 via the data communication unit 25.
The ROM 22 stores processing programs of various processing executed by the CPU 21. The RAM 23 stores values acquired or generated in the processing such as position coordinate data of the marker 15 detected by the marker detector 24. In addition, the RAM 23 jointly stores the marker characteristic information of each of the sticks 10A and 10B received from the center unit 30.
The marker detector 24 is an optical camera, for example, and captures images of the player carrying out music playing movements while holding the sticks 10 at a predetermined frame rate. In addition, the marker detector 24 outputs image capture data of each frame to the CPU 21. It should be noted that, although the camera unit 20 is configured to specify the position coordinates of the marker 15 of the stick 10 within image capture space, specifying of the position coordinates of the marker 15 may be performed by the marker detector 24, or may be performed by the CPU 21. Similarly, the marker characteristic information of the captured marker 15 also may be specified by the marker detector 24, or may be specified by the CPU 21.
The data communication unit 25 performs predetermined wireless communication (e.g., infrared communication) with at least the center unit 30. It should be noted that the data communication unit 25 may be configured to perform wireless communication with the sticks 10.
The explanation for the configuration of the camera unit 20 is as given above. Next, the configuration of the center unit 30 will be explained while referencing
The center unit 30 is configured to include a CPU 31, ROM 32, RAM 33, a switch operation detection circuit 34, a display circuit 35, a sound generating device 36, and a data communication unit 37.
The CPU 31 executes control of the overall center unit 30. For example, based on the shot detection received from the stick 10 and the position coordinates of the marker 15 received from the camera unit 20, the CPU 31 performs control such as to generate predetermined musical notes. In addition, the CPU 31 performs communication control with the sticks 10 and the camera unit 20 via the data communication unit 37.
The ROM 32 stores processing programs of various processing executed by the CPU 31. In addition, to be associated with the position coordinates and the like, the ROM 32 stores the waveform data of wind instruments such as the flute, saxophone and trumpet, keyboard instruments such as the piano, stringed instruments such as the guitar, and percussion instruments such as the bass drum, high-hat, snare, cymbal and tam.
By the CPU 31 reading the waveform stored in the ROM 32 to be associated with the position coordinates of the marker 15 upon shot detection (i.e. upon Note-on-Event reception), a musical note in accordance with the music playing movement of the player is generated.
The RAM 33 stores values acquired or generated in processing such as the state of the stick 10 received from the stick 10 (shot detection, etc.) and the position coordinates of the marker 15 received from the camera unit 20.
The switch operation detection circuit 34 is connected with a switch 341, and receives input information through this switch 341. The input information includes a change in the volume of a musical note generated or tone of a musical note generated, a switch in the display of the display device 351, and the like, for example.
In addition, the display circuit 35 is connected with a display device 351, and performs display control of the display device 351.
In accordance with an instruction from the CPU 31, the sound generating device 36 reads waveform data from the ROM 32, generates musical note data and converts the musical note data into an analog signal, and then generates musical notes from a speaker, which is not illustrated.
In addition, the data communication unit 37 performs predetermined wireless communication (e.g., infrared communication) with the sticks 10 and the camera unit 20.
The configurations of the sticks 10, camera unit 20 and center unit 30 configuring the musical instrument 1 have been explained in the foregoing. Next, processing of the musical instrument 1 will be explained while referencing
As shown in
Upon reading the marker characteristic information, the CPU 11 stores this marker characteristic information in the RAM 13, and transmits to the center unit 30 via the data communication unit 16 (Step S2). At this time, the CPU 11 transmits the marker characteristic information to the center unit 30 to be associated with identifying information (stick identifying information) that can distinguish each of the sticks 10A and 10B.
Next, the CPU 11 reads motion sensor information from the motion sensor unit 14, i.e. sensor values outputted by various sensors, and stores the information in the RAM 13 (Step S3). Subsequently, the CPU 11 performs attitude detecting processing of the stick 10 based on the motion sensor information thus read (Step S4). In the attitude sensing processing, the CPU 11 detects the attitude of the stick 10, e.g., displacements or the like in the tilt, roll angle and pitch angle of the stick 10, based on the motion sensor information.
Next, the CPU 11 performs shot detection processing based on the motion sensor information (Step S5). Herein, in a case of a player carrying out music playing using the sticks 10, generally, similar movements as the movements to strike an actual instrument (e.g., drums) are performed. With such (music playing) movements, the player first swings up the stick 10, and then swings down towards a virtual instrument. Then, just before knocking the stick 10 against the virtual instrument, the player applies a force trying to stop the movement of the stick 10. At this time, the player assumes that a musical note will generate at the moment knocking the stick 10 against the virtual instrument; therefore, it is desirable to be able to generate a musical note at the timing assumed by the player. Therefore, in the present embodiment, it is configured so as to generate a musical note at the moment the player knocks the stick against the surface of a virtual instrument, or a short time before then.
Herein, an example of the generation timing of a musical note using the stick 10 will be explained while referencing
Even in a state in which the stick 10 is standing still (portion represented by “a” in
In the standing still state, when the player raises the stick 10 accompanying a swing up movement, it further moves in an opposite direction to gravitational acceleration. As a result, the acceleration applied to the stick 10 increases in the negative direction. Subsequently, when the raising speed is made to decrease in an effort to make stand still, the upward acceleration decreases, and the acceleration in the negative direction of the stick detected by the motion sensor unit 14 decreases (portion represented by “b” in
When the stick 10 reaches the top by the up swing movement, the player performs a down swing movement with the stick 10. With the down swing movement, the stick 10 comes to move downwards. Therefore, the acceleration applied to the stick 10 increases in the positive direction to more than the acceleration in the negative direction detected against the gravitational acceleration. Subsequently, since the player decreases the acceleration in the downward direction for the purpose of a shot, the acceleration applied to the stick 10 increases in the negative direction. In this period, after the timing at which the down swing movement reaches the highest speed, a state is re-entered in which only the gravitational acceleration acts on the stick 10 (portion represented by “c” in
Thereafter, when the player further applies the acceleration in the up swing direction to the stick 10 with the purpose of a shot, the applied acceleration increases in the negative direction. Then, when the shot ends, the stick 10 comes to stand still again, and returns to a state in which the acceleration in the negative direction, countering the direction of the gravitational acceleration, is detected (portion represented by “d” in
In the present embodiment, after the down swing movement has been performed, the moment at which the acceleration in the up swing direction is applied is detected as the moment when the player knocks the stick 10 against a surface of a virtual instrument. In other words, in the portion presented by “d” in
With this timing of shot detection as a sound generation timing, when the aforementioned such sound generation timing is determined as having arrived, the CPU 11 of the stick 10 generates a Note-on-Event, and transmits to the center unit 30. Sound generation processing is thereby executed in the center unit 30, and a musical note generates.
Returning back to
Next, the CPU 11 performs processing to detect information (hereinafter referred to as action information) indicating a predetermined movement (action) of the player based on the motion sensor information, i.e. action detection processing (Step S6). Next, the CPU 11 transmits information detected in the processing of Steps S4 to S6, i.e. attitude information, shot information and action information, to the center unit 30 via the data communication unit 16 (Step S7). At this time, the CPU 11 transmits the attitude information, shot information and action information to the center unit 30 to be associated with stick identifying information.
Next, the CPU 11 performs marker switch on/off processing (Step S8), advances to the processing of Step S3, and repeatedly executes this and following processing. Herein, although marker point switch on/off processing will be explained in detail with
Next, marker switch on/off processing will be explained while referencing
First, the CPU 11 of the stick 10 determines whether or not a down swing has been detected based on the motion sensor information and attitude information, shot information, action information, etc. (Step S11). At this time, in a case of having detected a down swing, the CPU 11 performs switch-on processing of the marker 15 (Step S12), and marker switch on/off processing ends.
On the other hand, in a case of not having detected a down swing, the CPU 11 determines whether or not an up swing has been detected based on the motion sensor information, attitude information, shot information, action information, etc. (Step S13). At this time, in a case of an up swing having been detected, the CPU 11 performs switch-off processing of the marker 15 (Step S14), and the marker switch on/off processing ends. On the other hand, in a case of not having detected an up swing, the CPU 11 ends the marker switch on/off processing.
Herein, the detection of a down swing and up swing of Step S11 and Step S13 can be performed by any method, e.g., the acceleration of the stick 10 in a vertical direction of the stick 10 can be used. The detection of a down swing and an up swing by the CPU 11 will be explained hereinafter, taking a case of the change in acceleration of the motion sensor unit 14 in the vertical direction expressing a change such as that shown in
Start of an up swing movement defines the timing of shot detection. More specifically, in the portion represented by “d” in
In addition, the start of a down swing movement is defined in the portion represented by “c” in
It should be noted that, in the present embodiment, it is configured so as to detect down swing and up swing movements based on the acceleration in the vertical direction detected by the motion sensor unit 14 (acceleration sensor). However, as another example, it may be configured so as to use the attitude information of the stick 10 in the detection of down swing and up swing movements. Herein, displacement in the pitch angle can be used as attitude information. For example, the CPU 11 detects down swing start in a case of the pitch angle having displaced downwards. In addition, the CPU 11 detects up swing start in a case of the pitch angle having displaced upwards or in a case of displacement of the pitch angle downwards having ended.
Furthermore, it may be configured so that the detection of down swing and up swing movements is performed by the camera unit 20. More specifically, it may be configured so that the camera unit 20 distinguishes the activity of the hand of the player from a captured image, and detects down swing and up swing movements. In this case, it may be configured so that the stick 10 receives this detection information from the camera unit 20.
Detection of down swing and up swing movements in Step S11 and Step S13 can be performed according to various methods.
It should be noted that, since the sticks 10 and camera unit 20 are asynchronous, the camera unit 20 may not perform suitable image capturing according to the timing at which the marker 15 switches off. Therefore, with the sticks 10, it may be configured so that the switch-off timing of the marker 15 is delayed by one captured frame of the camera unit 20. It is thereby possible with the camera unit 20 to specify the position coordinates of the marker 15 during shot timing, irrespective of the timing shift, which occurs asynchronously between the stick 10 and camera unit 20.
As shown in
Next, the CPU 21 performs first marker detection processing (Step S23) and second marker detection processing (Step S24). In the respective processing, the CPU 21 acquires, and stores in the RAM 23, marker detection information such as of the position coordinates, size and angle of the marker 15 (first marker) of the stick 10A and the marker 15 (second marker) of the stick 10B, detected by the marker detector 24. At this time, the marker detector 24 detects marker detection information for the markers 15 while emitting light.
Next, the CPU 21 transmits the marker detection information acquired in Step S23 and Step S24 to the center unit 30 via the data communication unit 24 (Step S25), and then advances to the processing of Step S23.
As shown in
Next, the CPU 31 receives marker detection information of each of the first marker and second marker from the camera unit 20, and stores the information in the RAM 33 (Step S33). In addition, the CPU 31 receives attitude information, shot information and action information associated with the stick identifying information from each of the sticks 10A, 10B, and stores in the RAM 33 (Step S34).
Next, the CPU 31 determines whether or not there is a shot (Step S35). In this processing, the CPU 31 determines the presence of a shot according to whether or not a Note-on-Event is received from the sticks 10. At this time, in a case of having determined that there is a shot, the CPU 31 performs shot processing (Step S36). In shot processing, the CPU 31 reads waveform data corresponding to the position coordinates, size, angle, etc. included in the marker detection information from the ROM 32, and outputs the data to the sound generating device 36 along with volume data included in the Note-on-Event. Then, the sound generating device 36 generates a corresponding musical note based on the accepted waveform data.
After Step S36, or in a case of determining NO in Step S35, the CPU 31 determines whether or not there is an action based on the action information received from the sticks 10 (Step S37). At this time, in a case of having determined that there is an action, the CPU 31 performs action processing based on the received action information (Step S38), and advances to the processing of Step S33. On the other hand, in a case of having determined there is no action, the CPU 31 advances to the processing of Step S33.
The configuration and processing of the musical instrument 1 of the present embodiment has been explained in the foregoing. According to such a musical instrument 1, the occurrence of an event (shot) for which position coordinate data of the marker 15 is necessary is estimated, the marker 15 is switched on in advance, and switching on of the marker 15 is ended at the time of this event ending. Since the marker 15 is made to switch on only for the period of time required for position coordinate data of the marker 15, the electricity consumption of the stick 10 can be reduced compared to a case of always being switched on, and it is possible to realize prolonged powering of the stick 10 and a weight reduction.
In addition, a visual rendered effect can be expected by the switch on/off movement of the marker 15 in connection with event (shot) occurrence, whereby it is possible to achieve an improvement in the performance of music using the stick 10.
Although an embodiment of the present invention has been explained in the foregoing, the embodiment is merely an exemplification, and is not to limit the technical scope of the present invention. The present invention can adopt various other embodiments, and further, various modifications such as omissions and substitutions can be made thereto within a scope that does not deviate from the gist of the present invention. These embodiments and modifications thereof are included in the scope and gist of the invention described in the present disclosure, and are included in the invention described in the accompanying claims and the scope of equivalents thereof.
In the above embodiment, a virtual drum set D (refer to
In addition, any processing among the processing configured to be performed by the sticks 10, camera unit 20 and center unit 30 in the above-mentioned embodiment may be configured to be performed by other units (sticks 10, camera unit 20 and center unit 30). For example, it may be configured so that processing such as shot detection, which has been configured to be performed by the CPU 11 of the stick 10, is performed by the center unit 30.
In addition, in the above-mentioned embodiment, light-emission control of the marker 15 possessed by the stick 10 has been explained. However, it is not limited to the sticks 10, and it may be configured so as to perform light-emission control of the present invention on another component having a light-emitting part. In other words, the present invention can be applied to a light-emission controller that detects the start or end of a movement provided to a component having a light-emitting part, and that switches on the light-emitting part in response to detection of the start of movement, as well as causing the light-emitting part to switch off in response to detection of the end of movement. At this time, sensing of the initiation and end of movement can be performed by a CPU (detector) based on a value detected by various sensors (motion sensor unit). In addition, light-emission control responsive to the initiation and end of movement can be performed by a CPU (controller) as well.
In the above-mentioned embodiment, light-emission switch-off control of the marker in response to detecting a start(initiation) and an end of a down swing movement are explained. However, it may be also adoptable that the controller of the light-emitting part controls luminance (brightness) of the marker. For example, it may be adoptable that (a) the controller controls the marker to emit relatively high-intensity light when the start of the down swing movement is detected, and (b) the controller controls the marker to emit relatively low-intensity light when the end of the down swing movement is detected. By adopting the above way of control, it is possible to achieve similar effect with the above-mentioned embodiment as well.
Number | Date | Country | Kind |
---|---|---|---|
2011-181795 | Aug 2011 | JP | national |
2012-179920 | Aug 2012 | JP | national |