The present invention relates to a skill determination program and the like.
A user performing training in various sports fields sometimes receive coaching from an expert. In general, the expert coaches the user using a sense or the like based on the past coaching experience. Recently, the expert sometimes also coaches the user by displaying a video obtained by shooting the user on a display and causing the user to confirm his own form.
In addition, the expert also gives the user advice by causing the user to wear an acceleration sensor, performing sensing, and referring to numerical data which is obtained from a result of the sensing during training. When the advice is given on the basis of such numerical data, it is possible to perform the coaching that makes the user convinced.
Incidentally, dedicated application software configured to evaluate a specific motion of the user by focusing on the specific motion has been sold in some popular sports. Hereinafter, the dedicated application software will be referred to as a dedicated application. The user can confirm his own skill regarding a specific skill using the above-described dedicated application even when there is no expert. For example, when a dedicated application configured to evaluate swing speed of a bat is used, the user can know his own swing speed.
Patent Literature 1: Japanese Laid-open Patent Publication No. 2004-313479
Patent Literature 2: Japanese Laid-open Patent Publication No. 2008-236124
However, the above-described related art has a general problem that it is difficult to automatically determine the user's skill.
For example, the presence of the expert is set as the major premise in the case of performing the coaching by performing the sensing of the user, and the expert sets start and end timings of the sensing and performs analysis of the numerical data, and thus, it is difficult for the user to directly and automatically use the data.
In addition, the dedicated application can be easily used directly by the user, but only enables the evaluation of content limited to some sports and lacks general versatility.
According to an aspect of the embodiment of the invention, a non-transitory computer-readable recording medium stores therein a skill determination program that causes a computer to execute a process including: first determining each second frame corresponding to phase types from second frames having position information of a feature point, which corresponds to a predetermined part or a joint of a body of a second user, based on phase definition information in which a plurality of first frames having position information of a feature point corresponding to a predetermined part or a joint of a body of a first user and the phase types corresponding to the plurality of first frames, respectively, are associated with each other; and second determining a feature amount of a motion, an attitude, or the joint of the second user derived from the feature point included in the second frame determined for each of the phase types based on skill definition information defined by associating a feature amount of a motion, an attitude, or the joint of the first user derived from the feature point included in each of the plurality of first frames, a determination reference of a skill, and a phase type serving as a determination target with each other.
The object and advantages of the invention will be realized and attained by means of the elements and combinations particularly pointed out in the claims.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are not restrictive of the invention.
Hereinafter, embodiments of a skill determination program, a skill determination method, a skill determination device, and a server disclosed in the present application will be described in detail with reference to the drawings. Incidentally, the invention is not limited to the embodiments.
A configuration of a skill determination device according to a first embodiment will be described.
The motion sensors 10a and 10b will be collectively referred to as a motion sensor 10 if appropriate. The motion sensor 10 is a sensor that detects a motion of a man or an object. For example, the motion sensor 10 detects three-dimensional coordinates of a feature point of a man and outputs sensing information in which the feature point and the three-dimensional coordinates are associated with each other to the skill determination device 100. Here, the feature point of the man corresponds to, for example, the man's head, neck, back, waist or other joint parts.
The motion sensor 10 may output the sensing information using any of the related art. For example, the motion sensor 10 corresponds to a reflective-type MA motion sensor, or a light-receiving type thermal sensor. Alternatively, the sensing information may be extracted by causing the man to wear a three-axis acceleration sensor or a three-axis gyro sensor.
The sensing unit 110a is a processing unit that acquires the sensing information of the user who is coached by an expert from the motion sensor 10. In the following description, the user coached by the expert will be simply referred to as the user. The sensing unit 110a successively acquires the sensing information from the motion sensor 10 as frame data and outputs the information to the phase determination unit 150.
The sensing unit 110b is a processing unit that acquires the sensing information of the expert from the motion sensor 10. The sensing unit 110b successively acquires the sensing information and the frame data from the motion sensor 10 and outputs the information to the model data generation unit 130. A data structure of the frame data of the expert is the same as the data structure of the frame data illustrated in
Hereinafter, the frame data included in motion data of the expert will be referred to as first frame data for convenience of description. The frame data included in motion data of the user will be referred to as second frame data.
The storage unit 120 has model data 120a and skill determination definition data 120b. The storage unit 120 corresponds to a storage device such as a semiconductor memory device, for example, a random access memory (RAM), a flash memory, and the like and a hard disk drive (HDD).
The model data 120a is information in which the first frame data and types of phases of the first frame data are associated with each other. The model data 120a corresponds to phase definition information. For example, the types of the phases include “start”, “backswing”, “top”, “impact”, “follow”, “end” and the like. The model data 120a is generated by the model data generation unit 130 to be described later.
In the metadata 30 of
The skill determination definition data 120b is information that defines a feature amount of a motion of the user derived from feature points included in a plurality of second frame data, a determination reference of the skill, and a type of a phase serving as a determination target in an associated manner.
The target phase is information which specifies the phase type serving as a skill determination target. The comparative position is information which defines a type of the second frame data to be compared and a position of the feature point. The reference parameter and a reference value are numerical values which are used in the case of determining good or bad of the skill.
A description will be given regarding a record with a module name “PositionChecker” and a skill determination name “head movement” in
The comparative positions are “start: the head, and current: the head”. Thus, a position of a feature point as a comparison source is a position of a feature point of the head of initial second frame data among the second frame data having the phase type of “strat”. A position of a feature point as a comparison destination is a position of a feature point of the head of current second frame data.
The reference parameters are “8, 10 and 20”. When a difference of a position of the feature point serving as the comparison target is “smaller than 8 cm”, the determination result is defined as “Excellent”. When the difference of the position of the feature point serving as the comparison target is “8 cm or larger and smaller than 10 cm”, the determination result is defined as “Good”. When the difference of the position of the feature point serving as the comparison target is “10 cm or larger and smaller than 20 cm”, the determination result is defined as “Bad”.
Continuously, a description will be given regarding a record with a module name “AngleChecker(1)” and a skill determination name “left elbow angle”. The skill determination of “left elbow angle” determines good or bad of an angle of the user's left elbow during the swing of the driver. The target phase is “start-impact”. The types of the second phase serving as the determination target are “start, backswing, top, and impact” as illustrated in
The comparative positions are “current: the left shoulder, current: the left elbow, and current: the left wrist”. Thus, an angle formed by a line segment passing through a feature point of the left shoulder of current second frame data and a feature point of the left elbow and a straight line passing through the feature point of the left elbow and a feature point of the left wrist becomes the determination target.
The reference parameters are “(135-180), (130-135, 180-190), and (110-130, 190-200)”. When the formed angle is included in “135-180”, the determination result is defined as “Excellent”. When the formed angle is included in “130-135 and 180-190”, the determination result is defined as “Good”. When the formed angle is included in “110-130 and 190-200”, the determination result is defined as “Bad”.
Continuously, a description will be given regarding a record with a module name “AngleChecker(2)” and a skill determination name “forward inclined attitude”. The skill determination of “forward inclined attitude” determines good or bad of a forward inclined attitude of the user during the swing of the driver. The target phase is “start-impact”. The types of the second phase serving as the determination target are “start, backswing, top, and impact” as illustrated in
The comparative positions are “current: the head, and current: the waist”. Thus, an angle formed by a line segment passing through the feature point of the head of the current second frame data and a feature point of the waist and the perpendicular line becomes the determination target.
The reference parameters are “(25-40), (20-25, 40-55) and (8-20, 55-60)”. When the formed angle is included in “25-40”, the determination result is defined as “Excellent”. When the formed angle is included in “20-25 and 40-55”, the determination result is defined as “Good”. When the formed angle is included in “8-20 and 55-60”, the determination result is defined as “Bad”.
The description will be given returning to
For example, the model data generation unit 130 displays the first frame data included in the motion data of the expert on the display device and receives each correspondence of any first frame data with respect to any phase types from an input device. Here, the display device and the input device are not illustrated.
The expert operates the input device and inputs a relationship between the first frame data and the phase type to the model data generation unit 130. The model data generation unit 130 generates the metadata 30 illustrated in
The skill determination definition unit 140 is a processing unit that generates the skill determination definition data 120b. The skill determination definition unit 140 saves the generated the skill determination definition data 120b in the storage unit 120.
For example, the skill determination definition unit 140 displays a setting screen for skill determination definition on the display device and receives the information relating to the skill determination definition from the input device. The expert operates the input device to input the information relating to the skill determination definition. The skill determination definition unit 140 generates the skill determination definition data 120b based on the information relating to the skill determination definition and saves the generated data in the storage unit 120.
The expert operates the input device to input the correspondence relationship between the frame number and the phase type to the phase setting screen 50a while referring to the playback screen 50c and the frame number the display screen 50d. For example, the model data generation unit 130 generates the model data 120a based on the information input to the phase setting screen 50a.
The expert operates the input device to select any checker name on the skill setting screen 50b. Hereinafter, a description will be given regarding the respective parameter setting screens to be displayed when “PositionChecker”, “AngleCheker(1)”, and “AngleCheker(2)” are selected.
In the example illustrated in
For example, the skill determination definition unit 140 generates the record, which corresponds to the module name “PositionChecker” in
In the example illustrated in
For example, the skill determination definition unit 140 generates the record, which corresponds to the module name “AngleChecker(1)” in
In the example illustrated in
For example, the skill determination definition unit 140 generates the record, which corresponds to the module name “AngleChecker(2)” in
The description will be given returning to
Hereinafter, an example of a process of the phase determination unit 150 will be described in detail. The phase determination unit 150 saves the second frame data of the user in a memory and executes a correction process, feature amount calculation, and frame matching in this order.
A description will be given regarding an example of the correction process which is executed by the phase determination unit 150. The phase determination unit 150 performs vertical axis correction of the second frame data. For example, when the motion sensor 10 is an installation-type sensor, there is a case where an installation position or an installation angle is different from that of the previous environment, and the vertical axis of the second frame data is corrected in accordance with the installation environment. For example, the phase determination unit 150 displays the second frame data on the display device to receive input of correction information from the user and cause the vertical axis of the second frame data to correspond to the perpendicular line. In addition, the phase determination unit 150 may perform correction such that a direction of the user corresponds to a direction facing the front. After receiving the input of the correction information, the phase determination unit 150 may perform the vertical axis correction or the direction correction of the remaining second frame data using the corresponding correction information.
The phase determination unit 150 also executes the correction process for suppressing a variation in the position of the feature point of the second frame data. For example, the phase determination unit 150 suppresses the variation by setting an average value of the positions of the feature points of the previous and subsequent second frame data as the position of the feature point of the second frame data therebetween. In addition, the phase determination unit 150 may remove a noise component included in the second frame data using a low-pass filter.
A description will be given regarding an example of the feature amount calculation which is executed by the phase determination unit 150. In the first embodiment, it is assumed to calculate the three-dimensional coordinates of the feature point with respect to the respective joints included in the second frame data as the feature amount, for example.
A description will be given regarding an example of another feature amount calculation which is executed by the phase determination unit 150. When picking up a joint featuring a unique swing of each of the sports, the phase determination unit 150 may calculate three-dimensional coordinates and speed and acceleration of the hand, the waist, the finger or the like as the feature amount.
When calculating general data, that is not unique in each of the sports as the feature amount, the phase determination unit 150 may calculate three-dimensional coordinates and speed and acceleration of all joints as the feature amount. In addition, the phase determination unit 150 may calculate positions of the center of gravity and the like of all joint positions as the feature amount.
A description will be given regarding an example of the frame matching which is executed by the phase determination unit 150.
In regard to Formula (1), xs0j, ys0j and zs0j are three-dimensional coordinates of a feature point of a certain joint (joint corresponding to a numerical value of j) of the second frame data. Reference signs xtij, ytij and ztij are three-dimensional coordinates of a feature point of a certain joint (joint corresponding to the numerical value of j) of first frame data ti. Reference sign n is the number of feature points of the joints. The phase determination unit 150 specifies a set of the second frame data S0 and the first frame data where the joint average distance is the minimum among the respective joint average distances calculated by Formula (1).
In the example illustrated in
Meanwhile, the phase determination unit 150 corrects a value of the joint average distance using a weight when calculating the joint average distance using Formula (1). For example, the phase determination unit 150 may correct the joint average distance by dividing the joint average distance by the weight.
For example, first frame data matched with the second frame data S0 is highly likely to be first frame data that is close to first frame data matched with the previous second frame data S−1. For example, when the first frame data matched with the second frame data S−1 is set as the first frame data t4, the first frame data matched with the second frame data S0 is highly likely to be the first frame data which is after and close to the first frame data t4.
Thus, the larger weight is set as the first frame data closer to the matched first frame data among the first frame after the first frame data which is matched with the second frame data S−1. In addition, the smaller weight is set in the frame before the matched first frame data even if close to the matched first frame data due to the low possibility.
It is possible to prevent the first frame data to be subjected to matching from being out of order or jumping by correcting the value of the joint average distance calculated by Formula (1) using the weight.
The phase determination unit 150 repeatedly executes the above-described process with respect to the other second frame data to determine the first frame data corresponding to each of the second frame data and determine the phase type of each of the second frame data. The phase determination unit 150 extracts the second frame data having the phase types between start and end among the second frame data and saves the extracted data as the motion data in a file and outputs the data to the output unit 170.
The skill determination unit 160 is a processing unit which determines the user's skill for each of the phase types based on the skill determination definition data 120b and the feature amount of the motion, the attitude, and the joint of the user derived from the feature point included in the second frame data which is extracted for each of the phase types.
An example of a process of the skill determination unit 160 will be described. The skill determination unit 160 generates a determination module with reference to the skill determination definition data 120b. For example, the skill determination unit 160 generates the “PositionChecker” module, the “AngleChecker(1)” module, and the “AngleChecker(2)” module in the example illustrated in
The skill determination unit 160 outputs the second frame data received from the phase determination unit 150 to the corresponding module based on the phase type corresponding to this second frame data and the skill determination definition data 120b. When receiving the second frame data, the module outputs a result of determination of the user's skill based on the data defined in the skill determination definition data 120b.
As illustrated in
For example, when the formed angle is included in “135-180”, the “AngleChecker(1)” module 160b outputs the determination result “Excellent”. When the formed angle is included in “130-135 and 180-190”, the “AngleChecker(1)” module 160b outputs the determination result “Good”. When the formed angle is included in “110-130 and 190-200”, the “AngleChecker(1)” module 160b outputs the determination result “Bad”.
Incidentally, the “AngleChecker(1)” module 160b may output a comment in addition to the determination result. For example, the “AngleChecker(1)” module 160b outputs a comment “slightly bent” when the formed angle is included in “130-135” and outputs a comment “slightly extended” when the formed angle is included in “180-190”. The “AngleChecker(1)” module 160b outputs a comment “bent too much” when the formed angle is included in “110-130” and outputs a comment “extended too much” when the formed angle is included in “190-200”.
As illustrated in
For example, when the formed angle is included in “25-40”, the “AngleChecker(2)” module 160c outputs the determination result “Excellent”. When the formed angle is “20-25 and 40-55”, the “AngleChecker(2)” module 160c outputs the determination result “Good”. When the formed angle is “8-20 and 55-60”, the “AngleChecker(2)” module 160c outputs the determination result “Bad”.
Incidentally, the “AngleChecker(2)” module 160c may output a comment in addition to the determination result. For example, “AngleChecker(2)” the module 160c outputs a comment “slightly upright” when the formed angle is included in “20-25” and outputs a comment “slightly stooping” when the formed angle is included in “40-55”. The “AngleChecker(2)” module 160c outputs a comment “too much upright” when the formed angle is included in “8-20” and outputs a comment “stooping too much” when the formed angle is included in “55-60”.
Continuously,
The “PositionChecker” module 160a performs the user's skill determination by evaluating the second frame data Sm based on the comparative position and the reference parameter and outputs the determination result. For example, the “PositionChecker” module 160a outputs the determination result “Excellent” when a difference between the position of the feature point of the head of the second frame data S0 and the position of the feature point of the head of the second frame data Sm is “smaller than 8 cm”. The “PositionChecker” module 160a outputs the determination result “Good” when the difference between the position of the feature point of the head of the second frame data S0 and the position of the feature point of the head of the second frame data Sm is “8 cm or larger and smaller than 10 cm”. The “PositionChecker” module 160a outputs the determination result “Bad” when the difference between the position of the feature point of the head of the second frame data S0 and the position of the feature point of the head of the second frame data Sm is “10 cm or larger and smaller than 20 cm”.
The “PositionChecker” module 160a may output a comment in addition to the determination result. For example, the “PositionChecker” module 160a outputs a comment “moving too much” when the difference between the position of the feature point of the head of the second frame data S0 and the position of the feature point of the head of the second frame data Sm is “10 cm or larger”.
Incidentally, the target phase of “AngleChecker(1)” is “start-impact” as illustrated in
The skill determination unit 160 outputs information of the determination result in which the determination results of the respective modules and the second frame data are associated with each other to the output unit 170.
The output unit 170 is a processing unit which outputs the determination result of the skill determination unit 160 through image information, voice information, or physical stimulation with respect to the user. Hereinafter, an example of a process of the output unit 170 will be described.
A description will be given regarding a display screen which is generated by the output unit 170 based on the determination result of the skill determination unit 160.
The output unit 170 displays the second frame data of the user, acquired from the skill determination unit 160, to the own data screen 200a. In addition, the output unit 170 may perform the display such that the respective feature points of the second frame data, the feature amount of the motion, the attitude, and the joint, and the comment are associated with each other. In the example illustrated in
The output unit 170 displays the first frame data of the model data 120a on the model data screen 200b. For example, the output unit 170 causes the phase type of the first frame data to correspond to the phase type of the second frame data that is being displayed on the own data screen 200a. In addition, the output unit 170 may perform the display such that the respective feature points of the first frame data and the feature amount of the motion, the attitude, and the joint are associated with each other. In the example illustrated in
The output unit 170 displays the skill determination result, acquired from the skill determination unit 160, to the skill determination screen 200c. For example, the output unit 170 may perform the display such that the skill determination name, the determination result, the feature amount of the motion, the attitude, and the joint, and the comment of are associated with each other.
The output unit 170 may update the display screens 200a, 200b and 200c illustrated in
In addition, the output unit 170 may change the phases of the second frame data in the own data screen 200a of the display screen 200 in
The output unit 170 may generate a display screen such that the respective phase types, the second frame data, and the determination result are associated with each other.
Continuously, a description will be given regarding a process when the output unit 170 outputs the voice information based on the determination result. The output unit 170 may output a skill result or a point that needs to be fixed in the voice in accordance with the determination result output from the skill determination unit 160. For example, the output unit 170 outputs voice, such as “the head moves too much”, when the determination result relating to the head movement is Bad. Any voice to be output with respect to any determination result is set in a table or the like in advance, and the output unit 170 outputs the voice based on the corresponding table.
Continuously, a description will be given regarding a process when the output unit notifies the user of the determination result through the physical stimulation. The user wears a device including a small motor, and the output unit 170 causes the small motor to operate in accordance with the determination result. For example, the output unit 170 changes the number of rotations of the small motor in response to the various determination results “Excellent, Good and Bad”. The number of rotations of the small motor with respect to the determination result is set in a table or the like in advance, and the output unit 170 causes the small motor to rotate based on the corresponding table.
The user wears a cooling device such as a Peltier element, and the output unit 170 cools the cooling device in accordance with the determination result. For example, the output unit 170 changes the temperature of the cooling device in accordance with the various determination results “Excellent, Good and Bad”. The temperature of the cooling device with respect to the determination result is set in a table or the like in advance, and the output unit 170 controls the cooling device based on the corresponding table.
The user wears a device that causes a low-frequency current to flow, and the output unit 170 causes the device to generate the current in accordance with the determination result. For example, the output unit 170 changes the magnitude of the current in accordance with the various determination results “Excellent, Good and Bad”. The magnitude of the current with respect to the determination result is set in a table or the like in advance, and the output unit 170 controls the device based on the corresponding table.
Further, the user may wear a power suit or an artificial muscle, and the output unit 170 may cause the power suit or the artificial muscle to move in accordance with the motion data of the expert so that the user may experience the motion of the expert.
Next, a processing procedure of the skill determination device 100 according to the first embodiment will be described.
The skill determination unit 160 generates the module based on the skill determination definition data 120b and sets a parameter (Step S103). The phase determination unit 150 starts to acquire the motion data of the user (Step S104). The skill determination device 100 determines whether an end event has been detected via the input device (Step S105).
When the skill determination device 100 detects the end event via the input device (Yes in Step S105), the phase determination unit 150 executes a process of ending the acquisition of the user's motion data (Step S106).
On the contrary, the end event has not been detected via the input device (No in Step S105), the skill determination device 100 determines whether an event of the skill determination process has been detected via the input device (Step S107). When the skill determination device 100 has detected the event of the skill determination process via the input device (Yes in Step S107), the skill determination unit 160 executes the skill determination process (Step S108) and transitions to Step S105. Incidentally, the event of the skill determination process is an event that is generated when the sensing unit 110a acquires the frame data from the motion sensor.
On the other hand, when the skill determination device 100 has detected the event of a playback process via the input device (No in Step S107), the output unit 170 of the skill determination device 100 executes the playback process (Step S109) and transitions to Step S105.
Next, the processing procedure of the skill determination process illustrated in Step S108 will be described.
The phase determination unit 150 calculates the feature amount of the second frame data (Step S203). The phase determination unit 150 extracts the first frame data which has the closest feature amount to the second frame data among the model data 120a (Step S204). The phase determination unit 150 extracts the phase type corresponding to the first frame data and grants the extracted phase type to the second frame data (Step S205). Incidentally, the phase types include not only the phase types that have one-to-one correspondence with the frame numbers such as “start” and “backswing” as illustrated in reference sign 50a in
The phase determination unit 150 determines whether the extracted phase type is between “Start” and “End” (Step S206). When the phase type is not between “start” and “End” (No in Step S206), the phase determination unit 150 removes the second frame data stored in S201 from the memory (Step S213) and ends the skill determination process.
On the other hand, when the phase type is between “start” and “End” (Yes in Step S206), the skill determination unit 160 outputs the second frame data to the module corresponding to the phase type and determines the skill (Step S207). The output unit 170 outputs the determination result (Step S208).
The phase determination unit 150 determines whether the extracted phase type is “End” (Step S209). When the extracted phase type is not “End” (No in Step S209), the phase determination unit 150 ends the skill determination process.
On the other hand, when the extracted frame type is “End” (Yes in Step S209), the phase determination unit 150 transitions to Step S210. The phase determination unit 150 extracts a series of the second frame data having the phase types between “Start” and “End” among the second frame data, saved in the memory, and saves the extracted data as the motion data in the file (Step S210). The output unit 170 outputs the determination result of a series of the motion data (Step S211), removes the second frame data stored in S201 from the memory (Step S212), and ends the skill determination process.
Next, a description will be given regarding a processing procedure of a setting process which is executed by the model data generation unit 130 and the skill determination definition unit 140.
When an event of motion data acquisition selection has detected (motion data acquisition selection in Step S301), the skill determination device 100 transitions to Step S302. The model data generation unit 130 or the skill determination definition unit 140 acquires the motion data and saves the motion data in the memory (Step S302) and transitions to Step S301.
When an event of phase setting selection has been detected (phase setting selection in Step S301), the skill determination device 100 transitions to Step S303. The model data generation unit 130 saves the phase type and the frame number in the memory (Step S303) and transitions to Step S301.
When an event of skill determination definition selection has been detected (skill determination definition selection in Step S301), the skill determination device 100 transitions to Step S304. The skill determination definition unit 140 saves the module name, the skill determination name, and the parameter definition, which are to be used, in the memory (Step S304) and transitions to Step S301.
When the skill determination device 100 has detected an event of save selection (save selection in Step S301), the model data generation unit 130 outputs the motion data and the correspondence relationship between the phase type and the frame number saved in the memory into the file (Step S305). The skill determination definition unit 140 outputs the skill determination definition data 120b saved in the memory into the file (Step S306) and transitions to Step S301.
Next, an effect of the skill determination device 100 according to the first embodiment will be described. The skill determination device 100 extracts the second frame data corresponding for each of the phase types from the motion data of the user and determines the user's skill with respect to the second frame data for each of the phase types based on the skill determination definition data 120b. Thus, generally, it is possible to automatically determine the user's skill according to the skill determination device 100.
For example, the skill determination definition unit 140 appropriately updates the skill determination definition data 120b to perform the skill determination based on the information from the input device, and the skill determination unit 160 performs the skill determination based on the corresponding skill determination definition data 120b. Since the logic for skill determination is embedded in the related art, a skill determination target is fixed, but the skill determination definition data 120b can be appropriately updated by the skill determination definition unit 140, and accordingly, it is possible to enhance general versatility. In addition, the skill determination definition data 120b is defined as a combination of the module and the parameter definition, and thus, it is easy to reuse the module and the parameter definition which are defined in another target.
In addition, the skill determination device 100 displays the skill determination result associated with the motion data of the user and the motion data of the expert on the display screen. Accordingly, the user can know a point that needs to be improved by himself even if there is no expert around. In addition, it is possible to grasp the point that needs to be improved through the playback after performing swing without caring the screen one by one. In addition, it is possible to compare a difference between the user and the expert and harness the difference for improvement of a skill. In addition, the second frame data of the user is managed in association with the phase type, and thus, can be easily handled in the case of performing the analysis and the like.
The skill determination device 100 determines the phase type corresponding to the second frame data by comparing the model data 120a and the respective second frame data included in the motion data of the user and extracts the second frame data for each of the phase types. For example, the skill determination device 100 specifies the first frame data corresponding to the second frame data using a degree of similarity of each of the first frame data with respect to the second frame data, and determines the phase type corresponding to the first frame data as a type of the second frame data. Thus, it is possible to accurately determine the type of the second frame data of the user and to improve the skill determination accuracy.
When performing the matching, the skill determination device 100 corrects the degree of similarity between the first frame data and the second frame data using the weight. This weight is determined using
The skill determination device 100 outputs the skill determination result through the image information, the voice information, or the physical stimulation with respect to the user. Accordingly, it is possible to support improvement of the user's skill using various notification methods.
Meanwhile, the above-described process of the skill determination device 100 is exemplary. Hereinafter, another process which is executed by the skill determination device 100 will be described.
For example, a server on a network may be configured to serve the function of the skill determination device 100. The server acquires and stores the motion data from terminal devices of the user and the expert. In addition, the model data 120a and the skill determination definition data 120b are saved in the server. When being accessed by the user's terminal device, the server displays the motion data and the skill determination result on a web screen of the terminal device.
The skill determination definition unit 140 receives the input from the input device and generates the skill determination definition data 120b, but may automatically generate skill determination definition data 120b based on the model data 120a. For example, the skill determination definition unit 140 analyzes the model data 120a and sets a result obtained from statistical processing of an average value of parameters of the skill determination as the skill determination definition data 120b. For example, an average value of the amount of movement of the feature point of the head of the first frame data included in the model data 120a is set as a. In this case, the skill determination definition unit 140 sets reference parameters of the record having the module name “PositionChecker” as “α, α+a and α+2a”. Reference sign a is a numerical value that is appropriately set.
The skill determination unit 160 determines the skill for each of the second frame data and outputs the determination result, but is not limited thereto. The skill determination unit 160 may convert the respective skill determination results into the total score and displays the total score. As illustrated in
The skill determination unit 160 may perform the skill determination process with respect to motion data of the model data as well as the motion data of the user. In this case, the output unit 170 may display a difference between a determination result with respect to the motion data of the model data 120a and the determination result with respect to the motion data of the user.
A way of presenting the motion data output by the output unit 170 is not necessarily fixed. For example, the output unit 170 may receive an operation from the input device and display the motion data at different angles such as a back surface, a side surface and an upper side. In addition, the output unit 170 may change a way of presenting the motion data of the model data 120a interlocking with a change of a way of presenting the motion data of the user.
Meanwhile, the case of performing the skill determination of the golf has been described in the above-described embodiment, but the skill determination device 100 according to the first embodiment can be also applied to the sports other than the golf. For example, the skill determination device 100 can be also applied to tennis, athletics, dance, a way of using cooking equipment, playing of musical instrument, and the like.
Next, a description will be given regarding an example of a computer that executes a skill determination program which implements the same function as the skill determination device 100 illustrated in the above-described embodiment.
As illustrated in
The secondary storage unit 307 includes a skill determination program 307a, a skill setting program 307b, and various types of data 307c. The skill determination program 307a performs processes corresponding to the phase determination unit 150, the skill determination unit 160, and the output unit 170 of
Incidentally, each of the programs 307a and 307b is not necessarily stored in the secondary storage unit 307 from the beginning. For example, the respective programs may be stored in “portable the physical media”, such as a flexible disk (FD), a CD-ROM, the DVD disk, a magneto-optical disk, and an IC card, which are inserted into the computer 300. Then, the computer 300 may be configured to read and execute the respective programs 307a and 307b.
The user terminal 400 has the same function as the skill determination device 100 illustrated in the above-described first embodiment. The user terminal 400 is a processing unit which determines user's skill by acquiring the motion data of the user and notifies the server 500 of a result of the skill determination. Incidentally, the user terminal 400 may be connected to the skill determination device 100 and acquire the determination result of the user's skill from the skill determination device 100 as a connection destination.
The user operates the user terminal 400 to access the server 500 and refers to the past skill determination result stored in the server 500.
The server 500 receives information on the skill determination result from the user terminal 400 and holds the information. In addition, when receiving the access with respect to the information on the skill determination result from the user terminal, the server 500 notifies the user terminal 400 of the skill determination result.
Here, the server 500 displays an advertisement banner directly or indirectly relating to a type of sports that has been subjected to the skill determination on a display screen of the user terminal 400 when notifying the user terminal 400 of the information on the skill determination result. In addition, the server 500 displays information on a product on the display screen in accordance with the determination result of the user's skill. For example, the server 500 displays a golf-relating advertisement banner when the type of sports is a golf and notifies the user terminal 400 of information on a golf product for the skill. Incidentally, the user terminal 400 may be similarly notified of the advertisement banner or the information on the product in the case of a baseball, tennis, athletics, dance, a way of using cooking equipment, or playing of musical instrument other than the golf.
The communication unit 510 is a processing unit which executes data communication with the respective user terminals 400 via the network 50. The communication unit 510 corresponds to a communication device. The control unit 530 to be described later exchanges data with the respective user terminals 400 via the communication unit 510.
The storage unit 520 includes a skill determination result table 520a, a personal information table 520b, a banner information table 520c, and a product table 520d. The storage unit 520 corresponds to a storage device such as semiconductor memory device, for example, a random access memory (RAM) and a read only memory (ROM), a flash memory, and the like.
The skill determination result table 520a is a table which holds the information on the skill determination result to be notified by the user terminal 400.
The personal information table 520b is a table which holds the user's personal information.
The banner information table 520c is a table which holds information relating to the advertisement banner to be displayed on the display screen of the user terminal.
The product table 520d is a table which defines a product in accordance with the determination result of the user's skill.
The control unit 530 includes an acquisition unit 530a, a reception unit 530b, a retrieval unit 530c, and a screen generation unit 530d. The control unit 530 corresponds to an integrated device, for example, an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), or the like. In addition, the control unit 530 corresponds to an electronic circuit, for example, a CPU, a micro processing unit (MPU), or the like.
The acquisition unit 530a is an acquisition unit which acquires information relating to the skill determination result from the user terminal 400. For example, the information relating to the skill determination result is information in which the user identification information, the item, the skill determination result, and the motion data are associated with each other. The acquisition unit 530a associates the user identification information, the item, and the skill determination result with each other and stores the resultant in the skill determination result table 520a.
When storing the information relating to the skill determination result in the skill determination result table 520a, the acquisition unit 530a adds the information on the skill determination result in a record if the record on the same set of the user identification information and the item has been already stored.
The reception unit 530b is a processing unit which receives access with respect to information on the past skill determination result, which has been stored in the skill determination result table 520a, from the user terminal 400. For example, the user terminal 400 is configured to notify the server 500 of the user identification information when performing the access with respect to the information on the past skill determination result. The reception unit 530b outputs the user identification information received from the user terminal 400 to the retrieval unit 530c.
The retrieval unit 530c is a processing unit which retrieves the past skill determination result corresponding to the user identification information, the advertisement banner information relating to the item serving as the skill determination target, and the product in accordance with the determination result of the user's skill. In the following description, the past skill determination result corresponding to the user identification information will be referred to as history information if appropriate. The retrieval unit 530c outputs the retrieved history information, the advertisement banner information, and the information on the product to the screen generation unit 530d. Hereinafter, an example of a process of the retrieval unit 530c will be described.
First, an example of a process of retrieving the history information in the retrieval unit 530c will be described. The retrieval unit 530c compares the user identification information acquired from the reception unit 530b and the skill determination result table 520a and retrieves the item and the skill determination result corresponding to the user identification information. The retrieval unit 530c outputs the information on the retrieved skill determination result to the screen generation unit 530d as the history information. The item retrieved by the retrieval unit 530c is used in the case of retrieving the advertisement banner information which will be described later. In addition, the information on the skill determination result retrieved by the retrieval unit 530c is used in the case of retrieving the product information which will be described later.
Continuously, an example of a process of retrieving the advertisement banner information in the retrieval unit 530c will be described. The retrieval unit 530c compares the user identification information and the personal information table 520b and specifies the gender and the age corresponding to the user identification information. Then, the retrieval unit 530c compares a set of the item, the gender, and the age with the condition of the banner information table 520c and specifies the corresponding record. The retrieval unit 530c outputs the advertisement banner information included in the specified record to the screen generation unit 530d.
For example, the retrieval unit 530c specifies a record on the first row in
Continuously, an example of a process of retrieving the product information in the retrieval unit 530c will be described. The retrieval unit 530c compares a set of the item and the skill determination result with the condition of the product table 520d and specifies the corresponding record. The retrieval unit 530c outputs the product information included in the specified record to the screen generation unit 530d. Examples of the product information include the product name and the comment. When a plurality of skill determination results are present with respect to the same user identification information, the retrieval unit 530c retrieves the product information using the latest skill determination result.
For example, it is assumed that the item is “golf”, and the determination result of waist rotation is “45°±α” and the determination result of the right knee angle is “10°±α” in the phase “impact” included in the skill determination result. In this case, the retrieval unit 530c retrieves a set of the product name and the comment included in a record on the first row of
The screen generation unit 530d is a processing unit which generates a display screen to be displayed on the screen of the user terminal 400. For example, the screen generation unit 530d generates the display screen by arranging the information on the skill determination result, the advertisement banner information, and the product information, which are acquired from the retrieval unit 530c, on the screen. The screen generation unit 530d notifies the user terminal 400, who accessed the skill determination result, of information on the generated display screen.
Next, an example of a processing procedure of the server 500 according to the second embodiment will be described.
The retrieval unit 530c retrieves the advertisement banner information with reference to the banner information table 520c (Step S303). The retrieval unit 530c retrieves the product information with reference to the product table 520d (Step S304).
The screen generation unit 530d of the server 500 generates the display screen on which the history information, the advertisement banner information, and the product information are arranged (Step S305). The screen generation unit 530d transmits the information on the display screen to the user terminal 400 (Step S306).
Next, an effect of the server 500 according to the second embodiment will be described. The server 500 displays the advertisement banner directly or indirectly relating to the item that has been subjected to the skill determination on the display screen of the user terminal 400 when notifying the user terminal 400 of the information on the past skill determination result. In addition, the server 500 displays information on a product on the display screen in accordance with the determination result of the user's skill. Accordingly, it is possible to advertise the directly or indirectly relating information to the user who refers to the skill determination result. In addition, it is possible to recommend the product information in accordance with the skill determination result.
Meanwhile, the reception unit 530b of the server 500 may notify the user terminal 400 of billing Information in accordance with the amount of data when the information on the skill determination result from the user terminal 400 is stored in the skill determination result table 520a. For example, the reception unit 530b acquires the amount of data stored in the skill determination result table for each of the user identification information. The reception unit 530b notifies the user terminal 400 corresponding to the user identification information with the amount of data exceeding a threshold that the amount of data to be uploaded exceeds the threshold so that the further upload is subjected to be charged. When the user agrees payment with respect to such notification, the reception unit 530b receives a payment method and saves the method in association with the user identification information.
Incidentally, when notifying the server 500 of the information relating to the skill determination result, the user operating the user terminal 400 may select the data to be notified instead of notifying the server 500 of the entire information relating to the skill determination result. For example, when it is determined that the amount of data relating to the motion data is large, the user can save the amount of data by notifying the server 500 only of the skill determination result. In addition, the user can reduce the amount of data by notifying the server 500 only of a snapshot instead of the motion data.
The user terminal 400 has the same function as the skill determination device 100 illustrated in the above-described first embodiment. The user terminal 400 is a processing unit which determines user's skill by acquiring motion data of the user and notifies the server 600 of a result of the skill determination. Incidentally, the user terminal 400 may be connected to the skill determination device 100 and acquire the determination result of the user's skill from the skill determination device 100 as a connection destination.
The server 600 may classify users into a plurality of groups based on features of the users. When receiving the skill determination result from the user terminal 400, the server 600 determines the group to which the user serving as a target of this skill determination belongs and notifies user terminals of the respective users included in the determined group of information on the skill determination result.
The communication unit 610 is a processing unit which executes data communication with the respective user terminals 400 via the network 50. The communication unit 610 corresponds to a communication device. The control unit 630 to be described later exchanges data with the respective user terminals 400 via the communication unit 610.
The storage unit 620 includes a skill determination result table 620a, a personal information table 620b, and a group management table 620c. The storage unit 620 corresponds to a storage device such as a semiconductor memory device, for example, a RAM, a ROM, a flash memory, and the like.
The skill determination result table 620a is a table which holds the skill determination result to be notified by the user terminal 400. A data structure of the skill determination result table 620a is the same as the data structure of the skill determination result table 520a illustrated in
The personal information table 620b is a table which holds the user's personal information.
The group management table 620c is a table which holds information of the group to which the user belongs.
The control unit 630 includes an acquisition unit 630a, a classification unit 630b, and a social networking service (SNS) providing unit 630c. The SNS providing unit 630c corresponds to a notification unit. The control unit 630 corresponds to an integrated device, for example, an ASIC, a FPGA, and the like. In addition, the control unit 630 corresponds to an electronic circuit, for example, a CPU, an MPU, and the like.
The acquisition unit 630a is an acquisition unit which acquires information relating to the skill determination result from the user terminal 400. For example, the information relating to the skill determination result is information in which user identification information, an item, and the skill determination result are associated with each other. The acquisition unit 630a associates the user identification information, the item, and the skill determination result with each other and stores the resultant in the skill determination result table 620a.
When storing the information relating to the skill determination result in the skill determination result table 620a, the acquisition unit 630a adds the information on the skill determination result in a record if the record on the same set of the user identification information and the item has been already stored.
The classification unit 630b is a processing unit which classifies the user identification information into groups for each of features of users with reference to the personal information table 620b. The classification unit 630b associates the group identification information and the user identification information which belongs to the group of the group identification information with each other based on a classified result and registers the resultant in the group management table 620c.
Hereinafter, an example of a process of the classification unit 630b will be described. For example, the classification unit 630b specifies the user identification information belong to the same school with reference to the personal information table 620b and classifies the specified user identification information into the same group. Alternatively, the classification unit 630b specifies the user identification information belong to the same driving range with reference to the personal information table 620b and classifies the specified user identification information into the same group.
The above-described classification process is exemplary, and the classification unit 630b may classify the user identification information into a group of the same age, a group of family, and a group that is coached by the same coach. In addition, the classification unit 630b may classify the user identification information with the same skill level into the same group with reference to the skill determination result table 620a. For example, the classification unit 630b may obtain scores of the respective user identification information by summing scores in accordance with good, bad and excellent included in the skill determination results and classify the user identification information having the similar scores into the same group.
The SNS providing unit 630c is a processing unit which provides the SNS to the respective user terminals 400. For example, the SNS providing unit 630c cause the information on the skill determination result or information on another electronic bulletin board to be shared among users of the user identification information that belong to the same group with reference to the group management table 620c.
For example, the SNS providing unit 630c determines the user identification information that belongs to the same group as the user identification information corresponding to the skill determination result based on the group management table 620c when the information on the skill determination result is registered in the skill determination result table 620a. The SNS providing unit 630c notifies the user terminal 400 corresponding to the specified user identification information of the information on the skill determination result registered in the skill determination result table 620a. The SNS providing unit 630c may specify the address of the user identification information which belongs to the group with reference to the personal information table 620b and notify the information on the skill determination result using the address as a destination.
For example, the user identification information U101, U103 and U114 belong to the group having the group identification information “G101” with reference to
Next, an example of a processing procedure of the server 600 according to the third embodiment will be described.
The SNS providing unit 630c of the server 600 determines whether the skill determination result has been received (Step S402). When the skill determination result has not been received (No in Step S402), the SNS providing unit 630c transitions to Step S402 again.
When the skill determination result has been received (Yes in Step S402), the SNS providing unit 630c stores the received skill determination result in the skill determination result table 620a (Step S403). The SNS providing unit 630c specifies the group to which the user identification information corresponding to the received skill determination result belongs (Step S404).
The SNS providing unit 630c specifies the user identification information that belongs to the specified group and notifies the user terminal 400 corresponding to the specified user identification information of the skill determination result (Step S405).
Next, an effect of the server 600 according to the third embodiment will be described. The server 600 classifies the user identification information into the plurality of groups based on features of the users. When receiving the skill determination result from the user terminal 400, the server 600 determines the group to which the user serving as the skill determination target belongs and notifies the user terminals of the respective user identification information included in the determined group of the information on the skill determination result. Thus, it is possible to make communication among the respective users who belong to the same group easy. For example, it is possible to exchange an advice for improvement of the skill on the social network. In addition, the above-described process can be used as a site or an off-line window indirectly relating to solicitation to the item, solicitation to an event, or solicitation to a meeting.
The user terminal 400 has the same function as the skill determination device 100 illustrated in the above-described first embodiment. The user terminal 400 is a processing unit which determines user's skill by acquiring motion data of the user and notifies the server 700 of a result of the skill determination. Incidentally, the user terminal 400 may be connected to the skill determination device 100 and acquire the determination result of the user's skill from the skill determination device 100 as a connection destination.
In addition, the user operates the user terminal 400 to acquire model data of his favorite expert from the server 700 and determines the user's skill using the acquired model data of the expert.
The server 700 is a server which manages a plurality of types of the expert model data. When being accessed by the user terminal 400, the server 700 displays the plurality of types of the expert model data and receives selection of any expert model data. When receiving the selection of the expert model data, the server 700 notifies the user terminal 400 of the model data for which the selection has been received. Incidentally, the server 700 may select the expert model data suitable for the user and notify the user terminal 400 of the selected expert model data.
The communication unit 710 is a processing unit which executes data communication with the respective user terminals 400 via the network 50. The communication unit 710 corresponds to a communication device. The control unit 730 to be described later exchanges data with the respective user terminals 400 via the communication unit 710.
The storage unit 720 includes a skill determination result table 720a, a personal information table 720b, and an expert data table 720c. The storage unit 720 corresponds to a storage device such as a semiconductor memory device, for example, a RAM, a ROM, a flash memory, and the like.
The skill determination result table 720a is a table which holds the skill determination result to be notified by the user terminal 400. A data structure of the skill determination result table 720a is the same as the data structure of the skill determination result table 520a illustrated in
The personal information table 720b is a table which holds the user's personal information. A data structure of the personal information table 720b is the same as the data structure of the personal information table 520b illustrated in
The expert data table 720c is a table which holds information relating to the expert model data.
The control unit 730 includes an acquisition unit 730a, a reception unit 730b, a selection unit 730c, and a notification unit 730d. The control unit 730 corresponds to an integrated device, for example, an ASIC, a FPGA, and the like. In addition, the control unit 730 corresponds to an electronic circuit, for example, a CPU, an MPU, and the like.
The acquisition unit 730a is an acquisition unit which acquires information relating to the skill determination result from the user terminal 400. For example, the information relating to the skill determination result is information in which user identification information, an item, and the skill determination result are associated with each other. The acquisition unit 730a associates the user identification information, the item, and the skill determination result with each other and stores the resultant in the skill determination result table 720a.
When storing the information relating to the skill determination result in the skill determination result table 720a, the acquisition unit 730a adds the information on the skill determination result in a record if the record on the same set of the user identification information and the item has been already stored.
The reception unit 730b executes the following operation when receiving an access request relating to the expert model data from the user terminal. The reception unit 730b displays a display screen in which the expert profile information, the model data, and the evaluation value stored in the expert data table 720c are associated with each other on the user terminal 400. When receiving the selection of the model data from the user who operates the user terminal, the reception unit 730b notifies the user terminal 400 of the model data for which the selection has been received. Incidentally, the notification unit 730d to be described later may notify the user terminal 400 of the model data for which the selection has been received.
In addition, when information on an evaluation value with respect to the expert model data is received from the user terminal 400, the reception unit 730b updates the evaluation value. For example, the reception unit 730b may update the evaluation value of the expert data table 720c by averaging evaluation values from the respective user terminals 400 corresponding to an expert.
Although the description has been given regarding the case where the user operating the user terminal selects the model data in the above-described example, the server 700 may select the model data suitable for the user and notify the user terminal 400 of the selected model data.
The selection unit 730c is a processing unit which selects the model data suitable for the user. For example, the selection unit 730c acquires the user identification information from the user terminal 400 and acquires the gender, the age, the height, and the weight corresponding to the user identification information from the personal information table 720b. In the following description, the gender, the age, the height, and the weight which correspond to the user identification information will be collectively referred to as user profile information if appropriate.
The notification unit 730d is a processing unit which notifies the user terminal 400 serving as a request source of the model data acquired from the selection unit 730c.
Continuously, an example of a process of the selection unit 730c will be described. The selection unit 730c selects the profile information of the expert data table 720c which is the most similar to the user profile information. The selection unit 730c selects the model data of a record corresponding to the selected profile information and outputs the selected model data to the notification unit 730d.
The selection unit 730c may select the similar profile information in any way. For example, the selection unit 730c compares the user profile information and the expert profile information, grants each score depending on whether the gender matches, an age difference, a height difference, and a weight difference, and selects the profile information having the highest total score as the similar profile information. For example, a predetermined score is granted when the genders match each other and the score is not granted when the genders do not match each other. In regard to the age difference, the height difference, and the weight difference, a larger score is granted to the smaller difference.
Further, the selection unit 730c may acquire motion data of the user from the user terminal 400 and acquire the model data, which is the most similar to the acquired motion data, from the model data of the expert data table 720c. For example, the selection unit 730c selects the model data which is the most similar to the motion data of the user through skill determination by performing the same process as that of the skill determination device 100, which has been described in the first embodiment, based on the motion data of the user and the respective model data of the expert data table 720c. For example, the selection unit 730c may perform the skill determination and select the model data with the largest number of “Excellent” as the most similar model data.
In addition, the selection unit 730c may associate the model data of which the user terminal 400 is notified and the skill determination result based on the model data with each other and store the resultant in the skill determination result table 720a. The selection unit 730c determines whether the user's skill has been improved based on the skill determination result stored in the skill determination result table 720a by repeatedly executing the above-described process. For example, the selection unit 730c determines that the user and the expert have good chemistry when the number of “Excellent” increases by comparing the past skill determination result and the current skill determination result that correspond to the same user identification information. In this case, the selection unit 730c continuously notifies the user terminal 400 of the same expert model data. In addition, the selection unit 730c may perform correction to increase the evaluation value of the expert when the number of “Excellent” increases.
On the other hand, the selection unit 730c determines that the user and the expert have bad chemistry when the number of “Excellent” decreases or does not change by comparing the past skill determination result and the current skill determination result that correspond to the same user identification information. In this case, the selection unit 730c notifies the user terminal 400 of another expert model data. In addition, the selection unit 730c may perform correction to decrease the evaluation value of the expert when the number of “Excellent” decreases or does not change.
Next, an example of a processing procedure of the server 700 according to the fourth embodiment will be described.
Here, when the user directly selects the expert model data (Yes in Step S502), the server 700 transitions to Step S503. On the contrary, when the user does not directly select the expert model data (No in Step S502), the server 700 transitions to Step S506.
The subsequent process to Step S503 will be described. The reception unit 730b of the server 700 displays the expert profile information, the model data, and the evaluation value in an associated manner on the user terminal 400 (Step S503).
The reception unit 730b determines whether the selection of the model data has been received (Step S504). When the selection of the model data has not been received (No in Step S504), the reception unit 730b transitions to Step S504 again.
On the other hand, when the selection of the model data has been received (Yes in Step S504), the reception unit 730b notifies the user terminal 400 of the model data for which the selection has been received (Step S505).
The subsequent process to Step S506 will be described. The selection unit 730c of the server 700 acquires the user identification information from the user terminal 400 (Step S506). The selection unit 730c selects the model data suitable for the user (Step S507). The server 700 notifies the user terminal 400 of the model data selected by selection unit 730c (Step S508).
Next, an effect of the server 700 according to the fourth embodiment will be described. The server 700 displays the expert model data on the user terminal 400 and notifies the user terminal 400 of the model data for which the selection has been received when receiving the selection of the expert model data. Alternatively, the server 700 selects the expert model data suitable for the user and notifies the user terminal 400 of the selected expert model data. Thus, the user can perform the skill determination using the expert model data suitable for himself.
According to an embodiment of the invention, generally, it is possible to automatically determine the user's skill.
Next, a description will be given regarding an example of a computer that executes a control program which implements the same function as the servers 500, 600 and 700 illustrated in the above-described embodiments and a hardware configuration thereof.
As illustrated in
The hard disk device 807 includes a control program 807a. The CPU 801 reads the control program 807a and load the program on the RAM 806. The control program 807a functions as a control process 806a. For example, the control process 806a corresponds to the acquisition unit 530a, the reception unit 530b, the retrieval unit 530c, and the screen generation unit 530d illustrated in
Incidentally, the control program 807a is not necessarily stored in the hard disk device 807 from the beginning. For example, the respective programs may be stored in “portable the physical media”, such as a flexible disk (FD), a CD-ROM, the DVD disk, a magneto-optical disk, and an IC card, which are inserted into the computer 800. Then, the computer 800 may be configured to read and execute the control program 807a.
All examples and conditional language provided herein are intended for the pedagogical purposes of aiding the reader in understanding the invention and the concepts contributed by the inventor to further the art, and are not to be construed as limitations to such specifically recited examples and conditions, nor does the organization of such examples in the specification relate to a showing of the superiority and inferiority of the invention. Although one or more embodiments of the present invention have been described in detail, it should be understood that the various changes, substitutions, and alterations could be made hereto without departing from the spirit and scope of the invention.
Number | Date | Country | Kind |
---|---|---|---|
JP2014-208981 | Oct 2014 | JP | national |
This application is a continuation application, under 35 U.S.C. § 111(a), of International Application PCT/JP2015/077851, filed on Sep. 30, 2015, which claims foreign priority benefit to Japanese patent application No. 2014-208981, filed Oct. 10, 2014, and designating the U.S., the entire contents of which are incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
5099859 | Bell | Mar 1992 | A |
5821417 | Naruo | Oct 1998 | A |
5947742 | Katayama | Sep 1999 | A |
6041651 | Naruo | Mar 2000 | A |
6083123 | Wood | Jul 2000 | A |
6537076 | McNitt | Mar 2003 | B2 |
7264554 | Bentley | Sep 2007 | B2 |
20020019258 | Kim | Feb 2002 | A1 |
20030040380 | Wright | Feb 2003 | A1 |
20040209698 | Ueda et al. | Oct 2004 | A1 |
20080139307 | Ueshima | Jun 2008 | A1 |
20090148000 | Madsen | Jun 2009 | A1 |
20090209358 | Niegowski | Aug 2009 | A1 |
20090220124 | Siegel | Sep 2009 | A1 |
20100015585 | Baker | Jan 2010 | A1 |
20100034462 | Nevatia | Feb 2010 | A1 |
20100303303 | Shen | Dec 2010 | A1 |
20100306712 | Snook | Dec 2010 | A1 |
20110230274 | Lafortune et al. | Sep 2011 | A1 |
20120000300 | Sunagawa | Jan 2012 | A1 |
20120021833 | Boch | Jan 2012 | A1 |
20120083351 | Kim et al. | Apr 2012 | A1 |
20120143358 | Adams | Jun 2012 | A1 |
20120183940 | Aragones | Jul 2012 | A1 |
20120190505 | Shavit | Jul 2012 | A1 |
20120214594 | Kirovski | Aug 2012 | A1 |
20120253201 | Reinhold | Oct 2012 | A1 |
20120327194 | Shiratori | Dec 2012 | A1 |
20130029791 | Rose | Jan 2013 | A1 |
20130123667 | Komatireddy | May 2013 | A1 |
20130171601 | Yuasa | Jul 2013 | A1 |
20130252216 | Clavin | Sep 2013 | A1 |
20130278501 | Bulzacki | Oct 2013 | A1 |
20130302768 | Webb | Nov 2013 | A1 |
20130331199 | Wright | Dec 2013 | A1 |
20140079290 | Nakano et al. | Mar 2014 | A1 |
20140147820 | Snow | May 2014 | A1 |
20140177926 | Nojima | Jun 2014 | A1 |
20140219550 | Popa | Aug 2014 | A1 |
20140282105 | Nordstrom | Sep 2014 | A1 |
20140287389 | Kallmann | Sep 2014 | A1 |
20140347392 | Odessky | Nov 2014 | A1 |
20140364230 | Borghese | Dec 2014 | A1 |
20150154447 | Wilson | Jun 2015 | A1 |
20150193945 | Baek | Jul 2015 | A1 |
20150196803 | Shavit | Jul 2015 | A1 |
20150324636 | Bentley | Nov 2015 | A1 |
20160042242 | Segawa | Feb 2016 | A1 |
20160129343 | Domansky | May 2016 | A1 |
20160253710 | Publicover | Sep 2016 | A1 |
20160256740 | Rowe | Sep 2016 | A1 |
20170046568 | Bulzacki | Feb 2017 | A1 |
20170061818 | Hijioka | Mar 2017 | A1 |
20170103672 | Dey | Apr 2017 | A1 |
20190060752 | Lee | Feb 2019 | A1 |
20200058148 | Blaylock | Feb 2020 | A1 |
Number | Date | Country |
---|---|---|
102470267 | May 2012 | CN |
103596625 | Feb 2014 | CN |
103685862 | Mar 2014 | CN |
103888643 | Jun 2014 | CN |
2004-313479 | Nov 2004 | JP |
2008-236124 | Oct 2008 | JP |
2012-531941 | Dec 2012 | JP |
2014-64110 | Apr 2014 | JP |
2014-512220 | May 2014 | JP |
2014-123322 | Jul 2014 | JP |
10-2014-0012742 | Feb 2014 | KR |
WO 2011002225 | Jan 2011 | WO |
WO 2012138536 | Oct 2012 | WO |
Entry |
---|
Partial European Search Report dated Sep. 18, 2017 in corresponding European Patent Application No. 15849229.8. |
Korean Office Action dated May 24, 2018 in corresponding Korean Patent Application No. 10-2017-7008866. |
International Search Report dated Nov. 10, 2015 in corresponding International Application No. PCT/JP2015/077851**. |
Office Action dated Jun. 28, 2018, in corresponding Chinese Patent Application No. 201580054106.2, 21 pgs. |
Office Action dated Dec. 16, 2019 in corresponding European Patent Application No. 15849229.8. |
Number | Date | Country | |
---|---|---|---|
20170189784 A1 | Jul 2017 | US |
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2015/077851 | Sep 2015 | US |
Child | 15467649 | US |