This application is based upon and claims the benefit of priority of the prior Japanese Patent Application No. 2015-228982, filed on Nov. 24, 2015, the entire contents of which are incorporated herein by reference.
The embodiments discussed herein are related to, a non-transitory computer-readable storage medium, an evaluation method, and an evaluation device.
Technologies are known for scoring a dance of a person and notifying the person of a scoring result. As an example of a technology related to scoring and evaluating a dance of a person, there is a technology related to a game in which a part of a person's body is moved to a song. In the technology, in cases in which a part of a body of a player performing the game has moved at a speed greater than or equal to a reference speed, a game play of the player is evaluated based on a determination as to whether or not a substantially motionless state of the body part is maintained for a reference period.
Related technologies are disclosed in Japanese Laid-open Patent Publication No. 7-50825, Japanese Laid-open Patent Publication No. 2000-237455, and Japanese Laid-open Patent Publication No. 2013-154125.
According to an aspect of the invention, a non-transitory computer-readable storage medium storing an evaluation program that causes a computer to execute a process, the process comprising obtaining a captured image captured by an imaging device, displaying the plurality captured images on a display device and displaying a display that indicates a separation between a plurality of set areas set in the captured image while superimposing on the captured image, and detecting timings at which each of a plurality of persons beat rhythm by analyzing the captured images, each of the plurality of persons being included in each of the plurality of set areas.
The object and advantages of the invention will be realized and attained by means of the elements and combinations particularly pointed out in the claims.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are not restrictive of the invention, as claimed.
Note that, a case in which a dance performed by plural persons is evaluated is conceivable. However, a proper evaluation may be obstructed when an evaluation of a dance performed by plural persons is attempted based on movements of body parts of players. For example, if plural persons dance at the same time in an imaging area, evaluation is liable to be affected by a motion of a person with a large build and a motion of a person close to an imaging device. Furthermore, positions of respective persons may overlap with each other within the imaging area, thereby obstructing dance by each person to be properly evaluated. Thus, in order to properly evaluate a dance performed by each person, it is desirable that a separate area is provided within an imaging area in which each person performs a dance.
In one aspect, an object is to provide an evaluation program, an evaluation method, and an evaluation device capable of letting each person recognize a target area in which an evaluation of a dance is performed.
Hereinafter, embodiments of an evaluation program, an evaluation method, and an evaluation device, disclosed in the present application, will be described in detail, based on drawings. Note that the disclosed technology is not limited by the present embodiments. In addition, individual embodiments illustrated as follows may be combined as appropriate to the extent no inconsistency arises.
An evaluation device 10 illustrated in an example of
The evaluation device 10 illustrated in the example of
The reason for extracting a timing at which a motion amount of a person temporarily decreases as the timing at which the person beats time is because a person temporarily stops a motion when beating time, thereby causing the motion amount to be temporarily decreased. Here, the term “rhythm” means, for example, regularity of tempo. The term “tempo” means, for example, an interval between beats. Next, the evaluation device 10 compares the tempo indicated by the extracted timing and the reference tempo that is the tempo serving as a reference, so as to evaluate the tempo of a person's motion. The evaluation device 10 thereby extracts a timing at which a person beats time and evaluates a tempo of a motion of the person, without performing recognition processing for recognizing a human face, parts of human body, or instruments, namely, recognition processing with high processing volume (with high processing load). Accordingly, the evaluation device 10 enables a tempo of a person's motion to be evaluated in a simple manner.
In the present embodiment, the evaluation device 10 extracts timing at which a person beats time, in each of plurally divided evaluation target areas, which has been divided by an area control unit 14a, described later. In addition, the evaluation device 10 causes partitioning into divided evaluation target areas to be displayed superimposed on a captured image display. In this manner, according to the present embodiment, motions of each person in each plural area set as the imaging areas are analyzed, and an indication of separation into each divided area is displayed superimposed on the display device. This, accordingly, enables each person to recognize target areas for the evaluation of a dance.
The input unit 11 inputs various kinds of information to the control unit 14. For example, when the input unit 11 receives an instruction to perform evaluation processing, described later, from a user who uses the evaluation device 10, the input unit 11 inputs the received instruction to the control unit 14. Examples of devices for the input unit 11 may include a mouse, a keyboard, and a network card used for receiving various kinds of information transmitted by other devices, not illustrated, and entering the received information to the control unit 14.
The output unit 12 outputs various kinds of information. For example, when the output unit 12 receives an evaluation result of a tempo of a motion of a person from an output control unit 14d, described later, the output unit 12 displays the received evaluation result or transmits the received evaluation result to a mobile terminal maintained by the user or an external monitor. Examples of devices for the output unit 12 may include a monitor, a network card used for transmitting various information transmitted from the control unit 14 to other non-illustrated devices, and the like.
The storage unit 13 stores various kinds of information. The storage unit 13 stores, for example, moving image data 13a, timing data 13b, music tempo data 13c, and evaluation data 13d.
The moving image data 13a is moving image data containing plural frames obtained as a result of image-capturing plural dancing persons, using the camera 21. Examples of such plural persons may include persons who sing to a song played by a karaoke device in a karaoke box and at the same time dance to the played song. Note that the plural frames contained in the moving image data 13a are obtained by a continuous image-capturing with the camera 21, which is an example of a captured image.
In addition, the frame 15 is divided by a dividing line 601 into a left side evaluation target area 701 (hereinafter, also referred to as an “area A”) and a right side evaluation target area 702 (hereinafter, also referred to as an “area B”). The dividing line 601 is an example of a dividing line. The player A 401 is positioned in the left side evaluation target area 701, and the player B 402 is positioned in the right side evaluation target area 702, out of the divided imaging areas. Note that while any given value may be employed as the frame rate of the moving image data 13a, the frame rate of 30 fps (frames per second) is used in the following explanation.
The timing data 13b is data that indicates time (timing) at which a player who dances beats time. For example, when a player in the moving image data 13a sings and dances to a played song in a karaoke box, the dance is started as the song begins. Thus, time from the start of a song and a dance is an example of such data.
For example, in the first record of the timing data 13b illustrated in the example in
The music tempo data 13c is data indicating the reference tempo. The reference tempo is acquired from sound information by the evaluation unit 14c, described later. Here, examples of the sound information may include, for example, sound collected by a non-illustrated microphone, a song played by a karaoke device, and audio data acquired in synchronization with the moving image data 13a in video data recorded using a non-illustrated video camera. In addition, as an example of the sound information, a musical instrument digital interface (MIDI) may be used.
The evaluation data 13d is an evaluation result of a tempo of a motion of each player evaluated by the evaluation unit 14c, described later. The evaluation result will be described later.
The storage unit 13 is, for example, a semiconductor memory element such as a flash memory or a storage device such as a hard disk or an optical disk.
The control unit 14 includes an internal memory for storing a program that specifies various kinds of processing procedures, and control data, based on which the control unit 14 performs various kinds of processing. As illustrated in
The area control unit 14a is a processing unit that divides a captured image into plural evaluation target areas according to the number of players included in the captured image and generates dividing lines for the plural evaluation target areas.
Description follows regarding an embodiment of the area control unit 14a. First, the area control unit 14a determines the number of players included in the captured image. The area control unit 14a, for example, is able to identify the number of players included in the captured image, using the number of persons entered or selected by the user. Note that, in the present embodiment, the persons to dance may be selected from one up to the maximum of four persons, however, the embodiment is not limited thereto.
A configuration that uses the number of persons entered or selected by the user is described with reference to
As illustrated in
The area control unit 14a, upon receiving the instruction regarding the selection of the number of persons from the input unit 11, divides a frame into plural evaluation target areas according to the entered number of persons. The area control unit 14a, for example, divides the frame into areas with equal width according to the number of persons entered.
The area control unit 14a further generates a display to indicate divided respective evaluation target areas and outputs to the output control unit 14d, so as to be displayed superimposed on the captured image. As illustrated in
A configuration in which the area control unit 14a determines the number of players included in the captured image, based on an input or selection by the user, has been described, however a configuration is not limited thereto. For example, a configuration may be such that the area control unit 14a determines the number of players using a facial recognition technology or an object recognition technology.
In cases in which the number of players is identified using object recognition technology instead of facial recognition technology, players to be included in the captured image each hold or wear an object to serve as a target of recognition, such as, for example, a wrist band or a musical instrument.
Note that a configuration may be such that the players are instructed by the area control unit 14a to enter the respective evaluation target areas.
In the present embodiment, the area control unit 14a changes the display of the evaluation target areas and the dividing lines according to the identified number of players.
Note that the area control unit 14a may be configured such that a message is displayed prompting players to adjust positions to stand, when more than one players are standing in a single evaluation target area or when a player is standing on the dividing line when an “OK” is selected by the players. Alternatively, the area control unit 14a may be configured to start the next processing at the point of time it is confirmed that each player stands in the respective evaluation target area, without having the message and selectable options displayed.
Note that, in a configuration using facial recognition technology or object recognition technology, the area control unit 14a may be configured such that the evaluation target areas 701 to 703 are not preset. For example, the area control unit 14a may set areas within a certain range from the facial recognition area 421 illustrated in
Returning to the description of
Description of an embodiment of the acquisition unit 14b follows. For example, the acquisition unit 14b acquires the moving image data 13a stored in the storage unit 13, when an instruction to perform the evaluation processing, described later, is input from the input unit 11.
Next, for each of the divided area A 701 and area B 702, the acquisition unit 14b acquires difference between a frame and a frame image-captured before the current frame, using a background difference method, for each of the frames contained in the moving image illustrated by the moving image data 13a. For example, the acquisition unit 14b acquires, for each of the plural frames, difference between the frame and a frame obtained by accumulating frames image-captured before the current frame, using a known function related to accumulation of background statistics.
A description follows regarding processing in the acquisition unit 14b that uses the function related to accumulation of background statistics. For each of the corresponding evaluation target areas, the acquisition unit 14b compares a frame with background information obtained from frames that have been image-captured before the current frame, and generates a binarized image based on a change in luminance. Note that the information generated here is, for example, information in which a pixel with a change in luminance less than or equal to a threshold value is replaced by a black pixel and a pixel with a change in luminance greater than the threshold value is replaced by a white pixel, however, the information is not limited thereto. The acquisition unit 14b may generate an image other than an binarized image with black and white pixels, as long as, in the information provided, it is possible to identify whether a change in luminance is less than or equal to a threshold value, or greater than the threshold value.
In this manner, in the present embodiment, the background difference amount in each of the divided evaluation target areas is used as an index indicating amount of movement by each player. For example, the acquisition unit 14b calculates, as the motion amount of the player A 401, the total number of white pixels included in a binarized image in the area A 701 on the left side of the imaging areas illustrated in the example of
In this way, for each of the evaluation target areas, the acquisition unit 14b acquires the background difference amount for each of the frames as the motion amount of each player. Then, for each frame, the acquisition unit 14b the background difference amount with a frame number.
In this way, for each of the plural frames, the acquisition unit 14b acquires difference between a frame and a frame obtained by accumulating frames image-captured before the current frame, for each of the evaluation target areas.
Note that, the acquisition unit 14b may also acquire difference between a frame and a frame image-captured before the current frame, and acquire difference between a frame and a frame obtained by accumulating frames image-captured before the current frame, by using code book method.
Returning to the description of
A description follows regarding an embodiment of the evaluation unit 14c. For example, the evaluation unit 14c detects a frame having a smaller background difference amount than the background difference amount of an immediately preceding frame, also having a smaller background difference amount than the background difference amount of an immediately following frame, based on information in which frame number and background difference amount are associated with each other by the acquisition unit 14b.
When, as illustrated in the graph of the example of
Then, the evaluation unit 14c detects the time at which the detected frames are image-captured, as respective timings at which the amount of a temporal change in frames temporarily decreases. For example, the evaluation unit 14c detects the time at which the frames with frame numbers “4”, “6”, “10”, “18”, and “20” are respectively image-captured as the timings at which the amount of temporal change in frames temporarily decreases. In addition, the evaluation unit 14c also detects, for example, the time at which the frames with frame numbers “25”, “33”, “38”, “40”, and “47”, are respectively image-captured as the timings at which the amount of temporal change in frames temporarily decreases. In the present embodiment, the evaluation unit 14c also detects timings in the area B 702 on the right side of the diagram in
In addition, based on detected timings, the evaluation unit 14c extracts a motion in which a player included in a frame beats time, or a timing at which the player beats time. In the present embodiment, the evaluation unit 14c individually extracts the said timing for each player in the respective evaluation target areas.
The evaluation unit 14c extracts, for example, the following timing from detected timings. Namely, for each of the evaluation target areas, the evaluation unit 14c extracts a frame satisfying a predetermined condition out of frames image-captured at the time of detection, and extracts the time when the frame was image-captured as timing at which a player included in the frame beats time.
Here, a description follows regarding an example of a method used by the evaluation unit 14c to extract a frame satisfying a predetermined condition. The evaluation unit 14c, for example, selects, one by one, frames corresponding to the timing of respective detection (frames image-captured at the timing of detection) as an extraction candidate frame. Then, the evaluation unit 14c performs the following processing each time the evaluation unit 14c selects an extraction candidate. Namely, the evaluation unit 14c determines whether or not the background difference amount decreases starting from a frame a predetermined numbers of frames before the extraction candidate frame through the extraction candidate frame, and the background difference amount increases starting from the extraction candidate frame through a frame a predetermined numbers of frames after the extraction candidate frame.
When the evaluation unit 14c determines that the background difference amount decreases starting from the frame the predetermined numbers of frames before the extraction candidate frame through the extraction candidate frame, and the background difference amount increases starting from the extraction candidate frame through the frame the predetermined numbers of frames after the extraction candidate frame. the evaluation unit 14c performs the following processing. Namely, the evaluation unit 14c extracts the time in which the extraction candidate frame was image-captured as the timing at which a player included in the frame beats time. In other words, the evaluation unit 14c extracts a motion of beating time performed by a player included in the extraction candidate frame, out of motions of respective players indicated in the plural frames. Then, the evaluation unit 14c performs the above-mentioned processing on all frames corresponding to the respective detected timings.
A description follows regarding a case in which, in the area A 701 on the left side illustrated in
In addition, the evaluation unit 14c extracts a motion of beating time, performed by a player included in the frame with the frame number “25”, out of motions of players indicated in each of the plural frames. Regarding the above-mentioned predetermined numbers, a predetermined number of frames before the extraction candidate frame and a predetermined number of frames after the extraction candidate frame may be set to difference values. An embodiment in which the predetermined number of frames before the extraction candidate frame is set to “5” and the predetermined number of frames after the extraction candidate frame is set to “1” may be considered as an example.
In addition, out of timing in which respective plural frames are image-captured, time to beat time, and “time beaten” are associated by the evaluation unit 14c and registered in the timing data 13b, as illustrated in
In this manner, the timing data 13b, which is registered with various kinds of information, is used to evaluate, for example, a rhythm of a player, indicated by the timing at which the player beats time. For each of all the frames, the evaluation unit 14c registers in the timing data 13b, either “time to beat time” and “time beaten” associated each other, or “time not to beat time” and “time not beaten” associated each other.
In the example of
In addition, the evaluation unit 14c performs evaluation related to the tempi of the motions of the respective players according to a comparison between a reference tempo and tempi indicated by motions of beating time, performed by players included in respective evaluation target areas in frames, or timings at which the players beat time, the tempi being extracted based on the plural frames. Furthermore, the evaluation unit 14c performs evaluation related to the motions of the respective players, based on a tempo extracted from a reproduced song (music) and on timings at which the respective players keep rhythm and which are acquired from frames including, as image-capturing targets, the respective players singing to the reproduced song.
The evaluation unit 14c acquires, from the timing data 13b, time of a timing at which a player beats time. In addition, the evaluation unit 14c acquires the reference tempo from the sound information. The evaluation unit 14c performs the following processing on sound information including, for example, a voice of a player who sings and dances to the reproduced song collected by a non-illustrated microphone provided in a karaoke box and the reproduced song, and the like. Namely, the evaluation unit 14c acquires the reference tempo using technologies such as beat tracking and rhythm recognition. To perform the beat tracking and the rhythm recognition, for example, technologies described in a non-patent literature are available (Takeda, Haruto, “2-4 Audio Alignment, Beat Tracking, and Rhythm Recognition”, the Institute of Electronics, Information and Communication Engineers, “Know/edge Base”, Volume 2, Section 9, Chapter 2, page 17 to page 20, online, searched on Dec. 17, 2013, the URL http://www.ieice-hbkb.org/portal/doc_557.html). Alternatively, the evaluation unit 14c may acquire the reference tempo from MIDI data corresponding to the reproduced song. In addition, the evaluation unit 14c stores, as the music tempo data 13c, the acquired reference tempo in the storage unit 13.
In addition, the evaluation unit 14c performs a comparison between a timing of a beat in the reference tempo indicated by the music tempo data 13c and a timing at which a player beats time, acquired from the timing data 13b.
The evaluation unit 14c compares timings by using, for example, the timing at which the player beats time, as a reference.
In the example of
The evaluation unit 14c calculates difference and adds to the score points corresponding to the difference, with respect to all of the timings at which the player beats time. Note that the score is set to 0 point at the start of the evaluation processing. In addition, the first threshold and the second threshold are not limited to the above-mentioned values, and any given values may be adopted as the first threshold and the second threshold.
In the example of
Note that the evaluation unit 14c may compare timings using the timing of a beat in the reference tempo as a reference. At that time, as a timing indicated by the reference tempo used for evaluation, a timing between timings acquired from the sound information, what is referred to as backbeat, may be added. This thereby enables a rhythm of a player who beats time at a timing of a backbeat to be appropriately evaluated. By taking into consideration the fact that it is more difficult to beat time with a backbeat than to beat time at a timing acquired from the sound information (a downbeat), a mode may be adopted in which a higher score is added when a timing at which a player beats time matches a backbeat than the score to be added when the timing matches a downbeat.
When the evaluation unit 14c has added points to the score, for all the timings at which the player beats time, or for the timings of all the beats in the reference tempo, the evaluation unit 14c calculates evaluation by using the score. The evaluation unit 14c may use, for example, the score itself as an evaluation or may calculate scored points on a 100-point scale, based on the following Expression (1).
Scored Points (Out of 100 points)=Basic Points+(Value of Score/(Number of BeatsדExcellent” Points))×(100−Basic Points) (1)
In the above-mentioned Expression (1), the “basic points” indicate a minimum points that can be acquired, such as 50 points. The “number of beats” indicates the number of all the timings at which the player beats time or the number of timings of all the beats in the reference tempo. The “Excellent” points indicate “2”. Accordingly, in Expression (1), the denominator in the fractional term corresponds to a maximum acquirable score. In addition, when all the timings are judged “Excellent!”, Expression (1) is calculated to be 100 points. Moreover, in the Expression (1), 50 points are provided even when all the timings are judged “Bad!”, thereby enabling the motivation of the player who dances to be maintained.
In addition, when Expression (1) is used, the evaluation unit 14c is capable of calculating a score such that the value of the score increases with an increase in the number of timings at which the player beats time and an increase in the number of timings at which a difference from the timing indicated by the reference tempo is smaller than a predetermined value. This enables the tempo of the motion of the player to be evaluated from the viewpoint of whether the timings at which the player beats time matches the respective timings indicated by the reference tempo. Note that the above-mentioned Expression (1) is just an example and the evaluation unit 14c may use another mathematical expression in which points increase in response to the number of evaluation of “Excellent!”.
Next, the evaluation unit 14c stores the calculated evaluation in the storage unit 13 as the evaluation data 13d and transmits the evaluation to the output control unit 14d. In addition, at the time when all evaluation is finished, for example, at the time when a song ended, the evaluation unit 14c aggregates evaluation results of respective players, generates information to be displayed in a result display screen to be described later, and outputs the information to the output control unit 14d.
Returning to the description of
In addition, the output control unit 14d performs control so as to output an evaluation result serving as a result of an evaluation, received from the evaluation unit 14c. The output control unit 14d transmits the evaluation result to, for example, the output unit 12 so that the output unit 12 outputs the evaluation result.
A description follows regarding an example of an evaluation result with reference to
In addition, the output control unit 14d performs a control such that an indication of a reference tempo that is a timing at which each player beats time is displayed. For example, as illustrated by a symbol 901 in
As illustrated by the symbol 901 in
In
In addition, at the time when, for example, a song ended, the output control unit 14d performs a control so as to output the result display screen using information entered by the evaluation unit 14c.
Returning to the description of
Flow of Processing
Next, a description follows regarding a flow of processing performed by the evaluation device 10 according to the first embodiment.
First, the area control unit 14a determines the number of players included in a captured image (step S101). Next, according to the identified number of persons, the area control unit 14a sets evaluation target areas (step S103) and causes dividing lines to be displayed superimposed on the captured image (step S105).
Next, the area control unit 14a waits until reception of input of “OK” as illustrated by a symbol 502 in
Next, the evaluation unit 14c detects a timing at which an amount of a temporal change in continuously image-captured frames temporarily decreases (step S203). Then, based on the timing of the detection, the evaluation unit 14c extracts a motion of beating time, performed by each of the players included in a frame, or a timing at which the relevant player beats time (step S204).
Next, the evaluation unit 14c registers, in the timing data 13b illustrated in
The acquisition unit 14b repeats the evaluation processing in step S121 until all evaluation is finished (step S131: No). When all the evaluation is finished (step S131: Yes), the evaluation unit 14c outputs information indicating evaluation results to the output control unit 14d. The output control unit 14d performs control so as to output a result display screen (step S141) and processing ends.
As described above, in the present embodiment, the evaluation device divides an imaging area into plural evaluation target areas and evaluates a motion of one player for each of the evaluation target areas. This enables a motion of each player to be adequately evaluated without being affected by a motion of a specific player, since the evaluation device does not evaluate motions of plural players.
If the evaluation device performs recognition processing for identifying individual players in order to detect motions of plural players within the same area. This may render the processing load of the evaluation device high, and if the image-capturing accuracy of the evaluation device is low or the processing capacity of the evaluation device is low, the evaluation device may be restricted from performing sufficient evaluation. In the present embodiment, the evaluation device detects a motion of a single player in each of the divided evaluation target areas. Accordingly, the evaluation device is capable of evaluating dance of plural persons in a simple manner, without performing a recognition processing with high processing volume (with high processing load).
Furthermore, in the evaluation device in the present embodiment, an indication of separation into each divided area is displayed superimposed on the display device. This, accordingly, enables each person to recognize target areas for the evaluation of a dance.
In addition, in the above-mentioned first embodiment, a configuration in which the evaluation device performs an evaluation of whether a timing at which a player beats time matches a timing indicated by the reference tempo is described. However, the evaluation device is not limited thereto. The evaluation device may, for example, separate a time period into plural intervals and perform, for each of the intervals, an evaluation of whether the number of timings at which a player beats time and the number of timings indicated by the reference tempo match each other.
Note that, in the above-mentioned first embodiment, a configuration in which the evaluation device performs an evaluation of whether a timing at which a player beats time and a timing indicated by the reference tempo match each other is described. However, the evaluation device is not limited thereto. The evaluation device may, for example, separate a time period into plural intervals and perform, for each of the intervals, an evaluation of whether the number of timings at which a player beats time and the number of timings indicated by the reference tempo match each other. In addition, the evaluation device may perform an evaluation of, for example, whether an amount of a motion of a player and a melody expressed by, for example, “aggressive”, “slow”, or the like indicated by the reference tempo match each other.
In the above-mentioned embodiment, by outputting a visual effect according to an evaluation at the time of displaying the evaluation of a dance of a player, the evaluation device is able to cause the player to clearly recognize the evaluation and to provide a service with a high entertainment property. Therefore, a description follows regarding a configuration in which the evaluation device outputs a visual effect according to an evaluation of a dance of a player as a second embodiment.
In the present embodiment, for example, according to an evaluation result of each of the players, the output control unit 14d of the evaluation device 10 causes a visual effect indicating the corresponding evaluation result to be output in the vicinity of the respective player.
The output control unit 14d outputs a message 811 of “Excellent!”, when a difference between a timing at which the player A 401 beats time and a timing of a beat in the reference tempo is, for example, “0 second”. Furthermore, the output control unit 14d causes a large star-shaped visual effect 801 to be output around the player A 401 according to an evaluation result of “Excellent!”
In addition, the output control unit 14d outputs a message 812 of “Good!” when a difference between a timing at which the player B 402 beats time and a timing of a beat in the reference tempo is, for example, “0.1 seconds”. Furthermore, the output control unit 14d causes a small star-shaped visual effect 802 to be output around the player B 402. In the same way, when a difference between a timing at which the player C 403 beats time and a timing of a beat in the reference tempo is, for example, “0.3 seconds”, the output control unit 14d causes a message 813 of “Bad!” to be output. In this case, the output control unit 14d causes no visual effect to be output around the player C 403.
In this way, the output control unit 14d visually depicts an evaluation result, based on, for example, the number and size of stars. This enables each of the players to visually recognize an evaluation result at a glance. Note that, in the example illustrated in
In addition, a configuration may be such that, instead of displaying new images according to evaluation results, the output control unit 14d performs its control such that images already displayed are replaced by other images according to evaluation results. For example, a configuration may be such that the output control unit 14d performs its control such that a star-shaped image flows in according to a tempo, and the output control unit 14d performs its control so as to change the star-shaped image to an image of a music note if an evaluation result of “Excellent!” is obtained, at a predetermined timing of a beat.
Note that a configuration may be such that, the output control unit 14d controls output of visual effects not only based on an evaluation result of a timings of one beat in the reference tempo, but also based on an evaluation result of timings of plural beats in the reference tempo. For example, the output control unit 14d may associate specific portions of a graphic with the respective evaluation result of timing of each of the four beats. For example, each time an “Excellent!” judgement is obtained at a timing of one beat, the output control unit 14d may causes a portion of the graphic, associated with the timing of the beat, to be displayed. According to such a configuration, the output control unit 14d may perform its control such that a specific graphic is completed when “Excellent!” judgements are obtained at all timings of beats.
Note that in a configuration in which the output control unit 14d outputs a portion of the graphic according to a result of keeping a rhythm, a configuration that assigns portions to respective players may be adopted. In a configuration outputting a graphic to be completed with, for example, three beats, the area control unit 14a performs its control so as to display the first portion of the graphic when the player A 401 keeps the first rhythm. In the same way, the area control unit 14a performs its control so as to display the second portion of the graphic when the player B 402 keeps the second rhythm, and the area control unit 14a performs its control so as to display the third portion of the graphic when the player C 403 keeps the third rhythm. According to such a configuration, players do not only compete each other, but another object of the game is presented to the players which is to complete a graphic by cooperation.
A configuration may be such that the output control unit 14d performs its control so that a displayed graphic gradually disappear at a timing at which, for example, 4 tempi finish and a portion of a next graphic is displayed on the side.
Note that in each of the examples illustrated in
In addition, a configuration may be such that the output control unit 14d causes characters to be displayed on a screen and causes each of evaluation results of players to be visually displayed based on a motion of the corresponding character.
Since the characters move in tune with evaluation of the respective players in this way, it is easier for the player to visually recognize evaluation of the dance of the player.
Note that an effect in which the output control unit 14d controls output according to an evaluation is not limited to the visual effect. For example, a configuration may be such that the output control unit 14d performs its control so as to output, for example, sound for the reference tempo in tune with music and to output different sound when a dance of a player matches the reference tempo, thereby notifying the player of an evaluation. In addition, a configuration may be such that when a dance matches the reference tempo, the output control unit 14d uses an effect based on a tactile sensation, such as controlling to cause, for example, the corresponding one of wrist bands 451 to 453 worn by the players to be vibrated, thereby notifying the corresponding player of an evaluation. Furthermore, a configuration may be such that the output control unit 14d combines the information of the evaluation result, visual effect, effect based on sound, and the effect based on a tactile sensation so as to notify an evaluation result to a player. This thereby enables a player to recognize an evaluation even when it is difficult for a player who is dancing to visually recognize a screen.
As described above, the evaluation device is able to provide a service in which the player easily recognizes the evaluation result and provide service with high game property, by outputting visual effect, acoustic effect, tactile sensation effect or the combination thereof, according to an evaluation result of a player.
If, in each of the above-mentioned embodiments, a player crosses a dividing line and enters an evaluation target area of another player, or players are too close to each other, it may become difficult to properly evaluate motions of respective players. Therefore, a configuration may be considered in which when a player moves and crosses a dividing line or is too close to another player, the evaluation device issues, to the player, a warning prompting to return to an original position.
In the present embodiment, the area control unit 14a in the evaluation device 10 detects whether or not each of the players moves and crosses the indication partitioning respective evaluation target areas. When detecting that one of the players moves and crosses a partitioning indication, the area control unit 14a outputs, to the output control unit 14d, an instruction to cause a display to be displayed on a screen, the display notify that an evaluation target area has been trespassed.
In
A description follows regarding a configuration of detecting that players are too close to each other with reference to
Note that a configuration may be such that the area control unit 14a causes the output control unit 14d to output a visual effect in a form causing a player to more easily recognize a notice, such as changing the color of a screen, blinking the screen, or the like. In addition, a configuration may be such that, when a player moves and crosses a dividing line, the area control unit 14a outputs, to the evaluation unit 14c, an instruction to subtract points from an evaluation of the relevant player.
According to the present embodiment, it is possible to inhibit dances of respective players from being improperly evaluated because plural players enter one evaluation target area or players get too close to each other.
In each of the above-mentioned embodiments, a configuration may be such that, in addition to individually evaluating motions of respective players, the evaluation device evaluates whether or not players beat time at the same timing, in other words, whether or not the motions of the respective players are synchronized with one another. A configuration in which the evaluation device assigns a high evaluation when the motions of, for example, all the respective players are synchronized with one another may be adopted. In the present embodiment, synchronization determination processing for identifying the degree of synchronization between motions of respective players will be described. In the present embodiment, the degree of synchronization between motions of respective players is expressed as a “synchronization rate” in some cases.
In the present embodiment, in addition to individually evaluating dances of respective players, the evaluation unit 14c evaluates whether or not players beat time at the same timing.
At points illustrated by a symbol 954 in
According to the present embodiment, in addition to a competition between the players, an element that the players cooperate with each other is added. Therefore, it is possible to enhance the game property and to facilitate dances with a feeling of uniformity. Note that a configuration may be such that even if tempi are able to be acquired from persons at the same timing, if the relevant timing is different from a correct timing, for example, when evaluation results of “Excellent!” are not obtained, no score is added.
Embodiments related to the disclosed device are described as above. However, the present technology may be implemented in various different forms in other than the above-mentioned embodiments.
For example, as described above, the evaluation device 10 (hereinafter, simply referred to as an evaluation device in some cases) may extract a rhythm of a player in synchronization with a karaoke device provided in a karaoke box. For example, the evaluation device 10 may extract a rhythm of a player in real time in synchronization with the karaoke device. Here, the term “real time” includes, for example, a form of serially performing processing on input frames and sequentially outputting processing results.
Upon receiving the message indicating that it is a timing to start the reproduction of the song, the evaluation device transmits, to the camera 43, an instruction to start image-capturing. Upon receiving the instruction to start image-capturing, the camera 43 starts image-capturing the player A 401 and the player B 402 who are present in an imaging area, and the camera 43 sequentially transmits, to the evaluation device, frames of the moving image data 13a obtained by the image-capturing.
In addition, sound information including voices of the players who are singing a song and who are dancing to the reproduced song and the reproduced song, the sound information being collected by the microphone 42, is sequentially transmitted to the evaluation device via the karaoke device 41. Note that such sound information is output in parallel with the frames of the moving image data 13a.
Upon receiving the frames transmitted by the camera 43, the evaluation device performs the above-mentioned various kinds of processing on the received frames. In addition, the evaluation device extracts timings at which the respective player A 401 and player B 402 beat time, and the evaluation device registers various kinds of information in the timing data 13b. In addition, upon receiving the sound information from the karaoke device 41, the evaluation device acquires the reference tempo from the received sound information. In addition, the evaluation device performs the above-mentioned evaluation and transmits an evaluation result to the karaoke device 41.
Upon receiving the evaluation result, the karaoke device 41 causes the received evaluation result to be displayed on the monitor 44. Thus, the player A 401 and the player B 402 are able to recognize the evaluation result. Note that the evaluation device 10 is able to cause the evaluation result to be displayed on the monitor 44 in real time. Therefore, according to the system 40, it is possible to swiftly output the evaluation result.
In addition, upon being notified by the karaoke device 41 of the message indicating that it is a timing to end the reproduction of the song, the evaluation device transmits, to the camera 43, an instruction to stop image-capturing. Upon receiving the instruction to stop image-capturing, the camera 43 stops the image-capturing.
As described above, in the system 40, the evaluation device is able to output the evaluation result in synchronization with the karaoke device 41 provided in the karaoke box.
In addition, a server provided outside of the karaoke box may have the same functions as the various kinds of functions included in the evaluation device, and the server may output an evaluation result.
Upon receiving the instruction to start image-capturing, the camera 53 starts image-capturing the player A 401 and the player B 402 who are present in an imaging area, and the camera 53 sequentially transmits, to the karaoke device 51, frames of the moving image data 13a obtained by the image-capturing. Upon receiving the frames transmitted by the camera 53, the karaoke device 51 sequentially transmits the received frames to the server 54 via a network 80. In addition, the karaoke device 51 sequentially transmits, to server 54 via the network 80, sound information including voices of the players who are singing a song and who are dancing to the reproduced song and the reproduced song, the sound information being collected by the microphone 52. Note that such sound information is output in parallel with the frames of the moving image data 13a.
The server 54 performs, on the frames transmitted by the karaoke device 51, the same processing as the above-mentioned various kinds of processing performed by the evaluation device. In addition, the server 54 extracts timings at which the respective player A 401 and player B 402 beat time, and the server 54 registers various kinds of information in the timing data 13b. In addition, upon receiving the sound information from the karaoke device 41, the evaluation device acquires the reference tempo from the received sound information. Then, the evaluation device performs the above-mentioned evaluation and transmits an evaluation result to the mobile terminal 55 held by the player A 401 and the mobile terminal 56 held by the player B 402, via the network 80 and a base station 81.
Upon receiving the evaluation result, the mobile terminals 55 and 56 cause the received evaluation result to be displayed on displays in the respective mobile terminals 55 and 56. Thus, the player A 401 and the player B 402 are able to recognize the evaluation result from the mobile terminal 55 held by the player A 401 and the mobile terminal 56 held by the player B 402, respectively.
In addition, according to various kinds of loads, various usage situations, and so forth, processing operations at respective steps in individual processing operations described in the embodiments may be subdivided or integrated as desired. Furthermore, a step may be omitted.
In addition, according to various kinds of loads, various usage situations, and so forth, the order of processing operations at individual steps in each of processing operations described in the embodiments may be changed.
In addition, each of configuration components in each of devices illustrated in drawings is functional and conceptual and does not have to be physically configured in such a way as illustrated in the drawings. Namely, specific states of the distribution or integration of the individual devices are not limited to these illustrated in the drawings, and all or part of the individual devices may be functionally or physically integrated or distributed in any given units according to various kinds of loads and various usage situations. The camera 43 described in the embodiment may be connected to, for example, the karaoke device 41 so as to be communicable with the evaluation device via the karaoke device 41. In addition, the functions of the karaoke device 41 and the evaluation device described in the embodiments may be implemented by, for example, a single computer.
Note that separation between areas and a display for dividing areas may be fixed forms or may be fluctuating (moving) forms. A configuration may be such that, according to evaluation of players, a partitioning display moves so that the area of a player whose score is high becomes wide, for example. From this, it is possible to provide a service with high game property.
In addition, in the third embodiment, a configuration in which issuing of a warning when a player enters an evaluation target area of another player is described. However, a configuration may be such that, in contrast, the area control unit 14a issues, for example, an instruction to prompt a player to move into another evaluation target area. If a configuration is adopted in which the area control unit 14a adds a score when a player moves into another evaluation target area within a specified time period, it is possible to handle a dance with high game property and which is active. Note that since, in a case of detecting a motion of a player who is moving across areas, a processing load becomes high or false recognition occurs in some cases, a configuration may be adopted in which no detection of motion is made during a specified time period and detection of positions of respective players is made at the time when the specified time period ends.
Evaluation Program
A computer program prepared in advance is executed by a computer system such as a personal computer or a workstation, thereby enabling various kinds of processing performed by the evaluation device 10 described in each of the above-mentioned embodiments. Therefore, an explanation follows, with reference to
As illustrated in
In the ROM 320, a basic program such as an operating system (OS) is stored. In addition, in the HDD 330, an evaluation program 330a to exert the same functions as those of the area control unit 14a, the acquisition unit 14b, the evaluation unit 14c, and the output control unit 14d illustrated in the above-mentioned embodiments is preliminarily stored. In addition, in the HDD 330, the moving image data 13a, the timing data 13b, the music tempo data 13c, and the evaluation data 13d are preliminarily stored.
The CPU 310 reads and executes the evaluation program 330a from the HDD 330. The CPU 310 reads and stores the moving image data 13a, the timing data 13b, the music tempo data 13c, and the evaluation data 13d from the HDD 330 and in the RAM 340. Furthermore, the CPU 310 uses various kinds of data stored in the RAM 340, thereby executing the evaluation program 330a. Note that regarding data stored in the RAM 340, all the data do not have to be stored in the RAM 340. Data to be used for processing only has to be stored in the RAM 340.
Note that the above-mentioned evaluation program 330a does not have to be stored in the HDD 330 from the start. The evaluation program 330a may be stored in a “portable physical medium” to be inserted into the computer 300, such as, for example, a flexible disk (FD), a CD-ROM, a DVD disk, a magneto-optical disk, or an IC card. Then, the computer 300 may read and execute the evaluation program 330a from one of these.
Furthermore, the evaluation program 330a may be stored in advance in “another computer (or a server)” or the like connected to the computer 300 via a public line, the Internet, a LAN, a WAN, or the like. In addition, the computer 300 may read and execute the evaluation program 330a from one of these.
All examples and conditional language recited herein are intended for pedagogical purposes to aid the reader in understanding the invention and the concepts contributed by the inventor to furthering the art, and are to be construed as being without limitation to such specifically recited examples and conditions, nor does the organization of such examples in the specification relate to a showing of the superiority and inferiority of the invention. Although the embodiments of the present invention have been described in detail, it should be understood that the various changes, substitutions, and alterations could be made hereto without departing from the spirit and scope of the invention.
Number | Date | Country | Kind |
---|---|---|---|
2015-228982 | Nov 2015 | JP | national |