METHOD AND APPARATUS FOR SHOWING SPECIAL EFFECT, ELECTRONIC DEVICE, AND COMPUTER-READABLE MEDIUM

Information

  • Patent Application
  • 20230254436
  • Publication Number
    20230254436
  • Date Filed
    May 27, 2021
    3 years ago
  • Date Published
    August 10, 2023
    a year ago
Abstract
The present disclosure provides a special effect showing method and an apparatus thereof, an electronic device, and a computer-readable medium. The method includes: opening a special effect showing interface and turning on a video capturing apparatus; obtaining a music feature of background music in the special effect showing interface and second special effect elements generated according to the music feature, showing the second special effect elements in a preset order, and controlling the second special effect elements to move in the special effect showing interface according to the music feature; identifying a target object in a video captured by the video capturing apparatus and controlling a first special effect element in the special effect showing interface to move according to a movement of the target object; and triggering a special effect when the first special effect element and the second special effect elements satisfy a preset condition.
Description

This application claims the priority to Chinese Patent Application No. 202010699334.8, filed on Jul. 17, 2020, the entire disclosure of which is incorporated herein by reference as part of the disclosure of this application.


TECHNICAL FIELD

The present disclosure relates to the technical field of human-computer interaction, and in particular, to a method for showing a special effect and an apparatus thereof, an electronic device, and a computer-readable medium.


BACKGROUND

With the development of science and technology, the techniques of human-computer interaction have become increasingly mature. Many applications may involve human-computer interaction, and the human-computer interaction in entertainment scenarios can bring more joy for users.


SUMMARY

An embodiment of the present disclosure provides a method for showing a special effect, and the method includes:


opening a special effect showing interface based on a triggering operation and turning on a video capturing apparatus, wherein a first special effect element is shown in the special effect showing interface;


obtaining a music feature of background music in the special effect showing interface and a plurality of second special effect elements generated according to the music feature, showing the second special effect elements in a preset order, and controlling the second special effect elements to move in the special effect showing interface according to the music feature;


identifying a target object in a video captured by the video capturing apparatus and controlling the first special effect element to move in the special effect showing interface according to a movement of the target object; and


triggering a special effect when the first special effect element and the second special effect elements satisfy a preset condition.


Another embodiment of the present disclosure provides an apparatus for showing a special effect, and the apparatus includes:


an opening module, configured to open a special effect showing interface based on a triggering operation and turn on a video capturing apparatus, wherein a first special effect element is shown in the special effect showing interface;


a special effect element showing module, configured to obtain a music feature of background music in the special effect showing interface and a plurality of second special effect elements generated according to the music feature, show the second special effect elements in a preset order, and control the second special effect elements to move in the special effect showing interface according to the music feature;


a special effect element controlling module, configured to identify a target object in a video captured by the video capturing apparatus and control the first special effect element to move in the special effect showing interface according to a movement of the target object; and


a special effect showing module, configured to trigger a special effect when the first special effect element and the second special effect elements satisfy a preset condition.


Still another embodiment of the present disclosure provides an electronic device, and the electronic device includes:


one or more processors;


a memory; and


one or more applications, wherein the one or more applications are stored in the memory and configured to be executed by the one or more processors, and the one or more applications are configured to perform the above-mentioned method for showing a special effect.


The fourth aspect of the present disclosure provides a computer-readable medium, the computer-readable medium is configured to store computer instructions, and the computer instructions, when executed on a computer, cause the computer to perform the above-mentioned method for showing a special effect.





BRIEF DESCRIPTION OF THE DRAWINGS

In order to clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings which are needed to be used in the description of the embodiments of the present disclosure will be briefly described in the following.



FIG. 1 is a flowchart of a method for showing a special effect provided by an embodiment of the present disclosure;



FIG. 2 is a schematic diagram of showing a first special effect element provided by an embodiment of the present disclosure;



FIG. 3 is a schematic diagram of showing a background image provided by an embodiment of the present disclosure;



FIG. 4-a is a schematic diagram of showing a first special effect element and a second special effect element provided by an embodiment of the present disclosure;



FIG. 4-b is a schematic diagram of showing a target object corresponding to a first special effect element provided by an embodiment of the present disclosure;



FIG. 4-c is a schematic diagram of showing a first special effect element corresponding to a target object provided by an embodiment of the present disclosure;



FIG. 5 is a schematic diagram of showing a plurality of second special effect elements with different lengths provided by an embodiment of the present disclosure;



FIG. 6 is a schematic diagram of showing second special effect elements at different showing position heights provided by an embodiment of the present disclosure;



FIG. 7 is a flowchart of a method for controlling a movement of a first special effect element provided by an embodiment of the present disclosure;



FIG. 8. is a schematic diagram of showing a preset region provided by an embodiment of the present disclosure;



FIG. 9 is a schematic diagram of displaying a target object provided by an embodiment of the present disclosure;



FIG. 10 is a schematic diagram of showing a special effect provided by an embodiment of the present disclosure;



FIG. 11 is a schematic diagram of showing a preset region as an entire special effect showing interface provided by an embodiment of the present disclosure;



FIG. 12 is a flowchart of a method for music selecting provided by an embodiment of the present disclosure;



FIG. 13 is a schematic structural diagram of an apparatus for showing a special effect provided by an embodiment of the present disclosure; and



FIG. 14 is a schematic structural diagram of an electronic device provided by an embodiment of the present disclosure.





The above and other features, advantages and aspects of various embodiments of the present disclosure will become more apparent when taken in conjunction with the accompanying drawings and with reference to the following detailed description. Throughout the drawings, the same or similar reference numerals refer to the same or similar elements. It should be understood that the drawings are schematic and that the components and elements are not necessarily drawn to scale.


DETAILED DESCRIPTION

The embodiments of the present disclosure will be described below in more detail with reference to the accompanying drawings. While the accompanying drawings show some embodiments of the present disclosure, it should be understood that the present disclosure may be implemented in various forms and should not be construed as being limited to the embodiments set forth herein. Rather, these embodiments are provided so that the present disclosure can be understood more thoroughly and comprehensively. It should be understood that the drawings and embodiments of the present disclosure are merely illustrative and not meant to limit the protection scope of the present disclosure.


It should be understood that steps described in the method implementations of the present disclosure may be performed in different orders and/or concurrently. In addition, the method implementations may include additional steps and/or the steps shown may be omitted. The scope of the present disclosure is not limited in this aspect.


As used herein, the term “include” and variants thereof are open words and should be construed as “including but not limited to.” The term “based on” means “at least partially based on.” The term “an embodiment” represents “at least one embodiment.” The term “another embodiment” represents “at least one further embodiment.” The term “some embodiments” represents “at least some embodiments.” The relevant definitions of other terms will be given in the following descriptions.


It needs to be noted that terms such as the “first,” “second,” or the like mentioned in the present disclosure are only used for distinguishing between apparatuses, modules or units, and are not meant to define that these apparatuses, modules or units must be different apparatuses, modules or units and also not meant to define the order or mutual dependence relationship of functions performed by these apparatuses, modules or units.


It needs to be noted that terms such as “a/an” and “a plurality of” used herein are illustrative and non-limiting. It should be understood by a person skilled in the art that “a/an” shall be construed as “one or more” unless specified in the context otherwise.


Names of messages or information exchanged between a plurality of apparatuses in the embodiments of the present disclosure are only used for the purpose of description and not meant to limit the scope of these messages or information.


The inventors have noticed that the techniques of human-computer interaction in some entertainment scenarios simply involve obtaining a user operation and making a simple response to human-computer interaction according to the user operation. The techniques cannot provide much joy of human-computer interaction and cannot show rich special effects. Moreover, there are no scenarios of human-computer interaction that incorporate musical special effects, and special effects cannot be shown based on musical characteristics. Therefore, the techniques are narrow in practical use range.


Thus, the techniques of human-computer interaction can only show single and uninteresting special effects and need to be improved.


At least one embodiment of the present disclosure is intended to solve at least one of the technical defects described above, especially the technical problem of single and uninteresting showing of special effects in the techniques of human-computer interaction.


In at least one embodiment of the present disclosure, a music feature of background music of a special effect showing interface is obtained, and second special effect elements are generated based on the music feature. The second special effect elements are controlled to move in the special effect showing interface according to the music feature. The first special effect element is controlled to move in the special effect showing interface by obtaining a movement of a target object in a target trail direction. A special effect is triggered when the first special effect element and the second special effect elements satisfy a preset condition.


An embodiment of the present disclosure provides a method for showing a special effect. As illustrated in FIG. 1, the method includes the following steps.


Step S101: opening a special effect showing interface based on a triggering operation and turning on a video capturing apparatus, where a first special effect element is shown in the special effect showing interface.


Step S102: obtaining a music feature of background music in the special effect showing interface and a plurality of second special effect elements generated according to the music feature, showing the second special effect elements in a preset order, and controlling the second special effect elements to move in the special effect showing interface according to the music feature.


Step S103: identifying a target object in a video captured by the video capturing apparatus and controlling the first special effect element to move in the special effect showing interface according to a movement of the target object.


Step S104: triggering a special effect when the first special effect element and the second special effect elements satisfy a preset condition.


In the embodiments of the present disclosure, the provided method for showing a special effect is applied to a terminal device, such as a mobile terminal, a computer device (e.g., a desktop computer, a notebook computer, an all-in-one computer, etc.), or the like. The mobile terminal may include mobile devices such as a smart phone, a palm computer, a tablet computer, a wearable device with a display screen, etc. The video capturing apparatus described above may be a video capturing apparatus that the terminal device is equipped with or an external video capturing apparatus connected to the terminal device. Alternatively, the method for showing a special effect provided in the embodiments of the present disclosure may be a method for showing a special effect in an application installed on the terminal device.


It needs to be noted that in the present disclosure, the “special effect” refers to a special effect that normally does not appear in real life and is, for example, produced by computer software. The special effect includes, for example, a visual special effect, an audio special effect, or the like.


In the embodiments of the present disclosure, the first special effect element and the second special effect elements are elements for showing a special effect in the embodiments of the present disclosure. The second special effect elements are generated and controlled based on the music feature, and the first special effect element is controlled by the movement of the target object to trigger the special effect. The first special effect element and the second special effect elements may include but be not limited to picture elements such as an illustration, a dynamic picture, or the like. For example, the first special effect element is a virtual tree hole, while the second special effect elements are virtual animals. When the first special effect element and the second special effect elements satisfy the preset condition, e.g., the virtual animal entering the virtual tree hole, the special effect is triggered.


The showing of the special effect may also be related to the music, and the incorporation of music elements brings better experience for the user.


In the embodiments of the present disclosure, the target object includes a certain object in the video captured by the video capturing apparatus, which may be any object having a feature. For the embodiments of the present disclosure, to enhance the human-computer interaction experience of the user, a target object of a human body is taken as an example for description. The target object may include a face feature object, such as a nose, an eye, a mouth or the like, and may also include feature objects of other parts such as a hand or a finger.


For the embodiments of the present disclosure, a special embodiment is taken as an example for ease of illustration. In the embodiment of the present disclosure, the method for showing a special effect provided by the embodiment of the present disclosure is carried out in the terminal device. In one embodiment, the special effect showing interface is opened based on the triggering operation, and the special effect showing interface may be a display interface (e.g., a game interface or a shooting interface) in a certain application installed on the terminal device. The triggering operation may include a user tapping on a particular button to enter the special effect showing interface, or a user using a particular speech to enter the special effect showing interface, or a user using a particular expression to enter the special effect showing interface, which will not be limited in the present disclosure. The special effect showing interface is as illustrated in FIG. 2. In the embodiment of the present disclosure, the first special effect element 202 is shown in the special effect showing interface 201, where the first special effect element may include various static or dynamic picture elements. Optionally, when entering the special effect showing interface, a background may also be displayed in the special effect showing interface. As illustrated in FIG. 3, a background image 203 is further provided in the special effect showing interface 201. The background image 203 may be, for example, a background image including user interface (UI) elements such as lawns and flowers, so as to provide a better visual experience for the user, and the embodiments of the present disclosure are not limited in this aspect.


In the embodiment of the present disclosure, Step S102 includes: obtaining the music feature of the background music in the special effect showing interface and the plurality of second special effect elements generated according to the music feature, showing the second special effect elements in the preset order, and controlling the second special effect elements to move in the special effect showing interface according to the music feature. In the embodiments of the present disclosure, for ease of illustration, taking the above specific embodiment as an example, the background music may be played in the special effect showing interface. For each background music, a server or the terminal device may extract a music feature of the background music in advance and generate a plurality of second special effect elements according to the music feature. In addition, for each background music, the terminal device may further extract the music feature of the background music in real time and generate a plurality of second special effect elements according to the music feature. For ease of illustration, taking a piece of background music as an example, the number of the second special effect elements corresponding to the music feature of the piece of background music, a length of each second special effect element shown in the special effect showing interface, a position height of each second special effect element present in the special effect showing interface, and a movement velocity of each second special effect element in the special effect showing interface are all related to the music feature of the piece of background music. The second special effect elements are shown in the preset order, and the movement of each second special effect element in the special effect showing interface is controlled according to the music feature. As illustrated in FIG. 4-a, a first special effect element A (i.e., the first special effect element 202) and a plurality of second special effect elements C (i.e., the second special effect elements 204) are provided in the special effect showing interface 201. Each of the second special effect elements 204 enters the special effect showing interface 201 in a preset order in a direction opposite to the position of the first special effect element A.


Optionally, after the special effect showing interface is opened, the terminal device may also play the above-mentioned background music for the user. The background music may be selected by the user or configured by default. Other function options may also be configured in the special effect showing interface to better meet the demands of the user. For example, operation options such as the pause and start may be configured. The user may tap on the pause button to pause the movement of the second special effect element in the current interface or may tap on the start button to allow the second special effect element to move continuously in the current interface.


In the embodiment of the present disclosure, Step S103 includes: identifying the target object in the video captured by the video capturing apparatus and controlling the first special effect element to move in the special effect showing interface according to the movement of the target object. In the embodiment of the present disclosure, for ease of illustration, the target object is a feature object of the user's face, for example, the nose. The video capturing apparatus is a camera of the terminal device. After the camera captures an image of the user's head, the user's nose in the image of the user's head is identified, and the movement of the first special effect element 202 in the special effect showing interface 201 is controlled according to the movement of the user's nose. As illustrated in FIG. 4-b, the first special effect element 202 and the plurality of second special effect elements 204 are displayed in the special effect showing interface 201. When the target object is identified by the terminal device, the position of the first special effect element in the special effect showing interface is adjusted according to the position of the target object in the special effect showing interface. In one embodiment, the terminal device may adjust the position height of the first special effect element in the special effect showing interface according to the position height of the target object in the special effect showing interface. For example, taking FIG. 4-b for example, when the terminal device detects that the nose is in the middle position of the special effect showing interface 201, the first special effect element 202 is adjusted to be displayed in the middle of the special effect showing interface 201. In FIG. 4-c, when the terminal device detects that the nose moves to the lower portion of the special effect showing interface, the first special effect element 202 is adjusted to be displayed in the lower portion of the special effect showing interface 201.


In the embodiment of the present disclosure, Step S104 includes triggering the special effect when the first special effect element and second special effect elements meet the preset condition. In the embodiment of the present disclosure, for ease of illustration, taking the above specific embodiment as an example, the plurality of second special effect elements are controlled to move in the special effect showing interface according to the music feature, and the first special effect element is controlled, by the user using the target object (e.g., the nose), to move in the special effect showing interface. The special effect is triggered when the first special effect element and the second special effect elements meet the preset condition. In one embodiment, the special effect of the first special effect element coming into contact with the second special effect element is triggered when the first special effect element is in contact with the second special effect element. For example, when the first special effect element is a tree hole and the second special effect elements are animals, an animation effect of the animal entering the tree hole is triggered when the animal comes into contact with the tree hole, and simultaneously, special effects of graphic elements such as scores, music notes and celebration graphics may be displayed, or a sound effect may be played, or the sound associated with the second special effect element may be played. For example, when the second special effect element is an animal element, the animal sound or the like may be played. The present disclosure has no particular limitation on this.


In the embodiment of the present disclosure, optionally, there may be a plurality of first special effect elements and a plurality of target objects accordingly, where one first special effect element corresponds to one target object. Different target objects control different first special effect elements to move, respectively. Thus, for example, interaction among a plurality of users can be realized. For ease of illustration, taking a specific embodiment for example, when the special effect showing interface is opened, a game mode may be selected, such as a two-player/multiple-player mode. In this case, two or more first special effect elements may be displayed accordingly in the special effect showing interface. The terminal device then turns on the video capturing apparatus to identify the target object (e.g., a nose) in the video. When the video capturing apparatus detects that a plurality of users are present in the video captured by the video capturing apparatus, the plurality of target objects (e.g., the noses) are identified, and the movement trail of each nose is identified. The corresponding first special effect element is controlled to move in the special effect showing interface according to the movement trail of each nose such that each first special effect element comes into contact with the second special effect element, and different special effects are shown according to contact results. In another embodiment, in case that the terminal device turns on the video capturing apparatus to identify a plurality of target objects present in the video, the corresponding number of first special effect elements may be displayed accordingly in the special effect showing interface, and the movement trail of each target object is identified. The corresponding first special effect element is controlled to move in the special effect showing interface according to the movement trail of each target object such that each first special effect element comes into contact with the second special effect element, and different special effects are shown according to the contact results. In the embodiment of the present disclosure, a plurality of users may control, on the same device, the first special effect elements to come into contact with the second special effect elements, respectively, thereby enhancing the competitive nature of the game.


According to the embodiments of the present disclosure, the music feature of the background music in the special effect showing interface is obtained, and the second special effect elements are generated based on the music feature. The second special effect elements are controlled to move in the special effect showing interface according to the music feature. The first special effect element is controlled to move in the special effect showing interface by obtaining the movement of the target object in the target trail direction. The special effect is triggered when the first special effect element and the second special effect elements meet the preset condition. In addition, showing of a special effect according to the embodiments of the present disclosure is related to the music, and the incorporation of music elements brings better experience for the user. Specifically, the user uses the nose to control the first special effect element to move in the special effect showing interface, allowing more second special effect elements to come into contact with the first special effect element and obtain scores. Thus, the interaction and game experience of the user can be further improved.


The embodiment of the present disclosure provides a possible implementation, and in this implementation, the music feature includes the number of feature music segments of the background music, where the number of the second special effect elements corresponds to the number of the feature music segments.


In the embodiment of the present disclosure, the music feature of the background music of the special effect showing interface includes the number of feature music segments, where the feature music segments may include information segments capable of characterizing the features of music elements included in the music, and the music elements may include but be not limited to the duration of the music, characters and notes included in the music, or the like. In an embodiment, the feature music segments may be divided according to the characters included in the music. For example, for the music of a Chinese song, a music segment corresponding to one Chinese character in the lyrics of the music may be regarded as one feature music segment. In an embodiment, the feature music segments may be divided according to the duration. For example, the music may be divided into a plurality of music segments according to a certain duration. In this case, one music segment may be regarded as one feature music segment. Alternatively, for different pieces of music, feature music segments may be divided in different manners according to fast and slow music rhythms. For example, for two pieces of music with the same duration, the feature music segments of the music with a fast music rhythm may be more than the feature music segments of the music with a slow music rhythm. Correspondingly, the music with the fast music rhythm may correspond to more second special effect elements.


The music feature of the background music includes the number of feature music segments of the background music, and the number of the second special effect elements corresponds to the number of the feature music segments. Specifically, one feature music segment may correspond to one second special effect element. For example, taking the song “Two Tigers” for example, the song includes 32 Chinese characters in total, and the feature music segments may be divided according to the characters. Thus, the song may include 32 feature music segments, and 32 second special effect elements may be generated correspondingly. That is, one Chinese character corresponds to one second special effect element. When a plurality of second special effect elements are shown in the special effect showing interface, the plurality of second special effect elements are shown in sequence according to the music feature of the song “Two Tigers.” The user may use the target object such as the nose to control the first special effect element to come into contact with the second special effect element, so as to trigger the special effect of contact. Further, the special effects of scores, musical special effects or the like may be triggered.


According to the embodiments of the present disclosure, the second special effect elements are generated according to the number of the feature music segments of the music, and the special effect showing process is related to the music. Thus, the user experience and interaction joy can be improved.


The embodiments of the present disclosure provide a possible implementation, and in this implementation, the music feature further includes a play duration of each feature music segment. For each second special effect element, a length of the second special effect element shown in the special effect showing interface is positively related to the play duration of the feature music segment corresponding to the second special effect element.


In the embodiments of the present disclosure, the music feature further includes the play duration of each feature music segment. For each second special effect element, the length of the second special effect element shown in the special effect showing interface is positively related to the play duration of the feature music segment corresponding to the second special effect element, and the longer the play duration of the feature music segment, the greater the length of the corresponding second special effect element shown in the special effect showing interface. For example, as illustrated in FIG. 5, for three second special effect elements generated from “run so fast” in the song “Two Tigers,” the length of the second special effect element 501 corresponding to “fast” is longer than those of the second special effect element 502 and the second special effect element 503 corresponding to “run” and “so.” The ratio of the play duration of the feature music segment to the length of the corresponding second special effect element shown in the special effect showing interface may be preset, which will not be limited herein. When the second special effect elements are shown in the special effect showing interface, the lengths of different second special effect elements in the special effect showing interface may differ. When the user may use the target object such as the nose to control the first special effect element to come into contact with the second special effect element, the contact time or duration of each second special effect element with the first special effect element may be different. Thus, more diverse playing methods can be provided so that the user joy can be enhanced.


In the embodiments of the present disclosure, the length of the corresponding second special effect element in the special effect showing interface is controlled by the play duration of the feature music segment, thereby allowing for richer visual effects on the interface, more diverse playing methods, and better user experience.


The embodiments of the present disclosure provide a possible implementation, and in this implementation, the music feature further includes a pitch of each feature music segment. For each second special effect element, a position height of the second special effect element shown in the special effect showing interface is positively related to the pitch of the feature music segment corresponding to the second special effect element.


In the embodiments of the present disclosure, the music feature further includes the pitch of each feature music segment. The pitch refers to the pitch of a note corresponding to the feature music segment. If the feature music segment corresponds to one note, the pitch of the feature music segment is the pitch of the note. If the feature music segment corresponds to a plurality of notes, the pitch of the feature music segment is an average value of the plurality of notes. The position height of each second special effect element shown in the special effect showing interface is positively related to the pitch of the feature music segment. For example, as illustrated in FIG. 6, the second special effect elements corresponding to the feature music segments with different pitches are different in position height in the special effect showing interface, and the pitch of the feature music segment corresponding to the second special effect element 601 is higher than that of the feature music segment corresponding to the second special effect element 602. Thus, as illustrated in FIG. 6, the position height of the second special effect element 601 in the special effect showing interface is greater than that of the second special effect element 602 in the special effect showing interface. When showing a special effect, the second special effect elements are different in position height in the special effect showing interface. When the user uses the nose to control the first special effect element to come into contact with the second special effect element, the user may move the nose up and down, so that the interaction experience of the user can be improved.


In the embodiments of the present disclosure, the position height of the corresponding second special effect element in the special effect showing interface is determined according to the pitch of the feature music segment. The user may continuously control the first special effect element to move up and down in the special effect showing interface. Thus, the interaction experience of the user and the joy of playing the game can be improved.


The embodiments of the present disclosure provide a possible implementation, and in this implementation, the music feature further includes the number of beats of each feature music segment. For each second special effect element, a movement velocity of the second special effect element in the special effect showing interface is positively related to the number of beats of the feature music segment corresponding to the second special effect element.


In the embodiments of the present disclosure, the music feature further includes the number of beats of each feature music segment. The movement velocity of each second special effect element in the special effect showing interface is positively related to the number of beats of the feature music segment corresponding to the second special effect element. The more the beats in a unit time period, the greater the movement velocity of the corresponding second special effect element in the special effect showing interface. When showing a special effect, second special effect elements are set to be different in movement velocity according to the rhythm of the background music. The user may use the nose to control the first special effect element to come into contact with the second special effect element, so that the joy of interaction in the game can be improved.


In the embodiments of the present disclosure, the movement velocity of the second special effect element in the special effect showing interface is controlled according to the number of beats of the feature music segment. With different pieces of music, the second special effect elements are different in movement velocity with different game difficulties. Thus, the game playing methods can be increased, and the user experience can be improved.


The embodiments of the present disclosure provide a possible implementation, and in this implementation, the preset order includes a play order of the feature music segments. In the embodiments of the present disclosure, the order of the second special effect elements appearing in the special effect showing interface is the play order of the feature music segments corresponding to the second special effect elements. For ease of illustration, taking the above specific embodiment for example, the second special effect elements are controlled to appear in the special effect showing interface according to the play order of the music, ensuring that the user can control the first special effect element to come into contact with the second special effect element and guaranteeing that the game proceeds smoothly.


The embodiments of the present disclosure provide a possible implementation, and in this implementation, as illustrated in FIG. 7, controlling the first special effect element to move in the special effect showing interface according to the movement of the target object includes: obtaining a movement trail of the target object, and controlling the first special effect element to move in the special effect showing interface according to the movement trail of the target object. In one embodiment, the terminal device may obtain the movement trail of the target object and control the first special effect element to move in the special effect showing interface according to the movement trail of the target object, thereby controlling the first special effect element to come into contact with the second special effect element.


In one embodiment, controlling the first special effect element to move in the special effect showing interface according to the movement of the target object includes:


Step S701: obtaining a movement trail of the target object in a target direction; and


Step S702: controlling the first special effect element to move in the special effect showing interface according to the movement trail of the target object in the target direction.


In the embodiments of the present disclosure, the target direction may include a movement direction of the first special effect element, where the movement direction of the first special effect element may be perpendicular to that of the second special effect element. For example, when the first special effect element moves in a vertical direction, the movement direction of the second special effect element is a horizontal direction, and in this case, the target direction is the vertical direction. If the first special effect element moves in the horizontal direction, the movement direction of the second special effect element is the vertical direction, and in this case, the target direction is the horizontal direction. In addition, in an embodiment, the target direction may include a movement direction of the first special effect element, where the movement direction of the first special effect element may be consistent with that of the second special effect element. For example, when the first special effect element moves in a vertical direction, the movement direction of the second special effect element may also be the vertical direction. When the first special effect element moves in a horizontal direction, the movement direction of the second special effect element may also be the horizontal direction. The present disclosure has no particular limitation on this.


As an embodiment of the present disclosure, the terminal device firstly determines the target direction, then obtains the movement trail of the target object in the target direction in the video captured by the video capturing apparatus, and controls the first special effect element to move in the special effect showing interface based on the movement trail. For ease of illustration, taking one of the embodiments described above as an example for description, as illustrated in FIG. 8, the first special effect element 801 may move in a preset region 802. Assuming that the target direction is the vertical direction, when the user uses the target object (e.g., the nose) to control the first special effect element to move in the vertical direction, the terminal device identifies the movement trail of the nose in the vertical direction and then controls the first special effect element to come into contact with the second special effect element according to the movement trail of the nose in the vertical direction.


A person skilled in the art should understand that the size and position of the preset region in FIG. 8 are merely illustrative, the specific size of the preset region and the position thereof in the special effect showing interface may be set according to practical situations, and the present disclosure is not limited in this aspect.


In the embodiments of the present disclosure, the movement direction of the first special effect element is determined as the target direction, and the movement trail of the target object in the target direction is obtained. The first special effect element is controlled to move based on the movement trail, ensuring that the game proceeds normally and improving the joy and experience of user interaction.


The embodiments of the present disclosure provide a possible implementation, and in this implementation, after identifying the target object in the video captured by the video capturing apparatus, the method further includes: displaying the target object in the special effect showing interface, and adjusting the position of the first special effect element in the special effect showing interface according to the position of the target object in the special effect showing interface.


In an embodiment, the terminal device may adjust the position height of the first special effect element in the special effect showing interface according to the position height of the target object in the special effect showing interface, so that the position of the first special effect element in the special effect showing interface is in the same horizontal line with the target object. In an embodiment of the present disclosure, as illustrated in FIG. 9, when the special effect showing interface is opened, the terminal device displays the video captured by the video capturing apparatus in the special effect showing interface, where the video includes the target object. Moreover, the position of the first special effect element in the special effect showing interface is adjusted according to the position of the target object in the video. For ease of illustration, taking the above specific embodiment as an example, as illustrated in FIG. 9, the video 902 captured by the video capturing apparatus is displayed in the special effect showing interface 901, where the video 902 includes the target object 903 which may be a user's nose. The position of the first special effect element 904 in the special effect showing interface is then adjusted to be in the same horizontal line with the target object according to the position of the target object 903 in the video. For example, the position of the first special effect element 904 just appearing in the special effect showing interface 901 is shown by dotted lines. Thus, part of the first special effect element 904 is adjusted to a position in the same horizontal line with the target object 903, thereby allowing the user to use the target object to control the movement of the first special effect element.


In the embodiments of the present disclosure, the target object is displayed in the special effect showing interface so that the user can see the position of the target object conveniently, and the position of the first special effect element in the special effect showing interface is adjusted according to the initial position of the target object, thereby facilitating the setting of the initial interface and improving the user experience.


The embodiments of the present disclosure provide a possible implementation, and in this implementation, triggering the special effect when the first special effect element and the second special effect elements satisfy the preset condition includes: triggering a first special effect when the second special effect element moves to a preset region in the special effect showing interface and comes into contact with the first special effect element; and triggering a second special effect when the second special effect element moves to the preset region in the special effect showing interface and is not in contact with the first special effect element, where the preset region is a region in which the first special effect element is capable of moving.


In the embodiments of the present disclosure, the preset region may be a region in which the first special effect element is capable of moving, where the size of the preset region may be a region with a preset size on one side of the special effect showing interface. As illustrated in FIG. 10, for ease of illustration, taking the above specific embodiment as an example, the first special effect element 1002 (i.e., the first special effect element A) is displayed in the special effect showing interface 1001. The first special effect element 1002 is capable of moving in the preset region 1003. In the special effect showing interface 1001, a plurality of second special effect elements 1004 generated according to the music feature are shown, and the second special effect elements 1004 move close to the first special effect element 1002 in a direction opposite to the position of the first special effect element 1002. The user uses the target object such as the nose to control the first special effect element 1002 to move in the preset region 1003. When the second special effect element 1004 moves to the preset region 1003, if the user uses the nose to control the first special effect element 1002 to come into contact with the second special effect element 1004, the first special effect is triggered. When the second special effect element 1004 moves to the preset region 1003, if the user fails to use the nose to control the first special effect element 1002 to come into contact with the second special effect element 1004, the second special effect is triggered.


For example, showing the first special effect may include playing the first pre-configured music, such as a sound and background music associated with the second special effect element, and may also include playing the first pre-configured animation. Taking for example that the first special effect element is the tree hole element while the second special effect elements are animal elements, the first pre-configured animation may include special effects such as giving out dazzle light around the tree hole and the animals disappearing after the animal enters the tree hole. Showing the second special effect may include playing the second pre-configured music or playing the second pre-configured animation. Taking for example that the first special effect element is the tree hole element while the second special effect elements are animal elements, the second pre-configured music may include, for example, the sound of the animal bumping into a wall or sobbing of the animal, and the second pre-configured animation may include animation effects such as the animal stopping moving, revolving stars appearing above the head of the animal, and after a period of time, the animal and the stars above the head of the animal disappearing simultaneously. Alternatively, when the user uses the nose to move the first special effect element to come into contact with the second special effect element within the preset region, the terminal device may set bonus points for the user and record the scores of the user in the whole song. With different pieces of background music, the second special effect elements are different in movement velocity in the special effect showing interface, so that different game difficulties are provided, thereby providing more choices and challenges for the user.


Alternatively, in the method for showing a special effect provided by the embodiments of the present disclosure, the movement range (i.e., the preset region) of the first special effect element may include the entire special effect showing interface or a part of the special effect showing interface. As illustrated in FIG. 11, in the special effect showing interface 1101, the first special effect element 1102 (i.e., the first special effect element A) may move arbitrarily in the preset region 1103, and the movement trail thereof is controlled by the movement trail of the target object. The second special effect elements 1104 (i.e., the second special effect elements C) may appear from one side of the special effect showing interface. The user controls the first special effect element 1102 to move in the preset region 1103 to come into contact with the second special effect elements 1104. Certainly, the specific size of the preset region and the position thereof in the special effect showing interface may be set according to practical situations. Regarding the preset regions different in size and setting position, the user may control the first special effect element to move in different trails, thereby allowing the user to experience more game joy.


In the embodiments of the present disclosure, different special effects are provided when the first special effect element is or is not in contact with the second special effect elements. When the user uses the target object such as the nose to control the first special effect element to come into contact with the second special effect elements or fails to bring the first special effect element into contact with the second special effect elements, different special effects are triggered to be shown, thereby allowing for richer experience of the user.


In a possible implementation provided in an embodiment of the present disclosure, as illustrated in FIG. 12, the method further includes:


Step S1201: receiving a music selecting operation by a user; and


Step S1202: determining the background music for the special effect showing interface according to the music selecting operation.


In the embodiments of the present disclosure, the user may select different pieces of background music for the game, which correspond to different game difficulties. When the user wants to select different pieces of background music, the user can perform the music selecting operation on the special effect showing interface. When receiving the music selecting operation, the terminal device determines the background music for the special effect showing interface according to the music selecting operation. Moreover, for different pieces of background music, the number of the second special effect elements generated by the terminal device, and the showing lengths, position heights and movement velocities thereof in the special effect showing interface may also be different. For example, the user taps on a music selecting button on the special effect showing interface to display a music selecting interface in the special effect showing interface. There are multiple pieces of music in the music selecting interface for selection, and the user may tap on one piece of music to determine the music as the background music for the special effect showing interface. In an embodiment, for each background music, the server or the terminal device may extract the music feature of the background music in advance and generate multiple second special effect elements according to the music feature. In addition, for each background music, the terminal device may further extract the music feature of the background music in real time and generate multiple second special effect elements according to the music feature.


In the embodiments of the present disclosure, users may select different pieces of background music according to their own preferences and thus select different game difficulties accordingly, thereby enriching the game experience.


In the embodiments of the present disclosure, the music feature of the background music of the special effect showing interface is obtained, and the second special effect elements are generated based on the music feature. The second special effect elements are controlled to move in the special effect showing interface according to the music feature. The first special effect element is controlled to move in the special effect showing interface by obtaining the movement of the target object in the target trail direction. The special effects are triggered when the first special effect element and the second special effect elements meet the preset condition. The showing of the special effect is related to the music, and the incorporation of music elements brings better experience for the user. Specifically, the user uses the target object to control the first special effect element to move in the special effect showing interface, thereby ensuring that more second special effect elements come into contact with the first special effect element to gain scores and further improving the game experience of the user.


The embodiments of the present disclosure provide an apparatus for showing a special effect. As illustrated in FIG. 13, the apparatus 130 for showing a special effect may include an interface opening module 1301, a special effect element showing module 1302, a special effect element controlling module 1303, and a special effect showing module 1304.


The interface opening module 1301 is configured to open a special effect showing interface based on a triggering operation and turn on a video capturing apparatus, where a first special effect element is shown in the special effect showing interface.


The special effect element showing module 1302 is configured to


obtain a music feature of background music in the special effect showing interface and a plurality of second special effect elements generated according to the music feature, show the second special effect elements in a preset order, and control the second special effect elements to move in the special effect showing interface according to the music feature.


The special effect element controlling module 1303 is configured to identify a target object in a video captured by the video capturing apparatus and control the first special effect element to move in the special effect showing interface according to a movement of the target object.


The special effect showing module 1304 is configured to trigger a special effect when the first special effect element and the second special effect elements satisfy a preset condition.


Optionally, the music feature includes the number of feature music segments of the background music, where the number of the second special effect elements corresponds to the number of the feature music segments.


Optionally, the music feature further includes a play duration of each feature music segment. For each second special effect element, a length of the second special effect element shown in the special effect showing interface is positively related to the play duration of the feature music segment corresponding to the second special effect element.


Optionally, the music feature further includes a pitch of each feature music segment. For each second special effect element, a position height of the second special effect element shown in the special effect showing interface is positively related to the pitch of the feature music segment corresponding to the second special effect element.


Optionally, the music feature further includes the number of beats of each feature music segment. For each second special effect element, a movement velocity of the second special effect element in the special effect showing interface is positively related to the number of beats of the feature music segment corresponding to the second special effect element.


Optionally, the preset order includes a play order of the feature music segments.


Optionally, when controlling the first special effect element to move in the special effect showing interface according to the movement of the target object, the special effect element controlling module 1303 may be configured to obtain a movement trail of the target object in a target direction and control the first special effect element to move in the special effect showing interface according to the movement trail of the target object in the target direction.


Optionally, the apparatus for showing a special effect provided by the embodiments of the present disclosure further includes a target object showing module, and the target object showing module is configured to display the target object in the special effect showing interface and adjust a position of the first special effect element in the special effect showing interface according to a position of the target object in the special effect showing interface.


Optionally, the special effect showing module 1304 may be configured to trigger a first special effect when the second special effect element moves to a preset region in the special effect showing interface and contacts with the first special effect element, and trigger a second special effect when the second special effect element moves to the preset region in the special effect showing interface and is not in contact with the first special effect element, where the preset region is a region in which the first special effect element is capable of moving.


Optionally, the first special effect includes playing a first pre-configured sound effect, and/or playing a first pre-configured animation. The second special effect includes playing a second pre-configured sound effect, and/or playing a second pre-configured animation.


Optionally, the first pre-configured sound effect includes the background music.


Optionally, the apparatus for showing a special effect provided by the embodiments of the present disclosure further includes a music selecting module, and the music selecting module is configured to receive a music selecting operation by a user and determine the background music for the special effect showing interface according to the music selecting operation.


The apparatus for showing a special effect provided by the embodiments of the present disclosure can implement the method for showing a special effect described in the above embodiments by following the same implementation principles, which will not be described herein.


In the embodiments of the present disclosure, the music feature of the background music of the special effect showing interface is obtained, and the second special effect elements are generated based on the music feature. The second special effect elements are controlled to move in the special effect showing interface according to the music feature. The first special effect element is controlled to move in the special effect showing interface by obtaining the movement of the target object in the target trail direction. The special effects are triggered when the first special effect element and the second special effect elements meet the preset condition. The showing of the special effect is related to the music, and the incorporation of music elements brings better experience for the user.


The user uses the target object to control the first special effect element to move in the special effect showing interface, thereby ensuring that more second special effect elements come into contact with the first special effect element to gain scores and further improving the game experience of the user.


The modules described above may be implemented as software components executed on one or more general-purpose processors or implemented as hardware, for example, for executing some functions or a combination thereof, such as a programmable logic device and/or an application-specific integrated circuit. In some embodiments, these modules may be embodied in the form of software products that may be stored on a non-transitory storage medium. The non-transitory storage medium includes instructions, and the instructions, upon execution, cause a computer device (e.g., a personal computer, a server, a network device, a mobile terminal, etc.) to carry out the method described in the embodiments of the present disclosure. In an embodiment, the modules described above may also be implemented on a single device or distributed on a plurality of devices. The functions of these modules may be combined with one another, or each module may be further divided into a plurality of submodules.



FIG. 14 is a schematic structural diagram of an electronic device 1400 adapted to implement the embodiments of the present disclosure. The electronic device in the embodiments of the present disclosure may include but be not limited to mobile terminals such as a mobile phone, a notebook computer, a digital broadcasting receiver, a personal digital assistant (PDA), a portable Android device (PAD), a portable media player (PMP), a vehicle-mounted terminal (e.g., a vehicle-mounted navigation terminal), and a wearable device with a display screen, and fixed terminals such as a digital TV and a desktop computer. The electronic device illustrated in FIG. 14 is merely an example, and should not pose any limitation to the functions and the range of use of the embodiments of the present disclosure.


The electronic device includes a memory and a processor. The memory is configured to store programs for performing the method according to the method embodiments described above. The processor is configured to execute the programs stored on the memory to realize the functions in the embodiments of the present disclosure described above and/or other desired functions. The processor described here may be referred to as a processing apparatus 1401 described below. The memory may include at least one of a read-only memory (ROM) 1402, a random access memory (RAM) 1403, or a storage apparatus 1408, which will be described in detail below.


As illustrated in FIG. 14, the electronic device 1400 may include a processing apparatus (e.g., a central processing unit, a graphics processing unit, etc.) 1401, which can perform various suitable actions and processing according to the program stored on the read-only memory (ROM) 1402 or the program loaded from the storage apparatus 1408 into the random access memory (RAM) 1403. The RAM 1403 further stores various programs and data required for operations of the electronic device 1400. The processing apparatus 1401, the ROM 1402, and the RAM 1403 are interconnected by means of a bus 1404. An input/output (I/O) interface 1405 is also connected to the bus 1404.


Usually, the following apparatuses may be connected to the I/O interface 1405: an input apparatus 1406 including, for example, a touch screen, a touch pad, a keyboard, a mouse, a camera, a microphone, an accelerometer, and a gyroscope; an output apparatus 1407 including, for example, a liquid crystal display (LCD), a loudspeaker, and a vibrator; a storage apparatus 1408 including, for example, a magnetic tape and a hard disk; and a communication apparatus 1409. The communication apparatus 1409 may allow the electronic device 1400 to be in wireless or wired communication with other devices to exchange data. While FIG. 14 illustrates the electronic device 1400 having various apparatuses, it should be understood that all the illustrated apparatuses are not necessarily implemented or included. More or less apparatuses may be implemented or included alternatively.


Particularly, according to the embodiments of the present disclosure, the process described above with reference to the flowchart may be implemented as a computer software program. For example, the embodiments of the present disclosure include a computer program product, which includes a computer program carried by a computer-readable medium. The computer program includes program codes for carrying out the method shown in the flowchart. In such an embodiment, the computer program may be downloaded online through the communication apparatus 1409 and installed, or installed from the storage apparatus 1408, or installed from the ROM 1402. When the computer program is executed by the processing apparatus 1401, the functions defined in the method of the embodiments of the present disclosure are executed.


It should be noted that the computer-readable medium described above in the present disclosure may be a computer-readable signal medium or a computer-readable medium or any combination thereof. For example, the computer-readable medium may be but not limited to an electric, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus or device, or any combination thereof. More specific examples of the computer-readable medium may include but be not limited to an electrical connection with one or more wires, a portable computer disk, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a compact disk read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any appropriate combination thereof. In the present disclosure, the computer-readable medium may be any tangible medium including or storing a program that can be used by or in combination with an instruction execution system, apparatus or device. In the present disclosure, the computer-readable signal medium may include a data signal that propagates in a baseband or as a part of a carrier and carries computer-readable program codes. The data signal propagating in such a manner may take a plurality of forms, including but not limited to an electromagnetic signal, an optical signal, or any appropriate combination thereof. The computer-readable signal medium may also be any other computer-readable medium in addition to the computer-readable medium. The computer-readable signal medium may send, propagate or transmit a program used by or in combination with an instruction execution system, apparatus or device. The program codes included on the computer-readable medium may be transmitted by using any suitable medium, including but not limited to an electric wire, a fiber-optic cable, radio frequency (RF) and the like, or any appropriate combination thereof.


In some embodiments, a client and a server may communicate by means of any network protocol currently known or to be developed in future such as HyperText Transfer Protocol (HTTP), and may achieve communication and interconnection with digital data (e.g., a communication network) in any form or of any medium. Examples of the communication network include a local area network (LAN), a wide area network (WAN), an internet network (e.g., the Internet), a peer-to-peer network (e.g., ad hoc peer-to-peer network), and any network currently known or to be developed in future.


The above-mentioned computer-readable medium may be included in the electronic device described above, or may exist alone without being assembled with the electronic device.


The above-mentioned computer-readable medium may carry one or more programs which, when executed by the electronic device, cause the electronic device to perform the method described in the embodiments described above. In an embodiment, the above-mentioned one or more programs, when executed by the electronic device, cause the electronic device to perform: opening a special effect showing interface based on a triggering operation and turning on a video capturing apparatus, wherein a first special effect element is shown in the special effect showing interface; obtaining a music feature of background music in the special effect showing interface and a plurality of second special effect elements generated according to the music feature, showing the second special effect elements in a preset order, and controlling the second special effect elements to move in the special effect showing interface according to the music feature; identifying a target object in a video captured by the video capturing apparatus and controlling the first special effect element to move in the special effect showing interface according to a movement of the target object; and triggering a special effect when the first special effect element and the second special effect elements satisfy a preset condition.


Computer program codes for performing the operations in the present disclosure may be written in one or more programming languages or a combination thereof. The programming languages include but are not limited to object oriented programming languages, such as Java, Smalltalk, and C++, and conventional procedural programming languages, such as C or similar programming languages. The program codes can be executed fully on a user's computer, executed partially on a user's computer, executed as an independent software package, executed partially on a user's computer and partially on a remote computer, or executed fully on a remote computer or a server. In the case that a remote computer is involved, the remote computer may be connected to a user computer via any type of network, for example, including a local area network (LAN) or a wide area network (WAN), or may be connected to an external computer (for example, connected via the Internet by using an Internet service provider).


The flowcharts and block diagrams in the accompanying drawings illustrate system architectures, functions and operations that may be implemented by the system, method and computer program product according to the embodiments of the present disclosure. In this regard, each block in the flowcharts or block diagrams may represent a module, a program segment or a part of code, and the module, the program segment or the part of code includes one or more executable instructions for implementing specified logic functions. It should also be noted that in some alternative implementations, functions marked in the blocks may also take place in an order different from the order designated in the accompanying drawings. For example, two consecutive blocks can actually be executed substantially in parallel, and they may sometimes be executed in a reverse order, which depends on involved functions. It should also be noted that each block in the flowcharts and/or block diagrams and combinations of the blocks in the flowcharts and/or block diagrams may be implemented by a dedicated hardware-based system for executing specified functions or operations, or may be implemented by a combination of dedicated hardware and computer instructions.


Related modules or units described in the embodiments of the present disclosure may be implemented by software, or may be implemented by hardware.


The functions described above herein may be performed at least partially by one or more hardware logic components. For example, exemplary types of hardware logic components that can be used without limitations include a field programmable gate array (FPGA), an application specific integrated circuit (ASIC), an application specific standard product (ASSP), a system on chip (SOC), a complex programmable logic device (CPLD), etc.


In the context of the present disclosure, a machine-readable medium may be a tangible medium that may include or store a program for use by or in combination with an instruction execution system, apparatus or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include but be not limited to an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus or device, or any appropriate combination thereof. More specific examples of the machine-readable storage medium may include an electrical connection with one or more wires, a portable computer disk, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a compact disk read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any appropriate combination thereof.


According to one or more embodiments provided by the present disclosure, a method for showing a special effect is provided, and the method includes:


opening a special effect showing interface based on a triggering operation and turning on a video capturing apparatus, wherein a first special effect element is shown in the special effect showing interface;


obtaining a music feature of background music in the special effect showing interface and a plurality of second special effect elements generated according to the music feature, showing the second special effect elements in a preset order, and controlling the second special effect elements to move in the special effect showing interface according to the music feature;


identifying a target object in a video captured by the video capturing apparatus and controlling the first special effect element to move in the special effect showing interface according to a movement of the target object; and


triggering a special effect when the first special effect element and the second special effect elements satisfy a preset condition.


Further, the music feature comprises a total number of feature music segments of the background music, and a total number of the second special effect elements corresponds to the number of the feature music segments.


Further, the music feature further comprises a play duration of each of the feature music segments; and for each second special effect element, a length of the each second special effect element shown in the special effect showing interface is positively related to a play duration of a feature music segment corresponding to the each second special effect element.


Further, the music feature further comprises a pitch of each of the feature music segments; and for each second special effect element, a position height of the each second special effect element shown in the special effect showing interface is positively related to a pitch of a feature music segment corresponding to the each second special effect element.


Further, the music feature further comprises a total number of beats of each of the feature music segments; and for each second special effect element, a movement velocity of the each second special effect element in the special effect showing interface is positively related to the number of beats of a feature music segment corresponding to the each second special effect element.


Further, the preset order comprises a play order of the feature music segments.


Further, controlling the first special effect element to move in the special effect showing interface according to the movement of the target object comprises: obtaining a movement trail of the target object in a target direction; and controlling the first special effect element to move in the special effect showing interface according to the movement trail of the target object in the target direction.


Further, after identifying the target object in the video captured by the video capturing apparatus, the method further comprises: displaying the target object in the special effect showing interface, and adjusting a position of the first special effect element in the special effect showing interface according to an initial position of the target object in the special effect showing interface.


Further, triggering the special effect when the first special effect element and the second special effect elements satisfy the preset condition comprises: triggering a first special effect when the second special effect element moves to a preset region in the special effect showing interface and contacts with the first special effect element; and triggering a second special effect when the second special effect element moves to the preset region in the special effect showing interface and is not in contact with the first special effect element, wherein the preset region is a region in which the first special effect element is capable of moving.


Further, the first special effect comprises: playing a first pre-configured sound effect, and/or playing a first pre-configured animation; and the second special effect comprises: playing a second pre-configured sound effect, and/or playing a second pre-configured animation.


Further, the first pre-configured sound effect is the background music.


Further, the method further includes: receiving a music selecting operation by a user; and determining the background music for the special effect showing interface according to the music selecting operation.


According to one or more embodiments provided by the present disclosure, an apparatus for showing a special effect is provided, and the apparatus for showing a special effect includes:


an opening module, configured to open a special effect showing interface based on a triggering operation and turn on a video capturing apparatus, wherein a first special effect element is shown in the special effect showing interface;


a special effect element showing module, configured to obtain a music feature of background music in the special effect showing interface and a plurality of second special effect elements generated according to the music feature, show the second special effect elements in a preset order, and control the second special effect elements to move in the special effect showing interface according to the music feature;


a special effect element controlling module, configured to identify a target object in a video captured by the video capturing apparatus and control the first special effect element to move in the special effect showing interface according to a movement of the target object; and


a special effect showing module, configured to trigger a special effect when the first special effect element and the second special effect elements satisfy a preset condition.


Optionally, the music feature comprises a total number of feature music segments of the background music, and a total number of the second special effect elements corresponds to the number of the feature music segments.


Optionally, the music feature further comprises a play duration of each of the feature music segments; and for each second special effect element, a length of the each second special effect element shown in the special effect showing interface is positively related to a play duration of a feature music segment corresponding to the each second special effect element.


Optionally, the music feature further comprises a pitch of each of the feature music segments; and for each second special effect element, a position height of the each second special effect element shown in the special effect showing interface is positively related to a pitch of a feature music segment corresponding to the each second special effect element.


Optionally, the music feature further comprises a total number of beats of each of the feature music segments; and for each second special effect element, a movement velocity of the each second special effect element in the special effect showing interface is positively related to the number of beats of a feature music segment corresponding to the each second special effect element.


Optionally, the preset order comprises a play order of the feature music segments.


Optionally, when controlling the first special effect element to move in the special effect showing interface according to the movement of the target object, the special effect element controlling module is configured to obtain a movement trail of the target object in a target direction and control the first special effect element to move in the special effect showing interface according to the movement trail of the target object in the target direction.


Optionally, the apparatus for showing a special effect provided by the embodiments of the present disclosure further includes a target object showing module, and the target object showing module is configured to display the target object in the special effect showing interface and adjust a position of the first special effect element in the special effect showing interface according to an initial position of the target object in the special effect showing interface.


Optionally, the special effect showing module is specifically configured to trigger a first special effect when the second special effect element moves to a preset region in the special effect showing interface and contacts with the first special effect element, and trigger a second special effect when the second special effect element moves to the preset region in the special effect showing interface and is not in contact with the first special effect element, wherein the preset region is a region in which the first special effect element is capable of moving.


Optionally, the first special effect comprises: playing a first pre-configured sound effect, and/or playing a first pre-configured animation; and the second special effect comprises: playing a second pre-configured sound effect, and/or playing a second pre-configured animation.


Optionally, the first pre-configured sound effect is the background music.


Optionally, the apparatus for showing a special effect provided by the embodiments of the present disclosure further includes a music selecting module, and the music selecting module is configured to receive a music selecting operation by a user and determine the background music for the special effect showing interface according to the music selecting operation.


According to one or more embodiments provided by the present disclosure, an electronic device is provided, and the electronic device includes: one or more processors; a memory; and one or more applications. The one or more applications are stored in the memory and configured to be executed by the one or more processors, and the one or more applications are configured to perform the above-mentioned method for showing a special effect.


According to one or more embodiments provided by the present disclosure, a computer-readable medium is provided, and the computer-readable medium is configured to store computer instructions. The computer instructions, when executed on a computer, cause the computer to perform the above-mentioned method for showing a special effect.


The foregoing are merely descriptions of the preferred embodiments of the present disclosure and the explanations of the technical principles involved. It should be understood by those skilled in the art that the scope of the disclosure involved herein is not limited to the technical solutions formed by a specific combination of the technical features described above, and shall cover other technical solutions formed by any combination of the technical features described above or equivalent features thereof without departing from the concept of the present disclosure. For example, the technical features described above may be mutually replaced with the technical features having similar functions disclosed herein (but not limited thereto) to form new technical solutions.


In addition, while operations have been described in a particular order, it shall not be construed as requiring that such operations are performed in the stated specific order or sequence. Under certain circumstances, multitasking and parallel processing may be advantageous. Similarly, while some specific implementation details are included in the above discussions, these shall not be construed as limitations to the scope of the present disclosure. Some features described in the context of a separate embodiment may also be combined in a single embodiment. Rather, various features described in the context of a single embodiment may also be implemented separately or in any appropriate sub-combination in a plurality of embodiments.


Although the present subject matter has been described in a language specific to structural features and/or logical method actions, it should be understood that the subject matter defined in the appended claims is not necessarily limited to the particular features and actions described above. Rather, the particular features and actions described above are merely exemplary forms for implementing the claims.

Claims
  • 1. A method for showing a special effect, comprising: opening a special effect showing interface based on a triggering operation and turning on a video capturing apparatus, wherein a first special effect element is shown in the special effect showing interface;obtaining a music feature of background music in the special effect showing interface and a plurality of second special effect elements generated according to the music feature, showing the second special effect elements in a preset order, and controlling the second special effect elements to move in the special effect showing interface according to the music feature;identifying a target object in a video captured by the video capturing apparatus and controlling the first special effect element to move in the special effect showing interface according to a movement of the target object; andtriggering a special effect when the first special effect element and the second special effect elements satisfy a preset condition.
  • 2. The method according to claim 1, wherein the music feature comprises a total number of feature music segments of the background music, and a total number of the second special effect elements corresponds to the number of the feature music segments.
  • 3. The method according to claim 2, wherein the music feature further comprises a play duration of each of the feature music segments; and for each second special effect element, a length of the each second special effect element shown in the special effect showing interface is positively related to a play duration of a feature music segment corresponding to the each second special effect element.
  • 4. The method according to claim 2, wherein the music feature further comprises a pitch of each of the feature music segments; and for each second special effect element, a position height of the each second special effect element shown in the special effect showing interface is positively related to a pitch of a feature music segment corresponding to the each second special effect element.
  • 5. The method according to claim 2, wherein the music feature further comprises a total number of beats of each of the feature music segments; and for each second special effect element, a movement velocity of the each second special effect element in the special effect showing interface is positively related to the number of beats of a feature music segment corresponding to the each second special effect element.
  • 6. The method according to claim 2, wherein the preset order comprises a play order of the feature music segments.
  • 7. The method according to claim 1, wherein controlling the first special effect element to move in the special effect showing interface according to the movement of the target object comprises: obtaining a movement trail of the target object in a target direction; andcontrolling the first special effect element to move in the special effect showing interface according to the movement trail of the target object in the target direction.
  • 8. The method according to claim 1, wherein after identifying the target object in the video captured by the video capturing apparatus, the method further comprises: displaying the target object in the special effect showing interface, and adjusting a position of the first special effect element in the special effect showing interface according to an initial position of the target object in the special effect showing interface.
  • 9. The method according to claim 1, wherein triggering the special effect when the first special effect element and the second special effect elements satisfy the preset condition comprises: triggering a first special effect when the second special effect element moves to a preset region in the special effect showing interface and contacts with the first special effect element; andtriggering a second special effect when the second special effect element moves to the preset region in the special effect showing interface and is not in contact with the first special effect element,wherein the preset region is a region in which the first special effect element is capable of moving.
  • 10. The method according to claim 9, wherein the first special effect comprises: playing a first pre-configured sound effect; and/orplaying a first pre-configured animation; andthe second special effect comprises: playing a second pre-configured sound effect; and/orplaying a second pre-configured animation.
  • 11. The method according to claim 10, wherein the first pre-configured sound effect is the background music.
  • 12. The method according to claim 1, further comprising: receiving a music selecting operation by a user; anddetermining the background music for the special effect showing interface according to the music selecting operation.
  • 13. An apparatus for showing a special effect, comprising: an opening module, configured to open a special effect showing interface based on a triggering operation and turn on a video capturing apparatus, wherein a first special effect element is shown in the special effect showing interface;a special effect element showing module, configured to obtain a music feature of background music in the special effect showing interface and a plurality of second special effect elements generated according to the music feature, show the second special effect elements in a preset order, and control the second special effect elements to move in the special effect showing interface according to the music feature;a special effect element controlling module, configured to identify a target object in a video captured by the video capturing apparatus and control the first special effect element to move in the special effect showing interface according to a movement of the target object; anda special effect showing module, configured to trigger a special effect when the first special effect element and the second special effect elements satisfy a preset condition.
  • 14. An electronic device, comprising: one or more processors;a memory; andone or more applications,wherein the one or more applications are stored in the memory and configured to be executed by the one or more processors, andthe one or more applications are configured to perform the method for showing a special effect according to claim 1.
  • 15. A computer-readable medium for storing computer instructions, wherein the computer instructions, when executed on a computer, cause the computer to perform the method for showing a special effect according to claim 1.
  • 16. The apparatus according to claim 13, wherein the music feature comprises a total number of feature music segments of the background music, and a total number of the second special effect elements corresponds to the number of the feature music segments.
  • 17. The apparatus according to claim 16, wherein the music feature further comprises a play duration of each of the feature music segments; and for each second special effect element, a length of the each second special effect element shown in the special effect showing interface is positively related to a play duration of a feature music segment corresponding to the each second special effect element.
  • 18. The apparatus according to claim 16, wherein the music feature further comprises a pitch of each of the feature music segments; and for each second special effect element, a position height of the each second special effect element shown in the special effect showing interface is positively related to a pitch of a feature music segment corresponding to the each second special effect element.
  • 19. The apparatus according to claim 16, wherein the music feature further comprises a total number of beats of each of the feature music segments; and for each second special effect element, a movement velocity of the each second special effect element in the special effect showing interface is positively related to the number of beats of a feature music segment corresponding to the each second special effect element.
  • 20. The apparatus according to claim 16, wherein the preset order comprises a play order of the feature music segments.
Priority Claims (1)
Number Date Country Kind
202010699334.8 Jul 2020 CN national
PCT Information
Filing Document Filing Date Country Kind
PCT/CN2021/096349 5/27/2021 WO