This application is a continuation of International Application No. PCT/CN2022/097104, filed on Jun. 6, 2022 which claims priority to Chinese Patent Application No. 202110657137.4, titled “INTERACTION METHOD AND APPARATUS, MEDIUM, AND ELECTRONIC DEVICE”, filed on Jun. 11, 2021 with the China National Intellectual Property Administration, both of which are incorporated herein by reference in their entireties.
The present disclosure relates to the field of computer technology, in particular to an interaction method and apparatus, a medium and an electronic device.
In the conventional mobile Internet business, the production and interaction based on content consumption products are becoming increasingly frequent. Through content consumption products, users can obtain consulting information, entertainment information, dynamic information of friends, etc., and express their interests in the viewing content through interactions such as likes.
A purpose of the present disclosure is to provide an interaction method and apparatus, a medium and an electronic device, to solve the aforementioned interaction problems associated with application scenarios. The specific solutions are as follows.
In a first aspect, an interaction method is provided according to an implementation of the present disclosure. The method includes:
acquiring a target object and application scenario information associated with the target object, and presenting the target object on a first interface;
generating an interactive control in a first interactive area of the first interface, based on the application scenario information, where the interactive control is associated with the application scenario information; and
changing the interactive control from a first state to a second state, in response to a triggering on the interactive control.
In an implementation, the application scenario information is determined by at least one of:
determining the application scenario information based on content information in the target object; and
determining the application scenario information based on a preset time.
In an implementation, the content information includes at least one of image information, audio information and text information.
In an implementation, the changing the interactive control from a first state to a second state includes at least one of:
changing the interactive control from static state to dynamic state;
changing the interactive control from a first image into a second image;
changing the interactive control from a first shape to a second shape; and
changing the interactive control from a first size to a second size.
In an implementation, the method further includes: playing audio associated with the application scenario information, in response to the triggering on the interactive control.
In an implementation, the method further includes: generating first information associated with the application scenario information, in response to a triggering on the interactive control, where the first information is displayed on the second interface, the second interface is an interactive interface on a client of a target user, and the target user is a poster of the target object.
In an implementation, the first interface further includes a second interactive area, and the method further includes:
generating second information associated with the application scenario information, in response to input information of the second interactive area; and
sending the second information to display the second information on a second interface, wherein the second interface is an interactive interface on a client of a target user, and the target user is a poster of the target object.
In an implementation, the input information includes at least one piece of preset input information associated with the application scenario information, and the second information includes preset information corresponding to the preset input information.
In a second aspect, an interaction apparatus is provided according to an implementation of the present disclosure. The apparatus includes:
an acquisition unit configured to acquire a target object and application scenario information associated with the target object, and present the target object on a first interface;
a generation unit configured to generate an interactive control in a first interactive area of the first interface based on the application scenario information, where the interactive control is associated with the application scenario information; and
a changing unit configured to changing the interactive control from a first state to a second state, in response to a triggering on the interactive control.
In a third aspect, a computer readable storage medium is provided according to an implementation of the present disclosure. The computer readable storage medium storing a computer program thereon. The computer program, when executed by a processor, implements the interaction method according to the any one of the above implementations.
In a fourth aspect, an electronic device is provided according to an implementation of the present disclosure. The electronic device includes: one or more processors; and a storage apparatus configured to store one or more programs; where the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the interaction method according to any one of the above implementations.
Compared with the conventional technology, the above solutions of the embodiments of the present disclosure at least have beneficial effects below.
An interaction method and apparatus, a medium and electronic device based on application scenario information are provided according to the present disclosure. A target object including application scenario information is acquired; and a first interactive control is generated in a first interactive area of the first interface, based on the application scenario information, where the first interactive control is associated with the application scenario information. The first interactive control is changed from a first state to a second state in response to a triggering on the first interactive control.
The drawings herein are incorporated into the specification and constitute a part of the specification, which show embodiments which comply with the present disclosure. The drawings and the specification are used as a whole to explain the principle of the present disclosure. Apparently, the drawings in the following description are only some embodiments of the present disclosure, and those skilled in the art can obtain other drawings according to these drawings without creative efforts. In the drawings:
In order to make the purpose, technical solutions and advantages of the present disclosure clearer, the present disclosure will be further described in detail below in conjunction with the accompanying drawings. Apparently, the described embodiments are only some embodiments, rather than all embodiments of the present disclosure. Based on the embodiments of the present disclosure, all other embodiments obtained by the skilled in the art without making creative efforts belong to the protection scope of the present disclosure.
Terms used in the embodiments of the present disclosure are for the purpose of describing specific embodiments only and are not intended to limit the present disclosure. The singular forms “a/an”, “said” and “the” used in the embodiments of the present disclosure and the appended claims are also intended to include plural forms, unless the context clearly indicates otherwise, “a plurality of/multiple” generally contain at least two.
It should be understood that the term “and/or” used herein is only an association relationship describing associated objects, which indicates that there may be three relationships. For example, A and/or B, which may indicate that the following three relationships: A exists alone, A and B exist simultaneously, and B exists alone. In addition, the character “/” hereinafter generally indicates that the contextual objects are an “or” relationship.
It should be understood that although the terms first, second, third and the like may be used to describe some features in the embodiments of the present disclosure, these features should not be limited to these terms. These terms are used only to distinguish these features from each other. For example, without departing from the scope of the embodiments of the present disclosure, a first feature may also be called a second feature; similarly, the second feature may also be called the first feature.
Depending on the context, hereinafter, the words “if”, “in a case” may be interpreted as “at” or “when” or “in response to determining” or “in response to detecting”. Similarly, depending on the context, the phrases “if it is determined” or “if it is detected (the stated condition or event)” could be interpreted as “when it is determined” or “in response to the determination” or “when it is detected (the stated condition or event)” or “in response to detection of (the stated condition or event)”.
It should also be noted that the term such as “include”, “comprise” or any other variants thereof is intended to cover a non-exclusive inclusion. Therefore, an article or a device including a series of elements includes not only the disclosed elements but also other elements that are not clearly enumerated, or further includes inherent elements of the article or device. Unless expressively limited, the statement “including a . . . ” does not exclude the case that other similar elements may exist in the article or device other than enumerated elements.
Optional embodiments of the present disclosure will be described in detail below in conjunction with the drawings.
As shown in
As shown in
An interaction method is provided according to the present disclosure, which is applied to a consumption side of the target object. The interaction method may include steps S202 and S204 as follows.
In Step S202, a target object and application scenario information associated with the target object are acquired, and the target object is presented on a first interface.
As shown in
The application scenario information is information that matches the content of the target object. As an example, in the case that the content of the target object is a birthday video or picture file, the application scenario information is determined to be a birthday scenario. As another example, the content of the target object is a video or picture file related to the Mid-Autumn Festival reunion, the application scenario information is determined to be a Mid-Autumn Festival scenario.
In an optional implementation, the application scenario information is determined by: determining the application scenario information of the target object based on content information in the target object.
In an optional implementation, the content information includes image information, audio information, and/or text information. The text information includes topic information and/or title information. The above-mentioned specific information may be applied in combinations or separately. It is more accurate to identify the application scenario when multiple pieces of information are applied in combinations.
In the case that the content information includes image information, the key frame in the image information may be extracted under the permission of the user, and image recognition can be performed on the key frame, such as identifying the element with a preset image tag in the image information. For example, it is identified whether an element such as a moon cake is in the image information. When the element such as the moon cake is identified, the application scenario may be basically determined as a Mid-Autumn Festival scenario.
In the case that the content information includes audio information, the audio content in the audio information may be extracted under the permission of the user, and voice recognition can be performed on the audio content. For example, when a song or music such as “Happy Birthday to you” is identified in the audio content, the application scenario may be basically determined as a birthday scenario.
In the case that the content information includes text information, the text information may be extracted from the first text area 303 under the permission of the user, and the application scenario information may be identified based on the text information. As shown in
As shown in
In an optional implementation, the application scenario information may further be determined by: determining the application scenario information of the target object based on a preset time.
The preset time includes the festival preset time formed based on a general calendar. The method of acquiring the calendar includes, but is not limited to, acquiring through the communication network of the terminal. The festival preset time may be acquired through the calendar application program of the terminal. The preset time may be a traditional festival such as the Spring Festival, Mid-Autumn Festival, National Day, Dragon Boat Festival and the like. The preset time may further include an anniversary preset time formed based on the date inputted by user A, such as birthdays, wedding anniversaries and other specific dates, which is acquired after user active permission is accepted. Based on a time tag of traditional festival or anniversary, the application scenario information of the target object may be automatically identified.
In an optional implementation, the determining the application scenario information of the target object based on the preset time and determining the application scenario information of the target object based on the content information in the target object may be combined in two or more manner, to determine the application scenario information more accurately. For example, based on the combination of the image information of the cake and the time information of the birthday date, the target object is determined as a birthday application scenario, which will not be enumerated herein.
In Step S204, a first interactive control is generated in a first interactive area of the first interface based on the application scenario information, where the first interactive control is associated with the application scenario information.
As shown
In an optional implementation, the application scenario information includes birthday scenario information. The first interactive control includes multiple cake icons corresponding to the birthday scenario information that are optional for displaying.
In an optional implementation, the application scenario information includes wedding anniversary scenario information. The first interactive control includes multiple heart icons corresponding to the wedding anniversary scenario information that are optional for displaying.
In an optional implementation, the application scenario information includes Mid-Autumn Festival scenario information. The first interactive control includes moon cake icons corresponding to the Mid-Autumn Festival scenario information.
As an optional implementation, the application scenario information includes Dragon Boat Festival scenario information. The first interactive control includes a rice dumpling icon corresponding to the Dragon Boat Festival scenario information.
The first interactive control which is automatically generated, matches the application scenario information of the target object. Thus, the form of interaction between user B and user A is enriched, the interactive information expression between user B and user A at the special moments is improved, the interactive enthusiasm and interest of both parties are improved and the overall interactive activity of client user is elevated.
In Step S206, the first interactive control is changed from a first state to a second state in response to a triggering instruction on the first interactive control.
As shown in
In an optional implementation, the method further includes: playing audio associated with the application scenario information, in response to the triggering on the interactive control. The process of playing the audio associated with the application scenario information may be performed at the same time as the interactive control changes from the first state to the second state, or after the interactive control has changed from the first state to the second state, which is not limited here. In addition, the process of playing the audio associated with the application scenario information and the process of changing from the first state to the second state have the same triggering condition, or may respond to different triggering conditions respectively, which will not be limited here.
In an optional implementation, when the first interactive control is a cake icon corresponding to the birthday scenario information, the second state is a flashing state.
In this embodiment, before the first interactive control is touched, the birthday cake icon is lightless, which is in a static state. When the first interactive control is touched, the candles on the birthday cake are lit, and a flashing dynamic effect is formed, to show the effect of the birthday cake more realistically, thereby expressing a real scenario of user B sending birthday wishes to user A. In addition, the flashing state may further be accompanied by the flashing halo around the birthday cake, to make the dynamic effect more obvious.
In an optional implementation, the method further includes: changing a data tag adjacent to the first interactive area from first data to second data, in response to the triggering instruction on the first interactive control.
As shown in
In an optional implementation, the above method may further include step S208. In S208, in response to the triggering instruction on the first interactive control, first information associated with the application scenario information is sent to the target user. The target user is a poster of the target object, such as the above user A. The first information includes interactive information and preset information corresponding to the interactive control. For example, the first information includes likes information and greeting information.
When user B sends a triggering instruction in the first interactive area, the client may automatically respond to the triggering instruction and send the first information corresponding to the triggering instruction to user A. For example, when user B clicks on the birthday cake, it indicates that user B is intended to send greetings to user A. In this case, the client may automatically send the first information to user A. The first information includes two parts. One part is the interactive information corresponding to the interactive control. For example, no matter what the application scenario is, “XXX gave likes to you” or “XXX liked your work” may be displayed. The other part is the preset information associated with the application scenario, such as “Happy birthday to you”, “Happy Spring Festival to you”, “Happy Mid-Autumn Festival to you” and so on. Compared with the traditional approach of only pushing the interactive information corresponding to the interactive control, the first information pushed in this embodiment includes both the interactive information and preset information associated with the application scenario. In this way, the interactive information expression between user B and user A at special moments can be improved, the interaction enthusiasm and interests of both parties can be enhanced, and the overall interactive activity of client users can be improved.
In an optional implementation, the first interface 300 further includes a second interactive area 304. The method further includes step S210: generating second information associated with the application scenario information, in response to acquiring input information of the second interactive area; sending the second information to a target user, to display the second information on a second interface, where the second interface is an interactive interface of the target user, and the target user is a poster of the target object, as shown in
As shown in
In an optional implementation, the input information includes at least one piece of preset input information associated with the application scenario information. The second information includes preset information corresponding to the preset input information. The preset input information may be displayed in the second interactive area 304. Optionally, the preset input information includes pattern information or text information.
In an optional implementation, when the preset input information includes pattern information, the preset information includes text information, or combination information of a text and a pattern.
Since the application scenario of the target object has been identified in advance, when information is input in the input box, in order to improve the efficiency of input information and simplify the input content, preset input information may be automatically provided as a candidate input pattern. As an example, in the case that the application scenario is a birthday scenario, when the cursor of the user stays in the input box, the birthday cake icon will be displayed automatically. At this time, user B may easily input information as long as clicking the birthday cake icon. Correspondingly, the second information matching the input birthday cake icon may be text information or combination information of the text and the pattern.
The second information may be text information. After user B enters the birthday cake icon, the text information may be automatically matched and sent to user A. For example, “Happy Birthday to you” or other text information matching the birthday cake icon may be sent. In this implementation, the input of information may be simplified and the greeting content transmitted by text information may be accurately expressed, which improves the convenience of interaction.
The second information may also be combination information of the text and the pattern. After user B enters the birthday cake icon, the text information may be automatically matched, which is followed by pattern information. Then, the combination information of the text and the pattern is sent to user A. As an example, “Happy birthday to you+cake icon” or other combination information of the text and the pattern matching the birthday cake icon is sent. In this implementation, the input of information may be simplified and the diverse greeting content delivered by text information and icon information may be accurately expressed, thereby improving the convenience and enjoyment of interaction.
In an optional implementation, the input information includes text information, and the second information includes combination information of a text and a pattern.
In an embodiment, the input information may be text information, and user B may send more diversified greeting content. Correspondingly, the second information is the combination information of the text and the pattern. As an example, after the user B enters the greeting information “sincerely wish XXX a happy birthday”, the client may automatically generate the text information, which is followed by the automatically matched pattern information. The combination information of the text and the pattern is sent to user A. For example, “sincerely wish XXX a happy birthday+cake icon” or other combination information of the text and the pattern that matches the birthday cake icon may be sent. In this embodiment, the diversity of input text content is increased. In addition, the icon associated with the application scenario is automatically matched, improving the diversity and enjoyment of interaction.
In this embodiment, with the interaction method based on the application scenario information, the interaction process is associated with the application scenario information of the target object, thereby adding the interaction process in the specific application scenarios and improving the enthusiasm of both interacting parties. In addition, since the interactive information can automatically configure standard interactive content based on the application scenario information, the interactions are simpler and more convenient, to further improve the enthusiasm for the interactions.
According to another specific implementation of the present disclosure, an interaction method is provided, which is applied to the client of the user A as described above. That is, the interaction method is applied to a poster side of the target object. In this embodiment, the explanation of the same names/meanings is the same as the above embodiments and has the same technical effect as the above embodiments, which will not be repeated here. The method specifically includes steps S502 to S506 as follows.
In step S502, a target object is posted, where the target object includes application scenario information.
The target object may include long video files, short video files, dynamic picture files, static picture files, text files, and the like. The target object may be produced in advance. The pre-production process means that, before the target object is posted, a to-be-posted target object needs to be produced based on the client target object production process. In this embodiment, in order to facilitate the rapid production of the target object based on the application scenario, the one click making-video template provided by the client may be used. The one click making-video template specifically includes as follows.
User A manually selects pre-posted application scenario, such as selecting the birthday scenario, Mid-Autumn Festival scenario, Spring Festival scenario, etc. The application scenario may also be automatically pushed to a posting interface of the user A, based on the preset time of the user A.
Based on the application scenario, a specific target object is selected from the application scenario templates provided by the system. The target object may include basic content to be sent directly. For example, after selecting the birthday scenario, the target object may be a target object automatically generated in the birthday scenario. The target object may include title information and topic information which is automatically generated and match the birthday scenario. User A may also edit the title information and topic information to generate title information and topic information that meet personal requirements.
After the target object is produced, the target object may be posted by clicking “Post”.
In step S504, a second interactive control is generated in a third interactive area of a third interface, based on the application scenario information, where the second interactive control is associated with the application scenario information.
As shown in
In an optional implementation, the application scenario information includes birthday scenario information. The second interactive control includes multiple candidate cake icons which are selectable for displaying and correspond to the birthday scenario information.
In an optional implementation, the application scenario information includes wedding anniversary scenario information. The second interactive control includes multiple heart icons which are selectable for displaying and corresponds to the wedding anniversary scenario information.
In an optional implementation, the application scenario information includes Mid-Autumn Festival scenario information. The second interactive control includes a moon cake icon corresponding to the Mid-Autumn Festival scenario information.
In an optional implementation, the application scenario information includes Dragon Boat Festival scenario information. The second interactive control includes a rice dumpling icon corresponding to the Dragon Boat Festival scenario information.
The second interactive control is displayed on the third interface, and the first interactive control is synchronously displayed on the first interface. Dynamic changes of the two controls are independent. When different users perform the operations on the respective terminal interfaces, the states of the corresponding interactive controls may change independently.
In Step S506, the first information is displayed on the second interface, in response to receiving the first information.
As shown in
When user B sends a triggering instruction in the first interactive area, the client may automatically respond to the triggering instruction and send the first information corresponding to the triggering instruction to user A. For example, when user B lights the birthday cake, it indicates that user B is intended to send greetings to user A. In this case, the client may automatically send the first information to user A. The user A receives the first information and displays it on the second interface 400, as shown in
As shown in
The interactive control is changed from the third state to the fourth state by at least one of changing the interactive control from static state to dynamic state; changing the interactive control from a third image into a fourth image; changing the interactive control from a third shape to a fourth shape; and changing the interactive control from a third size to a fourth size.
In an optional implementation, the method further includes: playing audio associated with the application scenario information, in response to the triggering on the interactive control. The process of playing the audio associated with the application scenario information may be performed at the same time as the interactive control changes from the third state to the fourth state, or after the interactive control has changed from the third state to the fourth state, which is not limited here. In addition, the process of playing the audio associated with the application scenario information and the process of changing from the third state to the fourth state have the same triggering condition, or may respond to different triggering conditions respectively, which is not limited here.
In an optional implementation, when the second interactive control is a cake icon corresponding to the birthday scenario information, the fourth state is a flashing state.
In this embodiment, before the second interactive control is touched, the birthday cake icon is lightless, which is in a static state. When the second interactive control is touched, the candles on the birthday cake are lit, and a flashing dynamic effect is formed, to show the effect of the birthday cake more realistically, thereby expressing a real scenario of user A sending birthday wishes to himself. In addition, the flashing state may further be accompanied by the flashing halo around the birthday cake, so as to make the dynamic effect more obvious.
In an optional implementation, the method further includes: displaying the second information on a second interface, in response to receiving the second information, wherein the second information is associated with the application scenario information, the second information includes text information or combination information of a text and a pattern.
The second information may be the text information. After user B inputs the birthday cake icon, the text information may be automatically matched and sent to user A. User A receives the second information, and the received second information is displayed on the second interface 400. For example, text information such as “Happy Birthday to you” that matches the birthday cake icon is sent. In this embodiment, user B may transmit the information to user A with a simple input and accurately express the greeting content transmitted by the text information, which improves the interaction convenience.
The second information may be the combination information of the text and the pattern. After user B inputs the birthday cake icon, the text information may be automatically matched, which is followed by pattern information. The combination information of the text and the pattern is sent to user A, and the received second information is displayed on the second interface 400. For example, “Happy Birthday to you +cake icon” or other combination information of the text and the pattern matching the birthday cake icon may be displayed. In this embodiment, user B may transmit the information to user A with a simple input and accurately express the diverse greeting content delivered by text information and icon information, thereby improving the convenience and enjoyment of interaction.
In this embodiment, with the interaction method based on the application scenario information, the interaction process is associated with the application scenario information of the target object, thereby adding the interaction process in the specific application scenarios and improving the enthusiasm of both interacting parties. In addition, since the interactive information can automatically configure standard interactive content based on the application scenario information, the interactions are simpler and more convenient, to further improve the enthusiasm for the interactions.
As shown in
An acquisition unit 702 is configured to acquire a target object and application scenario information associated with the target object, and present the target object on the first interface.
A first generation unit 704 is configured to generate a first interactive control in a first interactive area of the first interface based on the application scenario information, where the first interactive control is associated with the application scenario information.
A changing unit 706 is configured to change the first interactive control from a first state to a second state in response to a triggering instruction on the first interactive control.
In an optional implementation, the interaction apparatus further includes a determination unit configured to determine the application scenario information of the target object based on content information in the target object.
In an optional implementation, the content information includes image information. audio information, and/or text information. The above specific information may be applied in combinations or separately. The application scenario may be identified accurately by applying multiple pieces of information in combinations.
The determination unit is further configured to extract a key frame in the image information, and perform image recognition on the key frame, in a case that the content information includes image information.
The determination unit is further configured to extract audio content in the audio information, and perform voice recognition on the audio content, in a case that the content information includes audio information.
The determination unit is further configured to extract topic information from a topic information area, and identify the application scenario information based on the topic information, in a case that the content information includes the topic information.
The determination unit is further configured to extract title information from a title information area, and identify the application scenario information based on the title information, in a case that the content information includes the title information.
In an optional implementation, the determination unit is further configured to determine the application scenario information of the target object based on a preset time.
In an optional implementation, the determining the application scenario information of the target object based on the preset time and determining the application scenario information of the target object based on the content information in the target object may be combined in two or more manner, to determine the application scenario information more accurately. For example, based on a combination of the image information of the cake and the time information of the birthday date, the target object may be determined as a birthday application scenario, which will not be enumerated herein.
In an optional implementation, the application scenario information includes birthday scenario information, and the first interactive control includes multiple cake icons corresponding to the birthday scenario information that are selectable for displaying.
In an optional implementation, the application scenario information includes wedding anniversary scenario information, and the first interactive control includes multiple heart icons corresponding to the wedding anniversary scenario information that are selectable for displaying.
In an optional implementation, the application scenario information includes Mid-Autumn Festival scenario information, and the first interactive control includes a moon cake icon corresponding to the Mid-Autumn Festival scenario information.
In an optional implementation, the application scenario information includes Dragon Boat Festival scenario information, and the first interactive control includes a rice dumpling icon corresponding to the Dragon Boat Festival scenario information.
In an optional implementation, the interactive control is changed from the first state to the second state by at least one of: changing the interactive control from static state to dynamic state; changing the interactive control from a first image into a second image; changing the interactive control from a first shape to a second shape: and changing the interactive control from a first size to a second size.
In an optional implementation, the changing unit 706 is further configured to play audio associated with the application scenario information in response to a triggering on the interactive control.
In an optional implementation, in a case that the first interactive control is a cake icon corresponding to the birthday scenario information, the second state is a flashing state.
In an optional implementation, the interaction apparatus further includes a changing unit configured to change a data tag adjacent to the first interactive area from first data to second data, in response to the triggering instruction on the first interactive control.
In an optional implementation, the interaction apparatus further includes a first sending unit configured to generate first information associated with the application scenario information, in response to the triggering on the interactive control, to display the first information on a second interface, where the second interface is an interactive interface of a client of a target user, the target user is a poster of the target object, and the first information includes likes information and greeting information.
In an optional implementation, the interaction apparatus further includes a second sending unit, and the first interface 300 further includes a second interactive area 304. The second sending unit is configured to generate second information associated with the application scenario information, in response to the input information in the second interactive area 304; send the second information to display the second information on the second interface 400, where the second interface 400 is an interactive interface of a client of a target user, the target user is a poster of the target object, as shown in
In an optional implementation, in a case that the input information includes pattern information, the second information includes text information or combination information of a text and a pattern.
In an optional implementation, in a case that the input information includes text information, the second information includes combination information of a text and a pattern.
In this embodiment, with the interaction apparatus based on the application scenario information, the interaction process is associated with the application scenario information of the target object, thereby adding the interaction process in the specific application scenarios and improving the enthusiasm of both interacting parties. In addition, since the interactive information can automatically configure standard interactive content based on the application scenario information, the interactions are simpler and more convenient, to further improve the enthusiasm for the interactions.
As shown in
A posting unit 802 configured to post a target object, where the target object includes application scenario information.
A second generation unit 804 configured to generate a second interactive control in a third interactive area of the third interface, based on the application scenario information, where the second interactive control is associated with the application scenario information.
A display unit 806 configured to display first information on a second interface in response to receiving the first information.
In an optional implementation, the application scenario information includes birthday scenario information, and the second interactive control includes multiple cake icons corresponding to the birthday scenario information that is selectable for displaying.
In an optional implementation, the application scenario information includes wedding anniversary scenario information, and the second interactive control includes multiple heart icons corresponding to the wedding anniversary scenario information that is selectable for displaying.
In an optional implementation, the application scenario information includes Mid-Autumn Festival scenario information, and the second interactive control includes a moon cake icon corresponding to the Mid-Autumn Festival scenario information.
In an optional implementation, the application scenario information includes Dragon Boat Festival scenario information, and the second interactive control includes a rice dumpling icon corresponding to the Dragon Boat Festival scenario information.
As shown in
In an optional implementation, the display unit 806 is further configured to play audio associated with the application scenario information, in response to the triggering on the interactive control. The process of playing the audio associated with the application scenario information may be performed at the same time as the process of changing the interactive control from the third state to the fourth state, or after the interactive control has changed from the third state to the fourth state, which is not limited here. In addition, the process of playing the audio associated with the application scenario information and the process of changing from the third state to the fourth state have the same triggering condition, or may respond to different triggering conditions respectively, which will not limited here.
In an optional implementation, in a case that the second interactive control is a cake icon corresponding to the birthday scenario information, the fourth state is a flashing state. In an optional implementation, the display unit 806 is further configured to display second information on the second interface, in response to receiving the second information, where the second information is associated with the application scenario information, the second information includes text information or combination information of a text and a pattern.
In this embodiment, with the interaction apparatus based on the application scenario information, the interaction process is associated with the application scenario information of the target object, thereby adding the interaction process in the specific application scenarios and improving the enthusiasm of both interacting parties. In addition, since the interactive information can automatically configure standard interactive content based on the application scenario information, the interactions are simpler and more convenient, to further improve the enthusiasm for the interactions.
As shown in
A non-transitory computer storage medium is provided according to an embodiment of the present disclosure. The computer storage medium stores computer executable instructions, and the computer executable instructions may implement the steps of the method described in the above embodiments.
Referring to
As shown in
Generally, the following apparatuses may be connected to the I/O interface 905: an input apparatus 906 such as a touch screen, a touch panel, a keyboard, a mouse, a camera, a microphone, an accelerometer, and gyroscope; an output apparatus 907 such as a liquid crystal display (LCD), a loudspeaker and a vibrator; a storage apparatus 908 such as a magnetic tape and a hard disk; and a communication apparatus 909. The communication apparatus 909 may allow the electronic device to communicate with other device in a wired or wireless manner to exchange data. Although
Particularly, according to the embodiments of the present disclosure, the process described above in conjunction with flowcharts may be implemented as a computer software program. For example, a computer program product is further provided according to an embodiment of the present disclosure, including a computer program carried on a computer readable medium. The computer program includes the program codes for implementing the methods as shown in the flowcharts. In the embodiment, the computer program may be downloaded and installed from the network via the communication apparatus 909, or installed from the storage apparatus 908, or installed from the ROM 902. When the computer program is executed by the processing apparatus 901, the functions defined in the methods according to the embodiments of the present disclosure are performed.
It should be noted that, the computer readable medium described in the present disclosure may be a computer readable signal medium, a computer readable storage medium or any combination thereof. The computer readable storage medium may include but not limited to a system, an apparatus or a device in an electric, magnetic, optical, electromagnetic, infrared or a semiconductor form, or any combination thereof. The more specific examples of the computer readable storage medium may include but not limited to electric connection of one or more wires, a portable computer magnetic disk, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read only memory (EPROM or a flash memory), an optical fiber, a portable compact disk read only memory (CD-ROM), an optical storage device, a magnetic storage device or any appropriate combination thereof. In the present disclosure, the computer readable storage medium may be a tangible medium including or storing programs. The programs may be used by an instruction execution system, apparatus or device, or may be used in combination with the instruction execution system, apparatus or device. In the present disclosure, a computer readable signal medium may include a data signal in a baseband or propagated as a part of carrier. The computer readable signal medium carries computer readable program codes. The propagated data signal may be in a variety of forms, and include but not limited to an electromagnetic signal, an optical signal or any appropriate combination thereof. The computer readable signal medium may further be any computer readable medium other than the computer readable storage medium. The computer readable signal medium may send, propagate or transmit programs used by the instruction execution system, apparatus or device or the programs used in combination with the instruction execution system, apparatus or device. The program code embodied in the computer readable medium may be transmitted via any appropriate medium, including but not limited to an electric wire, an optical fiber, radio frequency (RF) or any appropriate combination thereof.
The above computer readable medium may be included in the electronic device above or may stand alone without being assembled into the electronic device.
In an embodiment of the present disclosure, computer program codes for performing steps of the present disclosure may be written by using one or more program design language or any combination. The program design language includes but not limited to object oriented program design language such as Java, Smalltalk and C++, and further includes conventional process-type program design language such as “C” or similar program design language. The program codes may be completely or partially executed on a user computer, performed as an independent software packet, partially executed on the user computer and partially executed on a remote computer, or completely executed on the remote computer or a server. In a case of involving the remote computer, the remote computer may connect to the user computer via any type of network such as a local area network (LAN) and a wide area network (WAN). Alternatively, the remote computer may connect to an external computer (such as achieving internet connection by services provided by the internet network service provider).
The flowcharts and block diagrams in the drawings illustrate architectures, functions and operations which may be implemented by the system, method and computer program product according to the embodiments of the present disclosure. Each block in the flowcharts or the block diagram may represent a module, a program segment or part of codes including executable instruction(s) for implementing specified logic functions. It should be noted that, in some alternative implementations, the functions marked in blocks may be performed in an order different from the order shown in the drawings. For example, two blocks shown in succession may actually be executed in parallel, or sometimes may be executed in a reverse order, which depends on the functions involved. It should also be noted that each block in the block diagram and/or flow chart and a combination of the blocks in the block diagram and/or flow chart may be implemented by a dedicated hardware-based system that performs specified functions or operations, or may be realized by a combination of dedicated hardware and computer instructions.
The units mentioned in the description of the embodiments of the present disclosure may be implemented by means of software or otherwise by means of hardware. A name of the units does not constitute a limitation to the units in some case.
Number | Date | Country | Kind |
---|---|---|---|
202110657137.4 | Jun 2021 | CN | national |
Number | Date | Country | |
---|---|---|---|
Parent | PCT/CN2022/097104 | Jun 2022 | US |
Child | 18533706 | US |