This application relates to the field of artificial intelligence technologies, including a method for determining a recommendation indicator of resource information.
With the rapid development of artificial intelligence (AI) technologies, an intelligent recommendation system has been widely used in various fields in recent years, such as providing users with appropriate content in the fields such as e-commerce, advertising, and social media websites. In related art, the recommendation system usually adopts an “ignoring” or “blacklist” processing mechanism for negative feedback information (i.e., negative feedback information of a user on recommendation information). “Ignoring” means directly disregarding the negative feedback information of the user, and the blacklist processing mechanism means that the recommendation information for which a user has given a negative feedback will no longer be recommended to the user, resulting in low recommendation performance of the recommendation system.
Aspects of this disclosure include a method, an apparatus, and a non-transitory computer-readable storage medium for determining one or more recommendation indicators of resource information, which can improve recommendation performance of a recommendation system.
Examples of technical solutions of this disclosure may be implemented as follows:
An aspect of this disclosure provides a method for determining recommendation indicators. A first bipartite graph is constructed based on interaction feature data that indicate interactions between a plurality of objects and a plurality of pieces of resource information. The first bipartite graph includes (i) a plurality of graph nodes and (ii) a positive connecting edge. The plurality of graph nodes includes an object node for each of the plurality of objects and a resource information node for each of the plurality of pieces of the resource information. The positive connecting edge is between a first object node of the plurality of object nodes and a first resource information node of the plurality of resource information nodes. The positive connecting edge indicates a positive feedback feature between a first object of the plurality of objects that corresponds to the first object node and a first piece of resource information of the plurality of pieces of resource information that corresponds to the first resource information node. A second bipartite graph is constructed based on the interaction feature data. The second bipartite graph includes the plurality of graph nodes and a negative connecting edge between a second object node of the plurality of object nodes and a second resource information node of the plurality of resource information nodes. The negative connecting edge indicates a negative feedback feature between a second object of the plurality of objects that corresponds to the second object node and a second piece of resource information of the plurality of pieces of resource information that corresponds to the second resource information node. Comprehensive embedding vector representations of the plurality of graph nodes are determined based on the first bipartite graph and the second bipartite graph. The recommendation indicator for each of the plurality of pieces of resource information with respect to each of the plurality of objects is determined based on the comprehensive embedding vector representation of the object node corresponding to the respective object and the comprehensive embedding vector representation of the resource information node corresponding to the respective piece of resource information.
An aspect of this disclosure provides an apparatus. The apparatus includes processing circuitry configured to construct a first bipartite graph based on interaction feature data that indicate interactions between a plurality of objects and a plurality of pieces of resource information. The first bipartite graph includes (i) a plurality of graph nodes and (ii) a positive connecting edge. The plurality of graph nodes includes an object node for each of the plurality of objects and a resource information node for each of the plurality of pieces of the resource information. The positive connecting edge is between a first object node of the plurality of object nodes and a first resource information node of the plurality of resource information nodes. The positive connecting edge indicates a positive feedback feature between a first object of the plurality of objects that corresponds to the first object node and a first piece of resource information of the plurality of pieces of resource information that corresponds to the first resource information node. The processing circuitry is configured to construct a second bipartite graph based on the interaction feature data. The second bipartite graph includes the plurality of graph nodes and a negative connecting edge between a second object node of the plurality of object nodes and a second resource information node of the plurality of resource information nodes. The negative connecting edge indicates a negative feedback feature between a second object of the plurality of objects that corresponds to the second object node and a second piece of resource information of the plurality of pieces of resource information that corresponds to the second resource information node. The processing circuitry is configured to determine comprehensive embedding vector representations of the plurality of graph nodes based on the first bipartite graph and the second bipartite graph. The processing circuitry is configured to determine a recommendation indicator for each of the plurality of pieces of resource information with respect to each of the plurality of objects based on the comprehensive embedding vector representation of the object node corresponds to the respective object and the comprehensive embedding vector representation of the resource information node corresponds to the respective piece of resource information.
An aspect of this disclosure provides a non-transitory computer-readable storage medium storing instructions which when executed by a processor cause the processor to perform any of the methods of this disclosure.
In the aspects of this disclosure, the negative feedback information of the object for the resource information is not ignored or discarded, but the first bipartite graph and the second bipartite graph are respectively constructed by using the positive feedback information and the negative feedback information, then the comprehensive embedding vector representation of each graph node in the bipartite graph is determined, and then prediction of the recommendation indicator of each piece of resource information for each object is implemented based on the comprehensive embedding vector representation of each object node and the comprehensive embedding vector representation of each resource information node, so as to recommend the resource information to the object based on the recommendation indicator. In other words, in aspects of this disclosure, the positive feedback information and the negative feedback information of the objects for the resource information are comprehensively considered, so that the recommendation indicator of the resource information for the object can be more accurately determined, thereby improving accuracy of recommendation based on the recommendation indicator, and then improving the recommendation performance of the recommendation system.
The following description provides specific details of various aspects of this disclosure, so that a person skilled in the art can understand and implement the various aspects of this disclosure. The technical solutions of this disclosure may be implemented without some of the details. In some cases, some structures or functions are not shown or described in detail in this disclosure, so as to prevent the unnecessary descriptions from obscuring the description of the aspects of this disclosure. The terms used in this disclosure are to be understood in a broadest reasonable manner thereof, even if the terms are used in combination with specific aspects of this disclosure. Further, the descriptions of the terms are provided as examples only and are not intended to limit the scope of the disclosure.
Artificial intelligence (AI) involves a theory, a method, a technology, and an application system that use a digital computer or a machine controlled by the digital computer to simulate, extend, and expand human intelligence, perceive an environment, obtain knowledge, and use knowledge to obtain an optimal result. In other words, AI is a comprehensive technology in computer science and attempts to understand the essence of intelligence and produce a new intelligent machine that can react in a manner similar to human intelligence. AI is to study the design principles and implementation methods of various intelligent machines, to enable the machines to have the functions of perception, reasoning, and decision-making.
The AI technology is a comprehensive discipline, and involves a wide range of fields including both the hardware-level technology and the software-level technology. The basic AI technologies include technologies such as a sensor, a dedicated AI chip, cloud computing, distributed storage, a big data processing technology, an operating/interaction system, and electromechanical integration. AI software technologies mainly include major directions such as a computer vision technology, a speech processing technology, a natural language processing technology, machine learning/deep learning, autonomous driving, intelligent traffic, and automatic control.
Machine learning (ML) is a multi-field interdiscipline, and relates to a plurality of disciplines such as the probability theory, statistics, the approximation theory, convex analysis, and the algorithm complexity theory. ML specializes in studying how a computer simulates or implements a human learning behavior to obtain new knowledge or skills, and reorganize a related knowledge structure, so as to keep improving its performance. ML is the core of AI, is a basic way to make the computer intelligent, and is applied to various fields of AI. ML and deep learning include technologies such as an artificial neural network, a belief network, reinforcement learning, transfer learning, inductive learning, and learning from demonstrations.
Based on this, the aspects of this disclosure provide a method and an apparatus for determining a recommendation indicator of resource information, an electronic device, a computer storage medium, and a computer program product, which can improve recommendation performance of a recommendation system. Descriptions are provided below.
One or more modules, submodules, and/or units of the apparatus can be implemented by processing circuitry, software, or a combination thereof, for example. The term module (and other similar terms such as unit, submodule, etc.) in this disclosure may refer to a software module, a hardware module, or a combination thereof. A software module (e.g., computer program) may be developed using a computer programming language and stored in memory or non-transitory computer-readable medium. The software module stored in the memory or medium is executable by a processor to thereby cause the processor to perform the operations of the module. A hardware module may be implemented using processing circuitry, including at least one processor and/or memory. Each hardware module can be implemented using one or more processors (or processors and memory). Likewise, a processor (or processors and memory) can be used to implement one or more hardware modules. Moreover, each module can be part of an overall module that includes the functionalities of the module. Modules can be combined, integrated, separated, and/or duplicated to support various applications. Also, a function being performed at a particular module can be performed at one or more other modules and/or by one or more other devices instead of or in addition to the function performed at the particular module. Further, modules can be implemented across multiple devices and/or other components local or remote to one another. Additionally, modules can be moved from one device and added to another device, and/or can be included in both devices.
The use of “at least one of” or “one of” in the disclosure is intended to include any one or a combination of the recited elements. For example, references to at least one of A, B, or C; at least one of A, B, and C; at least one of A, B, and/or C; and at least one of A to C are intended to include only A, only B, only C or any combination thereof. References to one of A or B and one of A and B are intended to include A or B or (A and B). The use of “one of” does not preclude any combination of the recited elements when applicable, such as when the elements are not mutually exclusive.
In an implementation scenario, the terminal may be equipped with a client supporting information recommendation, a user (for example, an operation and maintenance person of a recommendation system) may trigger an information recommendation instruction for resource information on the client of the terminal, and the terminal transmits an information recommendation request to the server in response to the information recommendation instruction. The server is configured to: receive the information recommendation request transmitted by the terminal, and construct a first bipartite graph based on interactive feature data of a plurality of objects for a plurality of pieces of resource information in response to the information recommendation request, the first bipartite graph including a plurality of graph nodes and at least one positive connecting edge, the plurality of graph nodes including an object node of each object and a resource information node of each piece of the resource information, and the positive connecting edge indicating a positive feedback feature of an object of a connected object node of the positive connecting edge for resource information of a connected resource information node of the positive connecting edge; construct a second bipartite graph based on the interactive feature data, the second bipartite graph including a plurality of graph nodes and at least one negative connecting edge, the negative connecting edge indicating a negative feedback feature of an object of a connected object node of the negative connecting edge for resource information of a connected resource information node of the negative connecting edge; determine a comprehensive embedding vector representation of each graph node based on the first bipartite graph and the second bipartite graph; determine a recommendation indicator of each piece of the resource information for each object based on a comprehensive embedding vector representation of each object node and a comprehensive embedding vector representation of each resource information node; and recommend information of each piece of resource information to each object based on the recommendation indicator of each piece of resource information for each object.
The method for determining a recommendation indicator of resource information provided in the aspects of this disclosure may be implemented by various electronic devices. For example, the method may be implemented by a terminal alone, may be implemented by a server alone, or may be implemented by the terminal and the server collaboratively. The method for determining a recommendation indicator of resource information provided in the aspects of this disclosure may be applied to various scenarios, including but not limited to a cloud technology, AI, intelligent traffic, assisted driving, a game, an audio/video, and the like.
All operations in the method for determining a recommendation indicator of resource information described in the aspects of this disclosure may be performed by the server, or may be performed by the terminal. Alternatively, some operations in the method for determining a recommendation indicator of resource information are performed by the server, and some other operations are performed by the terminal. In other words, which operations in the method for determining a recommendation indicator of resource information are performed by the server and which operations are performed by the terminal are not limited in the aspects of this disclosure.
For the sake of simplicity, detailed description is provided below by using an example in which the method for determining a recommendation indicator of resource information is performed by a server.
S210: Construct a first bipartite graph based on interactive feature data of a plurality of objects for a plurality of pieces of resource information, the first bipartite graph including a plurality of graph nodes and at least one positive connecting edge, the plurality of graph nodes including an object node of each object and a resource information node of each piece of the resource information, and the positive connecting edge indicating a positive feedback feature of an object of a connected object node of the positive connecting edge for resource information of a connected resource information node of the positive connecting edge. For example, a first bipartite graph is constructed based on interaction feature data that indicate interactions between a plurality of objects and a plurality of pieces of resource information. The first bipartite graph includes (i) a plurality of graph nodes and (ii) a positive connecting edge. The plurality of graph nodes includes an object node for each of the plurality of objects and a resource information node for each of the plurality of pieces of the resource information. The positive connecting edge is between a first object node of the plurality of object nodes and a first resource information node of the plurality of resource information nodes. The positive connecting edge indicates a positive feedback feature between a first object of the plurality of objects that corresponds to the first object node and a first piece of resource information of the plurality of pieces of resource information that corresponds to the first resource information node.
S220: Construct a second bipartite graph based on the interactive feature data, the second bipartite graph including a plurality of graph nodes and at least one negative connecting edge, the negative connecting edge indicating a negative feedback feature of an object of a connected object node of the negative connecting edge for resource information of a connected resource information node of the negative connecting edge. For example, a second bipartite graph is constructed based on the interaction feature data. The second bipartite graph includes the plurality of graph nodes and a negative connecting edge between a second object node of the plurality of object nodes and a second resource information node of the plurality of resource information nodes. The negative connecting edge indicates a negative feedback feature between a second object of the plurality of objects that corresponds to the second object node and a second piece of resource information of the plurality of pieces of resource information that corresponds to the second resource information node.
S230: Determine a comprehensive embedding vector representation of each graph node based on the first bipartite graph and the second bipartite graph. For example, comprehensive embedding vector representations of the plurality of graph nodes are determined based on the first bipartite graph and the second bipartite graph.
S240: Determine a recommendation indicator of each piece of the resource information for each object based on a comprehensive embedding vector representation of each object node and a comprehensive embedding vector representation of each resource information node. For example, the recommendation indicator for each of the plurality of pieces of resource information with respect to each of the plurality of objects is determined based on the comprehensive embedding vector representation of the object node corresponding to the respective object and the comprehensive embedding vector representation of the resource information node corresponding to the respective piece of resource information.
The “object” mentioned herein may be a user of a terminal device, and the “resource information” may be a to-be-recommended object that has an association relationship with the user. For example, the “resource information” includes, but is not limited to, a to-be-recommended object that has been recommended to the user, a to-be-recommended object that has been clicked/tapped by the user, or a to-be-recommended object that the user clicks/taps and is intended to obtain. The resource information includes, but is not limited to, information such as a commodity, an article, a video, and an audio. The interactive feature data herein refers to data of interactive behaviors (for example, historical interactive behaviors) of a plurality of objects for a plurality of pieces of resource information. The interactive behaviors include, but are not limited to, clicking/tapping, giving a review, giving a like, sharing, obtaining or downloading based on resource information, and the like. Based on this, in S210 and S220, the first bipartite graph and the second bipartite graph are constructed based on the interactive feature data of the plurality of objects for the plurality of pieces of resource information. Herein, the first bipartite graph includes a plurality of graph nodes and at least one positive connecting edge, and the second bipartite graph also includes the plurality of graph nodes. The second bipartite graph further includes at least one positive connecting edge. The plurality of graph nodes include the object node of each object and the resource information node of each piece of resource information. The positive connecting edge indicates a positive feedback feature of the object of the connected object node of the positive connecting edge for the resource information of the connected resource information node of the positive connecting edge. The negative connecting edge indicates a negative feedback feature of the object of the connected object node of the negative connecting edge for the resource information of the connected resource information node of the negative connecting edge.
The “positive feedback feature” mentioned herein refers to a positive feedback feature generated by an object toward resource information. For example, the object receives a certain piece of resource information, and then clicks/taps a link related to the resource information, or performs an operation of downloading or obtaining, or gives feedback information such as a favorable review, a like, or a high score. This type of data is the positive feedback feature. The “negative feedback feature” mentioned herein refers to a negative feedback feature of the object toward the resource information. For example, after the resource information is exposed to the object, the object does not click/tap a link related to the resource information, or skips the resource information, or even gives feedback information such as a bad review or a low score. This type of data is the negative feedback feature.
The “bipartite graph” mentioned herein is a special graph structure. A vertex set in the graph structure may be divided into two subsets that do not intersect each other, and two vertices to which each edge in the graph structure is attached belong to the two subsets that do not intersect each other. The vertices in the two subsets are not adjacent.
The bipartite graph mentioned herein may be represented by a corresponding adjacency matrix. The adjacency matrix is a matrix of N*N, N being a quantity of graph nodes in the bipartite graph. The adjacency matrix is difficult to be used as a feature space of a large graph. Therefore, graph node attribute need to be transformed into a vector with a smaller dimension. The vector is also referred to as an embedding vector representation. In the foregoing aspects of this disclosure, the first bipartite graph and the second bipartite graph include the same node information. In other words, the first bipartite graph and the second bipartite graph include the same object node and resource information node. A difference between the first bipartite graph and the second bipartite graph is that an edge between a node of the first bipartite graph and a node of the second bipartite graph represents a different relationship. An edge between an object node and a resource information node in the first bipartite graph is defined as a positive connecting edge, which is configured to indicate the positive feedback feature of the object of the connected object node of the positive connecting edge for the resource information of the connected resource information node of the positive connecting edge. An edge between an object node and a resource information node in the second bipartite graph is defined as a negative connecting edge, which is configured to indicate the negative feedback feature of the object of the connected object node of the negative connecting edge for the resource information of the connected resource information node of the negative connecting edge.
After the first bipartite graph and the second bipartite graph are constructed based on the interactive feature data, in S230, the comprehensive embedding vector representation of each graph node is determined based on the first bipartite graph and the second bipartite graph, namely, the comprehensive embedding vector representation of each object node and the comprehensive embedding vector representation of each resource information node. Then in S240, a recommendation indicator of each piece of the resource information for each object is determined based on the comprehensive embedding vector representation of each object node and the comprehensive embedding vector representation of each resource information node. The “recommendation indicator” herein refers to an indicator and a corresponding indicator value. A recommendation indicator of a piece of resource information for an object is configured for measuring a recommendation possibility (or a recommendation probability) of recommending the resource information to the object. For example, the indicator value of the recommendation indicator may be characterized by a recommendation score value of recommending the resource information to the object (the recommendation score value is positively correlated with the recommendation possibility). In this way, the resource information may be recommended to the object through the recommendation score value of each piece of resource information for the object (for example, the resource information whose recommendation score value is higher than a score threshold is recommended to the object, or a certain amount of resource information with the recommendation score value ranking high in descending order is recommended to the object). In an example, the recommendation indicator includes, but is not limited to a degree of interest of the object for the resource information (which may be characterized by a recommendation score value) and an interaction probability of the object for the resource information (which may be characterized by the recommendation score value such as a clicking/tapping probability, a review probability, a sharing probability, a high scoring probability with a score higher than a score threshold, or a liking probability).
Through application of the foregoing aspects, the negative feedback information of the object for the resource information is not ignored or discarded, but the first bipartite graph and the second bipartite graph are respectively constructed by using the positive feedback information and the negative feedback information, then the comprehensive embedding vector representation of each graph node in the bipartite graph is determined by using a graph neural network model, and then prediction of the recommendation indicator of each resource information for each object is implemented based on the comprehensive embedding vector representation of each object node and the comprehensive embedding vector representation of each resource information node, so as to recommend the resource information to the object based on the recommendation indicator. In other words, in the aspects of this disclosure, the positive feedback information and the negative feedback information of the objects for the resource information are comprehensively considered, so that the recommendation indicator of the resource information for the object can be more accurately determined, thereby improving accuracy of recommendation based on the recommendation indicator, and then improving the recommendation performance of the recommendation system and user experience of receiving recommended resource information.
In some aspects, S230 described above may be implemented through the following operations. S2301: Determine a first embedding vector representation and a second embedding vector representation of each graph node based on the first bipartite graph and the second bipartite graph. S2302: Splice, for each graph node, the first embedding vector representation and the second embedding vector representation of the graph node, to obtain a comprehensive embedding vector representation of the graph node.
Herein, the splicing process in S2302 may be addition splicing, multiplication splicing, or the like. S2301 may be implemented by a pre-trained graph neural network model. To be specific, the graph neural network model is invoked based on the first bipartite graph and the second bipartite graph, to determine the first embedding vector representation and the second embedding vector representation of each graph node. In some aspects, the graph neural network model includes a first graph neural network submodel and a second graph neural network submodel. Based on this, as shown in
The first graph neural network submodel and the second graph neural network submodel may include any appropriate graph convolutional neural network. In some aspects, the first graph neural network submodel may be based on light graph convolution (LGC). For example, the first graph neural network submodel may include K (K is an integer greater than 1) graph convolutional layers. Each graph convolutional layer is configured to generate a first intermediate embedding vector representation. Based on this, as shown in
Herein, the first initial embedding vector representation in S510 may be preset. In S520, an input to the first graph convolutional layer is the first adjacency matrix and the first initial embedding vector representation, and an output thereof is the first intermediate embedding vector representation of the first graph convolutional layer. In S530, an input to the kth graph convolutional layer is an output of the (k−1)th graph convolutional layer and the first adjacency matrix, and an output thereof is the first intermediate embedding vector representation of the kth graph convolutional layer. In S540, the first intermediate embedding vector representation outputted by each graph convolutional layer of the K graph convolutional layers can be obtained through traversal of the k, i.e., K first intermediate embedding vector representations. In S550, the average value of the K first intermediate embedding vector representations outputted by the K graph convolutional layers and the first initial embedding vector representation is calculated to obtain the first embedding vector representation of the graph node. In this way, a feature expression capability of the obtained first embedding vector representation can be improved through feature extraction of multi-layer graph convolutional layers, thereby improving accuracy of a recommendation indicator determined based on the first embedding vector representation.
As described above, the first bipartite graph includes a positive connecting edge indicating a positive feedback feature of an object corresponding to an object node for resource information corresponding to a resource information node. Therefore, the first adjacency matrix of the first bipartite graph may also be referred to as a positive adjacency matrix. The positive adjacency matrix A+ may be expressed as
where R+ is an object-resource information interaction matrix including the positive feedback feature of the object for the resource information, R+ is an M*N matrix, Mis a quantity of object nodes, and Nis a quantity of resource information nodes. If an object interacts with a certain piece of resource information, a corresponding element in the matrix R+ is 1, otherwise the element is 0. Similarly, the adjacency matrix of the second bipartite graph may be referred to as a negative adjacency matrix. The negative adjacency matrix A− may be expressed as
where R− is an object-resource information interaction matrix including a negative feedback feature of the object for the resource information, R− is an M*N matrix, M is the quantity of object nodes, and Nis the quantity of resource information nodes.
In some aspects, a convolution operation of each graph convolutional layer in the foregoing first graph neural network submodel may be expressed as the following formula:
In an example,
In some aspects, a second graph neural network submodel includes a multilayer perceptron (MLP). As shown in
Herein, in S710, the second bipartite graph includes a negative connecting edge indicating a negative feedback feature of an object corresponding to an object node for resource information corresponding to a resource information node. Therefore, an adjacency matrix of the second bipartite graph may be referred to as a negative adjacency matrix. The negative adjacency matrix A− may be expressed as
where R− is an object-resource information interaction matrix including the negative feedback feature of the object for the resource information, R− is an M*N matrix, M is a quantity of object nodes, and N is a quantity of resource information nodes. In S720, the second embedding vector representation Z− of each graph node may be obtained through the following formula:
In this way, a comprehensive embedding vector representation of each graph node may be obtained based on the first embedding vector representation and the second embedding vector representation of each graph node that are obtained. In other words, the first embedding vector representation Z+ and the second embedding vector representation Z− of each graph node are spliced to obtain the comprehensive embedding vector representation Z of the graph node, i.e., Z=Z+|Z−. Therefore, a recommendation indicator may be predicted based on the comprehensive embedding vector representation of each graph node. Herein, the embedding vector representations (including the first embedding vector representation and the second embedding vector representation) of the graph node are obtained in different manners, and the comprehensive embedding vector representation of the graph node is obtained by splicing the embedding vector representations obtained in the two manners, which can improve a feature expression effect of the graph node, thereby improving accuracy of the recommendation indicator determined based on the comprehensive embedding vector representation, and thus improving recommendation performance of a recommendation system.
As shown in
Still referring to
In some aspects, the foregoing graph neural network model is obtained by training an initial graph neural network model. A training process of the initial graph neural network model includes: training the initial graph neural network model by using the positive connecting edge of the first bipartite graph and the negative connecting edge of the second bipartite graph, until a loss function of the initial graph neural network model reaches a minimum value, thereby obtaining an expected graph neural network model. In the foregoing aspects, the graph neural network model includes the first graph neural network submodel and the second graph neural network submodel. The first graph neural network submodel and the second graph neural network submodel may be used as a whole and trained simultaneously.
In some aspects, a loss function of the graph neural network model includes at least one of a Bayesian personalized ranking (BPR) loss function and a cosine loss function. In an example, a loss function L of the graph neural network model may include a BPR loss function Lbpr and a cosine loss function Lcos. The loss function L for the graph neural network model may be expressed as the following formula:
where λ1 is a hyper-parameter, which is configured for adjusting impact of different loss functions. Herein, the BPR loss function Lbpr is Lbpr=−log(yu,i−yu,j), where yu,i=zuTzi, zu is a comprehensive embedding vector representation of an object node u, and zi is a comprehensive embedding vector representation of a resource information node i of resource information interacting with an object corresponding to the object node u. yu,j=zuTzj, where zj is a comprehensive embedding vector representation of a resource information node of resource information without interaction with the object corresponding to the object node u.
Herein, the cosine loss function Lcos may be configured for predicting a positive feedback or a negative feedback of the object to the resource information. For the object node u and the resource information node i, the comprehensive embedding vector representation zu corresponding to the object node and the comprehensive embedding vector representation zi corresponding to the resource information node may be obtained, so that a cosine of an included angle between zu and zi may be calculated. The cosine loss function Lcos may be expressed as:
In some other aspects, the loss function L for the graph neural network model further includes a regular term Lreg. To be specific, the loss function L may be expressed as:
In an example, the method for determining a recommendation indicator of resource information provided in the aspects of this disclosure may be applied to a resource information recommendation system. To verify the effect of the method for determining a recommendation indicator of resource information provided in the aspects of this disclosure, the graph neural network model adopted in the aspects of this disclosure and other models adopted in the related resource information recommendation system are tested on four different datasets (including a dataset 1 to a dataset 4). The examples of the models adopted in the related resource information recommendation system include a BPR matrix factorization (BPRMF) network, a neural matrix factorization (NeuMF) network, a neural graph collaborative filtering (NGCF) network, a disentangled graph collaborative filtering (DGCF) network, a light graph convolution network (LightGCN), and a sinusoidal representation network (SiReN). Measurement indicators include an accuracy rate of top ten pieces of recommended resource information P@10, a recall rate of the top ten pieces of recommended resource information R@ 10, a normalized discounted cumulative gain of the top ten pieces of recommended resource information nDCG@10, an accuracy rate of top fifteen pieces of recommended resource information P@ 15, a recall rate of the top fifteen pieces of recommended resource information R@ 15, a normalized discounted cumulative gain of the top fifteen pieces of recommended resource information nDCG@15, an accuracy rate of top twenty pieces of recommended resource information P@20, a recall rate of the top twenty pieces of recommended resource information R@20, and a normalized discounted cumulative gain of the top twenty pieces of recommended resource information nDCG@20.
An aspect of this disclosure further provides an apparatus for determining a recommendation indicator of resource information. As shown in
In some aspects, the embedding vector representation determination unit 1100b is further configured to determine a first embedding vector representation and a second embedding vector representation of each graph node based on the first bipartite graph and the second bipartite graph; and splice, for each graph node, the first embedding vector representation and the second embedding vector representation of the graph node, to obtain a comprehensive embedding vector representation of the graph node.
In some aspects, the embedding vector representation determination unit 1100b is further configured to invoke, based on the first bipartite graph and the second bipartite graph, a graph neural network model to determine the first embedding vector representation and the second embedding vector representation of each graph node.
In some aspects, the graph neural network model includes a first graph neural network submodel and a second graph neural network submodel. The embedding vector representation determination unit 1100b is further configured to invoke, based on the first bipartite graph, the first graph neural network submodel to determine the first embedding vector representation of each graph node; and invoke, based on the second bipartite graph and the first embedding vector representation of each graph node, the second graph neural network submodel to determine the second embedding vector representation of each graph node.
In some aspects, the first graph neural network submodel includes K graph convolutional layers. The embedding vector representation determination unit 1100b is further configured to perform the following processing for each graph node: obtaining a first adjacency matrix of the first bipartite graph and a first initial embedding vector representation of the graph node; invoking a first graph convolutional layer of the K graph convolutional layers, and outputting a first intermediate embedding vector representation of the graph node in the first graph convolutional layer based on the first adjacency matrix and the first initial embedding vector representation; invoking a kth graph convolutional layer of the K graph convolutional layers, and outputting a first intermediate embedding vector representation of the graph node in the kth graph convolutional layer based on the first adjacency matrix and a first intermediate embedding vector representation of the graph node in a (k−1)th graph convolutional layer; traversing k to obtain the first intermediate embedding vector representation of the graph node in each of the graph convolutional layers, K and the k being both integers greater than 1, and the k being less than or equal to the K; and determining an average value of K first intermediate embedding vector representations outputted by the K graph convolutional layers and the first initial embedding vector representation, and using the average value as the first embedding vector representation of the graph node.
In some aspects, the second graph neural network submodel includes an MLP. The embedding vector representation determination unit 1100b is further configured to obtain a second adjacency matrix of the second bipartite graph; and invoke the MLP to determine the second embedding vector representation of each graph node based on the second adjacency matrix and the first embedding vector representation of each graph node.
In some aspects, the embedding vector representation determination unit 1100b is further configured to train the initial graph neural network model by using the positive connecting edge of the first bipartite graph and the negative connecting edge of the second bipartite graph, until a loss function of the initial graph neural network model reaches a minimum value.
In some aspects, the recommendation indicator determination unit 1100c is further configured to determine an inner product of the comprehensive embedding vector representation of each object node and the comprehensive embedding vector representation of each resource information node; and determine the recommendation indicator of each piece of the resource information for each object based on the inner product.
An aspect of this disclosure further provides an electronic device. The electronic device includes: a memory (e.g., a non-transitory computer-readable storage medium), configured to store a computer-executable instruction; and a processor (e.g., processing circuitry), configured to perform the operations in the method according to any one of the foregoing aspects when the computer-executable instruction is executed by the processor.
Particularly, the method described above with reference to the flowcharts may be implemented as a computer program. For example, an aspect of this disclosure provides a computer program product, the computer program product including a computer program carried on a computer-readable medium, the computer program including program code for performing at least one operation of the method for determining a recommendation indicator of resource information in the foregoing aspects.
An aspect of this disclosure further provides one or more computer-readable storage media, having a computer-readable instruction stored therein, the computer-readable instruction, when executed, implementing the method for determining a recommendation indicator of resource information provided in the aspects of this disclosure. Each operation of the method for determining a recommendation indicator of resource information may be converted into a computer-readable instruction through programming, and then stored in a computer-readable storage medium such as a non-transitory computer-readable storage medium. When such a computer-readable storage medium is read or accessed by an electronic device or a computer, the computer-readable instruction therein is executed by a processor on the electronic device or the computer to implement the method for determining a recommendation indicator of resource information.
The example electronic device 1210 shown in
The processing system 1211 is configured to perform one or more operations by using hardware. Therefore, the processing system 1211 is shown as including a hardware element 1214 that may be configured as a processor, a functional block, or the like. The hardware element may include implementation in hardware as an application specific integrated circuit (ASIC) or another logic device formed by one or more semiconductors. The hardware element 1214 is not limited by a material from which the hardware element is formed or a processing mechanism adopted therein. For example, a processor may include (a plurality of) semiconductors and/or transistors (for example, an electronic integrated circuit (IC)). In such a context, a processor-executable instruction may be an electronically executable instruction.
The computer-readable medium 1212 is shown as including a memory/storage apparatus 1215. The memory/storage apparatus 1215 represents a memory/storage capacity associated with one or more computer-readable media. The memory/storage apparatus 1215 may include a volatile medium (such as a random access memory (RAM)) and/or a non-volatile medium (such as a read-only memory (ROM), a flash memory, an optical disc, or a magnetic disk). The memory/storage apparatus 1215 may include a fixed medium (such as a RAM, a ROM, and a fixed hard disk drive) and a removable medium (such as a flash memory, a removable hard disk drive, or an optical disc). The computer-readable medium 1212 may be configured in various other manners further described below. One or more I/O interfaces 1213 are configured to allow a user to input a command and information to the electronic device 1210 by using various input devices, and in some aspects, further allow presenting the information to the user and/or another assembly or device by using various output devices. An example of the input device includes a keyboard, a cursor control device (for example, a mouse), a microphone (for example, configured for voice input), a scanner, a touch function (for example, a capacitive sensor or another sensor configured to detect physical touch), a camera (for example, a motion that does not involve touch may be detected as a gesture by using a visible or invisible wavelength (such as an infrared frequency)), or the like. An example of output device includes a display device (for example, a display or a projector), a speaker, a printer, a network card, a tactile response device, or the like. Therefore, the electronic device 1210 may be configured in various manners as further described below to support user interaction.
The electronic device 1210 further includes an application 1216. The application 1216 may be, for example, a software example of the apparatus 1100 for determining a recommendation indicator of resource information described with reference to
Various technologies may be described herein in the general context of software and hardware elements or program modules. The modules include a routine, a program, an object, an element, an assembly, a data structure, and the like that execute a particular task or implement a particular abstract data type. The terms “module”, “function”, and “assembly” used herein refer to software, firmware, hardware, or a combination thereof. The features of the technologies described herein are platform-independent, which means that the technologies may be implemented on various computing platforms having various processors.
The implementation of the described modules and technologies may be stored on a form of computer-readable medium, or transmitted across the form of computer-readable medium. The computer-readable medium may include various media that may be accessed by the electronic device 1210. As an example rather than a limitation, the computer-readable medium may include “a computer-readable storage medium” and “a computer-readable signal medium”.
Contrary to simple signal transmission, a carrier wave, or a signal, the “computer-readable storage medium” refers to a medium and/or a device that can store information permanently, and/or a tangible storage device. Therefore, the computer-readable storage medium refers to a non-signal bearing medium. The computer-readable storage medium includes hardware such as volatile and nonvolatile media, removable and non-removable media, and/or storage devices implemented by using a method or technology suitable for storage of information (such as a computer-readable instruction, a data structure, a program module, a logic element/circuit or other data). An example of the computer-readable storage medium may include, but is not limited to, a RAM, a ROM, an electrically erasable programmable ROM (EEPROM), a flash memory or another memory technology, a CD-ROM, a digital versatile disk (DVD) or another optical storage apparatus, a hard disk, a cassette tape, a magnetic tape, a magnetic disk storage apparatus or another magnetic storage device, or another storage device, a tangible medium, or a product suitable for storing expected information and accessible by a computer.
The “computer-readable signal medium” refers to a signal bearing medium configured to transmit an instruction to hardware of the electronic device 1210, for example, through a network. A signal medium may typically embody the computer-readable instruction, the data structure, the program module, or other data in a modulated data signal such as a carrier wave, a data signal, or another transmission mechanism. The signal medium further includes any information transmission medium. The term “modulated data signal” refers to such a signal. One or more features of the signal are set or changed to encode information into the signal. As an example rather than a limitation, a communication medium includes, for example, a wired network or a direct-connected wired medium, and a wireless medium such as an acoustic medium, an RF medium, an infrared medium, and another wireless medium.
As described above, the hardware element 1214 and the computer-readable medium 1212 represent instructions, modules, programmable device logic, and/or fixed device logic implemented in the form of hardware, which may be configured to implement at least some aspects of the technologies described herein in some aspects. A hardware element may include an integrated circuit or a system on chip, the ASIC, a field programmable gate array (FPGA), a complex programmable logic device (CPLD), and another implementation in silicon or an assembly of another hardware device. In such a context, the hardware element may serve as a processing device for performing a program task defined by the instruction, the module, and/or logic embodied by the hardware element, and a hardware device configured to store an instruction for execution, for example, the computer-readable storage medium previously described.
The foregoing combination may also be configured for implementing various technologies and modules described herein. Therefore, the software, the hardware, or the program module, and another program module may be implemented as one or more instructions and/or logics on a form of computer-readable storage medium and/or embodied by one or more hardware elements 1214. The electronic device 1210 may be configured to implement specific instructions and/or functions corresponding to software and/or hardware modules. Therefore, for example, through use of the computer-readable storage medium and/or the hardware element 1214 of the processing system, a module may be implemented at least partially in hardware as a module executable by the electronic device 1210 as software. The instructions and/or functions may be executed/operated by one or more products (for example, one or more electronic devices 1210 and/or processing systems 1211) to implement the technologies, modules, and examples described herein.
In various implementations, the electronic device 1210 may adopt various different configurations. For example, the electronic device 1210 may be implemented as a computer-type device including a personal computer, a desktop computer, a multi-screen computer, a laptop computer, a netbook, and the like. The electronic device 1210 may further be implemented as a mobile apparatus-type device including mobile devices including a mobile phone, a portable music player, a portable game device, a tablet computer, a multi-screen computer, and the like. The electronic device 1210 may further be implemented as a television-type device, which includes a device having or connected to a large screen in a casual viewing environment. The devices include a television, a set-top box, a game console, and the like.
The technologies described herein may be supported by the various configurations of the electronic device 1210, and are not limited to specific examples of the technologies described herein. The functions may further be implemented in whole or in part on the “cloud” 1220 by using a distributed system such as a platform 1222 described below. The cloud 1220 includes and/or represents the platform 1222 configured for a resource 1224. The platform 1222 abstracts underlying functions of hardware (for example, a server) and software resources of the cloud 1220. The resource 1224 may include other applications and/or data that may be used during execution of computer processing on a server remote from the electronic device 1210. The resource 1224 may further include services provided through the Internet and/or through a subscriber network such as a cellular or Wi-Fi network.
The platform 1222 may abstract resources and functions to connect the electronic device 1210 to another electronic device. The platform 1222 may further be configured to abstract a hierarchy of resources to provide a hierarchy of corresponding levels of encountered requirements for the resource 1224 implemented through the platform 1222. Therefore, in an interconnected device aspect, the implementation of the functions described herein may be distributed throughout a system 1200. For example, the functions may be partially implemented on the electronic device 1210 and through the platform 1222 that abstracts the functions of the cloud 1220.
For the sake of clarity, the aspects of this disclosure are described based on different functional units. However, it is to be apparent that functionality of each functional unit may be implemented in a single unit, implemented in a plurality of units, or implemented as a part of another functional unit without departing from this application. For example, the functionality described as being performed by a single unit may be performed by a plurality of different units. Therefore, a reference to a specific functional unit is only regarded as a reference to an appropriate unit configured to provide the described functionality, rather than indicative of a strict logical or physical structure or organization. Therefore, this application may be implemented in a single unit, or may be physically and functionally distributed between different units and circuits.
Although the terms such as “first”, “second”, and “third” in this specification may be configured for describing various devices, elements, components, or parts, the devices, elements, components, or parts are not to be limited by such terms. The terms are only configured for distinguishing one device, element, component, or part from another device, element, component, or part.
Although this disclosure has been described in combination with some aspects, this disclosure is not intended to be limited to the specific forms set forth herein. On the contrary, the scope of this disclosure is limited only by the appended claims. Additionally, although individual features may be included in different claims, the combinations, which may be advantageously combined, are included in different claims, and do not imply features, are not feasible and/or advantageous. An order of the features in the claims does not imply any specific order in which the features need to be operated. In addition, in the claims, the word “comprising” does not exclude another element, and the term “a” or “one” does not exclude a plurality.
| Number | Date | Country | Kind |
|---|---|---|---|
| 202310108533.0 | Jan 2023 | CN | national |
The present application is a continuation of International Application No. PCT/CN2023/130455, filed on Nov. 8, 2023, which claims priority to Chinese Patent Application No. 202310108533.0, filed on Jan. 17, 2023. The entire disclosures of the prior applications are hereby incorporated by reference.
| Number | Date | Country | |
|---|---|---|---|
| Parent | PCT/CN2023/130455 | Nov 2023 | WO |
| Child | 19050865 | US |