CN115935067A - Article recommendation method integrating semantics and structural view for socialized recommendation - Google Patents

Article recommendation method integrating semantics and structural view for socialized recommendation Download PDF

Info

Publication number
CN115935067A
CN115935067A CN202211590930.8A CN202211590930A CN115935067A CN 115935067 A CN115935067 A CN 115935067A CN 202211590930 A CN202211590930 A CN 202211590930A CN 115935067 A CN115935067 A CN 115935067A
Authority
CN
China
Prior art keywords
user
item
article
node
formula
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211590930.8A
Other languages
Chinese (zh)
Inventor
袁昆
姜元仲
孙见山
姜元春
钱洋
柴一栋
刘业政
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hefei University of Technology
Original Assignee
Hefei University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hefei University of Technology filed Critical Hefei University of Technology
Priority to CN202211590930.8A priority Critical patent/CN115935067A/en
Publication of CN115935067A publication Critical patent/CN115935067A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention discloses a socialized recommendation-oriented article recommendation method integrating semantics and structural views, which comprises the following steps of: 1, proposing definition of implicit relationship in social recommendation; 2, constructing a heterogeneous information network and defining a meta path; 3 extracting local score prediction; 4, extracting global scoring prediction; 5, model fusion; 6 put forward constraints on user rating behavior; and 7, training the model and obtaining a trained depth map model and a trained width linear attention model. The method can still ensure the accuracy and stability of recommendation under the conditions of different social relationship distribution unbalancedness and sparsity.

Description

Article recommendation method integrating semantics and structural views and oriented to social recommendation
Technical Field
The invention belongs to the field of social recommendation systems, and particularly relates to an article recommendation method of a semantic and structural view fusion model for social recommendation.
Background
Nowadays, social platforms are developed vigorously, the relation between people becomes compact, and how to perform modeling analysis on the relation between platform users and commodities to further realize more accurate social recommendation becomes increasingly important. Different from a traditional collaborative recommendation algorithm, the social recommendation system is based on a series of social influence theories, the main ideas are that users with explicit social relations often have similar preferences, and the choices of the users are also possibly influenced by relatives and friends of the users, and the main method is to take the relations in the social network as auxiliary information to improve the recommendation accuracy. Socialization recommendations have long penetrated various aspects of life, such as: the system comprises a commodity recommendation function in an e-commerce platform, a friend recommendation function in a friend making platform and the like. At present, the improvement of the social recommendation system performance is mainly limited by two factors of social relationship distribution imbalance, sparsity and rating behavior difference, so that the relative position relationship between a user and an article is kept while a large amount of latent data is further mined, the explicit-implicit relationship between the user and a commodity is more efficiently and reasonably utilized, and the method is the key for improving the robustness of the social recommendation system. The current recommendation methods associated with this model can be divided into three categories: 1. the classical social recommendation method mainly comprises a co-factor decomposition method, an integration method and a regularization method, wherein the methods only use a small amount of implicit data, and a plurality of high-quality implicit relations are still to be discovered; 2. the social recommendation system based on the graph model is characterized in that a high-order relation model is built based on the graph neural network, various data are better utilized, further mining of implicit relations is still lacked, and meanwhile, the influence of user scoring behaviors on recommendation results is ignored 3. The social recommendation method based on the multi-view aims at mining data information from multiple angles, and different interaction structures have different effects on scoring prediction.
Disclosure of Invention
In order to overcome the defects in the prior art, the invention provides the article recommendation method integrating the semantics and the structural view for social recommendation, so that the accuracy and the stability of article recommendation can be ensured under the conditions of different social relationship distribution unbalancedness and sparsity.
In order to achieve the purpose, the invention adopts the following technical scheme:
the invention relates to a socialized recommendation-oriented article recommendation method integrating semantics and structural views, which is characterized by comprising the following steps of:
step 1, defining implicit relation in social recommendation:
step 1.1, let U = { U = { (U) } 1 ,…,u i ,…,u M Denotes a set of users, u i Represents any ith user, i is more than or equal to 1 and less than or equal to M, and V = { V = 1 ,…,v a ,…,v N Denotes a collection of items, v a Representing any a-th article, wherein a is more than or equal to 1 and less than or equal to N;
let the user score matrix be R = { R = ia } M×N Represents the scoring of all items in item set V by all users in user set U, where r ia Representing an arbitrary ith user u i For any a item v a Scoring of (4);
let the user social matrix be denoted as S = { S = } ik } M×M Represents whether each user in the user set U focuses on other users, wherein s ik Represents an arbitrary i-th user u i Whether to pay attention to any k-th user u k If the ith user u i Focus on the kth user u k Then let s ik =1, otherwise, let s ik =0;
Let i user u i Is marked as F U (i)={u k |s ik =1}; k is more than or equal to 1 and less than or equal to M; and i is not equal to k;
step 1.2, order the ith user u i Is marked as HU (i) = { u) = ij ||||s ik =1∩s jk 1| ≧ τ }, where u | | is greater than or equal to τ }, where u ij Represents the ith user u i Implicit social friends of s ik =1∩s jk =1 denotes the i-th user u i K user u of interest k Pay attention to the jth user u at the same time j τ represents a cutoff threshold, 1 ≦ τ; j is more than or equal to 1 and less than or equal to M; and j ≠ k;
obtaining the a-th item v a And the b-th item v b User set U all giving scores ab ,U ab E is U; obtaining a user set U by using the formula (1) ab The ith user u i For the a-th item v a With the b-th item v b Score similarity between them
Figure BDA0003994299200000023
Figure BDA0003994299200000021
In the formula (1), r ib Represents the ith user u i For any b-th item v b Scoring; b is more than or equal to 1 and less than or equal to N; and a is not equal to b;
obtaining a user set U by using the formula (2) ab All users in the system to the a-th item v a With the b-th item v b Score similarity cumulative value sim between ab Thereby obtaining the a-th article v a A scoring similarity cumulative value set with other articles;
Figure BDA0003994299200000022
sorting related articles in descending order according to the score similarity accumulated value set, thereby obtaining the a-th article v a Hidden item relation set H V (a);
Step 2, constructing a heterogeneous information network and defining a meta path:
respectively taking each user and each article as nodes, taking the social friend set and the recessive social friend set of each user as the explicit-implicit relationship among the nodes of each user, and taking the recessive article relationship set of each article as the implicit relationship among the nodes of each article, thereby constructing edges among the nodes and forming a heterogeneous information network HIN;
defining five user element paths including three user single-hop neighbor element paths and two user double-hop neighbor element paths; the three user single-hop neighbor paths formed by two nodes and one edge connected with the two nodes comprise: user-item, user-user, user-implicit user; wherein, the user-article represents the user node to the article node and one edge connected with the article node; user-user represents one user node to another user node and an edge connected with the user node; the user-hidden user represents one user node to another hidden user node and one edge connected with the user node;
two kinds of user double-hop neighbor paths consisting of three nodes and two edges connected with the three nodes comprise: user-item, user-implicit user-item; the user-object represents a user node to another user node and one edge connected with the user node, the second user node to an object node and one edge connected with the object node, the user-hidden user-object represents a user node to a hidden user node and one edge connected with the hidden user node, and the hidden user node to an object node and one edge connected with the hidden user node;
three item meta-paths are defined, including: item-user, item-cryptic-user; wherein article-user represents an article node to a user node and an edge connecting the article-hidden article represents an article node to a hidden article node and an edge connecting the article node to the hidden article node, article-hidden article-user represents an article node to a hidden article node and an edge connecting the article node to the hidden article node, and a hidden article node to a user node and an edge connecting the hidden article node to the user node;
step 3, extracting local score prediction;
step 3.1, randomly removing partial nodes and associated edges in the heterogeneous information network HIN to obtain a preprocessed heterogeneous information network HIN';
step 3.2, obtaining ith user u from HIN' by using an embedded layer i Characteristic vector p of i ∈R d And the a-th item v a Characteristic vector q of a ∈R d′ D' represents a feature vector dimension;
3.3, building a depth map model consisting of a graph convolution network based on an attention mechanism, a user local feature extraction module, an article local feature extraction module and a local prediction module;
step 3.3.1, p i ,q a Inputting into a graph convolution network based on an attention mechanism for processing to obtain output vectors of five user element paths, wherein the output vectors comprise: user-item meta-path based embedded vector
Figure BDA0003994299200000031
Embedding vector based on user-user meta path +>
Figure BDA0003994299200000032
Embedded vector based on user-item meta-path>
Figure BDA0003994299200000033
Embedded vector based on user-implicit user meta-path ≥>
Figure BDA0003994299200000034
Embedded vector based on user-implicit user-item meta path>
Figure BDA0003994299200000035
Step 3.3.2, the user local feature extraction module processes the output vectors of the five user element paths by using the formula (2) to obtain the ith user u i Local feature embedding vector of
Figure BDA0003994299200000036
Figure BDA0003994299200000037
In the formula (2), the reaction mixture is,
Figure BDA0003994299200000038
indicating a collocated operation, MLP user Representing a multi-layer feedforward neural network in a user local feature extraction module;
step 3.3.3, the ith user u i Characteristic vector p of i And the a-th item v a Characteristic vector q of a Inputting into a graph convolution network based on an attention mechanism for processing to obtain output vectors of three article element paths, wherein the output vectors comprise: embedded vector based on item-user meta-path
Figure BDA0003994299200000039
Embedded vector based on item-implicit item meta-path>
Figure BDA00039942992000000310
Embedded vector based on item-implicit item-user meta path>
Figure BDA00039942992000000311
Step 3.3.4, the article local feature extraction module processes the output vectors of the three article element paths by using the formula (3) and outputs the a-th article v a Local feature embedding vector of
Figure BDA00039942992000000312
Figure BDA00039942992000000313
In formula (3), MLP item Representing an articleA multilayer feedforward neural network in the local feature extraction module;
step 3.3.5, mixing
Figure BDA0003994299200000041
Figure BDA0003994299200000042
Inputting into a local prediction module, and obtaining the ith user u by using an equation (4) i For the a-th item v a Partial score prediction in->
Figure BDA0003994299200000043
Figure BDA0003994299200000044
In formula (4), MLP deep Representing a multi-layer feedforward neural network in a local prediction module;
step 4, extracting global scoring prediction;
step 4.1, according to the ith user u i Set of implicit social friends of H U (i) Obtaining the ith user u from HIN by using embedded layer i Potential feature vector p' i ∈R d′ And potential influence vector x i ∈R d′ And according to the implicit item relation set H of the a-th item V (a) Obtaining the a-th item v from the HIN by using the embedding layer a Potential feature vector q' a ∈R d And potential influence vector y a ∈R d′
4.2, building a wide linear attention model consisting of a user global feature extraction module, an article global feature extraction module and a global prediction module;
step 4.2.1, the user global feature extraction module obtains the ith user u by using the formulas (5) to (8) based on three user single-hop neighbor paths i Global feature embedded vector of
Figure BDA0003994299200000045
Figure BDA0003994299200000046
/>
Figure BDA0003994299200000047
Figure BDA0003994299200000048
Figure BDA0003994299200000049
In the formula (8), α ik E alpha represents the ith user u i For the k user u k Attention weight of (1), beta ij E beta represents the ith user u i For the jth user u in the recessive social friends ij Attention weight of (1), γ ia E gamma represents the ith user u i For the a-th item v a Attention weight of (1), R V (i) E R represents the ith user u i A set of items for which a score has been given; w 1 ,W 2 ,W 3 ,W 4 ,W 5 ,W 6 For six trainable parameter matrices, b 1 ,b 2 ,b 3 Three offset vectors; sigma is an activation function, and softmax represents a normalization function; t represents transposition;
step 4.2.2, the item global feature extraction module obtains the a-th item v by using the formula (9) a Global feature embedding vector of
Figure BDA00039942992000000410
Figure BDA00039942992000000411
In the formula (9), eta ab E eta represents the a-th item v a For the b recessive similar item v b Attention weight of (1);
step 4.2.3, will
Figure BDA0003994299200000051
Figure BDA0003994299200000052
Inputting into a global prediction module, and obtaining the ith user u by using an equation (10) i For the a-th article v a Global of (2) score prediction>
Figure BDA0003994299200000053
Figure BDA0003994299200000054
In formula (10), b i For the ith user u i User deviation of (a), b a For the a-th item v a Mu is the average value of all users after adding the scores of all the articles;
step 5, obtaining the ith user u by using the formula (11) i For the a-th article v a Scoring the predicted outcome
Figure BDA0003994299200000055
Figure BDA0003994299200000056
In formula (11), λ represents a scoring weight coefficient;
step 6, constructing a constraint aiming at the scoring behavior of the user;
step 6.1, defining a rating triple (U, R, V) of a user to an item, wherein U belongs to U and represents a user entity, V belongs to V and represents an item entity, R belongs to R and represents a rating relation, and calculating a feature vector through a TransH algorithm, wherein the method comprises the following steps: user entity feature vector e u Article entity feature vector e v Ranking relation featuresEigenvector e r
Obtaining e according to formula (12) and formula (13) u ,e v Projection vector of user entity characteristic of hyperplane in relation r
Figure BDA0003994299200000057
And an item physical feature projection vector>
Figure BDA0003994299200000058
Figure BDA0003994299200000059
Figure BDA00039942992000000510
In formulae (12) and (13), w r Is a normal vector corresponding to the hyperplane;
step 6.2, constructing a scoring function f (u, r, v) of the rating triple (u, r, v) by using the formula (14):
Figure BDA00039942992000000511
/>
step 6.3, constructing a marginal loss function L by using the formula (15) KG
Figure BDA00039942992000000512
In the formula (16), [ f (-)] + Represents max (0, f (·)), V 'represents another item entity, (u, r, V') represents a dummy triple generated by V 'in place of V, V' is e.v; f (u, r, v ') represents the scoring function of the false triplet (u, r, v');
step 7, constructing a loss function L of the depth map model by using the formula (14) D
Figure BDA00039942992000000513
In formula (14), λ 1 ,λ 2 For regularization parameters, P represents a matrix formed by all user characteristic vectors, and Q represents a matrix formed by all article characteristic vectors;
construction of the loss function L of the Width Linear attention model Using equation (15) W
Figure BDA0003994299200000061
In formula (15), λ 3 For regularization parameters, P 'represents a matrix formed by potential feature vectors of all users, Q' represents a matrix formed by potential feature vectors of all articles, X represents a matrix formed by potential influence vectors of all users, and Y represents a matrix formed by potential influences of all articles;
step 7, training the depth map model and the width linear attention model by using a gradient descent method respectively, and correspondingly calculating a loss function L D And a loss function L W Updating the model parameters until the loss function is converged, thereby obtaining a trained depth map model and a trained width linear attention model;
and 8, respectively inputting the item set, the user scoring matrix, the target user social friend set and the recessive social friend set of a certain target user into the trained depth map model and the trained width linear attention model, and correspondingly obtaining the local scoring prediction and the global scoring prediction of the target user, so that the scoring of the target user on the items is calculated by using the formula (11), and K items with the top scoring rank are selected and recommended to the target user, so that the item recommendation of the target user is completed.
The electronic device comprises a memory and a processor, and is characterized in that the memory is used for storing programs for supporting the processor to execute the item recommendation method, and the processor is configured to execute the programs stored in the memory.
The present invention relates to a computer-readable storage medium, having a computer program stored thereon, wherein the computer program is adapted to be executed by a processor to perform the steps of the item recommendation method.
Compared with the prior art, the invention has the beneficial effects that:
1. according to the method, the problem of unbalanced distribution and sparsity of social relations is solved to a certain extent by mining implicit relations between the users and the objects, the influence of deviation and overfitting caused by noise nodes on score prediction results is reduced by predicting and weighting global scores and local scores, meanwhile, the relative position relation between the users and the objects is guaranteed by constraining the user scores, the preference of the users is better reflected, the performance and the overall generalization capability of the model are improved, the more comprehensive understanding of the relation between the users and the objects by the model is guaranteed, and the accuracy and the stability of the recommendation results are further improved.
2. The invention provides a semantic and structural view fusion model for predicting user scoring behavior, which is characterized in that a heterogeneous information network is constructed by introducing the definition of implicit relationship, the extraction of the explicit and implicit relationship characteristics of a user and an article is realized based on two models with different views, the two models are fused with the information of global and local characteristics acquired, the user scoring constraint is integrated, the understanding of the preference of the user is deepened, and the improvement of the performance of the whole recommendation system is facilitated.
Drawings
FIG. 1 is a diagram of a heterogeneous information network for the method of the present invention;
FIG. 2 is a depth map model of the method of the present invention;
FIG. 3 is a broad linear attention model of the method of the present invention;
FIG. 4 is a flowchart of the method of the present invention.
Detailed Description
In this embodiment, a social recommendation oriented article recommendation method with fusion of semantics and structural views is performed according to the following steps as shown in fig. 4:
step 1, defining implicit relations in social recommendation:
step 1.1, let U = { U = { (U) } 1 ,…,u i ,…,u M Denotes a set of users, u i Represents an arbitrary ith user, 1 ≦ i ≦ M, V = { V ≦ M 1 ,…,v a ,…,v N Denotes a collection of items, v a Representing any a-th article, wherein a is more than or equal to 1 and less than or equal to N;
let the user score matrix be R = { R = ia } M×N Represents the scoring of all items in item set V by all users in user set U, where r ia Representing an arbitrary ith user u i For any a item v a Scoring of (4);
let the user social matrix be denoted as S = { S = } ik } M×M Represents whether each user in the user set U focuses on other users, wherein s ik Represents an arbitrary i-th user u i Whether to pay attention to any k-th user u k If the ith user u i Focus on the kth user u k Then let s ik =1, otherwise, let s ik =0;
Let i user u i Is marked as F U (i)={u k |s ik =1}; k is more than or equal to 1 and less than or equal to M; and i is not equal to k;
step 1.2, order the ith user u i The set of implicit social friends is marked as H U (i)={u ij |||s ik =1∩s jk 1| ≧ τ }, where u | | is greater than or equal to τ }, where u ij Represents the ith user u i Implicit social friends of s ik =1∩s jk =1 denotes the i-th user u i Kth user u of interest k Pay attention to the jth user u at the same time j τ represents a cutoff threshold, 1 ≦ τ; j is more than or equal to 1 and less than or equal to M; and j ≠ k; the set of implicit social friends may interpret that there may be similar preferences between users who possess a common follower, with a larger τ indicating more common followers needed to establish implicit user relationships.
Obtaining the a-th item v a And the b-th item v b User set U all giving scores ab ,U ab E is U; obtaining a user set U by using the formula (1) ab To middlei users u i For the a-th item v a With the b-th item v b Score similarity between them
Figure BDA0003994299200000073
Figure BDA0003994299200000071
In the formula (1), r ib Represents the ith user u i For any b-th item v b Scoring of (4); b is more than or equal to 1 and less than or equal to N; and a is not equal to b;
obtaining a user set U by using the formula (2) ab All users in the system to the a-th item v a With the b-th item v b Score similarity cumulative value sim between ab Thereby obtaining the a-th article v a A scoring similarity cumulative value set with other articles;
Figure BDA0003994299200000072
sorting related articles in descending order according to the score similarity accumulated value set, thereby obtaining the a-th article v a Hidden item relation set H V (a) (ii) a The set of implicit item relationships may be interpreted as the more similar the common user scores of the two items are, the higher the similarity of the two items is.
Step 2, constructing a heterogeneous information network and defining a meta path:
as shown in fig. 1, each user and each article are respectively used as nodes, the social friend set and the recessive social friend set of each user are used as explicit and implicit relations between nodes of each user, and the recessive article relation set of each article is used as the implicit relation between nodes of each article, so that edges between the nodes are constructed, and a heterogeneous information network HIN is formed;
defining five user element paths including three user single-hop neighbor element paths and two user double-hop neighbor element paths; the three user single-hop neighbor paths formed by two nodes and one edge connected with the two nodes comprise: user-item, user-user, user-implicit user; wherein, the user-article represents the user node to the article node and one edge connected with the article node; user-user means from one user node to another user node and an edge connected thereto; the user-hidden user represents one user node to another hidden user node and one edge connected with the user node;
two kinds of user double-hop neighbor paths consisting of three nodes and two edges connected with the three nodes comprise: user-item, user-implicit user-item; the user-object represents a user node to another user node and one edge connected with the user node, the second user node to an object node and one edge connected with the object node, the user-hidden user-object represents a user node to a hidden user node and one edge connected with the hidden user node, and the hidden user node to an object node and one edge connected with the hidden user node;
defining three item meta-paths including: item-user, item-covert item-user; the system comprises an article node, a user node, an article-implicit article node, a user node and a connecting edge of the user node, the article-implicit article node, the connecting edge of the implicit article node and the user node, the article-implicit article-user node, the user node and the connecting edge of the user node, wherein the article-user represents an article node to a user node and a connecting edge of the user node;
step 3, extracting local score prediction;
step 3.1, randomly removing partial nodes and associated edges in the heterogeneous information network HIN to obtain a preprocessed heterogeneous information network HIN'; in specific implementation, after nodes and associated edges are randomly removed by using a DropNode, the capacity of a user-item double-edge graph is 30, the capacity of a user social friend is 20, and the top 20 users and items are selected as implicit user and item relationships.
Step 3.2, obtaining ith user u from HIN' by using an embedded layer i Characteristic vector p of i ∈R d And the a-th articlev a Characteristic vector q of a ∈R d′ D' represents a feature vector dimension;
step 3.3, as shown in FIG. 2, building a depth map model composed of a graph convolution network based on an attention mechanism, a user local feature extraction module, an article local feature extraction module and a local prediction module; embedding sizes =80, dropout ratio =0.5, and slope of the activation function leak Relu is 0.2 for the convolutional neural network.
Step 3.3.1, p i ,q a Inputting into a graph convolution network based on an attention mechanism for processing to obtain output vectors of five user element paths, wherein the output vectors comprise: user-item meta-path based embedded vector
Figure BDA0003994299200000081
Embedding vector based on user-user meta path +>
Figure BDA0003994299200000091
Embedded vector based on user-item meta path>
Figure BDA0003994299200000092
Embedded vector based on user-implicit user meta-path ≥>
Figure BDA0003994299200000093
Embedded vector based on user-implicit user-item meta path>
Figure BDA0003994299200000094
Step 3.3.2, the user local feature extraction module processes the output vectors of the five user element paths by using the formula (2) to obtain the ith user u i Local feature embedding vector of
Figure BDA0003994299200000095
Figure BDA0003994299200000096
In the formula (2), the reaction mixture is,
Figure BDA0003994299200000097
indicating a collocated operation, MLP user Representing a multi-layer feedforward neural network in a user local feature extraction module; />
Step 3.3.3, the ith user u i Characteristic vector p of i And the a-th item v a Characteristic vector q of a Inputting into a graph convolution network based on an attention mechanism for processing to obtain output vectors of three article meta-paths, wherein the output vectors comprise: embedded vector based on item-user meta-path
Figure BDA0003994299200000098
Embedded vector based on item-implicit item meta-path>
Figure BDA0003994299200000099
Embedded vector based on item-implicit item-user meta path>
Figure BDA00039942992000000910
Step 3.3.4, the article local feature extraction module processes the output vectors of the three article element paths by using the formula (3) and outputs the a-th article v a Local feature embedding vector of
Figure BDA00039942992000000911
Figure BDA00039942992000000912
In formula (3), MLP item Representing a multi-layer feedforward neural network in an article local feature extraction module;
step 3.3.5, mixing
Figure BDA00039942992000000913
Figure BDA00039942992000000914
Inputting into a local prediction module, and obtaining the ith user u by using an equation (4) i For the a-th item v a Is predicted to be->
Figure BDA00039942992000000915
Figure BDA00039942992000000916
In formula (4), MLP deep Representing a multi-layer feedforward neural network in a local prediction module;
step 4, extracting global scoring prediction;
step 4.1, according to the ith user u i Set of implicit social friends of H U (i) Obtaining the ith user u from HIN by using embedded layer i Potential feature vector p' i ∈R d′ And potential influence vector x i ∈R d′ And according to the recessive article relation set H of the a-th article V (a) Obtaining the a-th item v from the HIN by using the embedding layer a Potential feature vector q 'of' a ∈R d And potential influence vector y a ∈R d′
Step 4.2, as shown in fig. 3, building a wide linear attention model composed of a user global feature extraction module, an article global feature extraction module and a global prediction module;
step 4.2.1, the user global feature extraction module obtains the ith user u by using the formulas (5) to (8) based on three user single-hop neighbor paths i Global feature embedding vector of
Figure BDA00039942992000000917
Figure BDA00039942992000000918
Figure BDA0003994299200000101
Figure BDA0003994299200000102
Figure BDA0003994299200000103
In the formula (8), α ik E α represents the ith user u i For the k user u k Attention weight of (1), beta ij E beta represents the ith user u i For the jth user u in the recessive social friends ij Attention weight of (1), gamma ia E x represents the ith user u i For the a-th item v a Attention weight of (1), R V (i) E R represents the ith user u i A set of items for which a score has been given; w is a group of 1 ,W 2 ,W 3 ,W 4 ,W 5 ,W 6 Six trainable parameter matrices, b 1 ,b 2 ,b 3 Three offset vectors; sigma is an activation function, and softmax represents a normalization function; t represents transposition;
step 4.2.2, the global feature extraction module of the article obtains the a-th article v by using the formula (9) a Global feature embedded vector of
Figure BDA0003994299200000104
/>
Figure BDA0003994299200000105
In the formula (9), eta ab E eta represents the a-th item v a For the b-th recessive similar item v b Attention weight of (1);
step 4.2.3, will
Figure BDA0003994299200000106
Figure BDA0003994299200000107
Inputting the data into a global prediction module, and obtaining the ith user u by using an equation (10) i For the a-th item v a Is predicted based on the global score of->
Figure BDA0003994299200000108
Figure BDA0003994299200000109
In the formula (10), b i For the ith user u i User deviation of (a), b a Is the a-th item v a Mu is the average value of all users after adding the scores of all the articles; in specific implementation, trustSVD algorithm can be directly used, and
Figure BDA00039942992000001010
Figure BDA00039942992000001011
linear combination is performed.
Step 5, obtaining the ith user u by using the formula (11) i For the a-th item v a Scoring the predicted outcome
Figure BDA00039942992000001012
Figure BDA00039942992000001013
In the formula (11), λ represents a scoring weight coefficient;
step 6, constructing a constraint aiming at the scoring behavior of the user;
step 6.1, defining a rating triple (U, R, V) of a user to an item, wherein U belongs to U and represents a user entity, V belongs to V and represents an item entity, R belongs to R and represents a rating relation, and calculating a feature vector through a TransH algorithm, wherein the method comprises the following steps: user entity feature directionQuantity e u Article entity feature vector e v Ranking the relational feature vector e r
Obtaining e according to formula (12) and formula (13) u ,e v Feature projection vector of hyperplane at relation r
Figure BDA00039942992000001014
Figure BDA00039942992000001015
Figure BDA00039942992000001016
In formulae (12) and (13), w r A normal vector corresponding to the hyperplane; t represents transposition;
step 6.2, constructing a scoring function f (u, r, v) of the rating triple (u, r, v) by using the formula (14):
Figure BDA0003994299200000111
step 6.3, constructing a marginal loss function L by using the formula (15) KG
Figure BDA0003994299200000112
In the formula (16), [ f (-)] + Represents max (0, f (·)), V 'represents another item entity, (u, r, V') represents a false triple generated by V 'in place of V, V' is epsilon V; f (u, r, v ') represents the scoring function of the false triplet (u, r, v'); in a specific implementation, a score-based cross-sampling substitution method is adopted to generate a false triple, and all false items/users in the false triple should be sampled from rated items/users with ratings smaller than the ratings in the real three groups.
Step 7, constructing a loss function L of the depth map model by using the formula (14) D
Figure BDA0003994299200000113
In the formula (14), λ 1 ,λ 2 For regularization parameters, P represents a matrix formed by all user characteristic vectors, and Q represents a matrix formed by all article characteristic vectors;
construction of the loss function L of the Wide Linear attention model Using equation (15) W
Figure BDA0003994299200000114
In formula (15), λ 3 For regularizing parameters, P 'represents a matrix formed by potential characteristic vectors of all users, Q' represents a matrix formed by potential characteristic vectors of all articles, X represents a matrix formed by potential influence vectors of all users, and Y represents a matrix formed by potential influences of all articles; in specific implementation, different embedding parameters are respectively given to the depth map model and the width linear attention model so as to enhance the flexibility of the fusion model. The learning rate =0.001 for the depth map model and the learning rate =0.05 for the width linear attention model regularization parameter λ 1 =2,λ 2 =0.0001,λ 3 =0.05。
Step 7, training the depth map model and the width linear attention model respectively by using a gradient descent method, and correspondingly calculating a loss function L D And a loss function L W Updating the model parameters until the loss function is converged, thereby obtaining a trained depth map model and a trained width linear attention model;
step 8, as shown in fig. 4, for a certain target user, respectively inputting an article set, a user scoring matrix, a target user social friend set and a recessive social friend set into the trained depth map model and the trained width linear attention model, and correspondingly obtaining a local scoring prediction and a global scoring prediction, so as to calculate the score of the target user for the article by using formula (11), and select a K article with the top scoring rank to recommend to the target user, thereby completing the article recommendation of the target user.
In this embodiment, an electronic device includes a memory for storing a program that supports a processor to execute the item recommendation method, and a processor configured to execute the program stored in the memory.
In this embodiment, a computer-readable storage medium stores a computer program, and the computer program is executed by a processor to execute the steps of the item recommendation method.
In conclusion, the method further enhances the robustness and generalization capability on the basis of the early socialized recommendation model by utilizing the semantic and structural view fusion-based model, simultaneously considers the characteristic learning, model fusion and user scoring relative position relation based on the explicit-implicit relation, and can still ensure the accuracy and stability of recommendation under the conditions of imbalance and sparsity of different social relation distributions.

Claims (3)

1. A socialized recommendation-oriented item recommendation method integrating semantics and structural views is characterized by comprising the following steps of:
step 1, defining implicit relations in social recommendation:
step 1.1, let U = { U = { (U) } 1 ,···,u i ,···,u M Denotes a set of users, u i Represents an arbitrary ith user, 1 ≦ i ≦ M, V = { V ≦ M 1 ,···,v a ,···,v N Denotes a collection of items, v a Representing any a-th article, wherein a is more than or equal to 1 and less than or equal to N;
let the user score matrix be R = { R = ia } M×N Represents the scoring of all items in item set V by all users in user set U, where r ia Representing an arbitrary ith user u i For any a item v a Scoring of (4);
let the user social matrix be denoted as S = { S = } ik } M×M Represents whether each user in the user set U is interested in other users, wherein s ik Indicates whether an arbitrary ith user ui is paying attention to an arbitrary kth user u k If the ith user u i Focus on the kth user u k Then let s ik =1, otherwise, let s ik =0;
Let i user u i Is marked as F U (i)={u k |s ik =1}; k is more than or equal to 1 and less than or equal to M; and i is not equal to k;
step 1.2, order the ith user u i The set of implicit social friends is marked as H U (i)={u ij |||s ik =1∩s jk =1| ≧ epsilon }, where u ij Represents the ith user u i Implicit social friends of s ik =1∩s jk =1 denotes the i-th user u i Kth user u of interest k Pay attention to the jth user u at the same time j τ represents a cutoff threshold, 1 ≦ τ; j is more than or equal to 1 and less than or equal to M; and j ≠ k;
obtaining the a-th item v a And the b-th item v b User set U all giving scores ab ,U ab Belongs to U; obtaining a user set U by using the formula (1) ab The ith user u i For the a-th item v a With the (b) th item v b Score similarity between them
Figure FDA0003994299190000011
Figure FDA0003994299190000012
In the formula (1), r ib Represents the ith user u i For any b-th item v b Scoring of (4); b is more than or equal to 1 and less than or equal to U; and a is not equal to b;
obtaining a user set U by using the formula (2) ab All users in the system to the a-th item v a With the b-th item v b Score similarity cumulative value sim between ab FromTo obtain the a-th article v a A set of score similarity cumulative values with other items;
Figure FDA0003994299190000013
sorting related articles in descending order according to the score similarity accumulated value set, thereby obtaining the a-th article v a Hidden item relation set H V (a);
Step 2, constructing a heterogeneous information network and defining a meta path:
respectively taking each user and each article as nodes, taking the social friend set and the recessive social friend set of each user as the explicit-implicit relationship among the user nodes, and taking the recessive article relationship set of each article as the implicit relationship among the article nodes, thereby constructing edges among the nodes and forming a heterogeneous information network HIN;
defining five user element paths including three user single-hop neighbor element paths and two user double-hop neighbor element paths; the three user single-hop neighbor paths formed by two nodes and one edge connected with the two nodes comprise: user-item, user-user, user-implicit user; wherein, the user-item represents the user node to the item node and one edge connected with the item node; user-user represents one user node to another user node and an edge connected with the user node; the user-hidden user represents one user node to another hidden user node and one edge connected with the user node;
two kinds of user double-hop neighbor paths composed of three nodes and two edges connected with the three nodes comprise: user-item, user-implicit user-item; the user-object represents a user node to another user node and one edge connected with the user node, the second user node to an object node and one edge connected with the object node, the user-hidden user-object represents a user node to a hidden user node and one edge connected with the hidden user node, and the hidden user node to an object node and one edge connected with the hidden user node;
three item meta-paths are defined, including: item-user, item-cryptic-user; the system comprises an article node, a user node, an article-implicit article node, a user node and a connecting edge of the user node, the article-implicit article node, the connecting edge of the implicit article node and the user node, the article-implicit article-user node, the user node and the connecting edge of the user node, wherein the article-user represents an article node to a user node and a connecting edge of the user node;
step 3, extracting local score prediction;
step 3.1, randomly removing partial nodes and associated edges in the heterogeneous information network HIN to obtain a preprocessed heterogeneous information network HIN';
step 3.2, obtaining ith user u from HIN' by using an embedded layer i Characteristic vector p of i ∈R d' And the a-th item v a Characteristic vector q of a ∈R d' D' represents a feature vector dimension;
3.3, building a depth map model consisting of a graph convolution network based on an attention mechanism, a user local feature extraction module, an article local feature extraction module and a local prediction module;
step 3.3.1, p i ,q a Inputting into a graph convolution network based on an attention mechanism for processing to obtain output vectors of five user element paths, wherein the output vectors comprise: user-item meta-path based embedded vector
Figure FDA0003994299190000021
Embedded vector based on user-user meta path ≥>
Figure FDA0003994299190000022
Embedded vector based on user-item meta path>
Figure FDA0003994299190000023
User-implicit user meta-path based embedded vector
Figure FDA0003994299190000024
Embedded vector based on user-implicit user-item meta-path>
Figure FDA0003994299190000025
Step 3.3.2, the user local feature extraction module processes the output vectors of the five user element paths by using the formula (2) to obtain the ith user u i Local feature embedding vector of
Figure FDA0003994299190000026
Figure FDA0003994299190000027
In the formula (2), ≧ denotes a concatenation operation, MLP user Representing a multi-layer feedforward neural network in a user local feature extraction module;
step 3.3.3, the ith user u i Characteristic vector p of i And the a-th item v a Characteristic vector q of a Inputting into a graph convolution network based on an attention mechanism for processing to obtain output vectors of three article meta-paths, wherein the output vectors comprise: embedded vector based on item-user meta-path
Figure FDA0003994299190000031
Embedded vector based on item-implicit item meta-path>
Figure FDA0003994299190000032
Embedded vector based on item-implicit item-user meta path>
Figure FDA0003994299190000033
Step 3.3.4, the article local feature extraction module utilizes the formula (3) to carry out vector summation on output vectors of three article element pathsLine processing to output the a-th item v a Local feature embedding vector of
Figure FDA0003994299190000034
Figure FDA0003994299190000035
In formula (3), MLP item Representing a multi-layer feedforward neural network in an article local feature extraction module;
step 3.3.5, mixing
Figure FDA0003994299190000036
Inputting into a local prediction module, and obtaining the ith user u by using an equation (4) i For the a-th item v a Partial score prediction in->
Figure FDA0003994299190000037
Figure FDA0003994299190000038
In formula (4), MLP deep Representing a multi-layer feedforward neural network in a local prediction module;
step 4, extracting global scoring prediction;
step 4.1, according to the ith user u i Set of implicit social friends of H U (i) Obtaining the ith user u from HIN by using embedded layer i Potential feature vector p' i ∈R d' And potential influence vector x i ∈R d' And according to the recessive article relation set H of the a-th article V (a) Obtaining the a-th item v from the HIN by using the embedding layer a Potential feature vector q' a ∈R d' And potential influence vector y a ∈R d'
4.2, building a wide linear attention model consisting of a user global feature extraction module, an article global feature extraction module and a global prediction module;
step 4.2.1, the user global feature extraction module obtains the ith user u by using the formulas (5) to (8) based on three user single-hop neighbor paths i Global feature embedding vector of
Figure FDA0003994299190000039
Figure FDA00039942991900000310
Figure FDA00039942991900000311
Figure FDA00039942991900000312
Figure FDA00039942991900000313
In the formula (8), α ik E alpha represents the ith user u i For the k user u k Attention weight of (1), beta ij E beta represents the ith user u i For the jth user u in the recessive social friends ij Attention weight of (1), γ ia E gamma represents the ith user u i For the a-th article v a Attention weight of (1), R V (i) E R represents the ith user u i A set of items for which a score has been given; w is a group of 1 ,W 2 ,W 3 ,W 4 ,W 5 ,W 6 For six trainable parameter matrices, b 1 ,b 2 ,b 3 Three offset vectors; sigma is an activation function, and softmax represents a normalization function; t represents transposition;
step 4.2.2, the global feature of the article is extractedThe fetching module obtains the a-th item v by using the formula (9) a Global feature embedded vector of
Figure FDA0003994299190000041
Figure FDA0003994299190000042
In the formula (9), eta ab E eta represents the a-th item v a For the b-th recessive similar item v b Attention weight of (1);
step 4.2.3, mixing
Figure FDA0003994299190000043
Inputting the data into a global prediction module, and obtaining the ith user u by using an equation (10) i For the a-th item v a Is predicted based on the global score of->
Figure FDA0003994299190000044
Figure FDA0003994299190000045
In the formula (10), b i For the ith user u i User deviation of b a Is the a-th item v a μ is the mean value of all users after adding the scores of all the items;
step 5, obtaining the ith user u by using the formula (11) i For the a-th article v a Scoring the predicted outcome
Figure FDA0003994299190000046
Figure FDA0003994299190000047
In the formula (11), λ represents a scoring weight coefficient;
step 6, constructing a constraint aiming at the scoring behavior of the user;
step 6.1, defining a rating triple (U, R, V) of a user to an item, wherein U belongs to U and represents a user entity, V belongs to V and represents an item entity, R belongs to R and represents a rating relation, and calculating a feature vector through a TransH algorithm, wherein the method comprises the following steps: user entity feature vector e u Item entity feature vector e v Rank relational feature vector e r
Obtaining e according to formula (12) and formula (13) u ,e v Projection vector of user entity characteristic of hyperplane in relation r
Figure FDA0003994299190000048
And an item physical feature projection vector>
Figure FDA0003994299190000049
/>
Figure FDA00039942991900000410
Figure FDA00039942991900000411
In formulae (12) and (13), w r Is a normal vector corresponding to the hyperplane;
step 6.2, constructing a scoring function f (u, r, v) of the rating triple (u, r, v) by using the formula (14):
Figure FDA00039942991900000412
step 6.3, constructing a marginal loss function L by using the formula (15) KG
Figure FDA0003994299190000051
In the formula (16), [ f (-)] + Represents max (0, f (·)), V 'represents another item entity, (u, r, V') represents a false triple generated by V 'in place of V, V' is epsilon V; f (u, r, v ') represents the scoring function of the false triplet (u, r, v');
step 7, constructing a loss function L of the depth map model by using the formula (14) D
Figure FDA0003994299190000052
In formula (14), λ 1 ,λ 2 For regularization parameters, P represents a matrix formed by all user characteristic vectors, and Q represents a matrix formed by all article characteristic vectors;
construction of the loss function L of the Width Linear attention model Using equation (15) W
Figure FDA0003994299190000053
In formula (15), λ 3 For regularizing parameters, P 'represents a matrix formed by potential characteristic vectors of all users, Q' represents a matrix formed by potential characteristic vectors of all articles, X represents a matrix formed by potential influence vectors of all users, and Y represents a matrix formed by potential influences of all articles;
step 7, training the depth map model and the width linear attention model by using a gradient descent method respectively, and correspondingly calculating a loss function L D And a loss function L W Updating the model parameters until the loss function is converged, thereby obtaining a trained depth map model and a trained width linear attention model;
and 8, respectively inputting the article set, the user scoring matrix, the target user social friend set and the recessive social friend set of a certain target user into the trained depth map model and the trained width linear attention model, and correspondingly obtaining the local scoring prediction and the global scoring prediction of the target user, so that the scoring of the target user on the articles is calculated by using the formula (11), and K articles with the top scoring rank are selected and recommended to the target user, thereby completing the article recommendation of the target user.
2. An electronic device comprising a memory and a processor, wherein the memory is configured to store a program that enables the processor to perform the item recommendation method of claim 1, and wherein the processor is configured to execute the program stored in the memory.
3. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the item recommendation method according to claim 1.
CN202211590930.8A 2022-12-12 2022-12-12 Article recommendation method integrating semantics and structural view for socialized recommendation Pending CN115935067A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211590930.8A CN115935067A (en) 2022-12-12 2022-12-12 Article recommendation method integrating semantics and structural view for socialized recommendation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211590930.8A CN115935067A (en) 2022-12-12 2022-12-12 Article recommendation method integrating semantics and structural view for socialized recommendation

Publications (1)

Publication Number Publication Date
CN115935067A true CN115935067A (en) 2023-04-07

Family

ID=86551893

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211590930.8A Pending CN115935067A (en) 2022-12-12 2022-12-12 Article recommendation method integrating semantics and structural view for socialized recommendation

Country Status (1)

Country Link
CN (1) CN115935067A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116167828A (en) * 2023-04-25 2023-05-26 江苏亿友慧云软件股份有限公司 Article recommendation method based on graph cooperation and contrast learning

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116167828A (en) * 2023-04-25 2023-05-26 江苏亿友慧云软件股份有限公司 Article recommendation method based on graph cooperation and contrast learning

Similar Documents

Publication Publication Date Title
CN111428147B (en) Social recommendation method of heterogeneous graph volume network combining social and interest information
CN109299396B (en) Convolutional neural network collaborative filtering recommendation method and system fusing attention model
CN111222332B (en) Commodity recommendation method combining attention network and user emotion
CN108648049B (en) Sequence recommendation method based on user behavior difference modeling
CN109785062B (en) Hybrid neural network recommendation system based on collaborative filtering model
CN110807154A (en) Recommendation method and system based on hybrid deep learning model
CN110674850A (en) Image description generation method based on attention mechanism
CN104598611B (en) The method and system being ranked up to search entry
CN111881342A (en) Recommendation method based on graph twin network
CN109087178A (en) Method of Commodity Recommendation and device
Ortega et al. Providing reliability in recommender systems through Bernoulli matrix factorization
CN108647800B (en) Online social network user missing attribute prediction method based on node embedding
CN113761359B (en) Data packet recommendation method, device, electronic equipment and storage medium
CN114358657B (en) Post recommendation method and device based on model fusion
CN111695024A (en) Object evaluation value prediction method and system, and recommendation method and system
CN114298783A (en) Commodity recommendation method and system based on matrix decomposition and fusion of user social information
CN115618101A (en) Streaming media content recommendation method and device based on negative feedback and electronic equipment
Liphoto et al. A survey on recommender systems
CN115935067A (en) Article recommendation method integrating semantics and structural view for socialized recommendation
Hazrati et al. Entity representation for pairwise collaborative ranking using restricted Boltzmann machine
Nazari et al. Scalable and data-independent multi-agent recommender system using social networks analysis
CN112784177A (en) Spatial distance adaptive next interest point recommendation method
CN114842247B (en) Characteristic accumulation-based graph convolution network semi-supervised node classification method
Zhang et al. Probabilistic matrix factorization recommendation of self-attention mechanism convolutional neural networks with item auxiliary information
CN115829683A (en) Power integration commodity recommendation method and system based on inverse reward learning optimization

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination