CN118094004A - Click rate prediction method and system based on knowledge enhancement and interest evolution - Google Patents

Click rate prediction method and system based on knowledge enhancement and interest evolution Download PDF

Info

Publication number
CN118094004A
CN118094004A CN202410226217.8A CN202410226217A CN118094004A CN 118094004 A CN118094004 A CN 118094004A CN 202410226217 A CN202410226217 A CN 202410226217A CN 118094004 A CN118094004 A CN 118094004A
Authority
CN
China
Prior art keywords
interest
vector
segment
user
potential
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410226217.8A
Other languages
Chinese (zh)
Inventor
陈羽中
陈仕杰
陈子阳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fuzhou University
Original Assignee
Fuzhou University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fuzhou University filed Critical Fuzhou University
Priority to CN202410226217.8A priority Critical patent/CN118094004A/en
Publication of CN118094004A publication Critical patent/CN118094004A/en
Pending legal-status Critical Current

Links

Landscapes

  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The invention relates to a click rate prediction method and a click rate prediction system based on knowledge enhancement and interest evolution, wherein the method comprises the following steps: step A: collecting behavior data of a user, including a user ID, a target object ID, object types and interaction time, and constructing a click rate prediction training set; and (B) step (B): training a deep learning network model G based on knowledge enhancement and interest evolution by using a training set; step C: and inputting the user and article data into the trained deep learning network model G, and outputting the click probability of the current user on the target article. The method and the system are beneficial to improving the accuracy of click rate prediction.

Description

Click rate prediction method and system based on knowledge enhancement and interest evolution
Technical Field
The invention belongs to the technical field of recommendation systems, and particularly relates to a click rate prediction method and a click rate prediction system based on knowledge enhancement and interest evolution.
Background
The click rate prediction is a key module of the recommendation system, and the module can predict the possible click probability of any user on a certain object and further order the click probability from high to low, so that a target object list which the user is likely to be interested in is generated, the user is better, the user is prevented from directly contacting massive information, and the user is helped to find valuable key contents. In many applications, click-through rate is a key indicator of business assessment, and even minor improvements to applications with a large user base may contribute to a significant increase in overall revenue. In click rate prediction, user portraits and a large number of user behavior logs, including browsing, shopping cart joining, purchasing, evaluating, etc. behavior implies the interests of the user, so modeling behavior is an important topic of click rate prediction. The rich behavior data also brings a plurality of challenges to modeling of click rate prediction tasks, and how to use the behavior data to improve the performance of the click rate prediction model becomes a popular direction of research.
However, while existing models achieve performance improvements to some extent, they still face some major weaknesses. First, existing models simply input all items in the entire sequence of user behavior into the GRU, but there is not an interest evolution between each adjacent item, and the interest evolution should be captured as much as possible between items with relevance. Second, they only consider the interest evolution of each item at fine granularity, while ignoring the interest evolution of the coarse-grained entity, which can represent the overall interest evolution of the user. In addition, existing models typically only directly input the original vector of the target item, and the influence of the user neighbors around the target item node on it may reflect different interests of different users in the same item. Finally, each user has a unique interest, and it is important to mine the model for the unique interest of the user. The knowledge graph can flexibly model the comprehensive auxiliary data, and the external heterogeneous knowledge is stored in a ternary form of a head entity, a relation and a tail entity, so that the representation of users and articles can be enriched by combining the knowledge graph due to the abundant information of the external heterogeneous knowledge, and the interpretation of recommendation is improved.
Disclosure of Invention
The invention aims to provide a click rate prediction method and a click rate prediction system based on knowledge enhancement and interest evolution, which are beneficial to improving the accuracy of click rate prediction.
In order to achieve the above purpose, the technical scheme adopted by the invention is as follows: a click rate prediction method based on knowledge enhancement and interest evolution comprises the following steps:
Step A: collecting behavior data of a user, including a user ID, a target object ID, object types and interaction time, and constructing a click rate prediction training set;
and (B) step (B): training a deep learning network model G based on knowledge enhancement and interest evolution by using a training set;
Step C: and inputting the user and article data into the trained deep learning network model G, and outputting the click probability of the current user on the target article.
Further, the step B specifically includes the following steps:
Step B1: constructing a knowledge graph G kg, an article co-occurrence graph G cf and a user-article interaction graph G interact; each sample in the training set contains a user ID, a target item ID, and a sequence of actions of the user;
Step B2: segmenting a sample in the training set according to the knowledge graph G kg to obtain an interest segment sequence consisting of a plurality of interest segments Sequence of interest segment/>Divided into sequences of segments of strong interest/>And weak interest segment sequence/>Inputting the vector e s of the strong interest segment sequence and the vector e w of the weak interest segment sequence into an embedding layer; all entities in the knowledge graph are added into an entity sequence S entity and then input into an embedded layer, so as to obtain a vector e e of the entities; in the knowledge graph G kg and the item co-occurrence graph G cf, a semantic-based potential interest segment sequence/>, based on the shortest path between each item in the sample and the target item, is constructedAnd similarity-based potential interest segment sequences/>Inputting the vector e k of the potential interest segment sequence based on the semantics and the vector e c of the potential interest segment sequence based on the similarity into an embedding layer; constructing a unique interest segment sequence/>, according to the occurrence frequency of the entity in the knowledge graph G kg Inputting the vector e unique into the embedded layer to obtain a vector e unique of the unique interest segment sequence; inputting the user ID and the target object ID into the embedded layer to obtain e u and e t;
Step B3: vector each item within a single segment of high interest and segment of low interest And/>Inputting the information into GRU, capturing fine-granularity interest evolution of user in interest segment, and updating the item vectors to/>, respectivelyAnd/>For each interest segment, dividing according to the relation in the knowledge graph, and for each relation, carrying out entity vector/>, corresponding to the interest segmentSequencing according to the first article of the interest segments, inputting the first article into the GRU, capturing coarse-granularity interest evolution of the user among the interest segments, and capturing entity vectors/>Updated as/>Will/>For each item vector/>, as a query, in a corresponding interest segmentAttention is paid to calculate the relevance of each object and entity to obtain Duan Naju aggregate vectors/>, of the strong interest segmentsAnd Duan Naju aggregate vector/>, of the weak interest segment
Step B4: according to the user-object interaction graph G interact, using the GCN, vector e u of the user interacted with the target object is aggregated into target vector e t, and the vector of the target object is updatedVector/>, object itemDuan Naju aggregate vector/>, as a query, for each strong interest segmentPerforming attention calculation to obtain a vector u s with strong interest; strong interest vector u s and target item vector/>Stitching is performed as a query, and Duan Naju joint vectors/>, for each weak interest segmentPerforming attention calculation to obtain a vector u w of weak interest; vector/>, object itemAs a query, performing double-layer attention on a vector e unique of the unique interest segment sequence to obtain a vector u unique of the unique interest;
step B5: vector of target articles As a query, performing a double-layer attention mechanism and a multi-head attention mechanism on vectors e k and e c of the potential interest segment sequence to obtain vectors u k and u c of the potential interest vector; performing a contrast mechanism on u k and u c to capture complementary information and distinguishing information between two potential segments of interest; taking two potential interests from the same user as a pair of positive samples, and regarding the potential interests of different users as negative samples to obtain a contrast loss L cl;
Step B6: the vector e u of the user and the vector of the target object obtained in the steps B2, B4 and B5 are calculated The strong interest vector u s, the weak interest vector u w, the potential interest vectors u k and u c and the unique interest vector u unique are spliced together and input into a multi-layer perceptron to perform click rate prediction, the predicted click rate is obtained, and the prediction loss is calculated by using a cross entropy function; then calculating the gradient of each parameter in the deep learning network model by a back propagation method according to the target loss function, and updating each parameter by a random gradient descent method;
Step B7: and when the loss value generated by the deep learning network model is smaller than a set threshold value or the maximum iteration number is reached, training of the deep learning network model G is terminated.
Further, the step B1 specifically includes the following steps:
Step B11: constructing a knowledge graph G kg for all the articles by using the Freebase, if mapping is available, mapping the articles to Freebase entities by title matching, and if the articles and the entities have a certain relation, an edge is arranged between the articles and the entities;
Step B12: according to the interaction between the user and the articles, an article co-occurrence graph G cf is constructed, and if the two articles occur simultaneously in the behavior sequence of the user, an edge is arranged between the two articles;
step B13: based on the interactions between the user and the item, a user-item interaction graph G interact is constructed, with an edge between the user and the item if there is an interaction between the user and the item.
Further, the step B2 specifically includes the following steps:
Step B21: according to the knowledge graph G kg obtained in the step B11, storing the articles with edges of the same entity in the user behavior sequence into an interest segment, sorting the articles in the interest segment according to the interaction time of the user and the articles from small to large, traversing the single user behavior sequence, and obtaining an interest segment sequence comprising a plurality of interest segments
Step B22: segmenting the interest segments obtained in the step B21 according to the number of the articles in the segments, wherein the number of the articles is larger than a threshold value tau and is regarded as a strong interest segment, and the rest is regarded as a weak interest segment; adding a strong interest segment and a weak interest segment to a strong interest segment sequence, respectivelyAnd weak interest segment sequence/>Inputting the vector e s of the strong interest segment sequence and the vector e w of the weak interest segment sequence into an embedding layer; all entities in the knowledge graph are added into an entity sequence S entity and then input into an embedded layer, so as to obtain a vector e e of the entities;
Step B23: finding the shortest path between each article in the sample and the target article according to the knowledge graph G kg obtained in the step B11, adding nodes appearing in the path into a potential interest segment, and adding the potential interest segment into a semantic-based potential interest segment sequence Inputting the vector into an embedding layer to obtain a vector e k of a potential interest segment sequence based on the semantics;
Step B24: according to the item co-occurrence graph G cf obtained in the step B12, finding the shortest path between each item in the sample and the target item, adding nodes appearing in the path into a potential interest segment, and adding the potential interest segment into a potential interest segment sequence based on similarity Inputting the vector into an embedding layer to obtain a vector e c of the potential interest segment sequence based on the similarity;
step B25: according to the knowledge graph G kg obtained in the step B11, ordering all entities from small to large according to the occurrence frequency to obtain an entity sequence S entity, traversing the entity sequence S entity, selecting an interest segment corresponding to the entity to appear in a user behavior sequence, taking the interest segment as a unique interest segment of a user, and adding the unique interest segment into the unique interest segment sequence Until unique segment sequence/>The method comprises the steps of inputting mu interest segments into an embedded layer to obtain a vector e unique of a unique interest segment sequence;
Step B26: the user ID and the target item ID are entered into the embedded layer, resulting in e u and e t.
Further, the step B3 specifically includes the following steps:
Step B31: vector each item within a single segment of high interest and segment of low interest And/>Input into GRU, capture fine granularity interest evolution of user in interest segment, and update the vector of jth article in ith interest segment to/>, respectivelyAnd
Step B32: dividing each interest segment according to the relation in the knowledge graph; for each relation, the entity vector corresponding to the interest segmentSequencing according to the first article of the interest segments, inputting the first article into the GRU, capturing coarse-granularity interest evolution of the user among the interest segments, and vector/>, of the ith entityUpdated as/>
Step B33: the step B22 is carried outFor each item vector/>, as a query, in a corresponding interest segmentAttention is paid to calculate the relevance of each object and entity to obtain Duan Naju aggregate vectors/>, of the strong interest segmentsAnd Duan Naju aggregate vector/>, of the weak interest segment
Wherein,And/>Representing the attention coefficients, W 1 and W 2 represent trainable parameters, N s represents the number of items in the strong interest segment, and N w represents the number of items in the weak interest segment.
Further, the step B4 specifically includes the following steps:
Step B41: according to the user-object interaction graph G interact obtained in the step B13, using GCN, aggregating the vector e u of the user interacted with the object into the object vector e t to obtain the updated vector of the object
Step B42: the target object vector obtained in the step B41Duan Naju aggregate vector/>, as a query, for each strong interest segmentAnd (3) performing attention calculation to obtain a vector u s with strong interest:
Wherein, Representing the attention coefficient, W 2 representing the trainable parameter, L s representing the number of segments of strong interest;
Step B43: the strong interest vector u s obtained in the step B42 and the target object vector obtained in the step B41 are combined Stitching is performed as a query, and Duan Naju joint vectors/>, for each weak interest segmentAnd (3) performing attention calculation to obtain a vector u w of weak interest:
Wherein, Representing the attention coefficient, W 4 representing the trainable parameter, L w representing the number of weak segments of interest;
Step B44: the target object vector obtained in the step B41 As a query, for each item vector/>, in a unique interest segmentAttention is paid to calculate the correlation between each object and the target object to obtain Duan Naju aggregate vectors/>, of the unique interest segments
Wherein,Representing an attention coefficient, W 5 representing a trainable parameter, N unique representing the number of items within a unique segment of interest;
Step B45: the target object vector obtained in the step B41 Duan Naju aggregate vector/>, as a query, for each unique segment of interestPerforming attention calculation to obtain a vector u unique of unique interests:
Wherein, Representing the attention factor, W 6 representing the trainable parameter, L unique representing the number of unique segments of interest.
Further, the step B5 specifically includes the following steps:
Step B51: the target object vector obtained in the step B41 As a query, vector/>, for each semantic-based potential interest segmentPerforming attention calculation to obtain Duan Naju joint vectors/>, based on semantic potential interest segments
Wherein,Representing an attention coefficient, W 7 representing a trainable parameter, N k representing a number of items within the semantic-based potential segment of interest;
Step B52: vector Duan Naju of all semantic-based potential interest segments Splicing to obtain x k, inputting the x k into the multi-head self-attention to obtain a Duan Naju combined vector H k of the thinned semantic-based potential interest segment sequence;
Where head h represents the output of the h attention function, d h represents the dimension of each head, n head represents the number of attention functions, and W 8 represents the trainable parameters;
Step B53: the target object vector obtained in the step B41 As a query, vector/>, for each semantic-based potential interest segmentPerforming attention calculation to obtain a semantic-based potential interest vector u k:
Wherein, Representing the attention coefficient, W 9 representing the trainable parameter, L k representing the number of potential segments of interest based on semantics;
step B54: the target object vector obtained in the step B41 As a query, for each item vector/>, in a similarity-based segment of potential interestPerforming attention calculation to obtain the vector/>, based on the similarity, of the potential interest segment
Wherein,Representing an attention coefficient, W 10 representing a trainable parameter, N c representing a number of items within the semantic-based potential segment of interest;
Step B55: vector Duan Naju of all similarity-based potential interest segments Splicing to obtain x c, inputting the x c into the multi-head self-attention to obtain a Duan Naju combined vector H c of the thinned potential interest segment sequence based on similarity;
Where head h represents the output of the h attention function, d h represents the dimension of each head, n head represents the number of attention functions, and W 11 represents the trainable parameters;
step B56: the target object vector obtained in the step B41 As a query, vector/>, for each similarity-based potential interest segmentAttention calculations are made to arrive at a representation u c of potential interest vectors based on similarity:
Wherein, Representing the attention coefficient, W 12 representing the trainable parameter, L c representing the number of potential segments of interest based on semantics;
step B57: taking interests of the same user from two potential interest segments as a pair of positive samples, treating interests of different users as a pair of negative samples, and calculating a contrast loss L cl:
Where N u denotes the number of users, Representing the semantic-based potential interests of the remaining users in a batch.
Further, the step B6 specifically includes the following steps:
Step B61: vector e u of user and vector of target object And the strong interest vector u s, the weak interest vector representation u w, the potential interest vector vectors u k and u c and the unique interest vector u unique obtained in the steps B2, B4 and B5 are spliced together and input into a multi-layer perceptron MLP for click rate prediction to obtain a predicted click rate:
Step B62: calculating a prediction loss by using a cross entropy function L target, updating a learning rate by using a gradient optimization algorithm Adam, and updating model parameters by using back propagation iteration to train a model by using a minimum loss function; the model total loss is the loss weighted addition:
Lall=Ltarget+λLcl
Where N represents the number of samples, y i is the corresponding label, λ is the hyper-parameter, L target is the loss function of click rate prediction, and L cl is the contrast loss function of the potential interest vector.
The invention also provides a click rate prediction system adopting the method, which comprises the following steps:
The training set construction module is used for collecting behavior data of a user and constructing a click rate prediction training set;
The model training module is used for training a deep learning network model g based on knowledge enhancement and interest evolution; and
The click rate prediction module is used for receiving user and article data, inputting the user and article data into the trained deep learning network model G, and outputting the click probability of the current user on the target article.
Compared with the prior art, the invention has the following beneficial effects: according to the method, the interests of the user are subdivided into strong interests, weak interests, potential interests based on semantics, potential interests based on similarity and unique interests according to the knowledge graph, the object co-occurrence graph and the user-object interaction graph, the interests of the user are extracted from different angles, and meanwhile, the interest evolution of the user is captured from the fine granularity and the coarse granularity. In addition, the complementary relation between two potential interest vectors respectively obtained from the knowledge graph and the object co-occurrence graph is captured by enriching the vectors of the target object and introducing a contrast learning task, so that the interest of the user can be comprehensively captured, objects which have no interaction but meet the interest are introduced for the user, and more interaction information can be learned by the model.
Drawings
FIG. 1 is a flow chart of a method implementation of an embodiment of the present invention;
FIG. 2 is a schematic diagram of a deep learning network model in an embodiment of the invention;
fig. 3 is a schematic diagram of a system structure according to an embodiment of the present invention.
Detailed Description
The invention will be further described with reference to the accompanying drawings and examples.
It should be noted that the following detailed description is exemplary and is intended to provide further explanation of the application. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs.
It is noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of exemplary embodiments according to the present application. As used herein, the singular is also intended to include the plural unless the context clearly indicates otherwise, and furthermore, it is to be understood that the terms "comprises" and/or "comprising" when used in this specification are taken to specify the presence of stated features, steps, operations, devices, components, and/or combinations thereof.
As shown in fig. 1, the present embodiment provides a click rate prediction method based on knowledge enhancement and interest evolution, which includes the following steps:
step A: and collecting behavior data of the user, including user ID, target object ID, object type and interaction time, and constructing a click rate prediction training set.
And (B) step (B): the deep learning network model G based on knowledge enhancement and interest evolution is trained using a training set.
Step C: and inputting the user and article data into the trained deep learning network model G, and outputting the click probability of the current user on the target article.
The architecture of the deep learning network model in this embodiment is shown in fig. 2. In this embodiment, the step B specifically includes the following steps:
Step B1: constructing a knowledge graph G kg, an article co-occurrence graph G cf and a user-article interaction graph G interact; each sample in the training set contains a user ID, a target item ID, and a sequence of actions of the user.
The step B1 specifically comprises the following steps:
Step B11: constructing a knowledge graph G kg for all the articles by using the Freebase, if mapping is available, mapping the articles to Freebase entities by title matching, and if the articles and the entities have a certain relation, an edge is arranged between the articles and the entities;
Step B12: according to the interaction between the user and the articles, an article co-occurrence graph G cf is constructed, and if the two articles occur simultaneously in the behavior sequence of the user, an edge is arranged between the two articles;
step B13: based on the interactions between the user and the item, a user-item interaction graph G interact is constructed, with an edge between the user and the item if there is an interaction between the user and the item.
Step B2: segmenting a sample in the training set according to the knowledge graph G kg to obtain an interest segment sequence consisting of a plurality of interest segmentsSequence of interest segment/>Divided into sequences of segments of strong interest/>And weak interest segment sequence/>Inputting the vector e s of the strong interest segment sequence and the vector e w of the weak interest segment sequence into an embedding layer; all entities in the knowledge graph are added into an entity sequence S entity and then input into an embedded layer, so as to obtain a vector e e of the entities; in the knowledge graph G kg and the item co-occurrence graph G cf, a semantic-based potential interest segment sequence/>, based on the shortest path between each item in the sample and the target item, is constructedAnd similarity-based potential interest segment sequences/>Inputting the vector e k of the potential interest segment sequence based on the semantics and the vector e c of the potential interest segment sequence based on the similarity into an embedding layer; constructing a unique interest segment sequence/>, according to the occurrence frequency of the entity in the knowledge graph G kg Inputting the vector e unique into the embedded layer to obtain a vector e unique of the unique interest segment sequence; the user ID and the target item ID are entered into the embedded layer, resulting in e u and e t.
The step B2 specifically comprises the following steps:
Step B21: according to the knowledge graph G kg obtained in the step B11, storing the articles with edges of the same entity in the user behavior sequence into an interest segment, sorting the articles in the interest segment according to the interaction time of the user and the articles from small to large, traversing the single user behavior sequence, and obtaining an interest segment sequence comprising a plurality of interest segments
Step B22: segmenting the interest segments obtained in the step B21 according to the number of the articles in the segments, wherein the number of the articles is larger than a threshold value tau and is regarded as a strong interest segment, and the rest is regarded as a weak interest segment; adding a strong interest segment and a weak interest segment to a strong interest segment sequence, respectivelyAnd weak interest segment sequence/>Inputting the vector e s of the strong interest segment sequence and the vector e w of the weak interest segment sequence into an embedding layer; all entities in the knowledge graph are added into an entity sequence S entity and then input into an embedded layer, so as to obtain a vector e e of the entities;
Step B23: finding the shortest path between each article in the sample and the target article according to the knowledge graph G kg obtained in the step B11, adding nodes appearing in the path into a potential interest segment, and adding the potential interest segment into a semantic-based potential interest segment sequence Inputting the vector into an embedding layer to obtain a vector e k of a potential interest segment sequence based on the semantics;
Step B24: according to the item co-occurrence graph G cf obtained in the step B12, finding the shortest path between each item in the sample and the target item, adding nodes appearing in the path into a potential interest segment, and adding the potential interest segment into a potential interest segment sequence based on similarity Inputting the vector into an embedding layer to obtain a vector e c of the potential interest segment sequence based on the similarity;
step B25: according to the knowledge graph G kg obtained in the step B11, ordering all entities from small to large according to the occurrence frequency to obtain an entity sequence S entity, traversing the entity sequence S entity, selecting an interest segment corresponding to the entity to appear in a user behavior sequence, taking the interest segment as a unique interest segment of a user, and adding the unique interest segment into the unique interest segment sequence Until unique segment sequence/>The method comprises the steps of inputting mu interest segments into an embedded layer to obtain a vector e unique of a unique interest segment sequence;
Step B26: the user ID and the target item ID are entered into the embedded layer, resulting in e u and e t.
Step B3: vector each item within a single segment of high interest and segment of low interestAnd/>Inputting the information into GRU, capturing fine-granularity interest evolution of user in interest segment, and updating the item vectors to/>, respectivelyAnd/>For each interest segment, dividing according to the relation in the knowledge graph, and for each relation, carrying out entity vector/>, corresponding to the interest segmentSequencing according to the first article of the interest segments, inputting the first article into the GRU, capturing coarse-granularity interest evolution of the user among the interest segments, and capturing entity vectors/>Updated as/>Will/>For each item vector/>, as a query, in a corresponding interest segmentAttention is paid to calculate the relevance of each object and entity to obtain Duan Naju aggregate vectors/>, of the strong interest segmentsAnd Duan Naju aggregate vector/>, of the weak interest segment
The step B3 specifically comprises the following steps:
Step B31: vector each item within a single segment of high interest and segment of low interest And/>Input into GRU, capture fine granularity interest evolution of user in interest segment, and update the vector of jth article in ith interest segment to/>, respectivelyAnd
Step B32: dividing each interest segment according to the relation in the knowledge graph; for each relation, the entity vector corresponding to the interest segmentSequencing according to the first article of the interest segments, inputting the first article into the GRU, capturing coarse-granularity interest evolution of the user among the interest segments, and vector/>, of the ith entityUpdated as/>
Step B33: the step B22 is carried outFor each item vector/>, as a query, in a corresponding interest segmentAttention is paid to calculate the relevance of each object and entity to obtain Duan Naju aggregate vectors/>, of the strong interest segmentsAnd Duan Naju aggregate vector/>, of the weak interest segment
Wherein,And/>Representing the attention coefficients, W 1 and W 2 represent trainable parameters, N s represents the number of items in the strong interest segment, and N w represents the number of items in the weak interest segment.
Step B4: according to the user-object interaction graph G interact, using the GCN, vector e u of the user interacted with the target object is aggregated into target vector e t, and the vector of the target object is updatedVector/>, object itemDuan Naju aggregate vector/>, as a query, for each strong interest segmentPerforming attention calculation to obtain a vector u s with strong interest; strong interest vector u s and target item vector/>Stitching is performed as a query, and Duan Naju joint vectors/>, for each weak interest segmentPerforming attention calculation to obtain a vector u w of weak interest; vector/>, object itemAs a query, a double-layer attention is performed on the vector e unique of the unique segment sequence, resulting in a vector u unique of unique interest.
The step B4 specifically comprises the following steps:
Step B41: according to the user-object interaction graph G interact obtained in the step B13, using GCN, aggregating the vector e u of the user interacted with the object into the object vector e t to obtain the updated vector of the object
Step B42: the target object vector obtained in the step B41Duan Naju aggregate vector/>, as a query, for each strong interest segmentAnd (3) performing attention calculation to obtain a vector u s with strong interest: /(I)
Wherein,Representing the attention coefficient, W 2 representing the trainable parameter, L s representing the number of segments of strong interest;
Step B43: the strong interest vector u s obtained in the step B42 and the target object vector obtained in the step B41 are combined Stitching is performed as a query, and Duan Naju joint vectors/>, for each weak interest segmentAnd (3) performing attention calculation to obtain a vector u w of weak interest:
Wherein, Representing the attention coefficient, W 4 representing the trainable parameter, L w representing the number of weak segments of interest;
Step B44: the target object vector obtained in the step B41 As a query, for each item vector/>, in a unique interest segmentAttention is paid to calculate the relativity of each article and the target article, and the Duan Naju total vector of the unique interest segment is obtained
Wherein,Representing an attention coefficient, W 5 representing a trainable parameter, N unique representing the number of items within a unique segment of interest;
Step B45: the target object vector obtained in the step B41 Duan Naju aggregate vector/>, as a query, for each unique segment of interestPerforming attention calculation to obtain a vector u unique of unique interests:
Wherein, Representing the attention factor, W 6 representing the trainable parameter, L unique representing the number of unique segments of interest.
Step B5: vector of target articlesAs a query, performing a double-layer attention mechanism and a multi-head attention mechanism on vectors e k and e c of the potential interest segment sequence to obtain vectors u k and u c of the potential interest vector; performing a contrast mechanism on u k and u c to capture complementary information and distinguishing information between two potential segments of interest; two potential interests from the same user are taken as a pair of positive samples, and the potential interests of different users are taken as negative samples, resulting in a contrast loss L cl.
The step B5 specifically comprises the following steps:
Step B51: the target object vector obtained in the step B41 As a query, vector/>, for each semantic-based potential interest segmentPerforming attention calculation to obtain Duan Naju joint vectors/>, based on semantic potential interest segments
Wherein,Representing an attention coefficient, W 7 representing a trainable parameter, N k representing a number of items within the semantic-based potential segment of interest;
Step B52: vector Duan Naju of all semantic-based potential interest segments Splicing to obtain x k, inputting the x k into the multi-head self-attention to obtain a Duan Naju combined vector H k of the thinned semantic-based potential interest segment sequence;
Where head h represents the output of the h attention function, d h represents the dimension of each head, n head represents the number of attention functions, and W 8 represents the trainable parameters;
Step B53: the target object vector obtained in the step B41 As a query, vector/>, for each semantic-based potential interest segmentPerforming attention calculation to obtain a semantic-based potential interest vector u k:
Wherein, Representing the attention coefficient, W 9 representing the trainable parameter, L k representing the number of potential segments of interest based on semantics;
step B54: the target object vector obtained in the step B41 As a query, for each item vector/>, in a similarity-based segment of potential interestPerforming attention calculation to obtain the vector/>, based on the similarity, of the potential interest segment/>
Wherein,Representing an attention coefficient, W 10 representing a trainable parameter, N c representing a number of items within the semantic-based potential segment of interest;
Step B55: vector Duan Naju of all similarity-based potential interest segments Splicing to obtain x c, inputting the x c into the multi-head self-attention to obtain a Duan Naju combined vector H c of the thinned potential interest segment sequence based on similarity;
Where head h represents the output of the h attention function, d h represents the dimension of each head, n head represents the number of attention functions, and W 11 represents the trainable parameters;
step B56: the target object vector obtained in the step B41 As a query, vector/>, for each similarity-based potential interest segmentAttention calculations are made to arrive at a representation u c of potential interest vectors based on similarity:
Wherein, Representing the attention coefficient, W 12 representing the trainable parameter, L c representing the number of potential segments of interest based on semantics;
step B57: taking interests of the same user from two potential interest segments as a pair of positive samples, treating interests of different users as a pair of negative samples, and calculating a contrast loss L cl:
Where N u denotes the number of users, Representing the semantic-based potential interests of the remaining users in a batch.
Step B6: the vector e u of the user and the vector of the target object obtained in the steps B2, B4 and B5 are calculatedThe strong interest vector u s, the weak interest vector u w, the potential interest vectors u k and u c and the unique interest vector u unique are spliced together and input into a multi-layer perceptron to perform click rate prediction, the predicted click rate is obtained, and the prediction loss is calculated by using a cross entropy function; and then calculating the gradient of each parameter in the deep learning network model by a back propagation method according to the target loss function, and updating each parameter by a random gradient descent method.
The step B6 specifically comprises the following steps:
Step B61: vector e u of user and vector of target object And the strong interest vector u s, the weak interest vector representation u w, the potential interest vector vectors u k and u c and the unique interest vector u unique obtained in the steps B2, B4 and B5 are spliced together and input into a multi-layer perceptron MLP for click rate prediction to obtain a predicted click rate:
Step B62: calculating a prediction loss by using a cross entropy function L target, updating a learning rate by using a gradient optimization algorithm Adam, and updating model parameters by using back propagation iteration to train a model by using a minimum loss function; the model total loss is the loss weighted addition:
Lall=Ltarget+λLcl
Where N represents the number of samples, y i is the corresponding label, λ is the hyper-parameter, L target is the loss function of click rate prediction, and L cl is the contrast loss function of the potential interest vector.
Step B7: and when the loss value generated by the deep learning network model is smaller than a set threshold value or the maximum iteration number is reached, training of the deep learning network model G is terminated.
As shown in fig. 3, the present embodiment further provides a click rate prediction system adopting the above method, including:
The training set construction module is used for collecting behavior data of a user and constructing a click rate prediction training set;
the model training module is used for training a deep learning network model G based on knowledge enhancement and interest evolution; and
The click rate prediction module is used for receiving user and article data, inputting the user and article data into the trained deep learning network model G, and outputting the click probability of the current user on the target article.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the invention in any way, and any person skilled in the art may make modifications or alterations to the disclosed technical content to the equivalent embodiments. However, any simple modification, equivalent variation and variation of the above embodiments according to the technical substance of the present invention still fall within the protection scope of the technical solution of the present invention.

Claims (9)

1. The click rate prediction method based on knowledge enhancement and interest evolution is characterized by comprising the following steps of:
Step A: collecting behavior data of a user, including a user ID, a target object ID, object types and interaction time, and constructing a click rate prediction training set;
and (B) step (B): training a deep learning network model G based on knowledge enhancement and interest evolution by using a training set;
Step C: and inputting the user and article data into the trained deep learning network model G, and outputting the click probability of the current user on the target article.
2. The click-through rate prediction method based on knowledge enhancement and interest evolution according to claim 1, wherein the step B specifically comprises the following steps:
Step B1: constructing a knowledge graph G kg, an article co-occurrence graph G cf and a user-article interaction graph G interact; each sample in the training set contains a user ID, a target item ID, and a sequence of actions of the user;
Step B2: segmenting a sample in the training set according to the knowledge graph G kg to obtain an interest segment sequence consisting of a plurality of interest segments Sequence of interest segment/>Divided into sequences of segments of strong interest/>And weak interest segment sequence/>Inputting the vector e s of the strong interest segment sequence and the vector e w of the weak interest segment sequence into an embedding layer; all entities in the knowledge graph are added into an entity sequence S entity and then input into an embedded layer, so as to obtain a vector e e of the entities; in the knowledge graph G kg and the item co-occurrence graph G cf, a semantic-based potential interest segment sequence/>, based on the shortest path between each item in the sample and the target item, is constructedAnd similarity-based potential interest segment sequences/>Inputting the vector e k of the potential interest segment sequence based on the semantics and the vector e c of the potential interest segment sequence based on the similarity into an embedding layer; constructing a unique interest segment sequence/>, according to the occurrence frequency of the entity in the knowledge graph G kg Inputting the vector e unique into the embedded layer to obtain a vector e unique of the unique interest segment sequence; inputting the user ID and the target object ID into the embedded layer to obtain e u and e t;
Step B3: vector each item within a single segment of high interest and segment of low interest And/>Inputting the information into GRU, capturing fine-granularity interest evolution of user in interest segment, and updating the item vectors to/>, respectivelyAnd/>For each interest segment, dividing according to the relation in the knowledge graph, and for each relation, carrying out entity vector/>, corresponding to the interest segmentSequencing according to the first article of the interest segments, inputting the first article into the GRU, capturing coarse-granularity interest evolution of the user among the interest segments, and capturing entity vectors/>Updated toWill/>For each item vector/>, as a query, in a corresponding interest segmentAttention is paid to calculate the relevance of each object and entity to obtain Duan Naju aggregate vectors/>, of the strong interest segmentsAnd Duan Naju aggregate vector/>, of the weak interest segment
Step B4: according to the user-object interaction graph G interact, using the GCN, vector e u of the user interacted with the target object is aggregated into target vector e t, and the vector of the target object is updatedVector/>, object itemDuan Naju aggregate vector/>, as a query, for each strong interest segmentPerforming attention calculation to obtain a vector u s with strong interest; strong interest vector u s and target item vector/>Stitching is performed as a query, and Duan Naju joint vectors/>, for each weak interest segmentPerforming attention calculation to obtain a vector u w of weak interest; vector/>, object itemAs a query, performing double-layer attention on the vector u unique of the unique interest segment sequence to obtain a vector u unique of the unique interest;
step B5: vector of target articles As a query, performing a double-layer attention mechanism and a multi-head attention mechanism on vectors e k and e c of the potential interest segment sequence to obtain vectors u k and u c of the potential interest vector; performing a contrast mechanism on u k and u c to capture complementary information and distinguishing information between two potential segments of interest; taking two potential interests from the same user as a pair of positive samples, and regarding the potential interests of different users as negative samples to obtain a contrast loss L cl;
Step B6: the vector e u of the user and the vector of the target object obtained in the steps B2, B4 and B5 are calculated The strong interest vector u s, the weak interest vector u w, the potential interest vectors u k and u c and the unique interest vector u unique are spliced together and input into a multi-layer perceptron to perform click rate prediction, the predicted click rate is obtained, and the prediction loss is calculated by using a cross entropy function; then calculating the gradient of each parameter in the deep learning network model by a back propagation method according to the target loss function, and updating each parameter by a random gradient descent method;
Step B7: and when the loss value generated by the deep learning network model is smaller than a set threshold value or the maximum iteration number is reached, training of the deep learning network model G is terminated.
3. The click-through rate prediction method based on knowledge enhancement and interest evolution according to claim 2, wherein the step B1 specifically comprises the following steps:
Step B11: constructing a knowledge graph G kg for all the articles by using the Freebase, if mapping is available, mapping the articles to Freebase entities by title matching, and if the articles and the entities have a certain relation, an edge is arranged between the articles and the entities;
Step B12: according to the interaction between the user and the articles, an article co-occurrence graph G cf is constructed, and if the two articles occur simultaneously in the behavior sequence of the user, an edge is arranged between the two articles;
step B13: based on the interactions between the user and the item, a user-item interaction graph G interact is constructed, with an edge between the user and the item if there is an interaction between the user and the item.
4. The click-through rate prediction method based on knowledge enhancement and interest evolution according to claim 3, wherein the step B2 specifically comprises the following steps:
Step B21: according to the knowledge graph G kg obtained in the step B11, storing the articles with edges of the same entity in the user behavior sequence into an interest segment, sorting the articles in the interest segment according to the interaction time of the user and the articles from small to large, traversing the single user behavior sequence, and obtaining an interest segment sequence comprising a plurality of interest segments
Step B22: segmenting the interest segments obtained in the step B21 according to the number of the articles in the segments, wherein the number of the articles is larger than a threshold value tau and is regarded as a strong interest segment, and the rest is regarded as a weak interest segment; adding a strong interest segment and a weak interest segment to a strong interest segment sequence, respectivelyAnd weak interest segment sequence/>Inputting the vector e s of the strong interest segment sequence and the vector e w of the weak interest segment sequence into an embedding layer; all entities in the knowledge graph are added into an entity sequence S entity and then input into an embedded layer, so as to obtain a vector e e of the entities;
Step B23: finding the shortest path between each article in the sample and the target article according to the knowledge graph G kg obtained in the step B11, adding nodes appearing in the path into a potential interest segment, and adding the potential interest segment into a semantic-based potential interest segment sequence Inputting the vector into an embedding layer to obtain a vector e k of a potential interest segment sequence based on the semantics;
Step B24: according to the item co-occurrence graph G cf obtained in the step B12, finding the shortest path between each item in the sample and the target item, adding nodes appearing in the path into a potential interest segment, and adding the potential interest segment into a potential interest segment sequence based on similarity Inputting the vector into an embedding layer to obtain a vector e c of the potential interest segment sequence based on the similarity;
step B25: according to the knowledge graph G kg obtained in the step B11, ordering all entities from small to large according to the occurrence frequency to obtain an entity sequence S entity, traversing the entity sequence S entity, selecting an interest segment corresponding to the entity to appear in a user behavior sequence, taking the interest segment as a unique interest segment of a user, and adding the unique interest segment into the unique interest segment sequence Until unique segment sequence/>The method comprises the steps of inputting mu interest segments into an embedded layer to obtain a vector e unique of a unique interest segment sequence;
Step B26: the user ID and the target item ID are entered into the embedded layer, resulting in e u and e t.
5. The click-through rate prediction method based on knowledge enhancement and interest evolution according to claim 4, wherein the step B3 specifically comprises the following steps:
Step B31: vector each item within a single segment of high interest and segment of low interest And/>Input into GRU, capture fine granularity interest evolution of user in interest segment, and update the vector of jth article in ith interest segment to/>, respectivelyAnd/>
Step B32: dividing each interest segment according to the relation in the knowledge graph; for each relation, the entity vector corresponding to the interest segmentSequencing according to the first article of the interest segments, inputting the first article into the GRU, capturing coarse-granularity interest evolution of the user among the interest segments, and vector/>, of the ith entityUpdated as/>
Step B33: the step B22 is carried outFor each item vector/>, as a query, in a corresponding interest segmentAttention is paid to calculate the relevance of each object and entity to obtain Duan Naju aggregate vectors/>, of the strong interest segmentsAnd Duan Naju aggregate vector/>, of the weak interest segment
Wherein,And/>Representing the attention coefficients, W 1 and W 2 represent trainable parameters, N s represents the number of items in the strong interest segment, and N w represents the number of items in the weak interest segment.
6. The click-through rate prediction method based on knowledge enhancement and interest evolution according to claim 5, wherein the step B4 specifically comprises the following steps:
Step B41: according to the user-object interaction graph G interact obtained in the step B13, using GCN, aggregating the vector e u of the user interacted with the object into the object vector e t to obtain the updated vector of the object
Step B42: the target object vector obtained in the step B41Duan Naju aggregate vectors for each segment of strong interest as a queryAnd (3) performing attention calculation to obtain a vector u s with strong interest:
Wherein, Representing the attention coefficient, W 2 representing the trainable parameter, L s representing the number of segments of strong interest;
Step B43: the strong interest vector u s obtained in the step B42 and the target object vector obtained in the step B41 are combined Stitching is performed as a query, and Duan Naju joint vectors/>, for each weak interest segmentAnd (3) performing attention calculation to obtain a vector u w of weak interest:
Wherein, Representing the attention coefficient, W 4 representing the trainable parameter, L w representing the number of weak segments of interest;
Step B44: the target object vector obtained in the step B41 For each item vector in a unique interest segment as a queryAttention is paid to calculate the relativity of each article and the target article, and the Duan Naju total vector of the unique interest segment is obtained
Wherein,Representing an attention coefficient, W 5 representing a trainable parameter, N unique representing the number of items within a unique segment of interest;
Step B45: the target object vector obtained in the step B41 Duan Naju aggregate vector/>, as a query, for each unique segment of interestPerforming attention calculation to obtain a vector u unique of unique interests:
Wherein, Representing the attention factor, W 6 representing the trainable parameter, L unique representing the number of unique segments of interest.
7. The click-through rate prediction method based on knowledge enhancement and interest evolution according to claim 6, wherein the step B5 specifically comprises the following steps:
Step B51: the target object vector obtained in the step B41 As a query, vector/>, for each semantic-based potential interest segmentPerforming attention calculation to obtain Duan Naju joint vectors/>, based on semantic potential interest segments
Wherein,Representing an attention coefficient, W 7 representing a trainable parameter, N k representing a number of items within the semantic-based potential segment of interest;
Step B52: vector Duan Naju of all semantic-based potential interest segments Splicing to obtain x k, inputting the x k into the multi-head self-attention to obtain a Duan Naju combined vector H k of the thinned semantic-based potential interest segment sequence;
Where head h represents the output of the h attention function, d h represents the dimension of each head, n head represents the number of attention functions, and W 8 represents the trainable parameters;
Step B53: the target object vector obtained in the step B41 As a query, vector/>, for each semantic-based potential interest segmentPerforming attention calculation to obtain a semantic-based potential interest vector u k:
Wherein, Representing the attention coefficient, W 9 representing the trainable parameter, L k representing the number of potential segments of interest based on semantics;
step B54: the target object vector obtained in the step B41 As a query, for each item vector/>, in a similarity-based segment of potential interestPerforming attention calculation to obtain the vector/>, based on the similarity, of the potential interest segment
Wherein,Representing an attention coefficient, W 10 representing a trainable parameter, N c representing a number of items within the semantic-based potential segment of interest;
Step B55: vector Duan Naju of all similarity-based potential interest segments Splicing to obtain x c, inputting the x c into the multi-head self-attention to obtain a Duan Naju combined vector H c of the thinned potential interest segment sequence based on similarity;
Wherein nhea d denotes the output of the h-th attention function, d h denotes the dimension of each head, n head denotes the number of attention functions, and W 11 denotes the trainable parameter;
step B56: the target object vector obtained in the step B41 As a query, vector/>, for each similarity-based potential interest segmentAttention calculations are made to arrive at a representation u c of potential interest vectors based on similarity:
Wherein, Representing the attention coefficient, W 12 representing the trainable parameter, L c representing the number of potential segments of interest based on semantics;
step B57: taking interests of the same user from two potential interest segments as a pair of positive samples, treating interests of different users as a pair of negative samples, and calculating a contrast loss L cl:
Where N u denotes the number of users, Representing the semantic-based potential interests of the remaining users in a batch.
8. The click-through rate prediction method based on knowledge enhancement and interest evolution according to claim 7, wherein the step B6 specifically comprises the following steps:
Step B61: vector e u of user and vector of target object And the strong interest vector u s, the weak interest vector representation u w, the potential interest vector vectors u k and u c and the unique interest vector u unique obtained in the steps B2, B4 and B5 are spliced together and input into a multi-layer perceptron MLP for click rate prediction to obtain a predicted click rate:
Step B62: calculating a prediction loss by using a cross entropy function L target, updating a learning rate by using a gradient optimization algorithm Adam, and updating model parameters by using back propagation iteration to train a model by using a minimum loss function; the model total loss is the loss weighted addition:
Lall=Ltarget+λLcl
Where N represents the number of samples, y i is the corresponding label, λ is the hyper-parameter, L target is the loss function of click rate prediction, and L cl is the contrast loss function of the potential interest vector.
9. A click rate prediction system employing the method of any one of claims 1-8, comprising:
The training set construction module is used for collecting behavior data of a user and constructing a click rate prediction training set;
the model training module is used for training a deep learning network model G based on knowledge enhancement and interest evolution; and
The click rate prediction module is used for receiving user and article data, inputting the user and article data into the trained deep learning network model G, and outputting the click probability of the current user on the target article.
CN202410226217.8A 2024-02-29 2024-02-29 Click rate prediction method and system based on knowledge enhancement and interest evolution Pending CN118094004A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410226217.8A CN118094004A (en) 2024-02-29 2024-02-29 Click rate prediction method and system based on knowledge enhancement and interest evolution

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410226217.8A CN118094004A (en) 2024-02-29 2024-02-29 Click rate prediction method and system based on knowledge enhancement and interest evolution

Publications (1)

Publication Number Publication Date
CN118094004A true CN118094004A (en) 2024-05-28

Family

ID=91156938

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410226217.8A Pending CN118094004A (en) 2024-02-29 2024-02-29 Click rate prediction method and system based on knowledge enhancement and interest evolution

Country Status (1)

Country Link
CN (1) CN118094004A (en)

Similar Documents

Publication Publication Date Title
Yu et al. A review of recurrent neural networks: LSTM cells and network architectures
Narayanan et al. An FPGA implementation of decision tree classification
Meng et al. Leveraging concept association network for multimedia rare concept mining and retrieval
CN112380453B (en) Article recommendation method and device, storage medium and equipment
CN107918657A (en) The matching process and device of a kind of data source
CN113761359B (en) Data packet recommendation method, device, electronic equipment and storage medium
CN112395487A (en) Information recommendation method and device, computer-readable storage medium and electronic equipment
Zhang et al. GACOforRec: Session-based graph convolutional neural networks recommendation model
Zhang et al. Recurrent convolutional neural network for session-based recommendation
CN111444335B (en) Method and device for extracting central word
Li et al. Alpha-SGANet: A multi-attention-scale feature pyramid network combined with lightweight network based on Alpha-IoU loss
WO2023174064A1 (en) Automatic search method, automatic-search performance prediction model training method and apparatus
Nakamura et al. Stochastic batch size for adaptive regularization in deep network optimization
CN118094004A (en) Click rate prediction method and system based on knowledge enhancement and interest evolution
Wen et al. Extended factorization machines for sequential recommendation
Song et al. Capturing multi-granularity interests with capsule attentive network for sequential recommendation
CN116776001A (en) Click rate prediction method and system based on knowledge enhancement and contrast learning
Li et al. Ttnet: Tabular transfer network for few-samples prediction
CN114282687B (en) Multi-task time sequence recommendation method based on factorization machine
CN116340864B (en) Model drift detection method, device, equipment and storage medium thereof
Su et al. Churn Prediction in Telecommunications Industry Based on Conditional Wasserstein GAN
CN116089722B (en) Implementation method, device, computing equipment and storage medium based on graph yield label
CN116484103A (en) Exploratory recall method and device based on cluster-like information
Deng et al. Real-time collaborative filtering using extreme learning machine
Du et al. Boosted Trees Model with Feature Engineering: An Approach for Post-Click Conversion Rate Prediction

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination