CN109492229A - A kind of cross-cutting sensibility classification method and relevant apparatus - Google Patents
A kind of cross-cutting sensibility classification method and relevant apparatus Download PDFInfo
- Publication number
- CN109492229A CN109492229A CN201811406037.9A CN201811406037A CN109492229A CN 109492229 A CN109492229 A CN 109492229A CN 201811406037 A CN201811406037 A CN 201811406037A CN 109492229 A CN109492229 A CN 109492229A
- Authority
- CN
- China
- Prior art keywords
- comment text
- text data
- hidden state
- word sequence
- vector
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/30—Semantic analysis
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Computational Linguistics (AREA)
- General Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Machine Translation (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
The present invention provides a kind of cross-cutting sensibility classification method and relevant apparatus.Method includes: to obtain target comment text data, and extract the Aspect information in target comment text data;According to target comment text data and Aspect information, comment word sequence and Aspect word sequence are obtained;Comment word sequence and Aspect word sequence are input in preparatory trained model;The model is according to comment text data and Aspect information corresponding with comment text data, it is modeled using interactive attention network, and by the way that the comment text data for having label and the comment text data without label are trained, the model has the ability of domain classification and the ability of emotional semantic classification;Obtain the result of the emotional category for indicating the target comment text data of model output.The present invention can be improved the accuracy of cross-cutting emotional semantic classification.
Description
Technical field
The present invention relates to machine learning and text data digging technical field more particularly to a kind of cross-cutting emotional semantic classification sides
Method and relevant apparatus.
Background technique
Sentiment analysis or opinion mining are for people to viewpoint expressed by the entities such as product, service, tissue, feelings
The excavation and assessment of thread, attitude.The development in the field and fast starting have benefited from the social media on network, such as product is commented
By, short text description etc..In recent years, sentiment analysis has grown into one of most active research field in natural language processing,
There is extensive research in terms of data mining, Web excavation, text mining and information retrieval.
By taking product as an example, the comment information of the corresponding product is appeared in the forum on network, but internet
The comment of middle majority is not no label, this, which allows for tradition, has the machine learning method of supervision not to be employed well.For
Contradiction between solution this big data and few label, cross-cutting emotional semantic classification are suggested and are widely studied.
The case where cross-cutting emotional semantic classification is primarily directed to the data for having label are lacked in some field (aiming field), by
This, which introduces one, has the field (source domain) of enough label informations to carry out training pattern, by knowledge learning between field and migration,
So as to which the data in the aiming field of no label effectively to be carried out to the positive and negative classification of emotion.
Currently, the method about cross-cutting emotional semantic classification mainly includes following two method:
1) the sharing feature manual extraction analysis based on conventional machines learning method.
Work based on conventional machines study is intended to excavate the relationship between field, and this relation form is turned to altogether
Enjoy feature.Explanatory in order to have more it in text field, this sharing feature is usually referred to as between field altogether by researcher
Some emotion vocabulary.By the analysis and research of Previous work, demonstrates some knowledge shared between different field and really can
Enough effects for helping to improve cross-cutting feeling shifting.
2) sharing feature automatic identification neural network based and extraction and analysis.
Cross-cutting emotional semantic classification neural network based combines common feature extraction mode in conventional method, sends out simultaneously
The performance for waving deep learning is automatically extracted out between field using different neural network structures (such as memory network, confrontation network)
Sharing feature.This method can not only more fully extract the sharing feature between different field, additionally it is possible to enhance
Feeling shifting is explanatory between field.
However, applicant of the present invention has found, although above two method can be extracted effectively between different field
Sharing feature, complete cross-cutting feeling shifting, but above two method does not all fully take into account one that text has
Other a little aspect features, which has larger impact for the result of emotional semantic classification, so as to lead to emotion
The problem of classification results inaccuracy.Such as: in the comment of certain part product may comprising the description to many aspects of the product,
And influence of the different aspect of product to product is necessarily different, if assigning the feature of these different aspects to identical power
Weight, then will be greatly reduced the accuracy of judgement degree to the included Sentiment orientation of the product review.Similarly, for different field,
Certainly exist some similar aspect information (such as: can have commenting for its " appearance " in different classes of comment on commodity
Valence), and these aspect information also necessarily have opposite impacts on because of field difference.
Summary of the invention
In view of this, the present invention provides a kind of cross-cutting sensibility classification method and relevant apparatus, to improve cross-cutting emotion
The accuracy of classification method.Technical solution is as follows:
Based on an aspect of of the present present invention, the present invention provides a kind of cross-cutting sensibility classification method, comprising:
Target comment text data are obtained, and extract the aspect Aspect information in the target comment text data;Its
Described in target comment text data without label;
According to the target comment text data and the Aspect information, it is corresponding to obtain the target comment text data
Comment word sequence and Aspect word sequence;
The comment word sequence and the Aspect word sequence are input in preparatory trained model;The mould
Type is to use interactive attention network according to comment text data and Aspect information corresponding with the comment text data
It is modeled, and by the comment text data for having label and the comment text data without label are trained, institute
It states model and has the ability of domain classification and the ability of emotional semantic classification;
Obtain the result of the emotional category for indicating the target comment text data of the model output.
Optionally, training obtains the model with the following method:
Multiple comment text data are obtained from source domain and aiming field respectively, acquired multiple comment text data are made
For multiple sample comment text data to be trained, and the Aspect information in each comment text data is extracted respectively;Wherein
Aspect information and comment text data correspond, and the part comment text data in the source domain have label, and part is commented on
Text data is without label, and the comment text data in the aiming field are without label;
The training mission from source domain to aiming field is created, the training mission includes target sample comment text data, institute
State target sample comment text data include the sample comment text data for having label in the source domain, in the source domain without label
Sample comment text data and the aiming field in the sample comment text data without label;
Using word embedding grammar, the semantic vector table of the corresponding comment text of the target sample comment text data is obtained
Seek peace Aspect word sequence vector characterization;
The study that the semantic vector characterization of the comment text is passed through to Bi-LSTM model, obtains the hidden shape of comment text
State indicates, by the vector characterization of the Aspect word sequence by the study of the Bi-LSTM model, obtains Aspect word
The hidden state of sequence indicates;
The hidden state of the comment text is indicated respectively and the hidden state of the Aspect word sequence indicates to carry out pond
Change processing, the vector after obtaining the hidden state pool of comment text indicate and the vector table after the hidden state pool of Aspect word sequence
Show;
Hidden state according to the comment text indicates and the vector table after the hidden state pool of Aspect word sequence
Show, obtains the final expression of the target sample comment text data;
Hidden state according to the Aspect word sequence indicates and the vector table after the hidden state pool of the comment text
Show, obtains the final expression of the Aspect information;
Utilize formulaThe final expression of the target sample comment text data is carried out
The training of domain classification, wherein G (x)=x,
Utilize formulaEmotion is carried out to the final expression of the Aspect information
The training of classification.
Optionally, the study that the semantic vector characterization of the comment text is passed through to Bi-LSTM model, is commented on
The hidden state of text indicates, by the vector characterization of the Aspect word sequence by the study of the Bi-LSTM model, obtains
The hidden state of Aspect word sequence indicates
The semantic vector of the comment text is characterized into the input as the Bi-LSTM model, to vector unit sequence c
={ c1,c2,c3...cnAnd hidden state h={ h1,h2,h3...hnIt is iterated update;Wherein t is 1 any one into n
Number, n are positive integer, and the mode that the iteration updates includes:
ht=ot·tanh(ct)
Wherein it、ft、otInput gate, forgetting door and out gate in respectively the t times iterative process,For the comment
The semantic vector of text characterizes, ctIt is memory unit, htIt is final state output, i.e., hidden state indicates, tanh () is activation
Function, W*、b*The respectively bias term of weight matrix.
Optionally, the hidden state of the comment text is expressed asThe Aspect word sequence
Hidden state be expressed asVector after the hidden state pool of comment text is expressed asVector after the hidden state pool of Aspect word sequence is expressed asN, m is positive integer;
Hidden state according to the comment text indicates and the vector table after the hidden state pool of Aspect word sequence
Show, the final expression for obtaining the target sample comment text data includes:
The hidden state of the comment text is indicatedWith the hidden shape of Aspect word sequence
The vector of state Chi Huahou indicatesSpliced, obtains a new characterization vector
Using formulaCalculate the weighted score of the hidden vector of each Aspect word;Wherein
It adds up after the weighted score of the hidden vector of obtained each Aspect word is multiplied, obtains the target sample and comment
The final expression S of paper notebook datar, wherein
Optionally, according to the Aspect word sequence hidden state indicate and the hidden state pool of the comment text after
Vector indicates that the final expression for obtaining the Aspect information includes:
Using formulaThe attention weighted score of each Aspect is calculated;Wherein
It adds up after the attention weighted score of obtained each Aspect is multiplied, obtains the final of the Aspect information
Indicate Ar, wherein
Optionally, the training method of the model further include:
Using following formula respectively to y'sAnd y'dIt is trained;
Wherein ysAnd ydThe respectively true tag in domain and emotion,To have label comment text data, N in source domaindFor source
All comment text data in domain and aiming field;
Judge whether the value of L meets preset condition, wherein L=Lsen+Ldom+ρLreg, ρ is regularization parameter, LregIt is canonical
Change item;
When the value of the L meets preset condition, show that the classification accuracy of the model is high.
Optionally, the training method of the model further include:
The test comment text data without label are chosen, and extract test Aspect from the test comment text data
Information;
By the test comment text data and the test Aspect information input into the model;
Obtain the classification results of the model output;
Classification results according to model output verify the accuracy of the model.
Optionally, the source domain includes Amazon platform, and the aiming field includes that crowd raises platform;
Obtaining multiple comment text data from source domain includes: the comment text number that different commodity are obtained from Amazon platform
According to;
Obtaining multiple comment text data from aiming field includes: to comform to raise to extract in announced project data on platform
The comment text data of project.
Based on another aspect of the present invention, the present invention provides a kind of cross-cutting emotional semantic classification device, comprising:
First acquisition unit for obtaining target comment text data, and is extracted in the target comment text data
Aspect Aspect information;Wherein the target comment text data are without label;
Second acquisition unit, for obtaining the mesh according to the target comment text data and the Aspect information
Mark the corresponding comment word sequence of comment text data and Aspect word sequence;
First input unit, for the comment word sequence and the Aspect word sequence to be input to preparatory training
In good model;The model is made according to comment text data and Aspect information corresponding with the comment text data
It is modeled with interactive attention network, and then by the comment text data for having label and the comment text number without label
According to what is be trained, the model has the ability of domain classification and the ability of emotional semantic classification;
Third acquiring unit, for obtaining the emotion for being used to indicate the target comment text data of the model output
The result of classification.
Based on another aspect of the invention, the present invention provides a kind of model training apparatus, comprising:
4th acquiring unit will be acquired for obtaining multiple comment text data from source domain and aiming field respectively
Multiple comment text data are extracted in each comment text data respectively as multiple sample comment text data to be trained
Aspect information;Wherein Aspect information and comment text data correspond, the part comment text number in the source domain
According to there is a label, part comment text data are without label, and the comment text data in the aiming field are without label;
Training mission creating unit, for creating the training mission from source domain to aiming field, the training mission includes mesh
Standard specimen this comment text data, the target sample comment text data include the sample comment text for having label in the source domain
Sample comment text number in sample comment text data and the aiming field in data, the source domain without label without label
According to;
5th acquiring unit, for obtaining using word embedding grammar, the target sample comment text data are corresponding to be commented
The semantic vector characterization of paper sheet and the vector of Aspect word sequence characterize;
Unit passes through the study of Bi-LSTM model for the semantic vector characterization by the comment text, is commented
The hidden state of paper sheet indicates, by the vector characterization of the Aspect word sequence by the study of the Bi-LSTM model, obtains
Hidden state to Aspect word sequence indicates;
Pond processing unit indicates and the Aspect word sequence for the hidden state respectively to the comment text
Hidden state indicates progress pond processing, and the vector after obtaining the hidden state pool of comment text indicates and the hidden shape of Aspect word sequence
The vector of state Chi Huahou indicates;
6th acquiring unit indicates and the hidden shape of Aspect word sequence for the hidden state according to the comment text
The vector of state Chi Huahou indicates, obtains the final expression of the target sample comment text data;And according to the Aspect
Vector after the hidden state expression of word sequence and the hidden state pool of the comment text indicates, obtains the Aspect information
It is final to indicate;
Training unit, for utilizing formulaTo the target sample comment text data
The final training for indicating to carry out domain classification, wherein G (x)=x,And utilize formulaThe final training for indicating to carry out emotional semantic classification to the Aspect information.
In cross-cutting sensibility classification method and relevant apparatus provided by the invention, the target comment text data that will acquire
Corresponding comment word sequence, and extracted from target comment text data Aspect (in terms of) information is corresponding
Aspect word sequence is input in preparatory trained model, by means of the model, directly acquires the model output
For indicate target comment text data emotional category as a result, realizing to the emotional semantic classifications of target comment text data.
Model in the present invention is used according to comment text data and Aspect information corresponding with comment text data
Interactive attention network is modeled, and by the comment text data for having label and the comment text data without label into
Row training obtains, and the ability which has domain classification (judges that the target comment text data come from source domain or mesh
Mark domain) and emotional semantic classification ability (judge that the Sentiment orientation of the target comment text data be positive or bear).In the present invention
Model may learn similar sharing feature between different field by the classification to domain, then utilize the shared spy learnt
Sign carries out cross-cutting feeling shifting, carries out classification prediction without label data to aiming field, greatly mentions compared with the prior art
The high accuracy of cross-cutting emotional semantic classification.
Detailed description of the invention
In order to more clearly explain the embodiment of the invention or the technical proposal in the existing technology, to embodiment or will show below
There is attached drawing needed in technical description to be briefly described, it should be apparent that, the accompanying drawings in the following description is only this
The embodiment of invention for those of ordinary skill in the art without creative efforts, can also basis
The attached drawing of offer obtains other attached drawings.
Fig. 1 is a kind of flow chart of cross-cutting sensibility classification method provided by the invention;
Fig. 2 is a kind of flow chart of model training method provided by the invention;
Fig. 3 is a kind of structural schematic diagram of cross-cutting emotional semantic classification device provided by the invention;
Fig. 4 is a kind of structural schematic diagram of model training apparatus provided by the invention.
Specific embodiment
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention carries out clear, complete
Site preparation description, it is clear that described embodiments are only a part of the embodiments of the present invention, instead of all the embodiments.It is based on
Embodiment in the present invention, it is obtained by those of ordinary skill in the art without making creative efforts every other
Embodiment shall fall within the protection scope of the present invention.
As shown in Figure 1, the present invention provides a kind of cross-cutting sensibility classification method, method may include:
Step 101, target comment text data are obtained, and extract the letter of the Aspect in the target comment text data
Breath, wherein the target comment text data are without label.
Specifically, the present invention can raise acquisition target comment text data, the target comment text in platform from such as crowd
Data can be specially that crowd raises in platform the comment information of announced project and/or many raises the new projects not issued in platform
Comment information.
For the target comment text data got, the present invention further therefrom extracts Aspect information.
Step 102, according to the target comment text data and the Aspect information, the target comment text is obtained
The corresponding comment word sequence of data and Aspect word sequence.
For the target comment text data and Aspect information got, the present invention is translated into the shape of vector characterization
Formula.Specifically, unified mathematical form can be used to respectively indicate target comment text data and Aspect information in the present invention,
Wherein target comment text data are indicated with comment word sequence, and Aspect information is indicated with Aspect word sequence.
Step 103, the comment word sequence and the Aspect word sequence are input to preparatory trained model
In.
Step 104, the knot of the emotional category for indicating the target comment text data of the model output is obtained
Fruit.
The corresponding comment word sequence of target comment text data that the present invention will acquire, and from target comment text
The corresponding Aspect word sequence of the Aspect information extracted in data is input in preparatory trained model, by means of
The model, directly acquire model output for indicate target comment text data emotional category as a result, realize to mesh
Mark the emotional semantic classification of comment text data.
Model provided by the invention is according to comment text data (multi-field) and corresponding with the comment text data
Aspect information is modeled using interactive attention network, and by the comment text data for having label and without label
Comment text data be trained, which has the ability of domain classification and the ability of emotional semantic classification.Actually answering
With in the process, model obtains the shared spy between different field with the method that migration neural network combines using attention mechanism
Sign, then carries out cross-cutting feeling shifting using the sharing feature learnt, without label data classify to aiming field pre-
It surveys, greatly improves the accuracy of cross-cutting emotional semantic classification result compared with the prior art.
Model in the present invention is obtained using method shown in Fig. 2 training, specifically, model training side provided by the invention
Method may include:
Step 201, multiple comment text data are obtained from source domain and aiming field respectively, by acquired multiple comments text
Notebook data extracts the letter of the Aspect in each comment text data as multiple sample comment text data to be trained respectively
Breath;Wherein Aspect information and comment text data correspond, and the part comment text data in the source domain have label, portion
Point comment text data are without label, and the comment text data in the aiming field are without label.
In the present invention, source domain refers to the field for possessing enough label informations, containing a large amount of comment text number in source domain
According to part of comment text data have label, and part comment text data are without label.Specifically, source domain is, for example, Amazon
Platform, the present invention can obtain the comment text data of different commodity from Amazon platform.
In the present invention, aiming field refers to lacking the field for having label information, the comment text data in the aiming field without
Label.Specifically, aiming field is, for example, that crowd raises platform, and the present invention can comform to raise and extract in announced project data on platform
The comment text data of project out.
In the present invention, data processing is carried out for all comment text data got from source domain and aiming field,
It specifically includes: using acquired multiple comment text data as multiple sample comment text data to be trained, and mentioning respectively
The Aspect information in each comment text data is taken, and guarantees extracted Aspect information and comment text data one by one
It is corresponding.
Step 202, training mission of the creation from source domain to aiming field, the training mission include target sample comment text
Data, the target sample comment text data include having sample comment text data, the source domain of label in the source domain
Sample comment text data in the sample comment text data of middle no label and the aiming field without label.
Specifically, the comment that the present invention can will be got according to the category IDs on Amazon platform from Amazon platform
Text data is divided into different fields (such as field Book, the field DVD, the field Electronics, the field Kitchen), for from
Crowd raises the comment text data for the project being drawn into platform, then separately as a field.
Firstly, the present invention creates the training mission from source domain to aiming field, which includes target sample comment text
Notebook data, the target sample comment text data include having the sample comment text data of label in source domainNothing in source domain
The sample comment text data of labelWith the sample comment text data X in aiming field without labelt。
For each target sample comment text data, present invention assumes that it includes n word, target sample comment
Text data is labeled as s={ w1,w2,w3...wn, corresponding Aspect word sequence includes m word, is labeled as a=
{w1,w2,w3...wm}。
Step 203, using word embedding grammar, the language of the corresponding comment text of the target sample comment text data is obtained
Adopted vector table seek peace Aspect word sequence vector characterization.
For creation slave source domain to each of aiming field training mission, the present invention can be using word embedding grammar, will
Target sample comment text data in each training mission are converted into the vector characterization of target sample comment text.Wherein word
Embedding grammar is, for example, the method for network word insertion word2vec.
Specifically in the present invention, it target sample comment text data in each training mission and is commented on target sample
The corresponding Aspect word sequence of text data indicated using unified mathematical form, respectively target sample comment text number
According to being expressed as s={ w1,w2,w3...wn, Aspect word sequence is expressed as a={ w1,w2,w3...wm}.The present invention is to s=
{w1,w2,w3...wnUsing the word embedding grammar of neural network, obtain the corresponding comment text of target sample comment text data
Semantic vector characterizationSimilarly, the present invention is to a={ w1,w2,w3...wmEmbedding using the word of neural network
Enter method, obtains the vector characterization of the corresponding Aspect word sequence of target sample comment text data
Step 204, the study that the semantic vector characterization of the comment text is passed through to Bi-LSTM model obtains comment text
This hidden state indicates, by the vector characterization of the Aspect word sequence by the study of the Bi-LSTM model, obtains
The hidden state of Aspect word sequence indicates.
The present invention is in the semantic vector characterization and Aspect for obtaining the corresponding comment text of target sample comment text data
After the vector characterization of word sequence, using Bi-LSTM (Long Short-Term Memory, shot and long term memory network) model
The expression of abundant semantic information and its hidden state vector that study word is contained.
Specifically, the present invention characterizes the semantic vector of comment textAs Bi-LSTM model
Input, to vector unit sequence c={ c1,c2,c3...cnAnd hidden state h={ h1,h2,h3...hnIt is iterated update,
Middle t is 1 any one number into n, and n is positive integer.
Hidden state vector h in iteration renewal process, in each iterationtBy previous hidden state vector ht-1With it is current
Vector location ctIt codetermines, specific iteration update mode is as follows:
ht=ot·tanh(ct)
Wherein, it、ft、otInput gate, forgetting door and out gate in respectively the t times iterative process,For the comment
The semantic vector of text characterizes, ctIt is memory unit, htFinal state output, i.e., hidden state indicates, sigmod () and
Tanh () is activation primitive, and for enhancing the expressiveness of model, W*, b* are respectively the bias term of weight matrix.
The hidden state that comment text is calculated in the present invention as a result, indicates
Analogously, the present invention characterizes the vector of Aspect word sequenceAs Bi-LSTM mould
The input of type, to vector unit sequence c={ c1,c2,c3...cnAnd hidden state h={ h1,h2,h3..h.nBe iterated more
Newly, the hidden state that Aspect word sequence is calculated indicates
Step 205, hidden state table with the Aspect word sequence is indicated the hidden state of the comment text respectively
Show and carry out pond processing, after the vector expression and the hidden state pool of Aspect word sequence after obtaining the hidden state pool of comment text
Vector indicate.
The hidden state for obtaining comment text indicatesAnd the hidden state of Aspect word sequence
It indicatesLater, in order to further interact the information of Aspect and comment text, the present invention needs first
It will be to the hidden state vector that Bi-LSTM model exports (i.e.With) take
Pond processing operation.Pond processing operation is a kind of nonlinear Downsapling method, for reducing the space of character representation simultaneously
Retain important information.
Specifically, the present invention can obtain the feature vector of Chi Huahou by following Mean-Pooling method:N, m is positive integer.
WhereinIt is indicated for the vector after the hidden state pool of comment text,After the hidden state pool of Aspect word sequence
Vector indicate.
Step 206, according to the comment text hidden state indicate and the hidden state pool of Aspect word sequence after
Vector indicate, obtain the final expression of the target sample comment text data.
In actual application, when user carries out emotional semantic classification to comment text, each word is to the final emotion of comment
Influence degree be different.Such as: if there is the emotions vocabulary such as good, bad in certain 7 comment text, this can be to target
Influence it is obviously higher than normal words.Similarly, for different products, the influences of some same words may also can be because
It is different for product scope difference.Such as: in electronics field, the influence that the Aspect of appearance generates it far from and its
Influence in clothing field.In order to solve this problem, the present invention is using attention mechanism come single to the difference in comment text
Word assigns different weights, while the present invention also joined interaction mechanism in attention layer, so that the information and comment of Aspect
The information of text is more substantially effectively applied.
Specifically, the present invention indicates the hidden state of comment textWith Aspect word sequence
Vector after hidden state pool indicatesSimple concatenation is carried out, a new characterization vector is obtainedAnd then use formulaCalculate the hidden of each Aspect word
The weighted score α of vectori;WhereinFinally, the present invention is by obtained each Aspect word
The weighted score of hidden vector adds up after being multiplied, and obtains the final expression S of the target sample comment text datar, wherein
It is important to note that α in the present inventioniGreatly improve the interpretation of the method for the present invention.αiIt can help
The vocabulary for containing abundant semantic information to play an important role to emotional semantic classification is extracted, thus helps the prediction of feeling shifting.
Step 207, according to the Aspect word sequence hidden state indicate and the hidden state pool of the comment text after
Vector indicate, obtain the final expression of the Aspect information.
For Aspect, the present invention specifically uses formulaIt is calculated each
The attention weighted score of Aspect;WhereinAnd each Aspect that will be obtained
Attention weighted score be multiplied after add up, obtain the final expression A of Aspect informationr, wherein
It is worth noting that, the present invention is generating βiWhen be not by after the hidden state pool of comment text vector indicate and
The hidden state of Aspect word sequence indicates progress simple concatenation, but will be in the vector expression after the hidden state pool of comment text
Each pond vector and Aspect word sequence hidden state expression in each vector make inner product operation, the present invention is fully
Using the information of Aspect and text itself, keep emotional semantic classification accuracy rate higher.
So far, the present invention has obtained the final expression S of target sample comment text datar(i.e. comment text is aided with
The final expression S of Aspect informationr) and Aspect information final expression Ar(i.e. Aspect is aided with the final expression of comment text
Ar).Next, the present invention will be trained classifier.
In order to which the model for obtaining present invention training more has migration, the present invention relates to two in a model and appoints
Business, respectively domain classification task and emotional semantic classification task.Wherein domain classification task is for specifically judging comment text data
Belong to which field, how emotional semantic classification task is for judging emotion attribute.Specific in data, the present invention uses source domain data
XsThe data X all with aiming fieldtThe training for carrying out domain classification, has label data using source domainCarry out emotional semantic classification
Training.
Step 208, formula is utilizedTo the final of the target sample comment text data
Indicate the training of progress domain classification, wherein G (x)=x,
Specifically for the training of domain classification, the present invention uses gradient inversion layer GRL simultaneously to make two tasks can be with
Joint training, it is also ensured that domain classifier can learn the similar features between two fields out.
Specifically: G (x)=x,The present invention obtains after gradient inversion layerAnd then willInput as softmax () obtains domain labeling:
Step 209, formula is utilizedFinal expression to the Aspect information
Carry out the training of emotional semantic classification.
Training for emotional semantic classification, the present invention are specifically to utilize formulaIt is right
The final training for indicating to carry out emotional semantic classification of Aspect information.The final table of combining target sample comment text data of the present invention
Show SrWith the final expression A of Aspect informationr, it is common to realize emotional semantic classification prediction by softmax () layer, it can be preferably
Using the Aspect information not being fully utilized in comment, to improve the accuracy of emotional semantic classification.
Further, it is contemplated that the model in the present invention has the property of multitask, and the present invention uses stand-alone training and joint
The method that training combines goes training pattern.Specifically, the present invention is needed firstly for the training of independent attention model, domain classifier
Go some features shared between study different field;And emotion classifiers then need to learn some pairs of emotional semantic classifications and compare
Important feature.The present invention is by way of minimizing following loss function by their stand-alone trainings:
Wherein ysAnd ydThe respectively true tag in domain and emotion,To have label comment text data, N in source domaindFor source
All comment text data in domain and aiming field.
Then, the present invention is by joint training, so that model is learnt simultaneously and shares between two fields and to emotion point
The important some features of class.Here, the present invention, which introduces regularization term simultaneously, prevents model from generating during training
Over-fitting:
L=Lsen+Ldom+ρLreg,
Wherein ρ is regularization parameter, LregIt is regularization term.L indicates the difference between true tag and prediction label.
The present invention, which passes through, as a result, judges whether the value of L meets preset condition, such as judges whether the value of L is less than default threshold
Value, carrys out the classification accuracy of judgment models.Specifically, when minimum loss function L meets preset condition, model at this time reaches
Preferably restrain effect.
As a preferred embodiment of the present invention, the present invention, can be first tentatively to training after training obtains model
Obtained model is tested.Specifically, the present invention can choose the test comment text data of no label, and from the test
Test Aspect information is extracted in comment text data;And then the test comment text data and the test Aspect are believed
Breath is input in the model;Obtain the classification results of the model output;Classification results verifying according to model output
The accuracy of the model.
The present invention is accurate by the emotional semantic classification for giving the comment text data verification model without label of some aiming fields
Degree.Preferably, the present invention can choose the comment text number that project on the external data collection Indiegogo on platform is raised using crowd
The emotional semantic classification accuracy of model is verified according to (the comment text data are without label).
Model in the present invention according to comment text data (multi-field) and Aspect information data, with attention mechanism with
The method that combines of migration neural network obtains the sharing feature between different field, then using feature progress learn across
The feeling shifting in field carries out classification prediction without label data in aiming field, greatly improves compared with the prior art
The accuracy of cross-cutting emotional semantic classification result.
Based on a kind of cross-cutting sensibility classification method provided by the invention above, the present invention also provides a kind of cross-cutting emotions
Sorter, as shown in figure 3, described device includes:
First acquisition unit 10 for obtaining target comment text data, and is extracted in the target comment text data
Aspect Aspect information;Wherein the target comment text data are without label;
Second acquisition unit 20, for according to the target comment text data and the Aspect information, described in acquisition
The corresponding comment word sequence of target comment text data and Aspect word sequence;
First input unit 30, for the comment word sequence and the Aspect word sequence to be input to preparatory instruction
In the model perfected;The model be according to comment text data and Aspect information corresponding with the comment text data,
It is modeled using interactive attention network, and then by the comment text data for having label and without the comment text of label
What data were trained, the model has the ability of domain classification and the ability of emotional semantic classification;
Third acquiring unit 40, for obtaining the feelings for being used to indicate the target comment text data of the model output
Feel the result of classification.
And as shown in figure 4, the present invention also provides a kind of model training apparatus, comprising:
4th acquiring unit 100 will be acquired for obtaining multiple comment text data from source domain and aiming field respectively
Multiple comment text data as multiple sample comment text data to be trained, and extract each comment text data respectively
In Aspect information;Wherein Aspect information and comment text data correspond, the part comment text in the source domain
Data have a label, and part comment text data are without label, and the comment text data in the aiming field are without label;
Training mission creating unit 200, for creating the training mission from source domain to aiming field, the training mission includes
Target sample comment text data, the target sample comment text data include the sample comment text for having label in the source domain
Sample comment text number in sample comment text data and the aiming field in notebook data, the source domain without label without label
According to;
It is corresponding to obtain the target sample comment text data for using word embedding grammar for 5th acquiring unit 300
The semantic vector characterization of comment text and the vector of Aspect word sequence characterize;
Unit 400 passes through the study of Bi-LSTM model for the semantic vector characterization by the comment text, obtains
Hidden state to comment text indicates, the vector of the Aspect word sequence is characterized to pass through the Bi-LSTM model
It practises, the hidden state for obtaining Aspect word sequence indicates;
Pond processing unit 500 indicates and the Aspect word sequence for the hidden state respectively to the comment text
The hidden state of column indicates progress pond processing, and the vector after obtaining the hidden state pool of comment text indicates and Aspect word sequence
Vector after hidden state pool indicates;
6th acquiring unit 600 indicates and the Aspect word sequence for the hidden state according to the comment text
Vector after hidden state pool indicates, obtains the final expression of the target sample comment text data;And according to described in
Vector after the hidden state expression of Aspect word sequence and the hidden state pool of the comment text indicates, obtains the Aspect
The final expression of information;
Training unit 700, for utilizing formulaTo the target sample comment text number
According to the final training for indicating to carry out domain classification, wherein G (x)=x, And utilize public affairs
FormulaThe final training for indicating to carry out emotional semantic classification to the Aspect information.
It should be noted that all the embodiments in this specification are described in a progressive manner, each embodiment weight
Point explanation is the difference from other embodiments, and the same or similar parts between the embodiments can be referred to each other.
For device class embodiment, since it is basically similar to the method embodiment, so being described relatively simple, related place ginseng
See the part explanation of embodiment of the method.
Finally, it is to be noted that, herein, relational terms such as first and second and the like be used merely to by
One entity or operation are distinguished with another entity or operation, without necessarily requiring or implying these entities or operation
Between there are any actual relationship or orders.Moreover, the terms "include", "comprise" or its any other variant meaning
Covering non-exclusive inclusion, so that the process, method, article or equipment for including a series of elements not only includes that
A little elements, but also including other elements that are not explicitly listed, or further include for this process, method, article or
The intrinsic element of equipment.In the absence of more restrictions, the element limited by sentence "including a ...", is not arranged
Except there is also other identical elements in the process, method, article or apparatus that includes the element.
A kind of cross-cutting sensibility classification method provided herein and relevant apparatus are described in detail above, this
Specific case is applied in text, and the principle and implementation of this application are described, the explanation of above example is only intended to
Help understands the present processes and its core concept;At the same time, for those skilled in the art, the think of according to the application
Think, there will be changes in the specific implementation manner and application range, in conclusion the content of the present specification should not be construed as pair
The limitation of the application.
Claims (10)
1. a kind of cross-cutting sensibility classification method characterized by comprising
Target comment text data are obtained, and extract the aspect Aspect information in the target comment text data;Wherein institute
Target comment text data are stated without label;
According to the target comment text data and the Aspect information, obtain that the target comment text data are corresponding to be commented
By word sequence and Aspect word sequence;
The comment word sequence and the Aspect word sequence are input in preparatory trained model;The model is
According to comment text data and Aspect information corresponding with the comment text data, carried out using interactive attention network
Modeling, and by the comment text data for having label and the comment text data without label are trained, the mould
Type has the ability of domain classification and the ability of emotional semantic classification;
Obtain the result of the emotional category for indicating the target comment text data of the model output.
2. the method according to claim 1, wherein training obtains the model with the following method:
Multiple comment text data are obtained from source domain and aiming field respectively, using acquired multiple comment text data as to
Trained multiple sample comment text data, and the Aspect information in each comment text data is extracted respectively;Wherein
Aspect information and comment text data correspond, and the part comment text data in the source domain have label, and part is commented on
Text data is without label, and the comment text data in the aiming field are without label;
The training mission from source domain to aiming field is created, the training mission includes target sample comment text data, the mesh
Standard specimen this comment text data include the sample comment text data for having label in the source domain, the sample without label in the source domain
Sample comment text data in this comment text data and the aiming field without label;
Using word embedding grammar, obtain the corresponding comment text of the target sample comment text data semantic vector characterization and
The vector of Aspect word sequence characterizes;
The study that the semantic vector characterization of the comment text is passed through to Bi-LSTM model, obtains the hidden state table of comment text
Show, by the vector characterization of the Aspect word sequence by the study of the Bi-LSTM model, obtains Aspect word sequence
Hidden state indicate;
The hidden state of the comment text is indicated respectively and the hidden state of the Aspect word sequence indicates to carry out pond Hua Chu
Reason, the vector after obtaining the hidden state pool of comment text indicate to indicate with the vector after the hidden state pool of Aspect word sequence;
Hidden state according to the comment text is indicated to indicate with the vector after the hidden state pool of Aspect word sequence, be obtained
To the final expression of the target sample comment text data;
Hidden state according to the Aspect word sequence is indicated to indicate with the vector after the hidden state pool of the comment text, be obtained
To the final expression of the Aspect information;
Utilize formulaTo the final expression carry out field of the target sample comment text data
The training of classification, wherein G (x)=x,
Utilize formulaEmotional semantic classification is carried out to the final expression of the Aspect information
Training.
3. according to the method described in claim 2, it is characterized in that, described pass through the semantic vector characterization of the comment text
The study of Bi-LSTM model, the hidden state for obtaining comment text indicate, the vector characterization of the Aspect word sequence is passed through
The study of the Bi-LSTM model, the hidden state expression for obtaining Aspect word sequence include:
The semantic vector of the comment text is characterized into the input as the Bi-LSTM model, to vector unit sequence c=
{c1,c2,c3...cnAnd hidden state h={ h1,h2,h3...hnIt is iterated update;Wherein t is 1 to any one in n
Number, n is positive integer, and the mode that the iteration updates includes:
ht=ot·tanh(ct)
Wherein it、ft、otInput gate, forgetting door and out gate in respectively the t times iterative process,For the comment text
Semantic vector characterization, ctIt is memory unit, htIt is final state output, i.e., hidden state indicates, tanh () is activation primitive,
W*、b*The respectively bias term of weight matrix.
4. according to the method described in claim 2, it is characterized in that,
The hidden state of the comment text is expressed asThe hidden state table of the Aspect word sequence
It is shown asVector after the hidden state pool of comment text is expressed asIt is described
Vector after the hidden state pool of Aspect word sequence is expressed asN, m is positive integer;
Hidden state according to the comment text is indicated to indicate with the vector after the hidden state pool of Aspect word sequence, be obtained
To the target sample comment text data it is final expression include:
The hidden state of the comment text is indicatedWith the hidden state pool of Aspect word sequence
Vector afterwards indicatesSpliced, obtains a new characterization vector
Using formulaCalculate the weighted score of the hidden vector of each Aspect word;Wherein
It adds up after the weighted score of the hidden vector of obtained each Aspect word is multiplied, obtains the target sample comment text
The final expression S of notebook datar, wherein
5. according to the method described in claim 4, it is characterized in that, according to the Aspect word sequence hidden state indicate and
Vector after the hidden state pool of comment text indicates that the final expression for obtaining the Aspect information includes:
Using formulaThe attention weighted score of each Aspect is calculated;Wherein
It adds up after the attention weighted score of obtained each Aspect is multiplied, obtains the final expression of the Aspect information
Ar, wherein
6. according to the method described in claim 2, it is characterized in that, the training method of the model further include:
Using following formula respectively to y'sAnd y'dIt is trained;
Wherein ysAnd ydThe respectively true tag in domain and emotion,To have label comment text data, N in source domaindFor source domain and
All comment text data in aiming field;
Judge whether the value of L meets preset condition, wherein L=Lsen+Ldom+ρLreg, ρ is regularization parameter, LregIt is regularization
?;
When the value of the L meets preset condition, show that the classification accuracy of the model is high.
7. according to the described in any item methods of claim 2-6, which is characterized in that the training method of the model further include:
The test comment text data without label are chosen, and extract test Aspect letter from the test comment text data
Breath;
By the test comment text data and the test Aspect information input into the model;
Obtain the classification results of the model output;
Classification results according to model output verify the accuracy of the model.
8. according to the described in any item methods of claim 2-6, which is characterized in that the source domain includes Amazon platform, described
Aiming field includes that crowd raises platform;
Obtaining multiple comment text data from source domain includes: that the comment text data of different commodity are obtained from Amazon platform;
Obtaining multiple comment text data from aiming field includes: to comform to raise to extract project in announced project data on platform
Comment text data.
9. a kind of cross-cutting emotional semantic classification device characterized by comprising
First acquisition unit for obtaining target comment text data, and extracts the aspect in the target comment text data
Aspect information;Wherein the target comment text data are without label;
Second acquisition unit, for obtaining the target and commenting according to the target comment text data and the Aspect information
The corresponding comment word sequence of paper notebook data and Aspect word sequence;
First input unit, it is trained in advance for the comment word sequence and the Aspect word sequence to be input to
In model;The model is to use friendship according to comment text data and Aspect information corresponding with the comment text data
Mutual formula attention network is modeled, so by the comment text data for having label and the comment text data without label into
Row training obtains, and the model has the ability of domain classification and the ability of emotional semantic classification;
Third acquiring unit, for obtaining the emotional category for being used to indicate the target comment text data of the model output
Result.
10. a kind of model training apparatus characterized by comprising
4th acquiring unit will be acquired multiple for obtaining multiple comment text data from source domain and aiming field respectively
Comment text data are extracted in each comment text data respectively as multiple sample comment text data to be trained
Aspect information;Wherein Aspect information and comment text data correspond, the part comment text data in the source domain
There is a label, part comment text data are without label, and the comment text data in the aiming field are without label;
Training mission creating unit, for creating the training mission from source domain to aiming field, the training mission includes target sample
This comment text data, the target sample comment text data include the sample comment text number for having label in the source domain
According in, the source domain without label sample comment text data and the aiming field in the sample comment text data without label;
5th acquiring unit obtains the corresponding comment text of the target sample comment text data for using word embedding grammar
This semantic vector characterization and the vector of Aspect word sequence characterize;
Unit passes through the study of Bi-LSTM model for the semantic vector characterization by the comment text, obtains comment text
This hidden state indicates, by the vector characterization of the Aspect word sequence by the study of the Bi-LSTM model, obtains
The hidden state of Aspect word sequence indicates;
Pond processing unit indicates the hidden shape with the Aspect word sequence for the hidden state respectively to the comment text
State indicates progress pond processing, and the vector after obtaining the hidden state pool of comment text indicates and the hidden state pool of Aspect word sequence
Vector after change indicates;
6th acquiring unit indicates and the hidden state pool of Aspect word sequence for the hidden state according to the comment text
Vector after change indicates, obtains the final expression of the target sample comment text data;And according to the Aspect word
Vector after the hidden state expression of sequence and the hidden state pool of the comment text indicates, obtains the final of the Aspect information
It indicates;
Training unit, for utilizing formulaTo the final of the target sample comment text data
Indicate the training of progress domain classification, wherein G (x)=x,And utilize formulaThe final training for indicating to carry out emotional semantic classification to the Aspect information.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811406037.9A CN109492229B (en) | 2018-11-23 | 2018-11-23 | Cross-domain emotion classification method and related device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811406037.9A CN109492229B (en) | 2018-11-23 | 2018-11-23 | Cross-domain emotion classification method and related device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109492229A true CN109492229A (en) | 2019-03-19 |
CN109492229B CN109492229B (en) | 2020-10-27 |
Family
ID=65697705
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811406037.9A Active CN109492229B (en) | 2018-11-23 | 2018-11-23 | Cross-domain emotion classification method and related device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109492229B (en) |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110245299A (en) * | 2019-06-19 | 2019-09-17 | 中国人民解放军国防科技大学 | Sequence recommendation method and system based on dynamic interaction attention mechanism |
CN110674849A (en) * | 2019-09-02 | 2020-01-10 | 昆明理工大学 | Cross-domain emotion classification method based on multi-source domain integrated migration |
CN110688832A (en) * | 2019-10-10 | 2020-01-14 | 河北省讯飞人工智能研究院 | Comment generation method, device, equipment and storage medium |
CN110704622A (en) * | 2019-09-27 | 2020-01-17 | 北京明略软件***有限公司 | Text emotion classification method and device and electronic equipment |
CN111134666A (en) * | 2020-01-09 | 2020-05-12 | 中国科学院软件研究所 | Emotion recognition method of multi-channel electroencephalogram data and electronic device |
CN111159400A (en) * | 2019-12-19 | 2020-05-15 | 苏州大学 | Product comment emotion classification method and system |
CN111339752A (en) * | 2020-02-18 | 2020-06-26 | 哈尔滨工业大学 | Evaluation object-oriented emotion analysis method for multi-task joint learning |
CN111428039A (en) * | 2020-03-31 | 2020-07-17 | 中国科学技术大学 | Cross-domain emotion classification method and system of aspect level |
WO2021109671A1 (en) * | 2019-12-02 | 2021-06-10 | 广州大学 | Fine-granularity sentiment analysis method supporting cross-language transfer |
CN113536080A (en) * | 2021-07-20 | 2021-10-22 | 湖南快乐阳光互动娱乐传媒有限公司 | Data uploading method and device and electronic equipment |
CN113806545A (en) * | 2021-09-24 | 2021-12-17 | 重庆理工大学 | Comment text emotion classification method based on label description generation |
CN114116959A (en) * | 2021-10-21 | 2022-03-01 | 吉林大学 | Method and device for analyzing aspect level emotion and terminal |
CN117112757A (en) * | 2023-08-23 | 2023-11-24 | 人民网股份有限公司 | Comment generation method and device based on text data |
CN114116959B (en) * | 2021-10-21 | 2024-07-16 | 吉林大学 | Aspect-level emotion analysis method and device and terminal |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105930411A (en) * | 2016-04-18 | 2016-09-07 | 苏州大学 | Classifier training method, classifier and sentiment classification system |
US9690772B2 (en) * | 2014-12-15 | 2017-06-27 | Xerox Corporation | Category and term polarity mutual annotation for aspect-based sentiment analysis |
US20170193397A1 (en) * | 2015-12-30 | 2017-07-06 | Accenture Global Solutions Limited | Real time organization pulse gathering and analysis using machine learning and artificial intelligence |
CN107092596A (en) * | 2017-04-24 | 2017-08-25 | 重庆邮电大学 | Text emotion analysis method based on attention CNNs and CCR |
CN108363753A (en) * | 2018-01-30 | 2018-08-03 | 南京邮电大学 | Comment text sentiment classification model is trained and sensibility classification method, device and equipment |
-
2018
- 2018-11-23 CN CN201811406037.9A patent/CN109492229B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9690772B2 (en) * | 2014-12-15 | 2017-06-27 | Xerox Corporation | Category and term polarity mutual annotation for aspect-based sentiment analysis |
US20170193397A1 (en) * | 2015-12-30 | 2017-07-06 | Accenture Global Solutions Limited | Real time organization pulse gathering and analysis using machine learning and artificial intelligence |
CN105930411A (en) * | 2016-04-18 | 2016-09-07 | 苏州大学 | Classifier training method, classifier and sentiment classification system |
CN107092596A (en) * | 2017-04-24 | 2017-08-25 | 重庆邮电大学 | Text emotion analysis method based on attention CNNs and CCR |
CN108363753A (en) * | 2018-01-30 | 2018-08-03 | 南京邮电大学 | Comment text sentiment classification model is trained and sensibility classification method, device and equipment |
Non-Patent Citations (5)
Title |
---|
KAI ZHANG等: "Interactive Attention Transfer Network for Cross-domain Sentiment Classification", 《THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE》 * |
SUSANTI GOJALI等: "Aspect Based Sentiment Analysis for Review Rating Prediction", 《2016 INTERNATIONAL CONFERENCE ON ADVANCED INFORMATICS: CONCEPTS, THEORY AND APPLICATION (ICAICTA)》 * |
王晓宇: "网络评论标签提取的研究与实现", 《中国优秀硕士学位论文全文数据库信息科技辑》 * |
胡朝举 等: "基于深层注意力的LSTM的特定主题情感分析", 《计算机应用研究》 * |
陈龙 等: "情感分类研究进展", 《计算机研究与发展》 * |
Cited By (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110245299A (en) * | 2019-06-19 | 2019-09-17 | 中国人民解放军国防科技大学 | Sequence recommendation method and system based on dynamic interaction attention mechanism |
CN110245299B (en) * | 2019-06-19 | 2022-02-08 | 中国人民解放军国防科技大学 | Sequence recommendation method and system based on dynamic interaction attention mechanism |
CN110674849A (en) * | 2019-09-02 | 2020-01-10 | 昆明理工大学 | Cross-domain emotion classification method based on multi-source domain integrated migration |
CN110704622A (en) * | 2019-09-27 | 2020-01-17 | 北京明略软件***有限公司 | Text emotion classification method and device and electronic equipment |
CN110688832A (en) * | 2019-10-10 | 2020-01-14 | 河北省讯飞人工智能研究院 | Comment generation method, device, equipment and storage medium |
CN110688832B (en) * | 2019-10-10 | 2023-06-09 | 河北省讯飞人工智能研究院 | Comment generation method, comment generation device, comment generation equipment and storage medium |
WO2021109671A1 (en) * | 2019-12-02 | 2021-06-10 | 广州大学 | Fine-granularity sentiment analysis method supporting cross-language transfer |
CN111159400A (en) * | 2019-12-19 | 2020-05-15 | 苏州大学 | Product comment emotion classification method and system |
CN111159400B (en) * | 2019-12-19 | 2023-09-26 | 苏州大学 | Product comment emotion classification method and system |
CN111134666A (en) * | 2020-01-09 | 2020-05-12 | 中国科学院软件研究所 | Emotion recognition method of multi-channel electroencephalogram data and electronic device |
CN111339752A (en) * | 2020-02-18 | 2020-06-26 | 哈尔滨工业大学 | Evaluation object-oriented emotion analysis method for multi-task joint learning |
CN111339752B (en) * | 2020-02-18 | 2023-04-25 | 哈尔滨工业大学 | Evaluation object-oriented emotion analysis method for multi-task joint learning |
CN111428039A (en) * | 2020-03-31 | 2020-07-17 | 中国科学技术大学 | Cross-domain emotion classification method and system of aspect level |
CN111428039B (en) * | 2020-03-31 | 2023-06-20 | 中国科学技术大学 | Cross-domain emotion classification method and system for aspect level |
CN113536080A (en) * | 2021-07-20 | 2021-10-22 | 湖南快乐阳光互动娱乐传媒有限公司 | Data uploading method and device and electronic equipment |
CN113806545B (en) * | 2021-09-24 | 2022-06-17 | 重庆理工大学 | Comment text emotion classification method based on label description generation |
CN113806545A (en) * | 2021-09-24 | 2021-12-17 | 重庆理工大学 | Comment text emotion classification method based on label description generation |
CN114116959A (en) * | 2021-10-21 | 2022-03-01 | 吉林大学 | Method and device for analyzing aspect level emotion and terminal |
CN114116959B (en) * | 2021-10-21 | 2024-07-16 | 吉林大学 | Aspect-level emotion analysis method and device and terminal |
CN117112757A (en) * | 2023-08-23 | 2023-11-24 | 人民网股份有限公司 | Comment generation method and device based on text data |
CN117112757B (en) * | 2023-08-23 | 2024-03-08 | 人民网股份有限公司 | Comment generation method and device based on text data |
Also Published As
Publication number | Publication date |
---|---|
CN109492229B (en) | 2020-10-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109492229A (en) | A kind of cross-cutting sensibility classification method and relevant apparatus | |
Hitron et al. | Can children understand machine learning concepts? The effect of uncovering black boxes | |
Dieffenbacher | Fashion thinking: Creative approaches to the design process | |
Durupinar et al. | Perform: Perceptual approach for adding ocean personality to human motion using laban movement analysis | |
CN110674410B (en) | User portrait construction and content recommendation method, device and equipment | |
Landriscina | Simulation and learning | |
Rybicki et al. | Computational stylistics and text analysis | |
Chan | Style and creativity in design | |
Oliveira et al. | Co-PoeTryMe: interactive poetry generation | |
Lin et al. | Usability of affective interfaces for a digital arts tutoring system | |
Hong et al. | Tower of babel: A crowdsourcing game building sentiment lexicons for resource-scarce languages | |
Nguyen et al. | Seagull: A bird’s-eye view of the evolution of technical games research | |
Stemle et al. | Using language learner data for metaphor detection | |
Cobos et al. | Moods in MOOCs: Analyzing emotions in the content of online courses with edX-CAS | |
Chai et al. | DWES: a dynamic weighted evaluation system for scratch based on computational thinking | |
Jiménez et al. | Sentiment Analysis of Student Surveys--A Case Study on Assessing the Impact of the COVID-19 Pandemic on Higher Education Teaching. | |
Blunden | The sweet spot? Writing for a reading age of 12 | |
Jormanainen | Supporting teachers in unpredictable robotics learning environments | |
Alharbi et al. | Data-Driven analysis of engagement in gamified learning environments: A methodology for real-time measurement of MOOCs | |
Hadiana | Kansei based interface design analysis of open source e-Learning system for high education | |
Shim et al. | Multi-Converging Educational Program for Design with the usage of 3D Printer: Targeted for Middle School Students: Targeted for Middle School Students | |
Moutinho et al. | Innovative research methodologies in management | |
Viliunas et al. | Shape-finding in biophilic architecture: application of AI-based tool | |
Doja | Recommender system for personalized adaptive e-learning platforms to enhance learning capabilities of learners based on their learning style and knowledge level | |
Sterman | Process-Sensitive Creativity Support Tools |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |