CN116306677A - Emotion analysis method, system and equipment based on enhancement of neural topic model - Google Patents
Emotion analysis method, system and equipment based on enhancement of neural topic model Download PDFInfo
- Publication number
- CN116306677A CN116306677A CN202310565799.8A CN202310565799A CN116306677A CN 116306677 A CN116306677 A CN 116306677A CN 202310565799 A CN202310565799 A CN 202310565799A CN 116306677 A CN116306677 A CN 116306677A
- Authority
- CN
- China
- Prior art keywords
- model
- text
- representation
- sentence
- training
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000008451 emotion Effects 0.000 title claims abstract description 84
- 230000001537 neural effect Effects 0.000 title claims abstract description 64
- 238000004458 analytical method Methods 0.000 title claims abstract description 59
- 238000012549 training Methods 0.000 claims abstract description 47
- 230000007246 mechanism Effects 0.000 claims abstract description 18
- 238000007781 pre-processing Methods 0.000 claims abstract description 15
- 238000000034 method Methods 0.000 claims abstract description 12
- 238000012545 processing Methods 0.000 claims description 8
- 230000011218 segmentation Effects 0.000 claims description 8
- 230000006870 function Effects 0.000 claims description 7
- 239000011159 matrix material Substances 0.000 claims description 5
- 238000004590 computer program Methods 0.000 claims description 4
- 238000000605 extraction Methods 0.000 claims description 3
- 210000005036 nerve Anatomy 0.000 claims description 3
- 238000010606 normalization Methods 0.000 claims description 2
- 238000005065 mining Methods 0.000 description 5
- 238000013135 deep learning Methods 0.000 description 3
- 238000010801 machine learning Methods 0.000 description 3
- 238000003058 natural language processing Methods 0.000 description 3
- 230000006399 behavior Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000009412 basement excavation Methods 0.000 description 1
- 238000007418 data mining Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 230000003203 everyday effect Effects 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 238000002372 labelling Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/30—Semantic analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/20—Natural language analysis
- G06F40/279—Recognition of textual entities
- G06F40/284—Lexical analysis, e.g. tokenisation or collocates
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D10/00—Energy efficient computing, e.g. low power processors, power management or thermal management
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Artificial Intelligence (AREA)
- Health & Medical Sciences (AREA)
- Computational Linguistics (AREA)
- General Health & Medical Sciences (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Machine Translation (AREA)
Abstract
The invention discloses an emotion analysis method and equipment based on enhancement of a neural topic model, wherein the method comprises the following steps: s1, performing data preprocessing operation on a pre-training text; acquiring a word bag model representation of a text and a word vector representation of words in the text; s2, inputting the text represented by the word bag model into a neural topic model to obtain a global topic representation of the text, and completing training of the neural topic model; s3, learning the vector representation of each sentence; splicing an attention mechanism in the Bi-GRU model, and accessing an emotion polarity model to finish training of the Bi-GRU model, the attention mechanism and the emotion polarity model; s4, when emotion analysis is carried out on the text to be analyzed, preprocessing is carried out on the text to be analyzed, global theme representation of the text is obtained, the text global representation of the global theme representation is fused, and then an emotion polarity module is connected to obtain an emotion analysis result.
Description
Technical Field
The invention belongs to the technical field of data mining, and particularly relates to an emotion analysis method, system and equipment based on enhancement of a neural topic model.
Background
With the development of social media, a large number of internet users generate various messages, such as microblogs, weChat and comment information of various shopping platforms, which include the opinion of people about actual matters, the evaluation of products and the like every day. The lack and the huge number of these text structures automatically analyze these subjective texts is an important subject in the field of natural language processing. Especially emotion analysis and topic mining, which are faced with subjective text. The emotion analysis objects are of multiple types and have complex structures. The parallel analysis of these texts using machine learning techniques is of great significance to the fields of social media, government management, etc.
Emotion analysis and topic mining potential applications are understanding the behavior of a user and analyzing the purchasing behavior of a user. The emotion analysis and topic mining of user-generated content has mainly the following difficulties: the data content is sparse, the structure is missing, and a large amount of high-quality labeling data is absent. The current research has been shifted to deep learning with shallow machine learning models, setting up large-scale pre-training models. Meanwhile, the mining of the topic model is developed from machine learning to deep learning, most of the existing emotion analysis is concentrated on local texts, global representation learning is lacked, and a solution is provided based on the combination of the topic model, the emotion analysis model and the pre-training model of the deep learning.
In view of the foregoing, there is a need for a new text-level emotion analysis method that solves the above problems.
Disclosure of Invention
The invention mainly aims to provide an emotion analysis method, an emotion analysis system and emotion analysis equipment based on enhancement of a neural topic model, which are beneficial to accurately detecting emotion polarity of texts.
In order to achieve the above object, the present invention provides an emotion analysis method based on enhancement of a neural topic model, comprising the following steps:
s1, pretreatment: performing data preprocessing operation on the pre-training text; respectively acquiring word bag model representations of the text after data preprocessing and word vector representations of words in the text;
s2, training a nerve topic model: inputting the text represented by the word bag model into a neural topic model to obtain a global topic representation of the text, and completing training of the neural topic model for subsequent fusion of topic knowledge;
s3, training an emotion analysis model: inputting each sentence in the text of the word vector representation into the BERT pre-training model, and learning the vector representation of each sentence for subsequent learning of the text level representation;
inputting the vector representation of the sentence into a Bi-GRU model to obtain the characteristics of the sentence;
splicing an attention mechanism in the Bi-GRU model, and connecting the feature combination global theme representation of the sentence into the emotion polarity model through the attention mechanism to obtain an emotion analysis result;
training a Bi-GRU model, an attention mechanism and an emotion polarity model is completed;
s4, when emotion analysis is carried out on the text to be analyzed, preprocessing is carried out on the text to be analyzed, global topic representations of the text are output from the neural topic model, the text global representations fused with the global topic representations are obtained from the Bi-GRU model, and then the emotion polarity model is accessed to obtain emotion analysis results.
Further, the data preprocessing operation includes the steps of:
carrying out word segmentation on the pre-training text, segmenting sentences, and storing sentence information and word information of each sentence; removing invalid information in the text to be processed; and carrying out standardization processing on the segmented words.
Preferably, the text to be processed is subjected to word segmentation and sentence segmentation by using a Stanford NLP tool, and invalid information comprises spaces, punctuations, line-feed symbols and the like; the normalization processing comprises part-of-speech reduction and stem extraction, and is used for merging words with identical word senses and different forms.
Further, S1 specifically includes the following steps:
the standardized words are expressed into a discretized word bag model representation for learning of the topic representation;
the word vector representation of each word in each sentence is obtained, and the word vector of each sentence becomes a matrix input for learning the feature representation of the sentence.
Preferably, the bag-of-words model adopts a bag-of-words model representation tool of gemim; word vector representation Word2vec representation of calling genesim.
Further, S3 includes the following steps:
the discretized word bag model representation is input into a neural topic model, the neural topic model is represented into an intermediate representation, namely a global topic representation, through a VAE model based on Gaussian distribution, text is reconstructed, topic association among the text is learned, global information among sentences is captured, and training of the neural topic model is completed.
Preferably, the text x expressed by the word bag model is input into a neural topic model, and the neural topic model solves parameters mu and sigma of Gaussian distribution through a formula, namely, training of the neural topic model is completed:
(μ,logσ)=Encoder(x),ε~N(0,I); (1)
z=μ+σ*ε; (2)
where μ is the mean of the gaussian, σ is the variance of the gaussian, ε is the normal of the standard, and z is the intermediate representation based on the gaussian, i.e., the global topic representation.
Further, the BERT model includes 768 hidden layers; inputting the word vector matrix of each sentence into the BERT model, and taking the first symbol CLS of the last hidden layer of the BERT model as the sequence s of the whole sentence i And accessing to the Bi-GRU model to obtain the characteristics of the sentence.
Further, S3 specifically includes the following steps: learned sentence sequence s i Encoding sentence sequence s using Bi-GRU model i Obtaining h of each sentence i Capturing context information of the entire text; h is a i Representing a combination of forward and backward hidden states.
hi and the global topic representation are combined and input into the attention mechanism model, semantic representation of topic perception of the text is learned, and finally the emotion polarity of the text is learned by inputting a softmax function.
Further, the emotion polarity model adopts a softmax function, and the emotion polarity of the text to be processed is predicted through the softmax function.
Based on the inventive concept, the invention also provides an emotion analysis system based on the enhancement of the neural topic model, which comprises the following steps:
and a pretreatment module: performing data preprocessing operation on the pre-training text or the text to be analyzed; respectively acquiring word bag model representations of the text after data preprocessing and word vector representations of words in the text;
a neural topic model module: inputting the text represented by the word bag model into a neural topic model to obtain a global topic representation of the text; training the neural topic model through a pre-training text;
BERT pre-training model module: inputting each sentence in the text of the word vector representation into the BERT pre-training model, and learning the vector representation of each sentence;
Bi-GRU model module: splicing an attention mechanism in the Bi-GRU model, inputting the vector representation of the sentence into the Bi-GRU model, and obtaining the characteristics of the sentence; training the Bi-GRU model through a pre-training text;
emotion polarity model module: and combining the features of the sentences with the global topic representation through an attention mechanism, accessing an emotion polarity model, processing the text global representation obtained by the Bi-GRU model module to obtain an emotion analysis result, and training the attention mechanism and the emotion polarity model through a pre-training text.
Based on the above inventive concept, the present invention also provides an emotion analysis device based on enhancement of a neural topic model, which is characterized in that: the emotion analysis method based on the neural topic model enhancement comprises at least one computing device, wherein the computing device comprises a memory, a processor and a computer program which is stored on the memory and can run on the processor, and the emotion analysis method based on the neural topic model enhancement can be realized when the computer program is loaded into the processor.
The beneficial effects of the invention are as follows: the method combines the neural topic model and the Bi-GRU model to form a semi-supervision method, thereby solving the problem of difficult data annotation; the global information is mined by combining the unsupervised characteristic of the nerve topic model, so that the learning capacity of the emotion analysis model is enhanced, and the emotion polarity is analyzed; the invention has good flexibility and expansibility, and the emotion analysis precision is further improved by adjusting network parameters which are more suitable for a certain working environment; experiments prove that the model has better performance in emotion analysis than the most advanced emotion analysis technology. The results on the rich data set also prove that the emotion analysis method provided by the invention has universality.
Drawings
FIG. 1 is a schematic overall flow chart of an emotion analysis method based on neural topic model enhancement according to an embodiment of the present invention.
FIG. 2 is a model diagram of an emotion analysis method based on neural topic model enhancement in an embodiment of the present invention.
FIG. 3 is a schematic illustration of a neural topic model based on an emotion analysis method enhanced by the neural topic model according to an embodiment of the present invention.
Detailed Description
The method can be applied to text emotion analysis tasks such as social media texts, comment texts and the like, segments the texts, re-represents the text and inputs the text to a pre-training language model, finally gives out emotion classification and theme of the text, and can be executed by the pre-training language model and a neural theme model module, wherein the pre-training language model and the neural theme model module can be realized by software and can also be applied to emotion analysis tasks such as a dialogue system, as shown in fig. 1, a flow diagram provided by the invention is shown, and a specific implementation method shown in a model in fig. 2 is taken as an example, and comprises the following steps:
firstly, sentence segmentation and word segmentation are carried out on comment texts by using a Stanford NLP tool, and sentence information and word information of each sentence are saved. Meanwhile, the invalid information in the removed text includes punctuation marks and the like. And meanwhile, the words are subjected to standardized processing, including part-of-speech reduction and stem extraction, and the method is mainly used for learning a neural topic model.
Then, representing the standardized words into a discretized word bag model, and selecting a word bag model representation tool of gemim; meanwhile, a Word vector representation of each Word in each sentence is obtained, where the Word vector invokes the Word2vec representation of gensim. The word vector of each sentence becomes a matrix input for the BERT pre-trained language model to learn the feature representation of the sentence.
Next, the text x represented by the "bag of words model" is input into the neural topic model, the framework of which is shown in fig. 3. The neural topic model mainly solves parameters mu and sigma of Gaussian distribution, and a specific formula (mu, log sigma) =Encoder (x), epsilon-N (0,I); z=μ+σ. Where μ is the mean of the gaussian, σ is the variance of the gaussian, ε is the normal of the standard, and z is the intermediate representation based on the gaussian, i.e., the global topic representation. Next, the neural topic model reconstructs the text x ̂. To this end, training of a neural topic model that learns a topic representation of text.
The BERT model then contains an encoder of 12 transducer modules, 12 self-attention heads, and 768 hidden layers. The BERT model inputs sequences of no more than 512 characters, outputting semantic representations of the sequences. To obtain features of sentences for subsequent text modeling, the first symbol of the last hidden layer of the BERT model [ CLS ]]Representation s as whole sequence i . As illustrated in FIG. 2, x i1 ~x iT Representing the 1 st to T th words, h in sentence i i1 ~h iT Representing word x i1 ~x iT Corresponding intermediate representation, s i Representing the ith sentence sequence.
Learned sentence sequence s i Obtaining h of each sentence in the hidden layer by Bi-GRU model coding i ,h i Representing a combination of forward and backward hidden states for capturing context information for the entire text; meanwhile, the neural topic model assumes that all documents have a K-dimensional topic representation, each topic is a distributed representation, h i In combination with the global topic representation, inputs into the attention mechanism learn the topic-aware semantic representation of the text. To this end, the parameters of Bi-GRU and attention mechanisms are trained cooperatively with pre-trained topic representations using BERT learned vector representations. Finally, the softmax function is accessed to predict the final emotion classification, and the parameter training of the softmax function is completed.
When emotion analysis is carried out on text to be analyzed, the method enablesThe emotion polarity of each text can be analyzed by using the learned neural topic model and emotion analysis model, and meanwhile, topic representation of the text is obtained. Preprocessing a text to be analyzed, namely performing sentence segmentation and word segmentation on the comment text, removing invalid information in the text, performing standardized processing on words, and obtaining word bag model representation and word vector representation of the words in the text; then outputting the global topic representation of the text from the neural topic model, and obtaining h of the sentence from the Bi-GRU model i Then will be fused with h i And the text global representation of the global theme representation is connected with the emotion polarity model to obtain an emotion analysis result.
The method combines the neural topic model and the Bi-GRU model to form a semi-supervision method, so that the problem of sparse text content is solved; the neural topic model, the pre-training model and the attention mechanism are combined, so that the feature learning capability of the model is enhanced, and the emotion polarity and topic mining are facilitated to be analyzed.
The invention has good flexibility and expansibility, and can obtain higher analysis precision or excavation quality under different working environments by adjusting network parameters more suitable for a certain working environment.
The universality of the emotion analysis method can be proved by providing a large number of public data sets by the existing emotion analysis method.
The above embodiments are only for illustrating the technical solution of the present invention and not for limiting the same, and although the present invention has been described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that modifications and equivalents may be made thereto without departing from the spirit and scope of the technical solution of the present invention.
Claims (10)
1. A method of emotion analysis based on neural topic model enhancement, the method comprising the steps of:
s1, pretreatment: performing data preprocessing operation on the pre-training text; respectively acquiring word bag model representations of the text after data preprocessing and word vector representations of words in the text;
s2, training a nerve topic model: inputting the text represented by the word bag model into a neural topic model to obtain a global topic representation of the text, and completing training of the neural topic model;
s3, training an emotion analysis model: inputting each sentence in the text of the word vector representation into the BERT pre-training model, and learning the vector representation of each sentence for subsequent learning of the text level representation;
inputting the vector representation of the sentence into a Bi-GRU model to obtain the characteristics of the sentence;
splicing an attention mechanism in the Bi-GRU model, and connecting the feature combination global theme representation of the sentence into the emotion polarity model through the attention mechanism to obtain an emotion analysis result;
training a Bi-GRU model, an attention mechanism and an emotion polarity model is completed;
s4, when emotion analysis is carried out on the text to be analyzed, preprocessing is carried out on the text to be analyzed, global topic representations of the text are output from the neural topic model, the text global representations fused with the global topic representations are obtained from the Bi-GRU model, and then the emotion polarity model is accessed to obtain emotion analysis results.
2. The emotion analysis method based on enhancement of a neural topic model according to claim 1, wherein the data preprocessing operation includes the steps of:
carrying out word segmentation on the pre-training text, segmenting sentences, and storing sentence information and word information of each sentence; removing invalid information in the text to be processed; and carrying out standardization processing on the segmented words.
3. The emotion analysis method based on enhancement of a neural topic model according to claim 2, characterized in that:
using a Steady NLP tool to segment words and sentences of the text to be processed; the invalid information comprises a space, a punctuation mark and a line feed symbol; the normalization processing comprises part-of-speech reduction and stem extraction.
4. A neural topic model-based enhanced emotion analysis method as claimed in claim 2 or 3, characterized in that: s1 specifically comprises the following steps:
representing the standardized words into a discretized word bag model representation;
a word vector representation of each word in each sentence is obtained, the word vector of each sentence becoming a matrix input.
5. The neural topic model-based enhanced emotion analysis method of claim 4, wherein:
the word bag model selects a word bag model representation tool of gemim; word vector representation Word2vec representation of calling genesim.
6. The neural topic model-based enhanced emotion analysis method of claim 1, wherein: s2 comprises the following steps:
the discretized word bag model representation is input into a neural topic model, the neural topic model is represented into an intermediate representation, namely a global topic representation, through a VAE model based on Gaussian distribution, and text is reconstructed, so that training of the neural topic model is completed.
7. The neural topic model-based enhanced emotion analysis method of claim 6, wherein:
inputting a text x expressed by the word bag model into a neural topic model, and solving parameters mu and sigma of Gaussian distribution by the neural topic model through a formula to finish training of the neural topic model:
(μ,logσ)=Encoder(x),ε~N(0,I); (1)
z=μ+σ*ε; (2)
where μ is the mean of the gaussian, σ is the variance of the gaussian, ε is the normal of the standard, and z is the intermediate representation based on the gaussian, i.e., the global topic representation.
8. The neural topic model-based enhanced emotion analysis method of claim 4, wherein:
the BERT model includes 768 hidden layers; inputting the word vector matrix of each sentence into the BERT model, and taking the first symbol CLS of the last hidden layer of the BERT model as the sequence s of the whole sentence i Accessing to the Bi-GRU model to obtain the characteristics of sentences;
learned sentence sequence s i Encoding sentence sequence s using Bi-GRU model i Obtaining h of each sentence i Capturing context information of the entire text; h is a i Representing a combination of forward and backward hidden states;
the emotion polarity model adopts a softmax function, and the emotion polarity of the text to be processed is predicted through the softmax function.
9. An emotion analysis system based on neural topic model enhancement, comprising:
and a pretreatment module: performing data preprocessing operation on the pre-training text or the text to be analyzed; respectively acquiring word bag model representations of the text after data preprocessing and word vector representations of words in the text;
a neural topic model module: inputting the text represented by the word bag model into a neural topic model to obtain a global topic representation of the text; training the neural topic model through a pre-training text;
BERT pre-training model module: inputting each sentence in the text of the word vector representation into the BERT pre-training model, and learning the vector representation of each sentence;
Bi-GRU model module: splicing an attention mechanism in the Bi-GRU model, inputting the vector representation of the sentence into the Bi-GRU model, and obtaining the characteristics of the sentence; training the Bi-GRU model through a pre-training text;
emotion polarity model module: and combining the features of the sentences with the global topic representation through an attention mechanism, accessing an emotion polarity model, processing the text global representation obtained by the Bi-GRU model module to obtain an emotion analysis result, and training the attention mechanism and the emotion polarity model through a pre-training text.
10. An emotion analysis device based on neural topic model enhancement, characterized in that: comprising at least one computing device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the computer program being capable of implementing the neural topic model-based enhanced emotion analysis method of any of claims 1-8 when loaded into the processor.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310565799.8A CN116306677B (en) | 2023-05-19 | 2023-05-19 | Emotion analysis method, system and equipment based on enhancement of neural topic model |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310565799.8A CN116306677B (en) | 2023-05-19 | 2023-05-19 | Emotion analysis method, system and equipment based on enhancement of neural topic model |
Publications (2)
Publication Number | Publication Date |
---|---|
CN116306677A true CN116306677A (en) | 2023-06-23 |
CN116306677B CN116306677B (en) | 2024-01-26 |
Family
ID=86801747
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310565799.8A Active CN116306677B (en) | 2023-05-19 | 2023-05-19 | Emotion analysis method, system and equipment based on enhancement of neural topic model |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116306677B (en) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110362817A (en) * | 2019-06-04 | 2019-10-22 | 中国科学院信息工程研究所 | A kind of viewpoint proneness analysis method and system towards product attribute |
CN114186062A (en) * | 2021-12-13 | 2022-03-15 | 安徽大学 | Text classification method based on graph neural network topic model |
CN116108840A (en) * | 2023-02-16 | 2023-05-12 | 北京工业大学 | Text fine granularity emotion analysis method, system, medium and computing device |
-
2023
- 2023-05-19 CN CN202310565799.8A patent/CN116306677B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110362817A (en) * | 2019-06-04 | 2019-10-22 | 中国科学院信息工程研究所 | A kind of viewpoint proneness analysis method and system towards product attribute |
CN114186062A (en) * | 2021-12-13 | 2022-03-15 | 安徽大学 | Text classification method based on graph neural network topic model |
CN116108840A (en) * | 2023-02-16 | 2023-05-12 | 北京工业大学 | Text fine granularity emotion analysis method, system, medium and computing device |
Also Published As
Publication number | Publication date |
---|---|
CN116306677B (en) | 2024-01-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110609897B (en) | Multi-category Chinese text classification method integrating global and local features | |
CN111858944B (en) | Entity aspect level emotion analysis method based on attention mechanism | |
CA3039280C (en) | Method for recognizing network text named entity based on neural network probability disambiguation | |
CN110019812B (en) | User self-production content detection method and system | |
CN111368086A (en) | CNN-BilSTM + attribute model-based sentiment classification method for case-involved news viewpoint sentences | |
CN112487143A (en) | Public opinion big data analysis-based multi-label text classification method | |
CN107688576B (en) | Construction and tendency classification method of CNN-SVM model | |
CN112434161B (en) | Aspect-level emotion analysis method adopting bidirectional long-short term memory network | |
CN112561718A (en) | Case microblog evaluation object emotion tendency analysis method based on BilSTM weight sharing | |
CN113657115A (en) | Multi-modal Mongolian emotion analysis method based on ironic recognition and fine-grained feature fusion | |
CN114239574A (en) | Miner violation knowledge extraction method based on entity and relationship joint learning | |
CN114925205B (en) | GCN-GRU text classification method based on contrast learning | |
Miao et al. | A dynamic financial knowledge graph based on reinforcement learning and transfer learning | |
CN116401376A (en) | Knowledge graph construction method and system for manufacturability inspection | |
CN115630156A (en) | Mongolian emotion analysis method and system fusing Prompt and SRU | |
CN116245110A (en) | Multi-dimensional information fusion user standing detection method based on graph attention network | |
CN115292568B (en) | Civil news event extraction method based on joint model | |
CN115906816A (en) | Text emotion analysis method of two-channel Attention model based on Bert | |
CN115600605A (en) | Method, system, equipment and storage medium for jointly extracting Chinese entity relationship | |
CN117236676A (en) | RPA process mining method and device based on multi-mode event extraction | |
CN116306677B (en) | Emotion analysis method, system and equipment based on enhancement of neural topic model | |
CN116108840A (en) | Text fine granularity emotion analysis method, system, medium and computing device | |
CN113326695B (en) | Emotion polarity analysis method based on transfer learning | |
Jasim et al. | Analyzing Social Media Sentiment: Twitter as a Case Study | |
CN111460160B (en) | Event clustering method of stream text data based on reinforcement learning |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |